Validating Product Ideas. Tomer Sharon

Читать онлайн книгу.

Validating Product Ideas - Tomer Sharon


Скачать книгу
participant criteria into measurable benchmarks (e.g., travels for business at least three times a year).

      3 Transform the benchmark into a screening question or questions (e.g., How often do you go on an airplane?). If a person chooses the “right” answer, she’s in. If not, she’s out.

      4 Craft a screening questionnaire (also called a screener) you can send people. (Here is a sample screener—bit.ly/validating-chapter-1-screener.)

      5 Pilot-test the screener with a couple of people and make improvements.

      6 Utilize social media to find research participants quickly and effectively. Chapter 9 guides you through social media usage for finding research participants, as well as detailed steps and examples for creating greater screeners.

      Carefully consider the time it will take study participants to answer the question each time you ask. Keep in mind that you can definitely ask a lead question, as well as some follow-up questions. You can even ask participants to take a picture if it better explains their answer. As a general rule of thumb, try not to take more than one minute of each participant’s time. So if you ask the question five times a day, the required effort of participation is five minutes per day. If you increase this time, you increase the likelihood of getting fewer answers from your participants. They’ll be overwhelmed and quickly lose interest in the study.

      You need to make three decisions before you start the study:

      • The number of study participants

      • The length of the study

      • The frequency you ask the question

      First, think about how many valuable data points (single responses to the question) you want to have by the end of the study. You have two bad options and one good one:

      1 Too much: Gathering 20,000 answers is going to be extremely time consuming to analyze. Even 2,000 answers are a lot to handle. As you’ll see in a later step, each answer will need to be read, understood, classified, and verified. It can take 1 to 5 minutes to complete this analysis per answer. This translates to 4–20 workdays of analysis for 2,000 answers. That’s too much.

      2 Too little: 20, 50, or even 200 answers is not going to give you enough information to work with. You won’t feel very confident in your data, or be sure what to do next.

      3 Just about right: 500–1,000 answers is a range of answers you can work with, be confident it’s comprehensive, and handle alone or with a team of people who support the analysis. This number of data points can take 1–4 days or a couple of hours to a day of teamwork to handle.

      As soon as you have a target number of answers to be collected, calculate backward. Take into account that about a third of the answers you want will be lost (see Figure 1.2) due to your participants’ inability to answer (they don’t give you all the answers they committed to give), the fact that you have duplicates (they submit several identical answers in a matter of seconds), or the number of useless answers you received (they submit answers you don’t understand). For example, let’s assume that you want 1,000 valuable responses. Assuming one-third will be lost, you need to collect 1,500 responses. If you ask the question 5 times a day for 3 days and 100 people participate in the study, you will potentially get 1,500 responses (5 times a day x 3 days x 100 participants = 1,500 potential answers).

image

      The number of notifications you send each day is a trade-off between the times you think the behavior you ask about is happening and a number that would annoy or overwhelm your participants to the point where they would choose not to respond. For example, if you think people take notes on a piece of paper 10 times a day, ask them about it 3–5 times a day. Obviously, 10 times is too much. Also, your estimation might be wrong and if you ask 10 times a day about something that happens only 5 times a day, your participants will feel uncomfortable.

      Another example is about behaviors that occur relatively rarely. For example, if you estimate your audience updates their websites twice a week, don’t ask about them every day. Ask the question once a week for a period of 10 weeks. If you do that with 50 people, you will get 500 reasons that people update their sites. That’s a good number of data points to learn from. Table 1.2 provides additional examples for frequency of asking an experience sampling question.

Behavior at Stake Assumed Behavior Frequency Question Frequency
Using paper to write something down 10 times a day 3–5 times a day
Updating a website Twice a week Once a week
Grocery shopping 2–3 times a week Once a week
Searching on Google 5 times a day 1–2 times a day
Using a smartphone 150 times a day 5–8 times a day
Boarding a plane 1–2 times a year Not a good candidate for experience sampling. Better to apply interviewing (Chapter 2), observation (Chapter 3), or diary study (Chapter 4).

      Decide how you are going to send the question to your study participants: text message (SMS), email, an app, or voice message are all good options. Think about how the data will be collected and choose the simplest way for you. For example, use email through which you send a simple Google Form. If you are technically savvy, consider using text messages combined with an automation tool (e.g., an IFTTT1 recipe) that drops them in a Google spreadsheet. Don’t send text messages to participants with the intention of copying and pasting 1,500 text messages into a spreadsheet because this will be extremely tedious and time consuming. Whatever medium you choose, make sure that your data is collected in one spreadsheet, ready to be analyzed the moment you need it, which is about 10 minutes from the time you launch the study.

      The main activity you will perform during


Скачать книгу