Today I got a shotgun lesson in surveying potential customers. Last startup weekend, my team and I blasted friends and family with requests for survey responses. This resulted in around 40 responses. My goal this time around was to increase the number of responses by a significant amount, do a better job validating the assumptions in our business model, and avoid spamming friends.
To start out, we took a close look at the various elements of our business model and identified the key assumptions that enabled the business. Based on these, we clarified what we wanted to learn from our study, came up with a set of questions, and created a Google Form to capture the information. As we went back over our initial set of questions, we felt they were too complicated and that the amount of mental energy spent answering them would cause participants to disengage before completing the survey.
After modifying the surveys, we got feedback from the Startup Weekend mentors. We spent some one-on-one time with them walking through the survey and looking for feedback on the questions that were being asked. Interestingly, it was difficult to get this kind of feedback because we found that once someone started reading through the survey, they'd jump into responding to it and it took significant focus to keep the conversation on track and ferret out opinions of the positives and negatives about the questions and structure of the survey itself.
Here's a big lesson we learned. Every mentor we spoke with gave different opinions, frequently with advice that was contradictory to what other mentors had advised. It seems so simple and obvious to say that everyone has a unique opinion, but it's different to actually experience receiving contrary suggestions from smart, experienced practitioners who are trying to help you. As has been said many times elsewhere, it's important to apply the right advice to the right situation and to make one's own decisions. Aside from our own revisions, we additionally revised our survey 3 times based on mentor feedback. After the third time, we took a step back and realized that we had just been churning without making much real progress. What is really important is to understand the feedback mentors are offering and the reasoning behind it. This knowledge can then be applied to the task at hand.
After this realization, we tweaked the survey one last time, and were satisfied that it would help gather the knowledge we sought. Once finally ready, we sent the survey out using Amazon's Mechanical Turk. This was highly effective and at about 5 cents per question, we filled our initial round of 100 responses in about 30 minutes. At this speed, we realized that we had overpaid, but all-in-all it wasn't too expensive anyways. We sought an additional 100 responses and tried pricing them at about 3 cents per question. This worked as well, but took somewhere between 6x and 8x the amount of time to get the same number of responses.
While all of this survey-related learning was going on, two teammates did an amazing job hacking together a prototype through the course of the day using Twitter Bootstrap and CodeIgniter. They actually got a functional site up an running before we wrapped up for the night.
With all our feedback and a functional prototype, things are looking promising!