I thought this was going to be a post about customer development, where the issue is not generally the formatting of the questions, but asking the wrong questions in the first place, or misinterpreting the answers.
Surveys can be a useful tool, but nothing replaces a real conversation. Even asking more open ended questions in an email or survey form can give you so much more information than radio button answers (and unless you're a huge company, it's not too hard to read a few hundred open-ended responses).
Along those lines, I'll share a snippet from patio11's Microconf presentation last year (posted on HN yesterday), regarding interpreting the responses you get to a more open-ended conversation:
"If you’re solving a problem people actually have, they will say at this point, 'Shut up and take my money.' If someone says, 'That’s kind of interesting, tell me when that exists,' you have not successfully identified a problem that people actually have."
You don't get that kind of insight from radio buttons.
Agreed. One of the biggest mistakes I see startups making when talking to users is using surveys at all.
The theory seems to be "Oh, we'll get more information that way." Or, from the more honest ones, "This was less scary than trying to talk with people one on one." But surveys are mainly useful for confirming or denying particular generalizations from real conversations.
Surveys throw out all sorts of data. Every raised eyebrow. Every excited look. Every long pause where they try to figure out what they hell you are talking about. Every gesture, every tone of voice, every tangent. You really need that data when you're first starting out. Early on, the problem isn't the things you know you don't know. It's the things that you're pretty sure about that are false.
I'd say it is more of a "this is more tractable". Given that the barrier to doing a survey is so much lower for your users, the guys who accept "real conversations" will probably be significantly more optimistic about your product than those who don't..
I'd say it is more of a "this is more tractable". Given that the barrier to doing a survey is so much lower for your users, the guys who accept "real conversations" will probably be significantly more optimistic about your product than those who don't..
Not true in my experience. In fact I see exactly the opposite. The folk doing surveys seem far more likely to fool themselves that there onto something. They're usually some way to cut up the quant data so it looks like "success" by some category or another.
Surveys are a great tool in the right sort of context - but early customer development isn't one of 'em.
I do a little workshop on survey design. The first slide is:
The First Rule of Surveys: Don't do them.
The Second Rule of Surveys (for experts only!): Don't do them yet.
(yes - it is a deliberate paraphase of Michael Jackson's rules of optimisation ;-)
People who are optimistic at the only people that matter. A company with only 1% of the world population as customers is one of the greatest companies in history.
I also think the article title is misleading or off-base.
As for the contents: it's a PR promotional post for the startup disguised as a list of tips. The article also misses the point: "talking" involves dialogue, which is a two-way street. Conversely, a survey is a formulaic, one-way exchange of information that pales in comparison to a real conversation.
While the headline doesn't really match the article, because the article talks about surveys and not customer development, I want to point out one error I see quite frequently in startups or SaaS-businesses when they talk to users.
It can be summed up as: Asking the user for some sort of interaction and then failing to respond to the user's action.
This can be blog posts where there's an open question at the end with the goal to engage readers and failing to answer to interesting comments.
But I see it more frequently when I go for some sort of SaaS-trial, my account expires after a week or two, they ask me for feedback and since I'm not convinced of the product yet, I might give them feedback and state my concerns. That is actually the last chance for them to make me change my mind.
And they miss it. They simply fail to reply or to address my concerns that I stated in my feedback.
I don't have any numbers but my educated guess is that they lose quite a few customers this way. Some lost me this way and I bet there are others.
If you actually want to look at how to get better at talking to users - rather than writing surveys - I'd recommend getting a copy of Andrew Traver's pocket guide "Interviewing for Research" http://www.fivesimplesteps.com/products/interviewing-for-res...
They forgot the #1 mistake: Too Many Damn Questions.
There have been plenty of times when I have willfully taken surveys to help out companies I like then immediately gave up after being shown a survey with 20-60 questions. No.
What pisses me off is that they're asking for my time and effort without giving me anything in return. Like they feel entitled to taking 15 minutes of my time.
The worst is when they only display 1 question per page and there's a little progress bar at the top. And after 10 questions you only see the progress bar 7% full. At least it's not as bad as the time Starbucks/Chase Duetto Cards hired someone with a studdering speech impediment to call me for a phone survey... That lasted 30 minutes.
Somewhat related are the ones that are organized the same way the company is. I recently answered one for Safeway (supermarket chain).
At the beginning they asked about overall experience (it sucked due to long checkout queues so I got the bare minimum, waited and left). They then asked me which "departments" I shopped in using their internal terminology which doesn't match what real people use (eg I think "cold goods" was one of them). Then for each department they asked me exactly the same set of questions, then they asked them again for the store as a whole. Then I got asked about 10 questions for the self checkout (which took longer than using the thing).
Did I mention I had to enter a 19 digit code to start, which included 7 zeroes in a row in the middle of the sequence?
A nice platitude, but when folks from UX, marketing, sales, engineering, bizdev, and executives all want to know different things - that's how you end up with the 60-question surveys.
For anyone interested in the art and science of surveys/customer interviews, I found this book to be quite helpful: http://www.amazon.com/Asking-Questions-Definitive-Questionna... It's written for sociology/anthropology students, but the principles are equally applicable in product conversations. It's filled with lots of best practices for designing questions, interview flows, surveys, and other customer tools. There's actually a lot of mistakes that can be made without realizing and this book helps prevent many of them. e.g. switching up types of questions from yes/no, to scales, to free response to prevent answers based on momentum.
The problem with this data is that it comes from Survata, which is one of those god awful spammy survey walls that makes users fill out a survey before they can access the desired content. When your data comes from being hostile and annoying to users, I don't think it can be trusted.
Sometimes unbalanced scales (#4 on this article's list) are appropriate. If you have a five-option list where 95% of your users are answering 4 or 5, then it's time to unbalance the scale so that you can get better information. However, you need to have collected the balanced scale version for long enough to know it's an actual trend.
I still couldn't truthfully answer most of the "improved" questions with anything other than "it depends" or "none of the above", because they still make too many assumptions about people's behavior and choices that simply don't match the diversity of reality.
Pay for a streaming video services? Depends, 99% of current services I wouldn't pay for. How often do I get a new mobile phone? Depends, sometimes two in a year, but the current one is already 3 years old. Fuck if I know the "average". How often do I check my mail per hour. Depends on where I am and what the fuck I'm doing, of course.
Mistake #11: assuming that an online survey equates to discussions with your users. This is not a knock on survata, but rather talking about what it means to engage with a company's userbase.
Survata is one tool, but nothing beats direct, 1-1 customer interaction. Can't remember where I read it, but one CEO of a startup made it his job to handle a decent amount of customer support email. Of course, that only scales so far, but out of the gate -- that's priceless interaction.
Surveys can be a useful tool, but nothing replaces a real conversation. Even asking more open ended questions in an email or survey form can give you so much more information than radio button answers (and unless you're a huge company, it's not too hard to read a few hundred open-ended responses).
Along those lines, I'll share a snippet from patio11's Microconf presentation last year (posted on HN yesterday), regarding interpreting the responses you get to a more open-ended conversation:
"If you’re solving a problem people actually have, they will say at this point, 'Shut up and take my money.' If someone says, 'That’s kind of interesting, tell me when that exists,' you have not successfully identified a problem that people actually have."
You don't get that kind of insight from radio buttons.