Found in 3 comments on Hacker News
For anyone interested in the art and science of surveys/customer interviews, I found this book to be quite helpful: http://www.amazon.com/Asking-Questions-Definitive-Questionna... It's written for sociology/anthropology students, but the principles are equally applicable in product conversations. It's filled with lots of best practices for designing questions, interview flows, surveys, and other customer tools. There's actually a lot of mistakes that can be made without realizing and this book helps prevent many of them. e.g. switching up types of questions from yes/no, to scales, to free response to prevent answers based on momentum.
adrianhoward · 2012-09-10 · Original thread
I'd ditto the recommendation of http://www.amazon.com/Asking-Questions-Definitive-Questionna...

Also, if you're close to a university check out their social science courses for courses on survey/questionnaire design that you might be able to audit. They'll usually be something in that area on social psych courses for example.

The biggest mistakes I see are:

* Surveys being too long. Every additional question you have makes it less likely that people will bother to complete it.

* Not tracking abandonment rates. High abandonment rates mean that people cannot complete (because you've messed up a question design), or don't want to complete because of a perceived bias, or because it's too long, etc. It's a sign of a bug you need to fix.

* Trying to do too much in a single survey. If you have eight assumptions in your product that you're trying to explore do eight small surveys rather than one large one.

* Questions that assume the answer. You're often asking a survey because you hope people answer in a particular way ("Yes! People are going to be interested in my product!"). Get somebody else to read the survey and see if they can figure out from the questions what answers you want. If they can - rephrase the questions.

* Questions with no correct answer. For example "Do you prefer to log in with Facebook or LinkedIn? yes/no". I can't answer that truthfully (real answer "it depends"). People who can't answer questions truthfully either "lie" or abandon the survey. Both bias your results. Look at all your questions and think whether there is another way of answering it that you don't include.

* A preference for quantitative rather than qualitative data. Checkboxes and radio buttons and hard numbers make it easy to draw pretty graphs and fool yourself into thinking you're being scientific. The key info you need is often in the "soft" written answers.

* Doing surveys too early when you should actually be talking to people.

* More of a form design issue - but people tend to fit their answers to the box size. If you have a small text entry box people are more likely to give small answers, which may be less useful than the longer answer you would get from a bigger text area.

I'll shut up now since I should be doing actual work :-)

Will Evans has a nice series of slides on UX Research basics. There's one that covers some bits of survey design that you might find useful http://www.slideshare.net/willevans/introduction-to-user-exp...

malloc47 · 2012-09-09 · Original thread
This is what I have noticed several social scientists reference when discussing survey/questionnaire development:

http://www.amazon.com/Asking-Questions-Definitive-Questionna...

Fresh book recommendations delivered straight to your inbox every Thursday.