I'm in the process of designing a survey for existing users of
consumer food site but don't have a lot of time to design it. Most
likely I will use survey monkey unless anyone has a better suggestion.
Can anyone point me to best practices for quick online survey design?
Sent from my iPhone
Catriona Lohan-Conway asked:
> Can anyone point me to best practices for quick online survey design?
If it's a single-question quick poll that's mainly there to amuse the
visitor, then the crucial rules are:
- one question, answers with radio buttons
- find a way to track whether this user has voted or not
- made the results so far available immediately the user presses 'send' (or
whatever you call the button: 'vote' also works nicely).
- don't treat the results as anything other than an amusement.
If it's a survey that you want to gather data to support business decisions,
that's a lot trickier. I'd almost go so far as to say that you shouldn't do
it if you don't have time to do it properly. But I know that organisational
realities often force us into doing things quickly when that's a bad idea.
Biggest rule: test, test, test. It's incredibly difficult to write good
survey questions that really reflect user opinion and obtain the insights
you want. You'll only find out if you're close to OK if you test.
It's a bit easier if you go for all 'open' questions (i.e., a question with
a box for free text). Downside: those questions take longer to analyse.
If you must offer radio button/checkbox questions, always include an option
for 'other' with a box to type into. At least for the first three or four
iterations of your text/analyse cycle.
Shorter is better, obviously.
Find some way of selecting the people that you offer the survey to, and
track your response rate. You'll see claims along the lines of "a 2%
response rate is typical for online surveys" - this claim is true, but it
omits the crucial point that a 3% response rate is useless. That's right,
most online surveys are useless. Longer discussion of this point:
and a follow-up article on response rates:
If you include a rating scale, my advice is to go for a 5-point scale (plus
a 'don't know/prefer not to answer' type of option. Discussion of why is
You probably don't have the time to tackle a book on survey design, but if
you do then the one to go for is
"Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method" by
Dillman, Smith and Christian (or if you prefer not to shell out US$50 plus,
then any of the previous editions will also be OK).
There's also a lot of overlap between survey design and forms design. For
example, our book "Forms that work: Designing web forms for usability" has a
chapter on 'persuading people to answer' that would be very relevant to
survey design. www.formsthatwork.com or any bookshop.
Hope this helps
You could also use google spreadsheets. I used and it works well.
There basic design works fine.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
I agree with Caroline, especially about the 5 point scale. This is
also my experience. Except for the "don't know" option. I have
never experiencend any problems with a slightly seperated "don't
know" answer option (I am refering to the linked article).
One thing that I also agree upon: you need some time / experience to
create a good questionnaire that will give you usable data. If you
are making major mistakes in the way you ask or layout the
questionnaire, it can happen that the data you are generating is
misleading. So take your time (you won't need weeks). Think about
it, put yourself in the situation of a respondent, get your family /
friends / coworkers / the target group to comment on the
questionnaire and you will avoid most of the traps that are out
We have a blog that's full of best practices for survey design you
could look there for some additional best practice tips. If you want
to create a strong survey and make the user experience a great one
(aka higher response rates, more reliable data, etc.) I would
strongly suggest looking at other survey solutions. While Cvent Web
Surveys is certainly not the only solution out there, you may want to
include it on your short list (I'm obviously a little biased).
You did mention you don't have a lot of time for this project, you
could also look at some professional services that will reduce the
time you spend on the project, and allow you not to get too bogged
down with having to know all the best practices. Cvent's
Professional Services group can help you from little things like
reviewing questions and responses to make sure you don't have any
poorly worded questions that wont result in the data you need, to
building the survey out in the survey tool (if it's a long survey,
this is a huuuuge help because it frees up all that time doing data
entry), or they can run the entire project for you.
No matter what tool you decide to go with, I do recommend taking a
look at some of the online survey best practice tips that are shared
on the blog. I think there are a lot of good tips there for beginners
and advanced survey designers alike.
Blog link: http://survey.cvent.com
Feel free to reach out if you have any questions!
I used to install limesurvey <http://www.limesurvey.org/>,which is an open
小龙(Think Big, Act Small)
My general advice is to make it short and relevant as pointed out by
others here. Go through the questions and ask your self, what will
you use the results of this question, and remove all nice-to-know
questions. You do not say what the purpose is, but I guess it is to
improve the site. In general make sure all questions are relevant for
the participants and for your purpose. If you want to repeat the
survey after the changes, you should include questions that target
the purpose of the redesign.
In general, I would include an overall satisfaction question E.g.
"How satisfied are you with the site overall?" and then go into
Group questions according to theme and end it with an open ended
question to let the participants comment or give you suggestions.
Here is a short checklist that could be helpful:
Logica UX Norway
To add to what Caroline said, if you have room, add a couple of control
These should be inserted pseudo-randomly, and should be copies of the most
important questions in the survey, but with different wording. This will
a tool to double-check the answers and make sure the respondent is
in her opinion (you can compare and remove the respondents that give very
different answers to the real item vs the control item).
As for the sample, you said this is for existing customers. Well, take a bit
time and pick a sample that's representative to the entire population of
and email them, asking to complete the survey. Maybe offer them a coupon to
spend on-site, or some other incentive.
Other than that... limesurvey 2 is really good, but it's only in alpha. Keep
an eye out for it!
The current version can be used, try
* *for a free account.
Also, I think wufoo.com might be a solution. You can quickly design any form
you like, and the reports are very useful.
This is good advice. I like to ask people doing surveys to list
business and user experience (or other goals) explicitly and then link
any question directly to those goals. There should always be a link
between business goals, user experience goals, and specific questions.
It is useful to create a matrix before you do detailed survey design
that list that connection.
As Runar said, avoid the tendency to "tack on some interesting
questions that others may have".
There is some research (I'll see if I can dig it up) about the effects
of location of general satisfaction questions - do you put it early
because you ask specific questions about a product or service or do
you put it at the end when you have asked questions that will likeliy
evoke more critical thinking about a product?
One of the principles that I push on survey designers is to put
demographic questions (that are not explicit screeners) at the end of
any survey since these questions - to the respondent - are not
interesting or relevent (though they are to the survey lead). This
was advice by Dillman in several of his books on survey design.
I wrote a short essay in the ACM journal Interactions related to this
topic: Designing useful and usable questionnaires: you can't just
"throw a questionnaire together" It was in the May 2007 issue of
interactions and available in the ACM Digital Library.
Good topic. Thanks to those who responded.
On Wed, Nov 18, 2009 at 7:48 PM, Runar Normark <runarnormark at gmail.com> wrote:
> My general advice is to make it short and relevant as pointed out by
> others here. Go through the questions and ask your self, what will
> you use the results of this question, and remove all nice-to-know
> questions. You do not say what the purpose is, but I guess it is to
> improve the site. In general make sure all questions are relevant for
> the participants and for your purpose. If you want to repeat the
> survey after the changes, you should include questions that target
> the purpose of the redesign.
> In general, I would include an overall satisfaction question E.g.
> "How satisfied are you with the site overall?" and then go into
> Group questions according to theme and end it with an open ended
> question to let the participants comment or give you suggestions.
> Here is a short checklist that could be helpful:
> Runar Normark
> Logica UX Norway
> . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
> Posted from the new ixda.org
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help
Depends on the degree of confidence you want to have that your results
are representative of the larger population. If you want to have a
high confidence level and a small margin of error, then I agree the
comments about statistical rigor above.
If, on the other hand you are trying to discover concepts, variables,
motivations, etc. for further processing, and the survey is the only
means you have to reach the people you're interested in, then treat
the survey like a mini-interview and gather semi-structured data to
bring into other exercises to develop more fully. These results are
not representative in terms of percentages of the larger population.
Many would say that is an improper use of the survey technique, but
people do non-statistical surveys all the time to suit their
purposes. You just need to know what you're getting out of it so
that you don't misrepresent the results.