The Value Of User Surveys

6 Oct 2010 - 10:24am
3 years ago
7 replies
957 reads
gthomas10
2010

My client contact, several of my contacts employees, and the founder of the company have indicated a strong belief that, based on user surveys they have conducted, their site is "pretty darn good".

Part of the process I am working to implement will include user testing during the design phase. Does anyone have good counter points to why user surveys should be taken with a grain of salt and not considered the "end-all-be-all" judgment of a website.

Note: I haven't yet had the chance to review the surveys so I can't comment as to the quality of the questions (i.e. are they leading). I do know that they conducted the surveys by emailing users and asking them to participate.

Examples I thought of:

  • When you obtain survey results based on an existing design that is 'in the wild' users are typically compelled to be nice. People taking the survey don't want to be rude.
    If you test a prototype with users who understand that, what they are seeing, is not final you are more likely to get honest feedback.
  • Typically users who volunteer to take the time to complete a survey are happy with the product in question. People who are unhappy with a product/site complain to customer support and are annoyed already and therefore won't waste any more time to complete a survey.  We need to hear the bad stuff as well as the good. 

 

Thanks in advance

GT

 

Comments

6 Oct 2010 - 11:55am
Caroline Jarrett
2007

Your points are definitely possible.

Other things to think about:

- What did the survey ask? Did it ask about general 'ease of use', or was there anything in there about the goals that the users wanted to accomplish? Or about the goals that the business wanted them to accomplish? Real life example: a site where users were completely happy because they came to the site to find paper forms, and it was very easy to find paper forms. Unfortunately, the site's owners wanted them to use the web forms. A survey wouldn't have picked up that problem.

- Good for what? Example: I worked on a thick pack of 'help' information that had consistently good survey ratings for ease-of-use from its customers. But in fact, customers only used small parts of the information and sometimes not even the current year's stuff. The rest of the pack was useless. The survey didn't discriminate.

- What is the response rate? Are there likely to be any systematic differences between people who do respond and don't respond? O reven more important, between people who do respond and your target audience. As you point out, the response may be baised because people only answer if they're positive. Or all the respondents might be opinionated people with time on their hands, which would be a bit worrying if your site is aimed at especially busy people.

- Surveys tend to work best if they're aimed at relatively high-level questions e.g. "why did you come to this site? Were you able to do what you wanted to do?". Usability testing can help a bit with that, but is best at digging into the precise bits and pieces that led to the overall success. In the web forms/paper forms example, the users probably wouldn't have been able to report on a survey about why they didn't find the web forms. (The issue was that the web forms were hidden under a button with an obscure jargon term, whereas the paper forms had a link that included the word 'forms').

- Why not do both? It shouldn't be an either/or. It should be about getting as much data as you can from as many sources as you can. You've already got survey data: great, build on it. Explain that usability testing will add to that and give more depth (and is generally a lot cheaper and quicker than a good survey).

Best

Caroline Jarrett

www.formsthatwork.com

http://www.rosenfeldmedia.com/books/survey-design/

6 Oct 2010 - 1:05pm
Moses Wolfenstein
2010

This is not web design specific but regarding your second point (and feeding back into your first):

* Typically users who volunteer
 to take the time to complete a survey are happy with the product in
 question. People who are /unhappy/ with a product/site complain to
 customer support and are annoyed already and therefore won't waste any
 more time to complete a survey.  We need to hear the bad stuff as
 well as the good. 


It's always worth asking what the response rate looked like or otherwise uncovering the mechanism used for survey dissemination. Basically the concern you're raising here is very relevant if sampling wasn't random which should be easy enough to determine. If the research design was solid, the sampling was in fact random, and the response rate was high, then maybe users actually are satisfied with the site. Of course, the other concern you raise regarding design of the instrument itself (i.e how the items are written) is also very relevant and could skew results towards positive attitudes.


Additionally, attitudinal surveys that rely on subject self assessment will always give you a different part of the picture from user testing. If the only question that's relevant is whether or not users are happy with the tool (or its features) a survey is probably fine. However if you're looking to improve user experience around issues that users may or may not realize they're having with the tool you're going to want more information, and that's going to be information which a survey can't really provide. The data grain size issue is also very relevant here. If you design a concise survey to get a better completion rate, chances are you're getting less fine grained data in favor of designing an instrument that subjects will find easy to complete.

Those are a few general thoughts on the topic. There's a heck of a lot of stuff out there that's been written on the topic of methods more generally that can be brought in to support some of these points. While I'm not personally as familiar with writing on the topic of surveys vs. other research tools specifically within the user experience context, this is an issue that plagues social science researchers more broadly. Personally I believe that surveys are way over used as a research instrument in large part because they're relatively easy to design and deploy. Making sure you're asking the right questions for a survey and designing a good one . . . not so easy.

my 2¢,
-Moses


 

________________________________________________________________
Welcome to the Interaction Design Association (IxDA) Discussion!
Manage Subscriptions or Unsubscribe .......... http://www.ixda.org/user/36599/notifications
Discussion Guidelines .......... http://www.ixda.org/help

--

View original post: http://www.ixda.org/mailcomment/redirect/%3C36599.27843.0.1286378673.97d0a7d028f598ac81f7144932219a42%40ixda.org%3E





--
Moses Wolfenstein, Ph.D.
Associate Director of Research
Academic ADL Co-Lab
www.moseswolfenstein.com

6 Oct 2010 - 9:05pm
Prof. AProfeta
2009

Hello Tony,,

I tried to find the TED talk by Malcolm Gladwell you mentioned, but on ted.com found only the one from 2004 where he talks about Spaghetti Sauce. Is this the one you are showing?

Thank you, -Herbert

On Oct 6, 2010, at 5:50 PM, Prof. Anthony Profeta wrote:

> GT, > In addition to being a practitioner, I teach Usability and Interactive Design classes at Philadelphia University. > When discussing the quality of user feedback I show Malcolm Galdwell's amazing video on TED. In the video he discusses the faults, variance and effectiveness of user testing. In less than 15 minutes the lesson is learned and stays with the student. > On another point this week's episode of House drives the same point home in less than 20 seconds. I won't spoil for the DVR time shifting crew but it is quite amusing. > Hope this helps, > TonyPRO > PDA Media, Inc > http://PDAmedia.net > ------Original Message------ > From: gthomas10 > Sender: ixdaor@host.ixda.org > To: Tony Profeta > ReplyTo: discuss@ixda.org > Subject: [IxDA] The Value Of User Surveys > Sent: Oct 6, 2010 12:18 PM > > My client contact, several of my contacts employees, and the founder of the > company have indicated a strong belief that, based on user surveys they have > conducted, their site is "pretty darn good". > > Part of the process I am working to implement will include user testing > during > the design phase. Does anyone have good counter points to why user surveys > should be taken with a grain of salt and not considered the > "end-all-be-all" judgment of a website. > > Note: I haven't yet had the chance to review the surveys so I can't comment > as to the quality of the questions (i.e. are they leading). I do know that > they > conducted the surveys by emailing users and asking them to participate. > > Examples I thought of: > > * When you obtain survey > results based on an existing design that is 'in the wild' users are > typically compelled to be nice. People taking the survey don't want to be > rude. > If you test a prototype with users who understand that, what they are > seeing, is /not/ final you are more likely to get honest feedback. > * Typically users who volunteer > to take the time to complete a survey are happy with the product in > question. People who are /unhappy/ with a product/site complain to > customer support and are annoyed already and therefore won't waste any > more time to complete a survey. We need to hear the bad stuff as > well as the good. > >
> > Thanks in advance > > GT > >
> >

7 Oct 2010 - 12:44am
Prof. AProfeta
2009

Hi Herbert,

Yes strange as it may sound that is the video. My students love it! Gladwell has great finesse in relaying and understanding of the subject matter. He makes observation and studies compelling. In fact, I find that there is a strong correlation between a students initial reactions to the video and their overall performance /success in class.

here is the link: http://www.ted.com/talks/lang/eng/malcolm_gladwell_on_spaghetti_sauce.html

Regards,

TonyPRO  

PDA Media, Inc 

http://PDAmedia.net 

7 Oct 2010 - 11:06am
herbert68
2010

Thank you, I am going to follow your example and will report back.

Best, Herbert

7 Oct 2010 - 11:06am
heidi newell
2008

Gladwell gave the same talk at PopTech a couple years ago: http://www.poptech.org/popcasts/malcolm_gladwell_human_potential

On a side note, PopTech's 2010 conference will be live stream on Oct 20-23. Should be some great talks this year.

On Wednesday, October 6, 2010, Prof. Anthony Profeta wrote: > Hi Herbert, > > Yes strange as it may sound that is the video. My students love it! Gladwell has great finesse in relaying and understanding of the subject matter. He makes observation and studies compelling. In fact, I find that there is a strong correlation between a students initial reactions to the video and their overall performance /success in class. > > here is the link: http://www.ted.com/talks/lang/eng/malcolm_gladwell_on_spaghetti_sauce.html > > Regards, > > TonyPRO > > PDA Media, Inc > > http://PDAmedia.net > >

7 Oct 2010 - 10:01am
gthomas10
2010

Thank you everyone for the input. Can't wait to get my hands on the survey questions. Will let you all know what else I find out.

@Caroline I completely agree that they should continue user surveys.

@Moses response rate was the first thing that popped in my mind.

Greg Thomas

www.gthomas10.com

Syndicate content Get the feed