Questioning User Research (was Eyetracking article on UXMatters)

12 Jul 2006 - 11:17am
8 years ago
8 replies
818 reads
Christopher Fahey
2005

> While the diagrams are interesting and insightful, I would be
> very skeptical about using this article as a basis for design
> decisions, especially the conclusions.

I just wanted to add that I've been blogging a mini-series of posts this
week about my skepticism in general about a lot of user research: about how
bad much of it is, about how it is often used inappropriately, about how
common sense and individual expertise is too-often neglected in favor of
questionable research conclusions. I say some pretty strident things, and
I'm probably wrong about a lot of it. I'd love to hear what this community
thinks.

http://www.graphpaper.com/

Cheers,
-Cf

Christopher Fahey
____________________________
Behavior
http://www.behaviordesign.com
me: http://www.graphpaper.com

Comments

12 Jul 2006 - 11:22am
Dave Malouf
2005

I tend to agree w/ Chris. I'm very suspicious to most user research,
especially quantitative stuff. I'm just to empathetic to get into
numbers. But quantitative is not the only type of user research and
something I read today from Mark Hurst was nice to get a chance to read.
It is in this week's "episode" of Good Experience:
http://goodexperience.com/blog/index.php.

The blog is a preview to a case study that Mark did for del.icio.us and
the user research he did for them as part of their design work. Iike the
wholistic approach and importance of user research.

The case study is at: http://creativegood.com/casestudies/delicious.html

- dave

Christopher Fahey wrote:
> I just wanted to add that I've been blogging a mini-series of posts this
> week about my skepticism in general about a lot of user research: about how
> bad much of it is, about how it is often used inappropriately, about how
> common sense and individual expertise is too-often neglected in favor of
> questionable research conclusions. I say some pretty strident things, and
> I'm probably wrong about a lot of it. I'd love to hear what this community
> thinks.
>
> http://www.graphpaper.com/
>
--

David (Heller) Malouf
Vice President
dave(at)ixda(dot)org
http://ixda.org/
http://synapticburn.com/

AIM: bolinhanyc // Y!: dave_ux //
MSN: hippiefunk(at)hotmail.com // Gtalk: dave.ixd(at)gmail.com

12 Jul 2006 - 11:49am
Sabine Junginger
2006

Christopher,

I am glad you are asking this question. Your comment touches on several important aspects of user research that have tremendous implications for designers and design.

User research is indeed a tool for designers to draw the client, the organization, into the design process. It allows non-designers a pathway into design thinking. It also allows non-designers to begin to experience limits of existing approaches, a necessary step to imagine new alternatives. I argue in my dissertation that human-centered product development, of which user research is a key element, is a strategic tool for organizations to change because of this very same aspect. In organizational change theory, existing perceptions are called "fundamental assumptions." Fundamental assumptions are the kinds of beliefs and values we all have and hold without articulating them- because more often than not we are not even fully aware of them. It is rather difficult to "unearth" these deeply held beliefs and values and user research has shown to produce evidence and materials that challenge some of these assumptions. However, user research without a larger framework of inquiry ca!
n be questionable, too. Design researchers need to know when to engage in user research, what questions to ask, how to gather the data and, most importantly, for what purpose. It appears that you have discovered a "secondary" use of user research, one that has its purpose in redesigning the organization.

Cheers,

Sabine
°·..·••

>
> I just wanted to add that I've been blogging a mini-series of posts this
> week about my skepticism in general about a lot of user research: about
> how bad much of it is, about how it is often used inappropriately, about
> how common sense and individual expertise is too-often neglected in favor
> of questionable research conclusions. I say some pretty strident things,
> and I'm probably wrong about a lot of it. I'd love to hear what this
> community thinks.
>
> http://www.graphpaper.com/
>
> Cheers, -Cf
>
> Christopher Fahey ____________________________ Behavior
> http://www.behaviordesign.com me: http://www.graphpaper.com
>
>
> ________________________________________________________________ Welcome
> to the Interaction Design Association (IxDA)! To post to this list .......
> discuss at ixda.org List Guidelines ............ http://listguide.ixda.org/
> List Help .................. http://listhelp.ixda.org/ (Un)Subscription
> Options ... http://subscription-options.ixda.org/ Announcements List
> ......... http://subscribe-announce.ixda.org/ Questions ..................
> lists at ixda.org Home ....................... http://ixda.org/ Resource
> Library ........... http://resources.ixda.org
>
>

12 Jul 2006 - 1:57pm
Robert Hoekman, Jr.
2005

I've yet to read all of it yet, but so far, this is a fantastic series of
posts. I'm 100% behind you on this one. For all the focus on research and
testing, so many things we learn from it are "common sense" and are
conclusions just as likely to be reached by an experience designer.

It's true where I work as well. In fact, ironically, I just got an email
from someone who said he aked another usability guru about the value of
remote testing, and that person told him it wasn't as good as real-life
testing, but it was far better than nothing (if you're going to test), and
that WebEx was a good solution for performing remote tests.

Um, yeah. Thanks for the great insight. Didn't I say the same thing
yesterday?

-r-

On 7/12/06, Christopher Fahey <chris.fahey at behaviordesign.com> wrote:
>
> [Please voluntarily trim replies to include only relevant quoted
> material.]
>
> > While the diagrams are interesting and insightful, I would be
> > very skeptical about using this article as a basis for design
> > decisions, especially the conclusions.
>
> I just wanted to add that I've been blogging a mini-series of posts this
> week about my skepticism in general about a lot of user research: about
> how
> bad much of it is, about how it is often used inappropriately, about how
> common sense and individual expertise is too-often neglected in favor of
> questionable research conclusions. I say some pretty strident things, and
> I'm probably wrong about a lot of it. I'd love to hear what this community
> thinks.
>
> http://www.graphpaper.com/
>
> Cheers,
> -Cf
>
> Christopher Fahey
> ____________________________
> Behavior
> http://www.behaviordesign.com
> me: http://www.graphpaper.com
>
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> List Guidelines ............ http://listguide.ixda.org/
> List Help .................. http://listhelp.ixda.org/
> (Un)Subscription Options ... http://subscription-options.ixda.org/
> Announcements List ......... http://subscribe-announce.ixda.org/
> Questions .................. lists at ixda.org
> Home ....................... http://ixda.org/
> Resource Library ........... http://resources.ixda.org
>

12 Jul 2006 - 2:10pm
Sabine Junginger
2006

Robert,

Two questions for you:

1) Are you talking about "user testing," "user research" or "experience design?"

2) How do experience designers decide what is common sense?

Sabine
°·..·••

> [Please voluntarily trim replies to include only relevant quoted
> material.]
>
> I've yet to read all of it yet, but so far, this is a fantastic series of
> posts. I'm 100% behind you on this one. For all the focus on research
> and testing, so many things we learn from it are "common sense" and are
> conclusions just as likely to be reached by an experience designer.
>
> It's true where I work as well. In fact, ironically, I just got an email
> from someone who said he aked another usability guru about the value of
> remote testing, and that person told him it wasn't as good as real-life
> testing, but it was far better than nothing (if you're going to test),
> and that WebEx was a good solution for performing remote tests.
>
> Um, yeah. Thanks for the great insight. Didn't I say the same thing
> yesterday?
>
> -r-
>
>
> On 7/12/06, Christopher Fahey <chris.fahey at behaviordesign.com> wrote:
>>
>> [Please voluntarily trim replies to include only relevant quoted
>> material.]
>>
>>> While the diagrams are interesting and insightful, I would be very
>>> skeptical about using this article as a basis for design decisions,
>>> especially the conclusions.
>>
>> I just wanted to add that I've been blogging a mini-series of posts
>> this week about my skepticism in general about a lot of user research:
>> about how bad much of it is, about how it is often used inappropriately,
>> about how common sense and individual expertise is too-often neglected
>> in favor of questionable research conclusions. I say some pretty
>> strident things, and I'm probably wrong about a lot of it. I'd love to
>> hear what this community thinks.
>>
>> http://www.graphpaper.com/
>>
>> Cheers, -Cf
>>
>> Christopher Fahey ____________________________ Behavior
>> http://www.behaviordesign.com me: http://www.graphpaper.com
>>
>>
>> ________________________________________________________________ Welcome
>> to the Interaction Design Association (IxDA)! To post to this list
>> ....... discuss at ixda.org List Guidelines ............
>> http://listguide.ixda.org/ List Help ..................
>> http://listhelp.ixda.org/ (Un)Subscription Options ...
>> http://subscription-options.ixda.org/ Announcements List .........
>> http://subscribe-announce.ixda.org/ Questions ..................
>> lists at ixda.org Home ....................... http://ixda.org/ Resource
>> Library ........... http://resources.ixda.org
>>
> ________________________________________________________________ Welcome
> to the Interaction Design Association (IxDA)! To post to this list .......
> discuss at ixda.org List Guidelines ............ http://listguide.ixda.org/
> List Help .................. http://listhelp.ixda.org/ (Un)Subscription
> Options ... http://subscription-options.ixda.org/ Announcements List
> ......... http://subscribe-announce.ixda.org/ Questions ..................
> lists at ixda.org Home ....................... http://ixda.org/ Resource
> Library ........... http://resources.ixda.org
>
>

Sabine
°·..·••

::::

Sabine Junginger, PhD Design
Changing Around Customers
Atlanta GA 30309

412.726.8906

::::

12 Jul 2006 - 3:59pm
Robert Hoekman, Jr.
2005

> 1) Are you talking about "user testing," "user research" or "experience
> design?"
>
> 2) How do experience designers decide what is common sense?

Wow - two very loaded questions. :)

1) I just mean that so much of what learn via testing is so obvious. More
often than not, we learn things plenty of people already know and/or could
have deduced from a good experience level and working knowledge of a
particular domain (eg: web apps).

2) I was considering writing a blog post about this tonight.

We all know common sense isnt' that common, but I think that's primarily
because of context. Everyone has common sense about something, and that
common sense is only common to those who also operate within that context. A
designer's common sense, IMHO, comes from knowing his domain, from
experience with that domain, and from experience *designing* for that
domain. If that domain is about designing *experiences*, common sense can
come from having studied the qualities of an experience (see Shedroff's site
and his book Experience Design 1, for example), learning to choose which
aspects are important for each new design, and learning to reproduce those
qualities in appropriate ways. If you know what makes an experience good,
you can parlay that knowledge into new designs for new experiences.

So many things are knowable by those who study something inside and out.
Everything you need to know about anything can be learned by studying one
thing completely. As in, if you study a single rock thoroughly enough, you
can eventually write an encyclopedia about geography. Your common sense
about "experience design" is derived from having lived with that notion for
a while. Once you know the domain, because you've gotten really cozy with it
for a while, you can certainly step up and start morphing yourself into an
expert.

That said, I'm not sure if "experiences" can be designed. There are way too
many factors that are outside the designer's control. A limited definition
of the term "experience", though, is probably what is implied with the title
Experience Designer. As in, we can design something to be experienced, with
the intent to have it experienced a certain way, but we cannot truly design
a whole experience, because the person doing the experiencing has to become
part of it in his own way.

-r-

12 Jul 2006 - 4:00pm
Robert Hoekman, Jr.
2005

On 7/12/06, Robert Hoekman, Jr. <rhoekmanjr at gmail.com> wrote:
>
>
> 1) Are you talking about "user testing," "user research" or "experience
> > design?"
> >
> > 2) How do experience designers decide what is common sense?
>
>
> Wow - two very loaded questions. :)
>
> 1) I just mean that so much of what learn via testing is so obvious. More
> often than not, we learn things plenty of people already know and/or could
> have deduced from a good experience level and working knowledge of a
> particular domain (eg: web apps).
>
> 2) I was considering writing a blog post about this tonight.
>
> We all know common sense isnt' that common, but I think that's primarily
> because of context. Everyone has common sense about something, and that
> common sense is only common to those who also operate within that context. A
> designer's common sense, IMHO, comes from knowing his domain, from
> experience with that domain, and from experience *designing* for that
> domain. If that domain is about designing *experiences*, common sense can
> come from having studied the qualities of an experience (see Shedroff's site
> and his book Experience Design 1, for example), learning to choose which
> aspects are important for each new design, and learning to reproduce those
> qualities in appropriate ways. If you know what makes an experience good,
> you can parlay that knowledge into new designs for new experiences.
>
> So many things are knowable by those who study something inside and out.
> Everything you need to know about anything can be learned by studying one
> thing completely. As in, if you study a single rock thoroughly enough, you
> can eventually write an encyclopedia about geography. Your common sense
> about "experience design" is derived from having lived with that notion for
> a while. Once you know the domain, because you've gotten really cozy with it
> for a while, you can certainly step up and start morphing yourself into an
> expert.
>
> That said, I'm not sure if "experiences" can be designed. There are way
> too many factors that are outside the designer's control. A limited
> definition of the term "experience", though, is probably what is implied
> with the title Experience Designer. As in, we can design something to be
> experienced, with the intent to have it experienced a certain way, but we
> cannot truly design a whole experience, because the person doing the
> experiencing has to become part of it in his own way.
>
> -r-
>

13 Jul 2006 - 7:57am
Mark Schraad
2006

I think we often have unrealistic expectation of research. In market
research, the quantitative is useful for breaking audiences into
segments and observing large scale trends, but is can only 'indicate'
most other inferences. Qualitative research usually comes in the form
of focus groups and is virtually useless.

Any design project deserves some up front research to bring insight
into the user's (or customers) needs and wants. There are tons of way
to do this. Tons of preexisting research can be accessed via
ergonomic abstracts, articles and books. Often the designer can just
put themselves in a similar situation and learn a lot. These are, of
course, almost always small samplings and generalizations.

I believe that the minute a prototype is generated (no matter how
crude) you need to put it in front of a small group of users (either
together or individually). I can't tell you how many projects would
have been headed in the wrong direction early had we not done this.

I am a fan of 'deep dive' ethnography. It is expensive, but can
provide fascinating insight into opportunities and behaviors that you
can not uncover any other way. The interpretation of the researcher
is absolutely crucial. Pick a good/proven team. It IS evedentiary
research!

I also think that eye tracking is virtually useless unless you are
working at the cognitive level. That rarely applies to design and
user experience work.

Surveys are the worst. Very unreliable and 98% of the time they are
terribly written and executed. With one exception... conjoint
analysis is a survey technology that can be very revealing. The
current Mariott chain of hotels were designed utilizing extensive CA
almost exclusively (Green et al - Courtyard Marriots - the full
document is available from most university libraries... I can give
you the specifics if you want). Paul Green pioneered much of this if
you intend to do a search... find him.

The problem with nearly all user research is that it is "simulated".
The real incentive is lacking and can not be faked. For marketers...
predictive markets show great promise. For webbies, real time use
analysis is the best thing we have.

Usability - (in my world) implies that the product is finished. IMO -
this is too late unless we are talking about a site re-design.

Lastly, read Gerald Zaltman's book "how customers think." He
describes the problem with most research is a matter of how conscious
we are of our own actions and motivation... it renders a picture of
the consumer field a bit ill equipped.

Finally - the absolute toughest part of this process comes when it is
time for the researcher to interpret for the designer... and the
designer to apply the results and indications. The it is optimal to
retest. We rarely have the time or budget to do this right. This is
the golden goose for designers and probably the least defined or
documented.

Mark

On Jul 12, 2006, at 11:22 AM, Dave (Heller) Malouf wrote:

> [Please voluntarily trim replies to include only relevant quoted
> material.]
>
> I tend to agree w/ Chris. I'm very suspicious to most user research,
> especially quantitative stuff. I'm just to empathetic to get into
> numbers. But quantitative is not the only type of user research and
> something I read today from Mark Hurst was nice to get a chance to
> read.
> It is in this week's "episode" of Good Experience:
> http://goodexperience.com/blog/index.php.
>
> The blog is a preview to a case study that Mark did for del.icio.us
> and
> the user research he did for them as part of their design work.
> Iike the
> wholistic approach and importance of user research.
>
> The case study is at: http://creativegood.com/casestudies/
> delicious.html
>
> - dave

13 Jul 2006 - 8:55am
Mark Schraad
2006

I agree with your statement here Robert. But I would still maintain
that heuristic evaluations is hopefully based upon a deep knowledge
base (good) and previous experiences (fine - but dangerous). When we
rely on common sense, we often make assumptions and we miss
opportunities.

Mark

On Jul 12, 2006, at 1:57 PM, Robert Hoekman, Jr. wrote:

> For all the focus on research and
> testing, so many things we learn from it are "common sense" and are
> conclusions just as likely to be reached by an experience designer.

Syndicate content Get the feed