Feature control on user interfaces

18 Sep 2006 - 6:35pm
7 years ago
3 replies
408 reads
oliver green
2006

Hi All,

As applications grow, so does the demand for new features. I am curious
about two things. In the industry:

[1] What methods are used to determine the most "used" features as opposed
to the most "wanted" features?
[2] How often is this process repeated?

Any pointers will be appreciated.

Thanks,
Oliver

Comments

19 Sep 2006 - 12:17pm
Nasir Barday
2006

Oliver,

Your e-mail alias has "hci" in it, so I'm assuming you're already
comfortable with the academics, and that you want to know about real-life
industry applications. Here's how I've done it at my company:

-) Put up a quote from Kelly Goto outside my workspace: "Ask users, and they
will guess at what they do. Watch them, and you'll know." I recite this in
meetings when people bring up topics like "cool features this client *has*
to have next quarter."
-) Have product developers and engineers help you conduct contextual
inquiries
-) Run usability tests on the existing product as a benchmark
-) Model users with your team on the whiteboard and come up with insights
about workflow together. The most useful information to get out of this are
personas and their goals.

In short:
-) "Wants" vs. "Needs" can be easy to ferret out once you have your users
modelled, not only for you, but for the whole team.
-) Failing that, usability tests (well-benchmarked with the original design)
show if a feature really is useful or just gums up the works

If you want more info on implementing UCD (with a focus on IxD), I'd highly
recommend About Face 2.0 as a read. I feel like I'm spitting out standard
stuff you already know-- let me know if you're looking for a deeper cut.

- Nasir

19 Sep 2006 - 6:27pm
oliver green
2006

Hi Nasir,

Thanks for the reply. You are right - I am aware of the standarized methods
taught in school. I am trying to learn the "practical" aspects of usability
testing.

The problem that I am pondering over is that after we have done the
contextual enquires and usability tests etc. When we actually deploy the
application, what techniques (if any) are used in practice to keep track of
the "most used" features (e.g. perhaps by using log analysis). Also, are
these tests done on-the-fly, or only before a version release?

Basically, I guess what I am trying to understand is that - how can we
validate the application's UI during real time usage?

Thanks,
Oliver.

On 9/19/06, Nasir Barday <nbarday at gmail.com> wrote:
>
> Oliver,
>
> Your e-mail alias has "hci" in it, so I'm assuming you're already
> comfortable with the academics, and that you want to know about real-life
> industry applications. Here's how I've done it at my company:
>
> -) Put up a quote from Kelly Goto outside my workspace: "Ask users, and
> they will guess at what they do. Watch them, and you'll know." I recite this
> in meetings when people bring up topics like "cool features this client
> *has* to have next quarter."
> -) Have product developers and engineers help you conduct contextual
> inquiries
> -) Run usability tests on the existing product as a benchmark
> -) Model users with your team on the whiteboard and come up with insights
> about workflow together. The most useful information to get out of this are
> personas and their goals.
>
> In short:
> -) "Wants" vs. "Needs" can be easy to ferret out once you have your users
> modelled, not only for you, but for the whole team.
> -) Failing that, usability tests (well-benchmarked with the original
> design) show if a feature really is useful or just gums up the works
>
> If you want more info on implementing UCD (with a focus on IxD), I'd
> highly recommend About Face 2.0 as a read. I feel like I'm spitting out
> standard stuff you already know-- let me know if you're looking for a deeper
> cut.
>
> - Nasir
>

19 Sep 2006 - 6:43pm
Oleh Kovalchuke
2006

Oliver wrote:
"Basically, I guess what I am trying to understand is that - how can we
validate the application's UI during real time usage?"

I would recommend this well written book by Isaaks and Walendowski:
http://tinyurl.com/egeud . It describes usage testing including log analysis
among other things. Good book.

--
Oleh Kovalchuke
Interaction Design is Design of Time
http://www.tangospring.com/IxDtopicWhatIsInteractionDesign.htm

On 9/19/06, oliver green <oliverhci at gmail.com> wrote:
>
> [Please voluntarily trim replies to include only relevant quoted
> material.]
>
> Hi Nasir,
>
> Thanks for the reply. You are right - I am aware of the standarized
> methods
> taught in school. I am trying to learn the "practical" aspects of
> usability
> testing.
>
> The problem that I am pondering over is that after we have done the
> contextual enquires and usability tests etc. When we actually deploy the
> application, what techniques (if any) are used in practice to keep track
> of
> the "most used" features (e.g. perhaps by using log analysis). Also, are
> these tests done on-the-fly, or only before a version release?
>
> Basically, I guess what I am trying to understand is that - how can we
> validate the application's UI during real time usage?
>
> Thanks,
> Oliver.
>
>
>
> On 9/19/06, Nasir Barday <nbarday at gmail.com> wrote:
> >
> > Oliver,
> >
> > Your e-mail alias has "hci" in it, so I'm assuming you're already
> > comfortable with the academics, and that you want to know about
> real-life
> > industry applications. Here's how I've done it at my company:
> >
> > -) Put up a quote from Kelly Goto outside my workspace: "Ask users, and
> > they will guess at what they do. Watch them, and you'll know." I recite
> this
> > in meetings when people bring up topics like "cool features this client
> > *has* to have next quarter."
> > -) Have product developers and engineers help you conduct contextual
> > inquiries
> > -) Run usability tests on the existing product as a benchmark
> > -) Model users with your team on the whiteboard and come up with
> insights
> > about workflow together. The most useful information to get out of this
> are
> > personas and their goals.
> >
> > In short:
> > -) "Wants" vs. "Needs" can be easy to ferret out once you have your
> users
> > modelled, not only for you, but for the whole team.
> > -) Failing that, usability tests (well-benchmarked with the original
> > design) show if a feature really is useful or just gums up the works
> >
> > If you want more info on implementing UCD (with a focus on IxD), I'd
> > highly recommend About Face 2.0 as a read. I feel like I'm spitting out
> > standard stuff you already know-- let me know if you're looking for a
> deeper
> > cut.
> >
> > - Nasir
> >
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> List Guidelines ............ http://listguide.ixda.org/
> List Help .................. http://listhelp.ixda.org/
> (Un)Subscription Options ... http://subscription-options.ixda.org/
> Announcements List ......... http://subscribe-announce.ixda.org/
> Questions .................. lists at ixda.org
> Home ....................... http://ixda.org/
> Resource Library ........... http://resources.ixda.org
>

Syndicate content Get the feed