Whats the proper name for motor memory within interaction?

20 Nov 2006 - 2:28pm
7 years ago
9 replies
626 reads
.pauric
2006

In responding to Christian's question on the PSP interface I tried to
describe an aspect of the interaction as "memory mapping of category element
placement"

Examples;
I know my ATM card pin as [bottom middle]+[2x top right]+[top middle], not
by the actual numbers.
I've learned to navigate the UI of my mp3 player by remembering the sequence
of physical key presses - not by looking at the UI while I'm driving. Hold
[play] 3 seconds, then [down \/ ] then [right > ] to tag the current song in
to the favourites list.

I'm sure there is a more eloquent way to describe this aspect of how we
navigate certain UIs.

Thanks in advance - pauric

Comments

20 Nov 2006 - 2:48pm
Tanya Rabourn
2004

On 11/20/06, pauric <radiorental at gmail.com> wrote:
>
> In responding to Christian's question on the PSP interface I tried to
> describe an aspect of the interaction as "memory mapping of category
> element
> placement"

Sounds to me like you're talking about "muscle memory."

-Tanya

20 Nov 2006 - 3:29pm
.pauric
2006

Thank you Tanya. Motor Memory is also known as muscle memory/learning,
sorry for the confusion.
http://en.wikipedia.org/wiki/Motor_memory

The wiki entry discusses this in relation to a direct brain-computer
interface, I am thinking of the higher function processes that take place
when I consciously decide to select a function of a UI. Furthermore, the
decisions I make to commit to this learning and those contexts. It is
harder to motor learn a particular sequence initially. What are the
personal and environmental effects that cause me to commit to motor
learning. Habitual use, not wanting to look at the UI while driving, not
wanting to take an mp3 player out of my pocket etc

For example, I've come across a virtual phone keypad design that had
reversed the convention of having [1] at the top and the row starting with
[7] at the bottom. I was physically unable to enter my 7 digit voicemail
passcode because it broke my motor memory map.

To sum up, this seems to me to be an important factor of interaction
design. Its a combination of scenario definition, motor learning, haptics
and of course UI design to meet these requirements. So there is probably a
name for this within our context, right?

regards - pauric

20 Nov 2006 - 5:41pm
Vibol Hou
2006

I believe there's something similar to what you're describing called
Distributed cognition
(http://en.wikipedia.org/wiki/Distributed_cognition). It places a strong
emphasis on tools as a natural extension of one's own cognition. In the
case in which you describe the MP3 player, its function, and how you
interact with it, distributed cognition would describe that entire
system as a natural extension of your cognitive abilities as you've
internalized the interaction to a point where performing a function
becomes automatic.

-Vibol

pauric wrote:
> Thank you Tanya. Motor Memory is also known as muscle memory/learning,
> sorry for the confusion.
> http://en.wikipedia.org/wiki/Motor_memory
>
> The wiki entry discusses this in relation to a direct brain-computer
> interface, I am thinking of the higher function processes that take place
> when I consciously decide to select a function of a UI. Furthermore, the
> decisions I make to commit to this learning and those contexts. It is
> harder to motor learn a particular sequence initially. What are the
> personal and environmental effects that cause me to commit to motor
> learning. Habitual use, not wanting to look at the UI while driving, not
> wanting to take an mp3 player out of my pocket etc
>
> For example, I've come across a virtual phone keypad design that had
> reversed the convention of having [1] at the top and the row starting with
> [7] at the bottom. I was physically unable to enter my 7 digit voicemail
> passcode because it broke my motor memory map.
>
> To sum up, this seems to me to be an important factor of interaction
> design. Its a combination of scenario definition, motor learning, haptics
> and of course UI design to meet these requirements. So there is probably a
> name for this within our context, right?
>
> regards - pauric
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> List Guidelines ............ http://listguide.ixda.org/
> List Help .................. http://listhelp.ixda.org/
> (Un)Subscription Options ... http://subscription-options.ixda.org/
> Announcements List ......... http://subscribe-announce.ixda.org/
> Questions .................. lists at ixda.org
> Home ....................... http://ixda.org/
> Resource Library ........... http://resources.ixda.org
>

21 Nov 2006 - 10:34am
ldebett
2004

Distributed cognition is about using your environment and objects to extend
your own memory. For example, we do this when we hang our keys in the same
place every day as to not lose them, park in the same spot, or if you need
to disassemble something to fix it, you lay out the pieces in a way that
will help you reassemble it later so you don't have to try to remember so
much about what screw goes where. In the case of the mp3 player, you
organizing your favorite music into a playlist that, say you name "aaaaa" so
it would be the first one, would be using distributed cognition.

Cheers,
Lisa

On 11/20/06, Vibol Hou <vibol.lists at khmer.cc> wrote:
>
> I believe there's something similar to what you're describing called
> Distributed cognition
> (http://en.wikipedia.org/wiki/Distributed_cognition). It places a strong
> emphasis on tools as a natural extension of one's own cognition. In the
> case in which you describe the MP3 player, its function, and how you
> interact with it, distributed cognition would describe that entire
> system as a natural extension of your cognitive abilities as you've
> internalized the interaction to a point where performing a function
> becomes automatic.
>
> -Vibol
>
>
>

21 Nov 2006 - 11:00am
.pauric
2006

Thank you both Vibol & Lisa

So, my next question would would be... Can we make design choices that will
result in interfaces that have a natural 'intuitivity' based on an
understanding of what makes Distributed Cognition work?

Just to help get my head around the idea, I like to visualise concepts in
practical terms. Is the following an example of designing an interface
which aids our Distributed Cognition?

Car manufacturers usually put the indicator control stick on the right hand
side of the steering column. It usually toggles up for going left, and down
to indicate intention to go right*. This obvious design decision, probably
based on safety/consistency, has resulted in my subconscious ability to
comfortably operate most any model of car without having to learn the basics
of the system controls.

regards - pauric

On 11/21/06, Lisa deBettencourt <ldebett at gmail.com> wrote:
>
> Distributed cognition is about using your environment and objects to
> extend
> your own memory. For example, we do this when we hang our keys in the same
> place every day as to not lose them, park in the same spot, or if you need
> to disassemble something to fix it, you lay out the pieces in a way that
> will help you reassemble it later so you don't have to try to remember so
> much about what screw goes where. In the case of the mp3 player, you
> organizing your favorite music into a playlist that, say you name "aaaaa"
> so
> it would be the first one, would be using distributed cognition.
>
> Cheers,
> Lisa
>
>

21 Nov 2006 - 11:04am
Dan Saffer
2003

On Nov 21, 2006, at 8:00 AM, pauric wrote:

> Can we make design choices that will
> result in interfaces that have a natural 'intuitivity' based on an
> understanding of what makes Distributed Cognition work?

You probably already do. Most visual interfaces (GUIs) use
distributed cognition, with on-screen items like menus, icons,
buttons, etc. Compare that to say, the DOS or Unix command line,
where users had to remember the commands to manipulate the system.

Dan

21 Nov 2006 - 11:26am
.pauric
2006

Thank you Dan, that follows nicely in to my next question. I have a feeling
there is something new here, for me at least.

More than the Distributed Cognition (D.C.) that we take for granted as
designers. That is; users will naturally use the left mouse button to
click, select etc. They will probably know that the right mouse button has
some advanced functionality. It is a convention to underline links as users
have D.C. that the physical act of using the left hand mouse button to click
on underlined text will navigate them to more information on that text.

To me, this is the nuts and bolts of design. I have a gut feeling there's
something more that we can tap in to. Another tool that we can utilise to
help close a design choice. To use the 'Add song to playlist' example, my
mp3 player is actually quite bad at this. As its one of my top 5 features,
and part of a higher system flow; listening to favourite songs, I'm sure if
they factored in my scenario of navigating the UI blind (drive a car,
keeping the player in my pocket) they would have built a UI that was more
intuitive, more D.C. if the two are related.

I see what you are saying in that its something we probably end up doing in
good design practice. But I think that separating D.C. out as standalone
input in to the process has merit depending on the device/interface.

thanks folks

On 11/21/06, Dan Saffer <dan at odannyboy.com> wrote:
>
>
> On Nov 21, 2006, at 8:00 AM, pauric wrote:
>
> > Can we make design choices that will
> > result in interfaces that have a natural 'intuitivity' based on an
> > understanding of what makes Distributed Cognition work?
>
> You probably already do. Most visual interfaces (GUIs) use
> distributed cognition, with on-screen items like menus, icons,
> buttons, etc. Compare that to say, the DOS or Unix command line,
> where users had to remember the commands to manipulate the system.
>
> Dan
>
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> List Guidelines ............ http://listguide.ixda.org/
> List Help .................. http://listhelp.ixda.org/
> (Un)Subscription Options ... http://subscription-options.ixda.org/
> Announcements List ......... http://subscribe-announce.ixda.org/
> Questions .................. lists at ixda.org
> Home ....................... http://ixda.org/
> Resource Library ........... http://resources.ixda.org
>

21 Nov 2006 - 11:55am
ldebett
2004

>That is; users will naturally use the left mouse button to
>click, select etc. They will probably know that the right mouse button has
>some advanced functionality.

Just to play devil's advocate for a moment...

There have been quite a number of tests done with older populations where
they've found that the computer/keyboard/mouse design isn't intuitive, but
learned. They've seen people have pick up the mouse and hold it in the air
wondering what it does. Also, if one has never used a 2 button mouse (e.g.
Mac users), one may never know what that right button does when they come
across one!

Distributing cognition is something we, as humans, do to help us remember or
do things in our world. We, as designers, can build a "scaffolding" around
the user in our designs to allow them to distribute their cognition
(thinking, memory, etc.) more easily and do tasks faster/easier.

~Lisa

21 Nov 2006 - 12:35pm
Adrian Howard
2005

On 21 Nov 2006, at 16:55, Lisa deBettencourt wrote:
[snip]
> There have been quite a number of tests done with older populations
> where
> they've found that the computer/keyboard/mouse design isn't
> intuitive, but
> learned. They've seen people have pick up the mouse and hold it in
> the air
> wondering what it does.
[snip]

Yup. I remember helping out with an adult education course in the
late eighties when a (bright) business studies teacher hadn't
encountered a mouse before. We ended up taking the mouse apart and
poking the rollers with a biro before she was comfortable with how it
worked :-)

Adrian

Syndicate content Get the feed