'Transparent' Apple Devices

14 May 2007 - 2:38pm
7 years ago
10 replies
607 reads
Dan Saffer
2003

Via Core77:

http://www.core77.com/blog/object_culture/
new_invisible_interface_6296.asp
"Although touchscreens are apparently the wave of the future, there's
one design flaw no one's yet addressed: when your finger's on the
screen, it's obscuring the very elements you're supposed to be
interacting with.

Leave it to Apple to come up with a workaround."

Comments

15 May 2007 - 9:45am
.pauric
2006

Apple: "When the cursor gets where you want it, you press the rear of the device, and it registers visibly in the front."

Hmmm, with my old human 1.0 hands I'm having a little trouble
figuring out how I both hold the device in the balls of my palms and
gently slide my index fingers across the entire rear of the device.

This solution feels especially awkward as you try to reach the lower
left/right extremities of the screen

Core77 "when your finger's on the screen, it's obscuring the very
elements you're supposed to be interacting with."

Isnt this cart before horse? I move my fingers over the element after
I decide thats what I want to press, not the other way around. Moving
my fingers momentarily out of the way to confirm a selection seems
minimal. At least compared to the motor mapping & cursor lagg you'd
get doing a reach-around

15 May 2007 - 10:08am
Jack L. Moffett
2005

On May 15, 2007, at 10:45 AM, pauric wrote:

> Hmmm, with my old human 1.0 hands I'm having a little trouble
> figuring out how I both hold the device in the balls of my palms and
> gently slide my index fingers across the entire rear of the device.

Even more awkward, try holding your iPod (or device of similar form
factor) in one hand and use your index finger to turn a scroll wheel
on the back. This is nowhere near as natural as using your thumb on
the face. There isn't the same degree of flexibility. Yet the
abstract claims that this UI would allow for single-handed interaction.

> Isnt this cart before horse? I move my fingers over the element after
> I decide thats what I want to press, not the other way around.

True, but the fact that your finger is covering a button as you press
it precludes using visual feedback on the button itself to indicate
that your button press registered. This extrapolates to drag actions
with drop area indicators and so forth.

Of course, there are other ways of providing feedback, but visual
feedback on the interactive element is the most direct method.

Jack

Jack L. Moffett
Interaction Designer
inmedius
412.459.0310 x219
http://www.inmedius.com

You could design a process to catch
everything, but then you're overprocessing.
You kill creativity. You kill productivity.
By definition, a culture like ours that
drives innovation is managed chaos.

-Alex Lee
President, OXO International

15 May 2007 - 10:46am
Tracy Boyington
2007

>>> Jack Moffett <jmoffett at inmedius.com> 05/15/07 10:08 AM >>>

>> Isnt this cart before horse? I move my fingers over the element
after
>> I decide thats what I want to press, not the other way around.

>True, but the fact that your finger is covering a button as you press

>it precludes using visual feedback on the button itself to indicate
>that your button press registered.

Couldn't you provide visual feedback that wasn't limited to the button
- i.e., the whole screen subtly getting brighter or shifting color when
a button is pressed?

>From Core77:
>To interact with them, you place your finger on the rear of the
device,
>which is touch-sensitive (but not a screen), since the rear is where
your fingers
>would end up anyway, simply by virtue of holding the thing up.

When I hold my iPod, my fingers are on the sides, as they would be when
I'm holding a photo or CD or anything else I don't want to get
fingerprints on.

~~~~~
Tracy Boyington tracy_boyington at okcareertech.org
Oklahoma Department of Career & Technology Education
Stillwater, OK http://www.okcareertech.org/cimc

15 May 2007 - 10:48am
.pauric
2006

Jack "True, but the fact that your finger is covering a button as you
press it precludes using visual feedback on the button itself to
indicate that your button press registered. This extrapolates to drag
actions with drop area indicators and so forth."

It would seem to me that a primary function of handheld devices is to
compliment some form of real world activity (as opposed to desktop
content manipulation). As such, actions have stepped/modal feedback.

Select address > details appear
Make call > tones emit from speaker/call placed
Play music > music plays
Write sms/email > text appears in content box, separate to keyboard

While drag and drop can be solved through illuminating a larger
surround, thus fixing the issue. Aside from the positives of haptic
feedback, How big a problem is not seeing whats under your finger
really?

I have a hard time envisioning portable devices as anything other
than facilitating actions in our environment, with the exception of
games. Manipulating cells in a spreadsheet or.... well, as always
I'm wrong more than 4 thirds of of the time so I'm sure I'll get
skooled on this (o;

15 May 2007 - 11:06am
Jack L. Moffett
2005

Tracy Boyington wrote:

> Couldn't you provide visual feedback that wasn't limited to the button
> - i.e., the whole screen subtly getting brighter or shifting color
> when
> a button is pressed?

Yes, but if there are multiple buttons (a keyboard, for example) that
doesn't confirm which button was pressed, only that a button was
pressed.

pauric wrote:

> As such, actions have stepped/modal feedback.
> Select address > details appear

But if you have pressed the wrong button and an action occurs that
you weren't expecting, that leads to confusion. Visual feedback at
the time the button is pressed at least provides some context.

Alright, I'm picking nits here. There are certainly other solutions
to the problem. Notice what they do in the iPhone UI. The key that
was pressed flashes on the screen directly above your finger, and at
about twice the size of the button. Taps on other UI elements result
in a "pond ripple".
http://www.apple.com/iphone/phone/ (View the animation under
Revolutionary Phone > SMS).

Jack

Jack L. Moffett
Interaction Designer
inmedius
412.459.0310 x219
http://www.inmedius.com

When I am working on a problem,
I never think about beauty.
I think only of how to solve the problem.

But when I have finished,
if the solution is not beautiful,
I know it is wrong.

- R. Buckminster Fuller

15 May 2007 - 11:39am
.pauric
2006

Jack "Notice what they do in the iPhone UI. The key that was pressed
flashes on the screen directly above your finger, and at about twice
the size of the button."

You have a better handle on this than me. Can you highlight the need
to bubble the pressed key in addition to the pressed key appearing
the 'pre'send text entry box.

If you press the wrong button, putting an enlarged pop-up of that
doesnt fix anything more than reading your error in the text-entry
box.

thanks!

15 May 2007 - 12:00pm
Jack L. Moffett
2005

On May 15, 2007, at 12:39 PM, pauric wrote:

> You have a better handle on this than me. Can you highlight the need
> to bubble the pressed key in addition to the pressed key appearing
> the 'pre'send text entry box.
>
> If you press the wrong button, putting an enlarged pop-up of that
> doesnt fix anything more than reading your error in the text-entry
> box.

It's a matter of focused attention. Unlike typing on a keyboard, or
even the keypad on a Blackberry, you will have to look at the keys as
you type. You shouldn't have to keep flicking your attention between
the keys and the text entry box. The "bubbled" keys are enough
confirmation that you are typing what you intend to type. You don't
have to change focus. If you hit a wrong letter, you will know, and
can fix it. Without the bubbled keys, you would either be constantly
changing your focus, and therefore typing slower, or you would miss
the typos.

Jack

Jack L. Moffett
Interaction Designer
inmedius
412.459.0310 x219
http://www.inmedius.com

The World is not set up to facilitate the best
any more than it is set up to facilitate the worst.
It doesn't depend on brilliance or innovation
because if it did, the system would be unpredictable.
It requires averages and predictables.

So, good deeds and brilliant ideas go against the
grain of the social contract almost by definition.
They will be challenged and will require
enormous effort to succeed.

Most fail.
- Michael McDonough

15 May 2007 - 4:06pm
Will Parker
2007

On May 15, 2007, at 10:00 AM, Jack Moffett wrote:

> On May 15, 2007, at 12:39 PM, pauric wrote:
>
>> You have a better handle on this than me. Can you highlight the need
>> to bubble the pressed key in addition to the pressed key appearing
>> the 'pre'send text entry box.
>>
>> If you press the wrong button, putting an enlarged pop-up of that
>> doesnt fix anything more than reading your error in the text-entry
>> box.

The iPhone touchscreen sensor array is either capacitance-based or
some offshoot of the internal-reflection design that Jeff Han has
been demoing. In either case, it should be possible to distinguish
between 'touch' and 'press' events based on the area covered by the
fingertip. Even without fancy analysis of the touch signature, the
keyboard software could simply require that the user hold the same
key for, say, 0.75 seconds before the keypress is registered.

Without iPhones on hand, the key to determining whether this is the
case would be finding evidence in the iPhone demo of a sequence like
[ user touch / phone displays key bubble / user waits / phone signals
keypress acceptance ]. If there's no acceptance signal at the end of
each press, then it does come down to the advantage of focus.

Also, please note that there was some discussion in the iPhone demo
of dictionary-based auto-completion, including entries from the
user's contact list. For many text entry functions, it seems likely
that the number of correct keystrokes required would be reduced,
possibly by as much as 50%.

> It's a matter of focused attention. <snip>

This is the crucial part of the design for me. Because the key
bubbles provide visual feedback, the user's attention is drawn to the
keyboard area, which in turn informs and improves their fine motor
control. That in turn is (probably) improves accuracy and therefore
the user's experience. One persona we almost certainly won't see
among iPhone users is the Traveling Texter, driving one-handed while
IM'ing his Fave Fives, and doing both badly.

> The "bubbled" keys are enough
> confirmation that you are typing what you intend to type. You don't
> have to change focus. If you hit a wrong letter, you will know, and
> can fix it. Without the bubbled keys, you would either be constantly
> changing your focus, and therefore typing slower, or you would miss
> the typos.

Furthermore, if the keyboard includes a Delete key which is always in
the same screen location (affording muscle memory), the experienced
user can choose to attend to _either_ the key bubbles or the text
entry field.

- Will

Will Parker
wparker at ChannelingDesign.com

“I wish developing great products was as easy as writing a check. If
that were the case, then Microsoft would have great products.” -
Steve Jobs

16 May 2007 - 8:57am
.pauric
2006

Jack: "Focus"
Thanks for the explanation, what you describe makes sense, although
I'm not sure I see the need to avoid switching focus on a small
screen.

Will: "it should be possible to distinguish between 'touch' and
'press' events based on the area covered by the fingertip."

That leads in to the problem of calibration. A fat fingered
Neanderthal such as myself is going to have a smaller footprint
spread compared to someone who looks after their fingers.

Here's a direct link to the sms demo which allows for tracking
control. I think if they currently had the ability to show the
bubble in between placement and pressing of the finger they would
have included it in the demo.

http://images.apple.com/movies/us/apple/iphone/300/phone_sms_f_20070109.mov

Thats not to say it isnt possible, I just think its impractical given
the spread of input device characteristics (aka 'fingers')

Will: "Because the key bubbles provide visual feedback, the user's
attention is drawn to the keyboard area, which in turn informs and
improves their fine motor control."

http://www.flickr.com/photos/pauric/500837495/

I had wondered about this, a counter argument is that placing the
bubble over other keys, in this case the D appearing over both E & R,
negates the value of this, what do you think?

And to tie this back in to Dan's original comment. Given the
complexity of the design characteristics highlighted by the
subsequent comments, a rear panel input sensor seems a long, long,
way off.

16 May 2007 - 12:02pm
Will Parker
2007

On May 16, 2007, at 6:57 AM, pauric wrote:

> Will: "it should be possible to distinguish between 'touch' and
> 'press' events based on the area covered by the fingertip."
>
> That leads in to the problem of calibration. A fat fingered
> Neanderthal such as myself is going to have a smaller footprint
> spread compared to someone who looks after their fingers.

Actually, there are two different sensor issues, depending on whether
the touch-screen uses capacitance or optical sensors.

For capacitance sensors, the thing being sensed is the 2D 'footprint'
of the electrical field around the fingertip. I've seen some research
(which I'll have to look up again) which shows that it's possible to
use the characteristics of that electrical footprint to determine
both the extent of the fingertip flesh and the distance of the finger
bone from the sensor surface. Using changes in those two factors, the
researchers were able to distinguish between 'touch' and 'press' events.

For optical sensors, you'd have to look for a two-stage event - a
touch for a certain period of time, followed by a firmer press. Look
for a plateau followed by a spike in area covered, in other words. I
think that would be a trainable user behavior.

Capacitance-based touch screens are known quantities for engineers,
so Apple probably used that solution, but Jeff Han (http://cs.nyu.edu/
~jhan/ftirtouch/) made some interesting but vague comments regarding
future announcements by his team immediately after the iPhone demo,
so we're left guessing.

> Here's a direct link to the sms demo which allows for tracking
> control. I think if they currently had the ability to show the
> bubble in between placement and pressing of the finger they would
> have included it in the demo.
>
> http://images.apple.com/movies/us/apple/iphone/300/
> phone_sms_f_20070109.mov

That's not strictly a movie -- it's an informational animation. Tufte
Representation Rules apply. };->

I'll try to get time to review the real-life iPhone demo movies this
evening and see if there are any visual or behavioral clues there.

> Thats not to say it isnt possible, I just think its impractical given
> the spread of input device characteristics (aka 'fingers')

I think a larger problem (albeit for a smaller audience) arises if
the iPhone screen uses capacitance sensors. These can't respond to
touch by inanimate objects, such as mouthsticks, stylii, or even
gloved fingers, so there's a potential accessibility problem.

BTW, I'm keeping an eye out for a solution in this area for a blog-
buddy of mine who must use a mouthstick, so if you have ideas along
these lines, please respond _in a separate thread_ or directly to me.

> Will: "Because the key bubbles provide visual feedback, the user's
> attention is drawn to the keyboard area, which in turn informs and
> improves their fine motor control."
>
> http://www.flickr.com/photos/pauric/500837495/
>
> I had wondered about this, a counter argument is that placing the
> bubble over other keys, in this case the D appearing over both E & R,
> negates the value of this, what do you think?

There's a whole can of human-factors worms in that argument.

Do keyboard users (QWERTY or otherwise) use vertically-adjacent keys,
horizontally-adjacent keys, or some other mix of visual cues to
orient to the keyboard? Is there a significant difference between
experienced and inexperienced user populations? Are there cultural
differences in visual keyboard orientation factors between Hopi and
Yoruba keyboarders?

Let's just not go there right now. Please. ;-)

> And to tie this back in to Dan's original comment. Given the
> complexity of the design characteristics highlighted by the
> subsequent comments, a rear panel input sensor seems a long, long,
> way off.

Oooooh, yes. I think the most relevant comment I've seen on that
matter is that thumbs are much better suited to independent side-to-
side and rotary motions than fingers, especially when the fingers are
ALSO being used to grasp the object being manipulated.

But what if this principle were adapted to interactions with, say, on-
screen UI along the vertical sides of a laptop screen? It's rare that
people interact with a laptop while holding it in their hands, so the
whole finger-vs-thumb argument drops away.

I for one would welcome the ability to, say, control screen
brightness by caressing the edge of my PowerBook screen, but only if
that were an _addition_ to the control methods, rather than the
primary method.

- Will

Will Parker
wparker at ChannelingDesign.com

“I wish developing great products was as easy as writing a check. If
that were the case, then Microsoft would have great products.” -
Steve Jobs

Syndicate content Get the feed