Touch is the new Click - Gestures are the new UI

18 Oct 2007 - 2:36am
6 years ago
2 replies
1313 reads
Morten Hjerde
2007

Don't know if anyone of you saw the announcements in London two days ago? I
know its outside the US so it doesn't really count, but here is the heads up
anyway:

A mobile, open development platform with touchscreen, haptic feedback,
presence, GPS location, WiFI/VoIP and an AJAX capable web browser. Only
Google Gears is missing. Nokia showed a touch UI phone with a development
environment. It is not multi-touch, darn! The touch screen in the demo was a
resistive type single point touch screen. The UI changes are evolutionary.
Nokia still sees the phone as a single-hand device and they are keeping
menus, they are just making them finger-sized. Nokia said they wanted to be
backwards compatible, keep old application working. There is no iPhone style
finger keyboard, you need to use a stylus for writing.

The new phones will support haptics. Probably by using the vibration motor.
Nokia licensed this technology from Immersion earlier this summer. This is
fairly primitive so don't expect any religious experiences. But its a start.

*Cool gestures!*

They introduced a "Sensor framework" and mentioned both motion, orientation,
proximity and light sensors. They showed motion and acceleration. A
video http://www.unwiredview.com showed a woman diverting a call by
flipping over the phone while it was lying on a table. All these sensors makes gestures an additional UI
component. The really cool thing is that developers will have access to all
sensors. This is very exciting and certainly opens some new opportunities
for application and game developers!

You may have seen the crazy guys who made it possible to use a Wii remote to
control a Nokia
phone http://www.engadget.com/2007/10/02/wiimote-used-to-control-nokia-n95 ,
next year you may be able to control the Wii with your Nokia phone!

Gestures are going to be user configurable.

*ScreenPlay and FreeWay*

ScreenPlay and FreeWay are Symbian 9.5 technologies for video acceleration
and network access. Snazzy names! Something must have happened since the
iPhone. Symbian used to call their stuff something like "Host Controller
Transport Layer" or "Logical Link Control and Adaptation Protocol (L2CAP)".

ScreenPlay goes hand in hand with Symbian's recently announced support for
Symmetric Multi Processing (SMP). SMP is basically a Duo Core processor on
your phone. Current Symbian phones are painfully slow and desperately needs
faster drawing speed. Thanks, iPhone!

Symbian Freeway architecture. Nokia has promised UMA devices next year. The
promise is that you can move seamless between networks. It you are on the
phone and move into a known WiFi zone, the call seamlessly switches to VoIP.
Almost sounds too good to be true.

Flash Lite 3 was announced, that means Flash Video in the browser and will
cater to all your YouTube entertainment needs. Potentially a much better
solution than today's awkward video playback.

--
Morten Hjerde
http://sender11.typepad.com

Comments

18 Oct 2007 - 7:10am
SemanticWill
2007

Anyone else notice that the more the technology moves forward in some
ways - the more, from an HCI perspective, interactions and direct
manipulation are bringing devices more into line with the ways humans
were built to see and interact with the world for the last -- well --
since we could walk upright.

In the beginning - we had a viewport, also called our field of
vision. To "Select," we would raise arm arms and "Point" at an
object in our field of view and "Grunt," which is just like
clicking, in that it informed the other human's around us that we
had "Selected," one object out of many.

Pointing and clicking was always a least-bad but at least it works
mechanism for sending instructions to a machine. It was/is a weak
metaphor 2 orders of abstraction removed from the user's intent, but
when Englebart created the mouse and invented point-click - he was
solving a particular cs problem but from an engineer's perspective.
Issues of human-centeredness or affordance never entered the equation
so it's interesting that the more interaction technology moves
'forward' the more it's actually reverting to the human cognition
works.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://gamma.ixda.org/discuss?post=21621

19 Oct 2007 - 11:40am
Christopher Fahey
2005

> Anyone else notice that the more the technology moves forward in some
> ways - the more, from an HCI perspective, interactions and direct
> manipulation are bringing devices more into line with the ways humans
> were built to see and interact with the world for the last -- well --
> since we could walk upright.

Touch and motion sensors (iPhone, Wii) are essentially bringing us
closer to being real cyborgs, by smoothing over the short chain of
connections between our brains and our devices. By using, as Will
notes, our more natural "hardware" neuromuscular systems to bridge
the brain/machine gap, the layers of abstraction are removed. The
mouse, by the way, was also a big step in this direction.

What I find interesting is that our visceral connection to computers
and to information has gotten more and more intense without actually
crossing the line into Gibson-esque neural implants. That's the fun
part -- that we are in cyberspace but only virtually so. :-)

I look forward to the Grunt interface.

-Cf

Christopher Fahey
____________________________
Behavior
biz: http://www.behaviordesign.com
me: http://www.graphpaper.com

Syndicate content Get the feed