Mind Control

15 Jun 2005 - 11:35am
9 years ago
4 replies
362 reads
Dan Saffer
2003

"A company in Cupertino, California, called NeuroSky is...developing
a technology that uses electrical signals produced by the brain and
eye movement to control electronic devices."

http://www.thefeature.com/article?articleid=101692

Dan Saffer
Sr. Interaction Designer, Adaptive Path
http://www.adaptivepath.com
http://www.odannyboy.com

Comments

15 Jun 2005 - 4:31pm
Juan Lanus
2005

Shall we start discussing MUI (Mind User iInterface) design now? :-)
--
Juan Lanus
TECNOSOL
Argentina

15 Jun 2005 - 4:45pm
Dave Malouf
2005

Why not?

How do we control things w/o tangible/noticeable feedback forces? Is a big
question that always boggles my mind when I think about direct synaptic
control over user interfaces.

Another core question is whether when using synaptic control would the flow
paths differ? Does muscle use make us assume certain things about interface
use?

I'm sure there are a ton of other questions that we can ponder her.

Even doing gesture-based controls I feel this. I got to use some demos of
applications that allow for hand-motion controls, and you do feel a bit lost
when the feedback doesn't feel quite right. How can we overcome that?

-- dave

On 6/15/05 5:31 PM, "Juan Lanus" <juan.lanus at gmail.com> wrote:

> [Please voluntarily trim replies to include only relevant quoted material.]
>
> Shall we start discussing MUI (Mind User iInterface) design now? :-)
> --
> Juan Lanus
> TECNOSOL
> Argentina
> _______________________________________________
> Welcome to the Interaction Design Group!
> To post to this list ....... discuss at ixdg.org
> (Un)Subscription Options ... http://discuss.ixdg.org/
> Announcements List ......... http://subscribe-announce.ixdg.org/
> Questions .................. lists at ixdg.org
> Home ....................... http://ixdg.org/

-- dave

David Heller
http://synapticburn.com/
http://ixdg.org/
Dave (at) ixdg (dot) org
Dave (at) synapticburn (dot) com
AIM: bolinhanyc || Y!: dave_ux || MSN: hippiefunk at hotmail.com

15 Jun 2005 - 6:40pm
Juan Lanus
2005

In fact, this kind of interface merges with for example a fighter
plane tripulaqted by a single person.
As the pilot has to drive the plane, damage the target, skip missils,
be aware of the aircraft status, talk with the boss, navigate and
watch the landscape, and the interfaqce HAS to be responsive or else
... then more and more control will be set to be operated with
anything but the hands and consuming the least possible amount of
conscious-level actions.
As when a child drives a bike mostly with the body and not at all
wasting conscious thinking.
Or as when I watch my 10 years old son Nicolas operating his PC.
--
J

17 Jun 2005 - 9:49am
Diego Moya
2005

On 15/06/05, David Heller <dave en ixdg.org> wrote:
> [Please voluntarily trim replies to include only relevant quoted material.]
>
> Why not?
>
> How do we control things w/o tangible/noticeable feedback forces? Is a big
> question that always boggles my mind when I think about direct synaptic
> control over user interfaces.

But maybe synaptic control *will* have bi-directional flow ;-)

> Even doing gesture-based controls I feel this. I got to use some demos of
> applications that allow for hand-motion controls, and you do feel a bit lost
> when the feedback doesn't feel quite right. How can we overcome that?

That is only a problem for motion control, not for command invocation.
Maybe non-feedback gesture-based controls should only be used for
launching commands, not for precise object placement?

Syndicate content Get the feed