Computers in Things (was Re: Ecommerce web site book recommendations [signed]

3 Jan 2006 - 11:54am
8 years ago
16 replies
948 reads
Dan Saffer
2003

On Jan 3, 2006, at 7:56 AM, Bernie Monette [c] wrote:

> There's always Alan Cooper's "The Inmates are Running the Asylum".
> Very readable-one of my favourites: what do you get when you
> combine a computer and a battleship? A computer

I think it's time we challenged this notion. Combining a physical
object with a microprocessor doesn't get you a computer, it gets you
something else: a physical object that can demonstrate different
behaviors. Some of those behaviors are good (love the indicator on my
dishwasher that tells me how long before the dishes are clean), some
are bad (my dishwasher could, in theory, crash like Windows 95). I
don't, however, call or think of my dishwasher as a computer.

Only someone immersed in the world of computers would see everything
as a computer. Microprocessors are becoming as cheap and ubiquitous
as the small motors our world is filled with. I don't look at my car
and see it as one giant motor surrounded by many small motors!

We're entering an era where microprocessors will be everywhere, and
more embodied interactions will be the norm. We need to think of how
the behaviors inherent in computers can enhance and the interactions
with physical things, not port over the mental model of computers to
everything.

Dan

Dan Saffer, M.Des., IDSA
Sr. Interaction Designer, Adaptive Path
http://www.designingforinteraction.com
http://www.odannyboy.com

Comments

3 Jan 2006 - 1:08pm
Robert Hoekman, Jr.
2005

"Only someone immersed in the world of computers would see everything
as a computer. Microprocessors are becoming as cheap and ubiquitous
as the small motors our world is filled with. I don't look at my car
and see it as one giant motor surrounded by many small motors!"

I agree with this. A mechanic may very well look at every car that enters
the shop as just another engine with just anbother body wrapped around it,
because his entire job is about fixing them, but I know I don't see my iPod
as a "computer", per se. "Computers" tie me to a desk, cost loads of money
and time, and occasionally make me cringe at the thought of them.

My iPod, on the other hand, goes with me wherever I go, happily plays all
the music and videos I want it to play, and doesn't give me a hard time
about anything. So while it does have an OS, it doesn't make me cringe. The
difference is that the iPod lacks the stigma that computers have. Many, many
people look at a typical productivity tool, like Excel or something, and
think "I don't get it. How the heck do you make this thing work?" Many of
those same people look at iPods and think "Wow. That's cool." The difference
is how well the tool helps me accomplish my goals.

I understand Cooper's perspective. He sees interaction design issues in each
of the objects he mentions, as do most of us. I just don't necessarily agree
with the notion that there's no x-factor in the machine. Sometimes, if not
most of the time, a tool with a chip is still a tool.

-r-

3 Jan 2006 - 1:51pm
Josh Seiden
2003

--- Dan Saffer <dan at odannyboy.com> wrote:

> > There's always Alan Cooper's "The Inmates are
> Running the Asylum".
> > Very readable-one of my favourites: what do you
> get when you
> > combine a computer and a battleship? A computer
>
> I think it's time we challenged this notion.

I don't disagree with you Dan, but neither do I think
you are challenging Alan's point. Rather, I think you
are agreeing with it.

His claim is that *in the absence of interaction
design* the nature of microprocessors combined with
the nature of engineering-centric approaches will
always produce a system in which the computer-like
properties dominate.

Alan is not arguing that the inclusion of a
microprocessor alone automatically creates a computer.
He's making a comment about the current state of the
business. But encapsulated in a slogan, the argument
only goes so far...

JS

3 Jan 2006 - 1:52pm
Rick Cecil
2004

On 1/3/06, Dan Saffer <dan at odannyboy.com> wrote:
>
> [Please voluntarily trim replies to include only relevant quoted
> material.]
>
> On Jan 3, 2006, at 7:56 AM, Bernie Monette [c] wrote:
>
> > There's always Alan Cooper's "The Inmates are Running the Asylum".
> > Very readable-one of my favourites: what do you get when you
> > combine a computer and a battleship? A computer
>
> I think it's time we challenged this notion. Combining a physical
> object with a microprocessor doesn't get you a computer, it gets you
> something else: a physical object that can demonstrate different
> behaviors. Some of those behaviors are good (love the indicator on my
> dishwasher that tells me how long before the dishes are clean), some
> are bad (my dishwasher could, in theory, crash like Windows 95). I
> don't, however, call or think of my dishwasher as a computer.
>

Dan,

Once you add a microchip to an object, it becomes a computer because you
have hardware that allows you to interact with software, which, in turn,
allows you to control other hardware. So, your dishwasher is more of a
dishwashing computer--a specialized computer, if you will.

As designers of specialized computers, we must design both a software
component and a hardware component. We will have some amazing opportunities
as we see the convergence of device->computer->network to create specialized
input and display devices that meet the unique demands of the software--as
opposed to PC or Mac software that is designed for one of three types of
input: keyboard, mouse, or keyboard and mouse. And it's this opportunity
that makes desiging these specialized computers so exciting. In the end,
though, it's still hardware controlling software and that is just another
type of computer with the same design problems and complexities (probably
more) inherent in designing a computer.

Also, how we, as users, view the device is irrelevant. I don't view my DVR
as a computer, but it is still a computer designed to record television
shows.

Just my .02.

-Rick

3 Jan 2006 - 1:56pm
Baron Lane
2005

Great points Dan..But I now read the "add a microprocessor = a computer" to mean that if you take a relatively simple object (a dishwasher) and add a microprocessor you get a Bruce Sterling neologism. a "Spime."

An artifact that increases in relative compexity in it's behavior, in it's possibilities and in the relationship it has with it's user.

A computer is currently the most typical metaphor we currently have to apply to this phenomonon (in technical make-up and complexity of behavior).

I can't wait for the day I can formally refer to myself as a "spime wrangler".

http://en.wikipedia.org/wiki/Spime

Dan Saffer <dan at odannyboy.com> wrote:
[Please voluntarily trim replies to include only relevant quoted material.]

On Jan 3, 2006, at 7:56 AM, Bernie Monette [c] wrote:

> There's always Alan Cooper's "The Inmates are Running the Asylum".
> Very readable-one of my favourites: what do you get when you
> combine a computer and a battleship? A computer

I think it's time we challenged this notion. Combining a physical
object with a microprocessor doesn't get you a computer, it gets you
something else: a physical object that can demonstrate different
behaviors. Some of those behaviors are good (love the indicator on my
dishwasher that tells me how long before the dishes are clean), some
are bad (my dishwasher could, in theory, crash like Windows 95). I
don't, however, call or think of my dishwasher as a computer.

Only someone immersed in the world of computers would see everything
as a computer. Microprocessors are becoming as cheap and ubiquitous
as the small motors our world is filled with. I don't look at my car
and see it as one giant motor surrounded by many small motors!

We're entering an era where microprocessors will be everywhere, and
more embodied interactions will be the norm. We need to think of how
the behaviors inherent in computers can enhance and the interactions
with physical things, not port over the mental model of computers to
everything.

Dan

Dan Saffer, M.Des., IDSA
Sr. Interaction Designer, Adaptive Path
http://www.designingforinteraction.com
http://www.odannyboy.com

________________________________________________________________
Welcome to the Interaction Design Association (IxDA)!
To post to this list ....... discuss at ixda.org
List Guidelines ............ http://listguide.ixda.org/
List Help .................. http://listhelp.ixda.org/
(Un)Subscription Options ... http://subscription-options.ixda.org/
Announcements List ......... http://subscribe-announce.ixda.org/
Questions .................. lists at ixda.org
Home ....................... http://ixda.org/
Resource Library ........... http://resources.ixda.org

---------------------------------
Yahoo! DSL Something to write home about. Just $16.99/mo. or less

3 Jan 2006 - 5:38pm
Juan Lanus
2005

Josh wrote, referring to Alan Cooper's obsession of computers everywhere:
> His claim is that *in the absence of interaction
> design* the nature of microprocessors combined with
> the nature of engine ering-centric approaches will
> always produce a system in which the computer-like
> properties dominate.

Josh is right. In a well designed device all components merge to
produce that single outcome: the device.
When one of the parts stands out because of a failure then instead of
the device you get that part, which happens to be the computer in all
Cooper's examples.

In a building, for example, all components merge silently and people
do their things oblivious of it like the fish in the water. But if one
component fails, say: the elevator, then people will have trouble and
talk about the elevator and curse it.
It´s the case of a technology making the front pages instead of to be
there contributing silently to the general welfare.

Cooper's book is about the computers making headlines, but not simply
because of being computers but because of the lack of proper design by
the software developers, the "inmates" (I'm one such people but I
desing well ... most of the times ...).

It was during the www bubble that software developers took over
management and billions of dollars were burned. Alan cooper's book is
about this nosense.

All the book is a hidden message to the managers (those who gave up
the control of the assylum) about the urgency to reclaim that control,
or else <all those examples in the book>.

----------------------------
As of computers set into devices, it started to be cheap by 1971 with
Intel's 4004, priced at only more than us$200 (not sure), slow and
primitive, 4 bit word size.
A few years later Motorola sold more than 1 billion units of their
680X series, succesfully integrated into almost anything. Succesfully
because the computers were largely unnoticed. This was good design. I
think this micros were priced well below us$10. The successor of the
680x series was the 68000 series which powered the Mackintoshes for
years.

you might want to see ... (not very interesting):
http://www.cpu-collection.de/?tn=1&l0=cl&l1=680x
http://www.answers.com/topic/motorola-6800
http://www.microprocessor.sscc.ru/great/s1.html#6809
http://www.cpu-world.com/CPUs/6809/index.html
--
Juan Lanus
TECNOSOL
Argentina

3 Jan 2006 - 6:23pm
Tom George
2004

On Jan 3, 2006, at 05:38 PM, Juan Lanus wrote:

> It was during the www bubble that software developers took over
> management and billions of dollars were burned. Alan cooper's book is
> about this nosense.
>

I'm not sure you can lay "the bubble" on the heads of software
developers. Many professional software developers I know were not
that happy about "the bubble" (apart from the money) and were often
asked to compromise their standards and quality (by management) for
the sake of $$.

What I took away from Cooper's books and course is that we when put
software developers in a conflict of interest (ease of use vs. ease
of implementation) we get a very predictable outcome. Formal
interaction design (Goal Directed or otherwise) can help improve that
outcome.

-Tom

3 Jan 2006 - 6:31pm
Ian Chan
2005

Isn't this all about transparency? That our goal ought to be to produce
products whose complexities, computational functions especially, be
invisible to users? The idea being that since we're all sensitive to
computer functions and operations, to messages and information, hiding
unnecessary operations from users minimizes the risk that they become
overwhelmed with options, decisions, etc etc? Those have been our goals for
a long time and still are, for the most part. (Who would want to know what's
going on under the hood of a new car, chipwise, really....?)

I think sometimes that interaction design should worry less about
transparency, and users, and aim for pleasure in interaction. That's part of
the iPod's success, I think. Not that the iPod is less technological, or
less like a computer, but that it's got more personality: its wheel, its
affordances, its ability to appeal as fashion and lifestyle product... Users
will accept a technology's presence if its interaction is engaging. The
wheel succeeds in spite of the fact that it's a brand-new navigation system;
users enjoy learning it and making it their own.

All this assumes, of course, that the damn thing works! And unfortunately
for us, we still have to accept a high level of failure, inefficiency,
ineffectiveness, maintenance, and all of that..

Cheers and happy new year!

Adrian

www.gravity7.com/blog/media/

Juan wrote:
>
> Josh wrote, referring to Alan Cooper's obsession of computers everywhere:
>> His claim is that *in the absence of interaction
>> design* the nature of microprocessors combined with
>> the nature of engine ering-centric approaches will
>> always produce a system in which the computer-like
>> properties dominate.
>
> Josh is right. In a well designed device all components merge to
> produce that single outcome: the device.
> When one of the parts stands out because of a failure then instead of
> the device you get that part, which happens to be the computer in all
> Cooper's examples.
>
> In a building, for example, all components merge silently and people
> do their things oblivious of it like the fish in the water. But if one
> component fails, say: the elevator, then people will have trouble and
> talk about the elevator and curse it.
> It´s the case of a technology making the front pages instead of to be
> there contributing silently to the general welfare.
>

3 Jan 2006 - 9:40pm
penguinstorm
2005

On Jan-3-2006, at 10:52 AM, Rick Cecil wrote:

> Once you add a microchip to an object, it becomes a computer
> because you
> have hardware that allows you to interact with software

This is an extremely narrow and highly technical distinction. It
certainly doesn't match the user's perspective: my mom doesn't
consider her Dodge SX 2.0 a computer.

Joke's on her: I don't consider it a car, either.

The fact that microchip's are increasingly embedded in has a great
deal to do with efficiencies: making things cheaper, making things
smaller, making things work more efficiently (fuel injection vs.
carbeuration). The intent in many cases, is to keep the function
substantially the same and simply change the "way things work."

That doesn't make it a computer.

Start adding new systems, fundamentally changing the way things work
- digital spedometers, GPS units, DVD players? Maybe your car is now
a computer - but only when you choose to use them.
--
Scott Nelson
skot (at) penguinstorm (dot) com
http://www.penguinstorm.com/

skype. skot.nelson

3 Jan 2006 - 10:32pm
Robert Hoekman, Jr.
2005

> > I know I don't see my iPod
> > as a "computer", per se. "Computers" tie me to a desk, cost loads
> > of money
> > and time, and occasionally make me cringe at the thought of them.

> What about your Palm Pilot?

I use a PocketPC on occasion, and yes, it feels like a computer to me
as well, which is why I only use it occasionally.

> So computers have to cause pain?

(I know you meant this as a joke, but I have a not-so-funny reply.)

I might be examining really inexperienced users a little too much, as
they're usually not in my target audience, but my wife teaches
computer classes at her library (she's the head librarian there) and
she comes home often with stories about people who literally pick up
the mouse and point it at the screen like a remote (yup - they want to
flip the channel and go to Yahoo!), wonder why Word doesn't tell you
what you're supposed to do next, and simply cannot grasp the concept
of a memorizing a password to retrieve messages from their brand new
email accounts.

Yes, computers are mentally painful. Us "savvy" users don't have these
issues, but trying to teach Excel to a 70-year old woman who's husband
just died and needs to handle the finances for the first time in her
adult life is no small task, and my wife rejoices whenever one of her
students makes a breakthrough. It's an awesome event to see the
afforementioned woman come in on a Monday afternoon to check her
email. She's suddenly been given the power to stay in touch with her
relatives at will, and she "gets" it. And that;s an amzaing thing. The
web has given her this power, which is fantastic, but it took her some
serious work to reach a point where she felt comfortable enough to
venture out into the e-world and walk around.

Whenever I am working on a computer, I'm thinking about these things
as I work. I'm constantly watching for things that would trap someone
who doesn't know that software can be good and is settling for the
programmer-designed crap we call a "toolset".

When I work with my iPod, I don't feel the general disdain I do about
the software I use one daily basis. I feel good, in control, and able
to overcome anything that goes wrong (primarily because nothing ever
does).

So, no, I don't see a chip-equipped dishwasher as a computer. I see it
as a better dishwasher (depending on how easy it is to use, that is).
If only computers worked the same way. I'd like to see my computer as
a better way to get things done, but so far, it hasn't been proven
(can you say "productivity paradox"?). As aresult, my computer makes
me cringe sometimes. My dishwasher never does.

-r-

4 Jan 2006 - 1:18am
penguinstorm
2005

On Jan-3-2006, at 7:32 PM, Robert Hoekman, Jr. wrote:

>> So computers have to cause pain?
>
> (I know you meant this as a joke, but I have a not-so-funny reply.)

I meant it sort of half-joking, and I think your reply hits a of very
good points.

My personal belief is that the 'general purpose' computer -- the one
that causes so much of this difficulty be trying to be all things to
all people -- will fade from daily life for the majority of users.
Many will still use one, but increasingly people will interact with
purpose-built specialized pieces of technology.

When these pieces of technology do their job seamlessly, they will
cease to be thought of as computers.

Examples abound: the telephone, your microwave (for most people),
your car, your iPod, your fancy diswasher.

And yes, I find Pocket PCs and Palm Pilots too much like computers.
Not my Newton though. Just did the job, and let me carry everything I
needed with me when I was away from the office. Just worked.

Some devices take backwards leaps: some cell phones are now so
complicated, people find them harder to use than they should be and
are looking for old, simply models.

I have yet to see a 'seamless' email device - although the old pocket
mail devices seemed pretty cool, essentially an email only interface
with an acoustic couple type modem on it. Not good for seniors.

I think it's a killer market though. Technology that lets people use
the Internet to communicate, instead of using the Internet -- the
next wave I suspect.
--
Scott Nelson
skot (at) penguinstorm (dot) com
http://www.penguinstorm.com/

skype. skot.nelson

4 Jan 2006 - 4:01am
gretchen anderson
2005

>my mom doesn't
> consider her Dodge SX 2.0 a computer.
> Joke's on her: I don't consider it a car, either.

A friend with an Audi got into a fender bender in the summer. Turns out the
thermometer is in the bumper of the car, and it stopped working. The Audi
thought it was 0 degrees outside and started churning out the heat.

She *couldn't turn it off* as there is no off button for the climate control
system, only a way to set the desired temperature for the car.

I didn't think of audi's as computers either, until then. If the driver has
sufficient, elegant control over the system, it doesn't feel like a
computer. When your technology holds you hostage and makes you sweat the
curse words start flying.

I might laugh if it happened in a Dodge sx 2.0 (2.0!! How au courant!) but
in an Audi, it's a crime. ;)

4 Jan 2006 - 10:37am
Rick Cecil
2004

On 1/3/06, Skot Nelson <skot at penguinstorm.com> wrote:
>
> On Jan-3-2006, at 10:52 AM, Rick Cecil wrote:
>
> > Once you add a microchip to an object, it becomes a computer
> > because you
> > have hardware that allows you to interact with software
>
> This is an extremely narrow and highly technical distinction. It
> certainly doesn't match the user's perspective: my mom doesn't
> consider her Dodge SX 2.0 a computer.

It was intended to be a technical distinction as I think it is
important to draw that technical distinction--and, that's the
distinction that Cooper was making in his book when he argued that a
computer+object = computer. Doesn't mean that you don't get some
amazing synergy out of this new device; it's just that it's still
hardware controlling software to control other hardware.

All that said, it's also equally important to define how the user
views the object: users don't consider cars or diswashers or Ipods--or
any convergent device for that matter--as a computer. And that's the
beauty of this new breed of device: we can design controls that meet
the user's needs better than a keyboard or mouse ever could. And, if
done well, we give them a better device than the mechanical version.

> Joke's on her: I don't consider it a car, either.

LOL

> The fact that microchip's are increasingly embedded in has a great
> deal to do with efficiencies: making things cheaper, making things
> smaller, making things work more efficiently (fuel injection vs.
> carbeuration). The intent in many cases, is to keep the function
> substantially the same and simply change the "way things work."

Very true. And a car is a great example of invisible computing. At
least, it's invisible because all cars have pretty much stuck with the
same controls and displays as its mechanical counterpart.

Then, you take the BMW wheel--or whatever it's called. Here they have
substantially changed how the car works--suddenly the computer in the
car is presented to the user. And it shows: they offer a 3 day
training course just to learn how to work the damn wheel! (At least,
that's what I hear.)

-Rick

4 Jan 2006 - 12:14pm
Lada Gorlenko
2004

RHJ> When I work with my iPod, I don't feel the general disdain I do about
RHJ> the software I use one daily basis. I feel good, in control, and able
RHJ> to overcome anything that goes wrong (primarily because nothing ever
RHJ> does).

Oh, please :-)

Seriously, colleagues: religious Applesianism and sleek coolness
aside, would you call iPod an exemplary case of the user in control?

To me, iPod is a classic example of adding a computer to a music
player to get a computer. Dependency on a PC and a specific piece of
software to perform basic functions with a digital music player (such
as deleting an item) is anything but control and independence.

iPod will be as mentally (and physically) painful to many 70-year olds
as Excel. Watch a couple of people with dexterity somewhat below the
young-healthy-average struggling with the accelerated scrolling and
light touch of the Click Wheel and you may start feeling for them, too.

I am not blaming iPod for its design - it is designed brilliantly for
those who can appreciate its mental model. So does Excel, which makes
wonders for its core target users. You may be the archetypical user of
iPod, but not the one of Excel; that's why mental pain and feeling
"this is a computer and this is not" differs for you. Compare apples
to apples, e.g., target user to target and non-target to non-target,
and you may get a very different picture. [If you observe those who
have no concept of a spreadsheet or has poor math literacy, don't
forget to observe those who have no concept of a music player and is
not literate in file management.]

Lada

ps. If anyone has a chance of running user studies with iPod, check if
hard-core UNIXoids are any different; they are your ultimate Homo
Logicus craving for control.

4 Jan 2006 - 2:54pm
aelman.861302 a...
2005

I agree with the fundamental point you're making -- I don't see my iPod as
a computer either. But I don't think the difference lies in the lack of "stigma"
that the iPod has. The iPod is designed (well, IMHO) to do one thing very
well: play music. (I haven't used a video iPod, so I can't speak to that.)
Once you've loaded your music onto the iPod, it's easy and intuitive to find
a particular song on a particular album by a particular artist, or just throw
it into shuffle mode.

But here's a challenge: try to set the clock on your
iPod. Now, under normal circumstances, you should never have to do this;
when you synchronize with your computer, the clock is set automatically.
But pretend for a moment that you had to set it.

The answer is, there's
a way to do it. It involves navigating through a bunch of menus, choosing
to set various pieces of the time (time zone, hour/minute/second, etc.).
It's not too hard, but it's definitely a heckuva lot more computer-like than
listening to music.

The genius of the iPod is that it exposes the functionality
for the most frequently used task in an easy, intuitive way, and hides little-used
functionality (without making it impossible to find). How to do this is a
central problem in IxD, and is exactly the issue that Cooper is reacting to.
General-purpose computers are difficult to use because the number of things
you can do with them is greater than the ability of Microsoft et al.'s UI
designers to present them to you (cf. MS Office and the 97% of features that
you don't use, don't know how to use, and probably don't even know are there.
:). When product designers add additional functionality to a device, the
complexity often quickly approaches the point where the designers feel it
necessary to introduce organizing metaphors like menus, toolbars, modal dialogs,
etc.

Microwave ovens are a fantastic example. The old knob-style microwave
is a great design for when you just need to put something in on HIGH for about
a minute. Today's microwaves offer a bunch of additional functionality --
you can usually say "I want to thaw 8 oz. of chicken" and it will automatically
heat it with exactly the right settings. The problem is, there are a lot
of those kinds of things that the user might want to do. So you end up with
a bunch of menus and modal dialogs. And as a result, on a poorly-designed
microwave, you have to go through several options just to do the simple task
of heating something for about a minute.

Now, my own microwave oven (a
Sharp -- it's built into the apartment so I can't take credit for choosing
it :) has a nice little feature where if you hit the "1" button without pressing
anything else, it'll just start going for a minute on high. There's another
button labeled "Add 30 seconds." This design definitely has its down sides
(have to press "Cook time" first before I try to enter 15 seconds or I overcook),
but mostly, it makes the stuff I need to do frequently quick and easy.

Digital cameras (as discussed in the hard/soft button thread here) are another
great example. You want the frequently-accessed functions easy to access;
hence, hard buttons/switches for power, shoot, movie/still mode, and playback/shoot
mode for casual photographers (and soft buttons/menus for other functions),
while professionals need lots of easy-to-find hard buttons/controls and fewer
menus because they will need to access many more functions for fine-grained
controls. The learning curve is higher with more controls, but pros (and
"prosumers") are willing to put in the time to learn as long as it's easy
to use once they do.

Adam

--- Robert wrote:
[Please voluntarily trim
replies to include only relevant quoted material.]
> My iPod, on the other
hand, goes with me wherever I go, happily plays all
> the music and videos
I want it to play, and doesn't give me a hard time
> about anything. So while
it does have an OS, it doesn't make me cringe. The
> difference is that the
iPod lacks the stigma that computers have. Many, many
> people look at a
typical productivity tool, like Excel or something, and
> think "I don't
get it. How the heck do you make this thing work?" Many of
> those same people
look at iPods and think "Wow. That's cool." The difference
> is how well
the tool helps me accomplish my goals.

4 Jan 2006 - 3:17pm
Rimantas Liubertas
2006

> Oh, please :-)
>
> Seriously, colleagues: religious Applesianism and sleek coolness
> aside, would you call iPod an exemplary case of the user in control?

Removing as much control from the user as possible I'd call good design
in many cases (I am awarre of very high probability of being misunterstood on
this one, so be it). Or rather need to control. iPod Shuffle anyone?

> To me, iPod is a classic example of adding a computer to a music
> player to get a computer. Dependency on a PC and a specific piece of
> software to perform basic functions with a digital music player (such
> as deleting an item) is anything but control and independence.

Wait... How did I remove items from my LP, MC and CD back in those
times... not sure I remember...
Ah, digital music player... I'd argue, that basic function for it is
to play music,
not to remove items.

Regards,
Rimantas
--
http://rimantas.com/

6 Jan 2006 - 2:15am
penguinstorm
2005

On Jan-4-2006, at 12:17 PM, Rimantas Liubertas wrote:

>> Dependency on a PC and a specific piece of
>> software to perform basic functions with a digital music player (such
>> as deleting an item) is anything but control and independence.
>
> Wait... How did I remove items from my LP, MC and CD back in those
> times... not sure I remember...
> Ah, digital music player... I'd argue, that basic function for it is
> to play music,

Not necessarily disagreeing with the latter point, it's important to
acknowledge the former.

The iPod does suffer from severe design constraints caused by a
business case: the absolute dependence on iTunes (the interface has
been hacked, but Apple has broken the hacks.) iPod's impose a storage
structure while other players do not (you have no control over how to
file your music - it uses ID3 tags to create a hierarchy.) Arguably,
DRM and the ability to only purchase music at the iTunes Music Store
fit into this category.

Other players suffer the same problem; I need a player for running
(don't want to toss that 20GB Hard Drive on the ground) and would
gladly consider any number of flash based players: alas, the number
that play AAC files is very low. Bad idea Sony - you'd have my money
otherwise. (Why does my Sony phone play AAC, while their music
players do not?)

All that said, I will readily admit that I love my iPod, and I tried
quite a few other toys; some of the iPod's design constraints are
what make it do what I want more effectively (plus, I'm a Mac guy, so
you know....it fits in exceptionally well.)
--
Scott Nelson
skot (at) penguinstorm (dot) com
http://www.penguinstorm.com/

skype. skot.nelson

Syndicate content Get the feed