Touch screen application interfaces have been following a common frame design for quite sometime now. When
i got an oppurtunity to work on couple of similar projects, 1st thing i
did was i started observing the way people interact with the devices. I
went through most of the similar applications. As i was going through, lot of questions came in my mind. Few were like.
1. Technology has grown so much, isn't the practice of interaction with it also has grown?
2. Normally a person uses just 1 hand to interact with a non-touch screen based phone. But most people use both hands to interact with a touch screen. (May be in 1 hand he is holding the phone and in other he is interacting. What ever, He is using both his hands.)
3. Why is that we are not thinking little different from the adopted pattern. Are we trying to play safe?
is when i thought i will take a chance to think and come up with
something different. Some solution where life with touch screens become
After weeks of brainstorming and head breaking working around with different options, i got little closer to possible solution.
i did here is i looked closely into how exactly was the interaction
happening. What are we using the most to interact?? The answer i found
was 'thumb'. Now all i had to do was try to know how exactly the
movement of the thumb happens. So i figured out our thumb moves mainly
in 1 axis without any strain.
Now Keeping thumb has primary, i tried to come up with a design solution.
What are the most used elements(controls) in a media player?
1. Play 2. Pause 3. Next 4. Previous 5. Stop 6. Volume
Looking into this, i moved all 1st 5 controls to the right side of the screen to make it easily accessible with the thumb. For
Volume there has always been physical controls on the top side of the
phone. Which means when you are holding a phone, you are always in very
easy reach to volume controls with your index and middle finger.