Zazzle Shop

Screen printing

Tuesday, September 2, 2008

Brain Scanners, Fingercams Take Computer Interfaces Beyond Multitouch


With their easy-to-use touch screens, Apple's iPhone and iPod Touch are driving home the idea that computing can be more than just tapping away at a keyboard and clicking a mouse.

So it's no surprise that multitouch displays (screens that are sensitive to the pressure of more than one finger) are capturing the imaginations of other manufacturers, including Samsung, Palm and Hewlett-Packard.

But multitouch is merely the first step of a coming revolution in the way people interact with computers.

That future may include using neurotransmitters to help translate thoughts into computing actions, face detection combined with eye tracking and speech recognition, and haptics technology that uses the sense of touch to communicate with the user.

"Computing of today is primarily designed for seated individuals doing office work in the developed world," says Scott Klemmer, co-director of the Human Computer Interaction Group at Stanford University. "If you flip any one of those bits -- look at mobile users, or users outside of the developed world or social computing instead of individual computing -- then the future is wide open."

Neurotransmitters Play on Thought

Users are increasingly looking for richer experiences from the digital world and want more seamless interaction with computing devices, says Klemmer. That's particularly true in entertainment, one area where advancements in interface design are booming.

For instance, the Nintendo Wii made popular the idea of using natural, gestural actions and translating them into movements on screen. The Wii controller takes the swinging motion made by a user and translates it into a golf swing, or it takes a thrust with a remote and turns it into a punch on the screen.

At Drexel University's RePlay Lab, students are working on taking that idea to the next level. They are trying to measure the level of neurotransmitters in a subject's brain to create games where mere thought controls gameplay.

The lab created a 3-D game called Lazybrains that connects a neuro-monitoring device and a gaming engine. The game uses feedback from the players' brains as an additional method of input.

At its core is the Functional Near-Infrared Imaging Device, which shines infrared light into the user's forehead. It then records the amount of light that gets transmitted back and looks for changes to deduce information about the amount of oxygen in the blood.

When a user concentrates, his or her frontal lobe needs more oxygen and the device can detect the change. That means a gamer's concentration level can be used to manipulate the height of platforms in the game, slow down the motion of objects, or even change the color of virtual puzzle pieces.

The technology is in a very experimental stage, says Paul Diefenbach, assistant professor of digital media at Drexel University, who is supervising the project. Researchers have yet to fully understand the human brain and how to correlate the data measured from its activity and map it to an application, he says.

"How do you map brain activity to, say, speed in an application, and how does this translate into the user interface?" says Diefenbach.

Diefenbach believes limitations of brain interface devices and our understanding of how the brain works will mean other technologies such as eye tracking and speech recognition will be combined with it to create a computing experience that will mimic actions in real life.

Meanwhile companies such as Panasonic and Sony are reportedly taking steps to bring alternative interfaces to the world of entertainment, says a Gartner analyst in a recent BBC report. These include bringing face recognition technology to televisions and making entertainment systems that will respond to gestures.

Remote Control Will Rule

While multitouch requires contact between the user and the device, the future will have a lot more remote control, meaning people will be able to manipulate objects and perform traditional computing actions from a distance, say researchers.

An example is FingerSight, a technology labeled by the researchers working on the project as a "new concept in sensory substitution and remote control for the visually impaired, as well as for those who simply want to wave their hands and have things happen."

FingerSight has a miniature camera attached to the fingertip (shown below) along with a device that will offer feedback such as a vibration. As you wave your finger, software in your computer will recognize graphical controls on the screen and deduce your motion relative to those controls, enabling you to turn a dial, for instance, without actually touching it, says John Galeotti, a postdoctoral fellow at the Robotics Institute of the Carnegie Mellon University, who has been working on the system.

"The camera tells how much the finger has moved and it becomes the equivalent of being able to use your fingers to turn a knob," says Galeotti.

Galeotti_siggraph_20083_page_1_imag

Before the system can take off, we have to see a more ubiquitous computing environment with chips embedded in objects everywhere and the development of smart homes so the use of these technologies is seamless and widespread, says Galeotti.

Advancements in human computer interaction will also come from users looking to improve their personal experience by hacking, mashing and modifying devices, says Klemmer.

"Users will soon be tailoring, customizing and hacking the technology out there to suit their own needs," he says. "By gluing a few things together, users will find they can get an experience that is radically different from using things off the shelf."

Still, the keyboard and the mouse aren't going to disappear completely. For word processing, the keyboard remains the most efficient method of input, say researchers.

The developers of these futuristic interfaces will also need patience, as it took nearly 25 years for multitouch to take hold. In 1982, Nimish Mehta at the University of Toronto showed what he called the "Flexible Machine Interface," one of the first demonstrations of a multitouch interface. Nearly 25 years later, Apple released the iPhone.

It takes a long time to go from research to reality," says Klemmer.

0 comments: