Researchers Develop Better Car Controls

ByABC News
July 14, 2003, 12:14 PM

July 15 -- If someday in the future you see a driver shake his fist, don't panic. It may not be a road-rage incident. He might just be changing the station on his car's radio.

Researchers at Carnegie Mellon University in Pittsburgh are developing a high-tech system to convert drivers' hand gestures into something more useful than impromptu communications with fellow highway pilots.

Make a few jabbing motions in the air, for instance, and you might adjust the car's electronic control system. A "twirl" of an index finger and the radio's volume goes up or down, depending on the direction of the twirl. "Wave" and incoming cell phone calls are automatically answered by a voice-mail system.

The "gesture interface" system is one of a few advanced control systems being studied by university scientists for the General Motors Collaboration Laboratory. The goal of the lab, a five-year, $8 million joint project with the Detroit automaker, is to help develop new technologies to make cars "smarter" and allow drivers to stay focused on the road.

"Computers are providing the smarts to make car driving easier," says Professor Asim Smailagic, a senior researcher on the gesture interface project. "Performing other tasks taking a phone call, adjusting the radio or air conditioning should not take away from the task of driving."

Watchful Eye

The workings of the system are fairly simple and cheap.

The scientists place a common computer camera on the floor of a Pontiac Montana minivan, roughly where a gear-shift lever would be positioned. The camera is attached to a standard laptop computer that contains special computer algorithms developed by Carnegie Mellon computer engineers over the past three years.

When a driver passes his hand above the camera, the computer programs analyzes the images, looking for specific hand features the fingers, palm, the outline of a fist, and so on and tracks its motion across the camera's field of view. The program then translates that gesture into appropriate commands.