When it comes to processing information, our eyes do most of the work. Musicians use their ears, perfumists their noses, but by and large, we humans are reliant on our vision.
For most of us, most of the time, this is just fine. But in visually demanding situations, the added strain can be distracting. Hence the popularity of audio command systems in cars and on cellphones: the more info we can divert toward our ears, the better we can use our eyes. And yet while driving, there's plenty of stuff to hear, too: the policeman's whistle, the buzz of the radio, the kids in the back seat, the hum of the road.
That leaves three untapped senses, and after vision and hearing, the next step for information transmission seems obvious: touch.
Lynette Jones, a senior research scientist at the Massachusetts Institute of Technology, wants us to think of skin as the next frontier. "We're exploring using skin as a medium of communication," she says.
Jones's work is to develop tactile "displays," body pads with little vibrating motors called tactons, that convey important information to drivers, soldiers, firefighters and other individuals with busy eyes and ears.
One of the first practical uses of tactile displays is likely to be in-car navigation. Aided by GPS, a pad on the driver's back could distribute directions through targeted vibrations. In addition to varying vibration patterns for wide or sharp turns, tactile displays could warn of obstacles near the car.
There are two reasons why this combination -- automobile navigation and directions delivered to a person's lower back -- would be a great debut for the technology. As Jones's recent research shows, the back is a particularly good area for a tactile display: it's a large area of skin that doesn't move very much, and it is always facing the same way relative to our bodies.
"The skin does not have nearly the communicative capacity of the eye or ear," Jones says. "The advantage of the back is you can space things out." Better still, with tactile display embedded in the seat back, drivers could skip the hassle of putting on a device.
Navigation is also considered a good target because humans are quite good at detecting the relative position of vibrations -- left or right, say -- and less good at detecting the most subtle differences in location. Directional signals are intuitive. But with more than 15 to 20 tactons, we have a lot of trouble telling what's what. If someone were to draw a QWERTY keyboard on your back, for example, you would have trouble telling "let" from "mad."
This makes it hard to transmit language through tactile display. But it's not impossible. Braille, for example, is a widely used form of tactile communication, though it is difficult to master even despite being perceived with our bodies' finest sensory implements, the fingertips. Still, those working with the visually impaired have tried to develop better tactile communication for well over a hundred years. Helen Keller's sense of touch was apparently so finely developed that she could hear music by putting her hand on a resonant tabletop. Cellphones, too, are a basic form of tactile communication: short buzz for an email, long buzz for a text message, multiple buzzes for a call. Tactile pavements are common on subway platforms and at crosswalks.
But shrinking motors, GPS technology, and wireless communication have rejuvenated the field. As recently as the 1970s, tactile communication systems resembled nightmarish torture machines. Now, you can receive electrode signals of letters on your tongue! (OK, so that last one still has some resemblance to a torture machine.)
For Jones, new technology means the possibilities of tactile navigation are getting closer, even for the casual consumer. The same device helping the visually impaired navigate the street grid or sending commands to soldiers in a war zone at night could provide directional cues to a biker on a joyride.
Image courtesy of Lynette Jones.