Hearing a Rembrandt, tasting a landscape

A blind mountain climber tastes his way to the summit. A deaf musician hears melodies on his skin. A color-blind painter listens to hues and shades.

New technologies in “sensory substitution” are helping the brain replace one damaged sense with another that learns to step in and do the same task.

These assistive technologies capitalize on the brain’s capacity for “synesthesia” — the ability of one sensory system to impinge on another. In some individuals, the dividing line between senses is weak or missing, allowing two senses to mingle. A person born with the condition might say that a slamming door feels yellow or that a damp chill tastes like cherries.

If the brain has the capacity to join senses, Paul Bach-y-Rita thought, it’s logical to assume that when one sense fails, another can fill in. A neuroscientist at the University of Wisconsin’s medical school, Bach-y-Rita exploited the concept of “neuroplasticity” and created early versions of assistive technologies. In one, a blind person could sit in a chair decked out with 400 vibrating plates in its back and a camera mounted above. A computer translated shifts in the image captured by the camera into patterns of vibrations in the plates; when a blind person leaned back against the plates, he or she could learn to translate the patterns of vibrations into the objects or movements the camera was capturing.

SEEING WITH YOUR TONGUE

Bach-y-Rita died in 2006, but his inspiration lives on in the BrainPort V100, a device marketed by Wicab, a private Wisconsin firm. The BrainPort mounts a digital videocam on the nose bridge of a pair of sunglasses. Images from the camera travel to a controller the size of a smartphone. The controller translates the digital images into a pattern of electrical pulses among 400 tiny electrodes embedded in a wafer held on the tongue. Shapes become patterns of sensation on the tongue: bright areas of a scene translate into strong blips, dark areas don’t blip at all, and gray areas make pulses equivalent to their degrees of light and dark. Hand controls allow the user to zoom in or out and sharpen or blend contrasts.

The tongue has more tactile nerves than any other part of the body but the lips, and can detect sensations separated by less than three one-hundredths of an inch. Users say the pulses from the electrodes feel like bubbles on the tongue and, after about 10 hours of supervised training, a user can distinguish shapes and patterns. Over time, users become deft enough to take up a range of ordinary, and even extraordinary, tasks: Erik Weihenmayer, the first blind person to climb Mount Everest, now uses the BrainPort to help him scale rock faces.

The US Food and Drug Administration has approved the $10,000 BrainPort for sale by prescription. So far, about 50 users are out and about with it. Wicab is planning to set up satellite training centers this year in New York, Chicago and San Francisco and will triple the number of locations before 2018 to reach the estimated 7 million US adults with serious visual impairment.

If you want to be able to eat or talk while you “see,” there’s The vOICe (as in “Oh, I see”) from Dutch inventor Peter Meijer. The system sends live digital images from a camera to a smartphone, where an app — available at no charge — converts the images into patterns of sound and sends them through a set of headphones. A higher pitch signals a taller object; a louder tone indicates a brighter light, for example. Studies show that The vOICe creates visual analogies as detailed as specialized devices costing thousands.

FROM VISIONS TO SOUNDS

EyeMusic, developed at the Hebrew University in Jerusalem, is another free app that transforms visual images into stereo soundscapes, but does so musically. Notes sounding soonest are closest to the left side of the view and the tune proceeds toward the right. A smile is heard as a series of notes that descend and then rise again from left to right, a frown the opposite; the distinctive tones of different musical instruments each represent a different color.  With around 70 hours of training and experience, the brain can learn to translate the melodies into readable words, recognizable faces and landscapes, each of which is interpreted by a different part of the brain.

Neil Harbisson can see well enough; he just can’t see color. Born in a gray world, he became obsessed with knowing what color was and took up art. Ultimately, he convinced surgeons to solve his problem by implanting an audiovisual system in his head.

His tiny video camera is on the end of a gooseneck stem that rises from the back of his skull and droops over his forehead. The camera sends images to a chip at the base of the stem, inside Harbisson’s head. A specific color is represented by an equally specific audio vibration that he feels in his skull. Harbisson’s brain has learned to distinguish among scores of vibrational frequencies to “hear” specific colors as he paints. He also perceives colors the eye can’t see, such as ultraviolet, and has translated colorscapes into musical scores for orchestras. 

For people who can see but not hear, David Eagleman’s research team is developing the Versatile Extra-Sensory Transducer or, simply, the VEST. It’s indeed a vest, weighing about 10 pounds, and has a checkerboard of small transducers embedded in the back. The transducers translate electrical signals into patterns of vibrations among the transducers.

A microphone near the vest’s collarbone captures sounds and sends them through a Bluetooth device to a controller that maps the signals as patterns of vibrations across as many as three dozen transducers.

This isn’t just a way to feel sound; it’s a way that persons who can’t hear can sense the audible world. Users report they just feel vibrations initially. But, with a few days of intense work, those users begin to be able to match unique patterns of vibrations with specific words. No one knows quite how the brain adapts in this way, only that it does.

Eagleman, an engineer at Baylor University’s medical school, recently completed a $40,000 Kickstarter campaign to continue development. But his vision ranges far beyond the short term.  He sees the device as not only allowing people to communicate across sensory obstacles but also enabling humans to communicate with machines — for example, allowing astronauts aboard a space station to feel patterns of vibrations from the craft that identify specific mechanical or electrical problems.   TJ

Skip to content