top of page

  HOW BLIND PEOPLE ARE LEARNING TO SEE WITH SOUND

Deprived of sight, blind people manage to squeeze an amazing amount of information out of their other senses. Doing this requires their brains to do some reorganizing. To learn about some of these changes, scientists studied the brains of blind people who’ve learned to use an augmented reality system that converts images into soundscapes.

 

The system was invented in the early ‘90s, but it’s not widely used. The way it works is a person puts on a pair of goggles with a built-in camera and software that converts images captured by the camera into sounds. For example, the pitch of the sound (high or low) indicates the vertical position of an object; the timing and duration of the sound indicate the object’s horizontal position and width. For the real world scenes, the sounds are complex-in fact, they sound a bit like a garbled transmission from an alien spacecraft.

 

But with enough practice people can learn to interpret the sounds and form a mental image of objects-including people-that appear in front of them.

When sighted people see an outline or silhouette of a human body, areas of the cerebral cortex that specialize in making sense of visual stimuli become active. One of these, the extrastriate body area, seems particularly interested in bodies: It responds more strongly to images of the human body than to other types of objects.

 

But blindness cuts off the usual flow of information from the eyes to this part of the brain, and people who’ve been blind since birth have never actually seen a human form. Something must change in their brains when they learn to perceive body shapes using sound. Do visual parts of the brain start responding to sounds? Or do auditory parts of the brain start responding to body shapes? It’s a neat trick either way.

To find out what really happens, Ella Striem-Amit and Amir Amedi of the Hebrew University of Jerusalem scanned the brains of seven congenitally blind people who’d trained for an average of 73 hours on the augmented reality system. After training, they achieved 78 percent accuracy at classifying three different types of objects: people, everyday objects (like a cellphone), or textured patterns.

 

In some cases, they could do even more. “During training, the participants were asked to report the body posture of the people in the images they ‘saw,’ and could verbally describe it quite well, and also mimic it themselves,”

 

Functional MRI brain scans of the blind subjects and a group of sighted people showed that many of the same visual brain regions became active in both groups when they perceived images of the human form. Even though the blind subjects weren’t actually seeing anything, their extrastriate body area fired up when they heard a soundscape corresponding to a person’s body. As in the sighted subjects, the responses in this area were specific-everyday objects and textures didn’t elicit as much of a response, Striem-Amit and Amedi reported in Current Biology.

 

Striem-Amit and Amedi also found that in blind people as well as sighted people, body shapes also activated an area called the temporal-parietal junction, which some researchers think is involved in figuring out the intentions of other people.

 

The study illustrates that the brain can be remarkably malleable, says Kalanit Grill-Spector, a Neuroscientist at Stanford University. When blind people learn to read Braille, their visual cortex becomes sensitive to touch, she notes. “However, there have been little evidence for auditory stimuli driving responses in visual cortex in the blind,” Grill-Spector said. “For example making human sounds such as clapping or laughing does not seem to activate visual cortex in the blind.” (One exception is sounds of motion, such as footsteps, which can trigger the activity in the brain region which normally responds to the sight of moving objects, but not to sounds).

 

At the same time, the study also shows aspects of brain organization that remain stable, Grill-Spector says. The extrastriate body area, for example, appears to specialize in detecting human forms in sighted and blind people alike. It just uses different inputs to do it.

 

Striem-Amit and Amedi acknowledge in their paper that this type of augumented reality device has yet to be widely adopted by blind people. With some improvements, they’re hoping it might be. Amedi’s lab just released a new free iPhone app called EyeMusic that adds new algorithms and different music instruments to provide information about color.

 

Whether that’s enough to make the technology finally take off remains to be heard.

 

Source: Wired.com

Copyright © 2014 MJF Media (PVT) Ltd. All rights reserved. 

  • Facebook Clean
  • Twitter Clean
  • YouTube Clean
  • Google Clean
bottom of page