University of Utah engineers learned through a recent study that language comprehension relies on vision as well as on sound.
“For the first time, we were able to link the auditory signal in the brain to what a person said they heard when what they actually heard was something different. We found vision is influencing the hearing part of the brain to change your perception of reality – and you can’t turn off the illusion,” says the study’s first author, Elliot Smith, in a press release posted on the University of Utah's website.
The study shows that, though the brain considers both auditory and visual cues when processing speech, visual cues prevail when paired with auditory cues that differ slightly from the visual cues. This phenomenon is known as the McGurk effect, says the press release.
The new study attributes the source of the McGurk effect to brain signals in the temporal cortex, the region of the brain that processes sound.
Certain retinal degenerative diseases such as macular degeneration and Stargardt disease make it difficult for those afflicted to see faces to lip-read. This new study helps show how these people could potentially have more trouble understanding what others are saying in loud, crowded places than sighted people do.