3-D tongue images to improve speech

Print edition : December 11, 2015

3-D Images of the tongue. Photo: University of Texas at Dallas

A NEW study done by the researchers of the University of Texas at Dallas indicates that watching 3-D images of tongue movements can help individuals learn speech sounds. William Katz, professor at UT Dallas’ Callier Centre for Communication Disorders, said the findings could especially help stroke patients seeking to improve their speech articulation. The study was co-authored with Sonya Mehta, a doctoral student at the centre.

“These results show that individuals can be taught consonant sounds in part by watching 3-D tongue images,” said Katz. “But we also are seeking to use visual feedback to get at the underlying nature of apraxia (inability to perform particular purposive actions owing to brain damage) and other related disorders.” The study has been published in the journal Frontiers in Human Neuroscience. Although it was a small study, it showed that participants became more accurate in learning new sounds when they were exposed to visual feedback training. Katz is one of the first researchers to suggest that the visual feedback on tongue movements could help stroke patients recover speech.

“People with apraxia of speech can have trouble with this process. They typically know what they want to say but have difficulty getting their speech plans to the muscle system, causing sounds to come out wrong,” said Katz. Technology development allowed researchers to switch from 2-D technology to the Opti-Speech technology, which shows the 3-D images of the tongue.

Part of the study looked at an effect called compensatory articulation —when acoustics are rapidly shifted and subjects think they are making a certain sound with their mouths but hear feedback of a different sound.

“In our paradigm, we were able to visually shift people. Their tongues were making one sound but, little by little, we start shifting it,” Katz said. “People changed their sounds to match the tongue image.”

R. Ramachandran