This discovery is particularly important for understanding speech production in people with hearing loss, including those using cochlear implants.
Key Facts: Hearing loss impairs real-time coordination of speech movements.
New therapies could improve speech for those with hearing loss or cochlear implants.
This finding has significant implications for understanding speech production in people with hearing loss, especially those using cochlear implants.
The preliminary findings suggest that people with hearing loss might rely more on how their mouth and tongue feel, rather than auditory feedback, to control speech movements.
Summary: According to a recent study, speech movement coordination is greatly aided by hearing. Researchers observed a decrease in people’s ability to control their jaw and tongue movements when they were momentarily unable to hear themselves speak.
Understanding how people with hearing loss, including those who use cochlear implants, produce speech is made easier by this discovery. For people with hearing impairments, the findings may result in new therapeutic approaches centered on oral-motor training.
Important Information:.
Speech movements are less coordinated in real time when one has hearing loss.
When auditory feedback is limited, people might rely more on oral-motor feedback.
For those who have cochlear implants or hearing loss, new therapies may enhance speech.
The McGill University is the source.
According to research from McGill University, hearing is essential for people to be able to plan and regulate their speech movements in real time.
Published in The Journal of the Acoustical Society of America, the research shows that when people cannot hear their own speech, even briefly, their ability to move their jaw and tongue in a coordinated manner is impaired.
According to Matthew Masapollo, the paper’s lead author and Research Associate in McGill’s Motor Neuroscience Laboratory, “people rely on immediate auditory feedback to coordinate and control the movements of their vocal tract in service to speech production.”.
The researchers tracked jaw and tongue-tip movements in subjects with normal hearing under two different scenarios: when their speech was audible and when it was muffled by multi-talker noise. In order to collect this data, the researchers used electromagnetic articulography (EMA).
Speech motor performance decreased in the latter scenario, where participants momentarily lost their ability to hear themselves.
This discovery has important ramifications for our knowledge of how deaf individuals, particularly those who wear cochlear implants, produce speech.
Masapollo stated, “It is clearly because the auditory signals available through CIs are degraded that certain aspects of speech production remain impaired, even years after implantation.”.
The researchers noted that a better understanding of how poor sound affects speech informs how best to teach children with severe hearing loss to speak and ensures that cochlear implants are effective.
Working with Susan Nittrouer and McGill academics David J. Ostry and Lucie Ménard are currently examining the impact of limited sound access from cochlear implants on the speech produced by those who have received them.
According to the preliminary results, speech movements may be more controlled by speech-impaired individuals based on their mouth and tongue sensations than on auditory cues.
If this is verified, clinical research can take advantage of it by creating novel oral-motor training-based therapeutic interventions to help adults and children with hearing loss.
Regarding this news on auditory neuroscience research.
Written by Claire Loewen.
University of McGill is the source.
Claire Loewen at McGill University can be reached.
Picture: Neuroscience News is credited with this picture.
Original Study: Access restricted.
Matthew Masapollo et al. state that “immediate auditory feedback regulates inter-articulator speech coordination in service to phonetic structure.”. Acoustical Society of America Journal.
Inabst.
Interarticulator speech coordination is regulated by immediate auditory feedback in support of phonetic structure.
Studies have demonstrated that talkers can consistently synchronize the timing of articulator movements despite changes in syllable stress and production rate. This accuracy in inter-articulator timing creates phonetic structure in the resulting acoustic signal.
Here, we investigated the theory that that consistent articulatory timing control is regulated in part by immediate auditory feedback.
Electromagnetic articulography was used by talkers with normal hearing to record 480 /tVCat/ utterances, with alternative V (/ɑ/-/ɛ/) and C (/t/-/d/), across variation in production rate (fast-normal) and stress (first syllable stressed-less). Two listening conditions—unmasked and masked—were used to separate the utterances.
The timing of C’s tongue-tip raising onset in relation to V’s jaw opening-closing cycle was measured in each listening condition in order to assess the impact of immediate auditory feedback on the coordination between the jaw and tongue-tip.
Any modification that lowered the jaw opening-closing cycle’s latency, in relation to the jaw opening’s onset, was applied to both listening conditions. Furthermore, there was a strong correlation between utterance type and tongue-tip latencies.
In contrast, tongue-tip latencies were less significantly correlated with utterance type during auditory masking, suggesting that talkers use afferent auditory signals in real-time to control the accuracy of inter-articulator timing in support of phonetic structure.