Improving noise filters in hearing aids

Young woman says something in older woman's ear
Image: Colourbox.de

Researchers at FAU have developed a method that makes it easier to understand speech at high levels of background noise.

Human beings are very skilled at understanding language, even under difficult conditions. However, loud background noises make it difficult for people who are hard of hearing to follow what a speaker is saying. Hearing aids do not really help as currently available models do not filter enough of the background noises out. Researchers at FAU are developing a method that helps people to understand speech better despite background noise. The results have been published in the scientific journal PNAS.

When noises such as loud music, rattling cutlery in a restaurant or murmuring voices disrupt what we can hear, such sources of noise can make it difficult for us and for hearing aid users in particular to understand our conversation partners. This is because hearing aids can amplify quiet noise signals but cannot adequately serve the entire bandwidth of our hearing.

Additional vibration impulses

Researchers at FAU demonstrated that when a vibration impulse is transmitted at the same time as the speech signal, the speech comprehension of test persons in rooms with high levels of noise significantly improves. During their experiment, the researchers used a small vibration motor which participants held between their thumb and forefinger. The body is particularly sensitive to tactile signals here.

Understanding the speech signal

The impulse emitted by the motor is similar to the vibrations of smartphone screens when they are touched. The motor emits low level vibrations at the centre of each spoken syllable. This impulse improves speech comprehension because the nervous system, which is responsible for the sense of touch, transmits the signal to the auditory cortex – the area of the brain that processes noise and language signals. This part of the brain reproduces the rhythm of syllables and thus supports the decoding process of syllables and the comprehension of the speech signal.

During their experiment, the researchers selected specific speech signals that they played to the participants. The centres of the spoken syllables were calculated in advance and the vibrating motor was controlled in such a way that it emitted a signal at the relevant time. The next phase of the research involves finding out whether the impulses are transmitted at the right time when the researchers do not calculate the centres of the syllables before the experiment is conducted. They also want to try and discover whether the forefinger is the best place to locate the motor for the tactile signal as placing it near the ear would be beneficial if the system is to be combined with a hearing aid. The results already available will be used to develop new multi-sensory hearing aids that will help users to understand speech better.

More information:

Prof. Dr. Tobias Reichenbach

Phone: +49 9131 85 23354

tobias.j.reichenbach@fau.de.