ByHearology Publishing | Date: Wed Apr 30 2025

An infographic of a brain sending out sound waves

New research from MIT reveals how precise neural timing underpins our ability to hear, understand and connect – and could transform hearing technology

The next time you pick out a voice in a noisy room or instinctively turn towards the source of a sound, spare a thought for your auditory neurons. Behind even the simplest act of listening lies an astonishing feat of biological precision.

Now scientists at MIT’s McGovern Institute have come closer than ever to explaining how that precision works – and what happens when it goes wrong. 

Their latest study, published in Nature Communications, reveals that the exact timing of electrical signals in the auditory nerve is critical to how we process sound. Lose that precision, and we lose our grip on the world of sound.


A spike in understanding

Sound waves entering the ear are converted into electrical signals by specialised cells. These signals – or “spikes” – travel along the auditory nerve to the brain. What MIT’s researchers found is that it’s not just the number of spikes that matters, but rather when these spikes occur in relation to the original sound wave.

This alignment, known as phase-locking, ensures that the information reaching the brain reflects not just the presence of sound, but its tone, timing, pitch and direction. “The action potentials in an auditory nerve get fired at very particular points in time relative to the peaks in the stimulus waveform,” said lead researcher Professor Josh McDermott.

In other words: our sense of hearing relies not only on detecting sound, but also on an exquisite form of internal choreography between the ear and the brain.


Artificial models reveal the limits of human hearing

To explore this further, McDermott’s team built artificial neural networks capable of simulating auditory processing. Previous models often outperformed humans – not because they were more realistic, but because they didn’t face the same challenges as human listeners. This time, the researchers designed tasks that reflect the true complexity of real-life hearing: picking out words or voices in background noise, for example.

When they degraded the timing of the simulated spikes, the system faltered. Voice recognition and sound localisation suffered – just as they do in people with hearing loss.

“You need quite precise spike timing in order to both account for human behaviour and to perform well on the task,” said Mark Saddler, who worked on the study and is now at the Technical University of Denmark.


From neural timing to better hearing aids

The findings don’t just add to our understanding of how hearing works. They could also change how we help people when it doesn’t.

Hearing aids and cochlear implants, while transformative for millions, can still fall short in noisy environments. By mimicking the brain’s natural reliance on precise timing, future devices could become much better at separating signal from noise – especially in complex environments like restaurants or classrooms.

“Now that we have these models that link neural responses in the ear to auditory behaviour, we can ask, ‘If we simulate different types of hearing loss, what effect is that going to have on our auditory abilities?’” said McDermott.


The beauty – and fragility – of human hearing

For Hearology®, which provides hearing tests, tinnitus support, hearing aids and microsuction ear wax removal amongst other products and services, this research is a reminder of both the complexity and the delicacy of the hearing system.

“This work is astonishing – and important,” said Vincent Howard, co-founder of Hearology®. “It shows just how much of our ability to listen, understand and connect with others depends on timing that’s accurate to the millisecond. That’s a marvel of evolution – but it also explains why even small disruptions to the auditory system can have such a profound effect.”

The more we learn about the brain’s role in hearing, the more it becomes clear that hearing loss is not just about the ears. It’s about how the whole system – sound, nerves, brain – works together.

And understanding this may be the key to building the next generation of hearing support: devices that don’t just amplify, but truly listen like we do.


References


  • Computational modeling - Provides further details on the computational modeling used to understand pitch perception, which is crucial for auditory processing research.
  • Hearing through the clatter - Discusses how the brain extracts meaningful sound from noise, which is relevant to understanding auditory processing and its implications for hearing restoration technologies.
  • Perfecting pitch perception - Highlights research on pitch perception and how natural soundscapes influence human hearing, aligning with the study's focus on auditory processing.