A brain signal associated with the conversion of speech into understanding has been identified by researchers at Trinity College Dublin for the first time.

The discovery could have a wide range of applications, like assessing brain function in unresponsive patients or determining the early onset of dementia.

It could also be used track language development in infants and children.

The neuroscience team found the signal to be present when the person listening had understood what they had heard.

However, there was no signal if the individual was not paying attention or did not understand what was being said.

The researchers made their discovery using the same techniques that enable computers and smartphones to comprehend speech.

The process involves feeding computers with large quantities of text to learn patterns of association between words with similar meaning.

The scientists then recorded electrical bainwaves signals from participants as they listened to audiobooks.

This data was tracked to see if human brains compute the similarity between words as they listen to speech.

In doing this, the researchers identified a specific brain response that indicated how similar or different a word was from others that came before it.

But most significantly that signal was not present when the participants were either unable to understand the speech or when they were just not paying attention to it. 

The discovery, details of which were published in the journal Current Biology, is important as previously it wasn't clear how our brains processed the meaning of words in context.

"The presence or absence of the signal may also confirm if a person in a job that demands precision and speedy reactions - such as an air traffic controller, or soldier - has understood the instructions they have received, and it may perhaps even be useful for testing for the onset of dementia in older people based on their ability to follow a conversation," said Ed Lalor, Ussher Assistant Professor in TCD's School of Engineering who led the study.

Further work is required, he said, to get a complete picture of all the computations performed in the understanding of speech.
 
"However, we have already begun searching for other ways that our brains might compute meaning, and how those computations differ from those performed by computers," he said.

"We hope the new approach will make a real difference when applied in some of the ways we envision."
 
"There is more work to be done before we fully understand the full range of computations that our brains perform when we understand speech."