Neuroscientists discover a brain signal that indicates whether speech has been understood

on 25 February 2018
it ain't what you think, it's the way that you think it

It seems that what you hear and understand (maybe even what you think you hear), the brain then determines what to do, this could merely be to comprehend what is said, or carry out a complex procedure. But what about your internal voice; your self talk . . . . . . . . . . 

I very often say to clients, "my job as a therapist, is to help, you, the client, to get what you want . . . . . . .  so, what do you want?" In the 18 years of my being a therapist, I can say that I have rarely had a client who could clearly and specifically articulate 'what they want.' Some say, "oh that's easy, I want to be happy." So I then ask; "what is happiness?" Again, rarely do I get a clearly stated definition of happiness. Some, after a little thought, do give a reasonably good definition of happiness (bearing in mind, it will be different for each of us). It is the pause, the additional thinking, that is the point I am making here. We need to know what we want at the level of conscious awareness and the subconscious and ultimately, without thought. Albeit the initial phase of developing the "I want" statement, does require a lot of thought. In order to get what we want, in the many areas of our life, it has to be clear in our minds. So, the more clearly and easily we can articulate what we want to others, consequentialy, the greater the chance of us achieving it becomes.

The research below helps to understand, in some way, how and why different parts of our brain do, or do not, come online relative to speech. That speech is linked to emotion, is quite easy to understand, e.g. if someone said something pleasant or complimentary to us, it is likely we will have an awareness of nice feelings. On the other hand, if they said something unpleasant or nasty, that will generate very different feelings! So, it would seem very logical, that if we respond emotionally to what we hear or think, then life, will very likely follow that pattern. 

So, part of my work as a hypnotherapist is to learn what the client wants. But clients don't usually come for therapy because they want something, they usually come to get rid of something they don't want, e.g. anxiety, depression, smoking etc. It is quite common when I ask, "what do you want," they they will say, "I don't want this anxiety." But that is not what they want, it's what they don't want! Have you heard of this saying, whatever you do right now, do not think of a pink elephant? Well, what did you think of? Most likely it was a pink elephant. That is because in order to not think of a pink elephant, you actually think of one. As an aside, If you are not familiar with a pink elephant, then substitute it with an apple, smelly cheese or something like that. Essentially, hypnosis creates the presence of a new concept, idea or way of thinking that allows subconscious inculcation of something new. Basically laying down the foundations of new or extended neural pathways. Also, reconsolidating old memories with newer, better, alternatives. This interrupts the way in which an old emotional memory expresses itself; change the expression, change the outcome!

So, if life is not panning out the way you want or hoped it would, maybe your language or self talk etc. is getting in the way? If that's the case, then why not try hypnosis - you've got nothing to lose; except you problem!

My aim here, is to highlight the way hypnosis can help many people, who have conditions that confuse them or, that are not responding to other medical interventions.

Hypnotherapy,, essentially helping ordinary people; live a more ordinary life! To find out more, why not make an appointment for a Free Consultation - here

The Research:  

The presence or absence of a unique brain signal after a listener has heard some speech indicates whether or not that listener has understood what has been said. The discovery has a number of practical applications, including tracking language development, assessing brain function post-injury, and confirming whether important instructions have been understood in high-pressure jobs.

When a listener understands speech, a strong response signal is seen over the mid back part of their scalp (top row; blue and green waveforms show response at two specific recording locations). When they can't understand (because, for example, the speech is played backwards), the signal completely disappears (bottom row; red and yellow waveforms show the lack of response at the same two specific recording locations).


Neuroscientists from Trinity College Dublin and the University of Rochester have identified a specific brain signal associated with the conversion of speech into understanding. The signal is present when the listener has understood what they have heard, but it is absent when they either did not understand, or weren't paying attention.

The uniqueness of the signal means that it could have a number of potential applications, such as tracking language development in infants, assessing brain function in unresponsive patients, or determining the early onset of dementia in older persons.

During our everyday interactions, we routinely speak at rates of 120-200 words per minute. For listeners to understand speech at these rates -- and to not lose track of the conversation -- their brains must comprehend the meaning of each of these words very rapidly. It is an amazing feat of the human brain that we do this so easily -- especially given that the meaning of words can vary greatly depending on the context. For example, the word bat means very different things in the following two sentences: "I saw a bat flying overhead last night"; "The baseball player hit a homerun with his favourite bat."

However, precisely how our brains compute the meaning of words in context has, until now, remained unclear. The new approach, published today in the international journal Current Biology, shows that our brains perform a rapid computation of the similarity in meaning that each word has to the words that have come immediately before it.

To discover this, the researchers began by exploiting state-of-the-art techniques that allow modern computers and smartphones to "understand" speech. These techniques are quite different to how humans operate. Human evolution has been such that babies come more or less hardwired to learn how to speak based on a relatively small number of speech examples. Computers on the other hand need a tremendous amount of training, but because they are fast, they can accomplish this training very quickly. Thus, one can train a computer by giving it a lot of examples (e.g., all of Wikipedia) and by asking it to recognise which pairs of words appear together a lot and which don't. By doing this, the computer begins to "understand" that words that appear together regularly, like "cake" and "pie," must mean something similar. And, in fact, the computer ends up with a set of numerical measures capturing how similar any word is to any other.

To test if human brains actually compute the similarity between words as we listen to speech, the researchers recorded electrical brainwave signals recorded from the human scalp -- a technique known as electroencephalography or EEG -- as participants listened to a number of audiobooks. Then, by analysing their brain activity, they identified a specific brain response that reflected how similar or different a given word was from the words that preceded it in the story.

Crucially, this signal disappeared completely when the subjects either could not understand the speech (because it was too noisy), or when they were just not paying attention to it. Thus, this signal represents an extremely sensitive measure of whether or not a person is truly understanding the speech they are hearing, and, as such, it has a number of potential important applications.

Ussher Assistant Professor in Trinity College Dublin's School of Engineering, Trinity College Institute of Neuroscience, and Trinity Centre for Bioengineering, Ed Lalor, led the research.

Professor Lalor said: "Potential applications include testing language development in infants, or determining the level of brain function in patients in a reduced state of consciousness. The presence or absence of the signal may also confirm if a person in a job that demands precision and speedy reactions -- such as an air traffic controller, or soldier -- has understood the instructions they have received, and it may perhaps even be useful for testing for the onset of dementia in older people based on their ability to follow a conversation."

"There is more work to be done before we fully understand the full range of computations that our brains perform when we understand speech. However, we have already begun searching for other ways that our brains might compute meaning, and how those computations differ from those performed by computers. We hope the new approach will make a real difference when applied in some of the ways we envision."

Story Source:

Materials provided by Trinity College DublinNote: Content may be edited for style and length.

Journal Reference:

  1. Michael P. Broderick, Andrew J. Anderson, Giovanni M. Di Liberto, Michael J. Crosse, Edmund C. Lalor. Electrophysiological Correlates of Semantic Dissimilarity Reflect the Comprehension of Natural, Narrative SpeechCurrent Biology, 2018; DOI: 10.1016/j.cub.2018.01.080