The group of international scientists from Israel and the United States revealed how the brain processes words in daily conversations. The study was published in the Journal of Human behavior Science (NHB). The team analyzed more than 100 hours of brain activity in real dialog boxes. Experts have used electrocardiograms to record brain signals in a natural conversation. To analyze, the conversion model is used into a text called whisper, breaking the words into three levels: sound (audio signal), speech model (phonetic structure) and the meaning of the words (semantics).

The level is compared to brain activity by using complex computer algorithms. The model shows that the exact part of the brain is activated when handling different levels of words. For example, hearing areas react to sound and highest cognitive areas – giving the meaning of words. The study also proves that the brain processes the tongue sequentially.
Before we say, our brain shifts from the reflection of words to sound formation, and after we listen, it works in the opposite direction to understand the meaning of what has been said. The structure used in this study turned out to be more effective than the old methods to overcome these complex processes, the scientists noted.