
A collaboration between Google and Osaka University is transforming the way we understand the relationship between music and the mind.
Using a “mind reading AI“, researchers were able to recreate music similar to what people were listening to when their brains were scanned.
see more
Against the SP government, Sweden surrenders to the efficiency of the book…
GENERAL cleaning: Google makes alerts about closing Gmail accounts
The groundbreaking study opens up new perspectives on how music is processed by the human brain. This could be an important step towards understanding how many activities are captured by our brain functions. Understand better below!
Research demonstrates that Artificial Intelligence (AI) is capable of generating music based on brain patterns, opening the door to understanding how music affects the mind and emotions.
The results suggest that brain activity can be translated into musical characteristics such as genre, rhythm, mood and instrumentation.
So far, reconstructing sounds from brain activity has mostly been explored in relation to simple sounds like human speech and animal noises.
However, such a pioneering study by Google and Osaka University has expanded these boundaries, delving into the complex world of music.
(Image: Shutterstock/Reproduction)
The AI's music creation process, called Brain2Music, uses brain images obtained by functional magnetic resonance imaging (fMRI).
From this data, the AI makes connections between brain patterns and musical elements such as genre and mood.
Such a personalized approach enables the technology to produce music clips that resemble snippets of music heard by participants during the scan.
While the AI has achieved notable success in generating music similar to the originals, the researchers point out that the process is not perfect.
The concordance between the recreated music and the original music was around 60%, indicating that the AI captures certain aspects of songs more accurately than others.
However, this groundbreaking discovery hints at significant potential for future research and development in this field. The study also revealed specific brain regions involved in music processing.
The primary auditory cortex, where sounds are interpreted, is activated when listening to music, whereas the lateral prefrontal cortex, associated with meanings and interpretation, plays a role important.
Furthermore, the research raises the fascinating possibility of reconstructing songs that people are imagining in their minds, going beyond physical hearing.
With this, scientists are unlocking the mysteries of the profound connection between music and the mind, opening doors to a more comprehensive understanding of how music shapes the human experience.
The study could also have practical implications for developing personalized music therapies and more intuitive interactions between humans and technology.