Hasso-Plattner-Institut
Neurodesign
 

Brain data is frequently visualized through colourful fMRI images or EEG plots. However, many aspects of brain functioning cannot be intuitively comprehended or analyzed by gazing at static images only. Brain activity involves dynamics and rhythms that visual depictions can only convey in a limited manner. By utilizing 3D sonification models tailored for headphones and multi-speaker setups in a room, brain data can be presented in an auditory manner, providing distinctive experiences of the dynamics in ever-changing neural activities.

The objective of a project by Luca Hilbrich, Lukas Hartmann, Philipp Steigerwald, and Tim Strauch is to present EEG data via a 3D sound installation. Specifically, the project aims to present rhythms and topographies of brain activity in an intuitively understandable way, even for individuals without extensive prior training in neuroscience. To control audio parameters, the team monitors power in different EEG frequency bands over time, such as alpha, low beta, and high beta. The distribution of audio channels corresponds to EEG channel locations on the skull: EEG activity recorded at frontal positions triggers speakers at the front of the room, while EEG activity from lateral/medial positions on the skull is represented by speakers in the middle of the room. EEG signals at occipital positions, from the back of the head, are conveyed through speakers at the back of the room.

You can try out the web application HERE

 

About Sonification
Sonification involves the use of non-speech audio to convey information or represent data perceptually. It offers advantages in terms of high temporal, spatial, amplitude, and frequency resolution, making it a promising alternative or complement to visualization techniques. Sonification has been applied in various contexts, including healthcare [1], the detection of gravitational waves [2], auditory altimeters [3], and weather sonification [4].

One significant consideration in sonification projects is the approach to synthesis: How can musical elements like rhythm or pitch be employed to enable listeners to intuitively grasp the data being presented? The design of sonification approaches is highly dependent on their intended applications. For instance, when the objective is to monitor EEG alpha-activity in a participant engaged in a creative task, the acoustic signal for alpha activity should stand out within the overall auditory experience.

 

The Approach
The approach used in this project differs from usual sonification applications where the data is directly sonified (e.g. controlling pitch parameters). Instead the data is converted into signals which then control the distribution of sound in a 3D space. With this approach, in the case of EEG it is possible to use the spatial distribution of frequency bands to create an acoustic topography. Since the data merely controls the distribution of sound in space, the acoustic experience remains flexible, as the sound can be refined for a variety of different applications, including diagnostic, educational or artistic purposes.

 

Project Results
This project yields two types of outcomes. The first is a web browser-based binaural rendering for demonstration and research purposes, optimized for headphone use. It includes a built-in dataset to showcase the concept of spatial sonification within a virtual room. The second outcome is an installation in a physical room, featuring six speakers and a custom-written PureData program that controls the spatial distribution of sound within the room. This installation represents different regions of the brain, allowing visitors to explore "the brain" spatially, experiencing how brainwaves move throughout.

 

References
[1] Parvizi, J., Gururangan, K., Razavi, B., & Chafe, C. (2018). Detecting silent seizures by their sound. Epilepsia, 59(4), 877-884.
[2] Tech, G.(2016,February 11). LIGO Gravitational Wave Chirp. www.youtube.com/watch
[3] Montgomery, E. T., & Schmitt, R. W. (1997). Acoustic altimeter control of a free vehicle for near-bottom turbulence measurements. Deep Sea Research Part I: Oceanographic Research Papers, 44(6), 1077-1084.
[4] Schuett, J. H., Winton, R. J., Batterman, J. M., & Walker, B. N. (2014, October). Auditory weather reports: demonstrating listener comprehension of five concurrent variables. In Proceedings of the 9th Audio Mostly: A Conference on Interaction With Sound (pp. 1-7).