top of page

 Rodrigo Gutierrez

 

www.llupiy.com – In native American language of the regions of Ecuador, Peru and Bolivia, i.e., Quechua, llupiy means to think, to meditate.

 

Research interests in cognition with central focus on sound and affect, i.e., bridging psychophysiology, philosophy and conversational AI.

 

Present: Support the development and testing of algorithms for standards of immersive 3D audio, i.e., MPEG-I, MPEG-H, for Virtual and Augmented reality applications. Dolby GmbH, Nuremberg, Germany.

2017 -2020. MSc degree in Digital Media Informatics. University of Bremen / HFK Bremen, Germany. Thesis on psychophysiological effects of acoustic environments at Dolby GmbH, Nuremberg, Germany. Research oriented program linked computer science, media theory and design, Human-computer Interaction.

2017-2019. Student assistant at Spatial Cognition Center. Psychometrics, Just Noticeable Difference in auditory display. University of Bremen, Germany.

2015. Bachelor of Information Systems Engineering. UEES, Ecuador.

2002-2017. Academic Staff / Research / IT / Sound Production at www.igad.edu.ec, Ecuador.

2000-2017. Freelance: interactive media installations, video mapping and live coding performances for several events including symphony orchestra concerts, music festivals, TV presentations, and museum exhibitions.

 

 

 The Stone and The Ripples

 

Think of a forest. Have you ever been to one? Even if you haven’t, you can probably imagine how it may sound or look. “Hearing and seeing” spaces inside our heads that we model through cognition is called auralization and visualization. Modern technology allows to accurately model acoustic and visual environments and make them audible and visible through speakers, headphones, screens and head mounted displays, e.g., augmented or virtual reality. Then, how about auralizing and visualizing the internal functioning of the human body? This is called audiovisual biofeedback, the mapping of biometrics, such as heart or breathing rate, to sounds, shapes and colors. Audiovisual biofeedback provides a way to become aware of our organs (interoception) and exercise the integration of our senses by stimulating the assembly of sensory neural networks (synesthesia). As a whole, these bio-technological processes support the perception of the bonds between physical and unmaterial dimensions, long described in scriptures of ancient Hindu civilizations.

 

Here, if you sync your breathing with the guiding sounds, you can experience Heart Rate Variability Audiovisual Biofeedback with a standard camera, such as the ones in mobile devices or laptops. HRV is measured from color changes in face skin. Its value should be optimized by breathing slowly, i.e., at the pace of the oscillating nature sounds. Red tones will be more visible when you fall out of sync and blue tones will be more visible when your bio signals sync in. The later leads to nervous system coherence (Pranayama) and interoception support (Vipassana), i.e., relaxation and awareness of your body's functions.

 

This work makes use of open source algorithms (see Fig1.) and makes the practice of audiovisual biofeedback freely available to those interested in exploring the connections between mind and body and the cognitive processes that underpin them. The ultimate questions of science and spirituality approach the nature of consciousness and should be a matter of debate for everyone, since thereby lies the present and future of the realities we want to construct with our attention.

bottom of page