How is your auditory perception influenced by context?

How is your auditory perception influenced by context?

Dear reader, this post is the first of many posts presenting the efforts currently taking place at DELTA SenseLab. Although I am writing this post, this is an official blog covering the work and thoughts on perceptual audio-visual testing taking place at SenseLab. This format of communication is an experiment on how to improve our interaction with professionals in the audio industry and hopefully ensure that the services in development will have value for the industry, so please comment on the posts and share ideas, critiques, thoughts, inspiration, etc.

Understanding the interactions between audio & video

We are currently working on a research project, which will continue for the next two years. In this project, we will investigate methods of evaluating audio with video as well as audio and video. In the case of audio with video, new tests will be conducted to investigate to which degree the human auditory perception is influenced by visual context. For example, whether noise from trains are perceived more of less annoying, if one can see them as well. Or whether a car’s warning sounds (e.g. “Fasten your seatbelt”) is perceived as more intuitive by providing a visual context, in comparison to sitting in a laboratory. In Belgium they have already made similar experiments: Here the company haystack had their test subjects equipped with virtual reality glasses and placed them in a virtual bar, when characterising beer (link)! Their experience so far is, that visual context is significant for the subjects’ evaluation in terms of tasting and smelling tasks. In SenseLab we want to investigate the degree to which this is the case when evaluating sound.

The aim of this project is to increase our know-how of perceptual audio-visual testing; focusing particularly on the influence of video on audio evaluation. Furthermore, most experiences and results will be made publicly available either through publications (e.g DELTA Tech Document’s or AES conference papers) or through other channels, such as our annual mini conference SenseCamp. Within this project, we will gain experience with regards to 360 video capture, A/V testing methods in our new SenseLabOnline tool and run audio tests in artificial visual contexts. This will entail both normal video on a flat screen or projector as well as presentations in virtual reality environments. Our first demonstrator project – using video as visual context for an audio evaluation test - will be described in a December post.

If you are interested in updates about our project please follow me here on LinkedIn (https://dk.linkedin.com/in/christervolk), and new posts will show up on your LinkedIn front page.

DELTA SenseLab publication library: http://senselab.madebydelta.com/about/publications/


To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics