ICSA 2017 4th International Conferenceon Spatial Audio | September 7th to 10th, 2017, Graz, Austria #vdticsa
The spread of VR technology in consumer devices also opens new
possibilities for using spatial audio with head-tracking, making this
progress available to nearly everyone. In the past years the most
important social and video platforms also implemented first order
ambisonics in an user-friendly way. Binaural audio has played an
underpart for the last decades due to the lack of head-tracking.
Nowadays every smartphone user is head-tracking enabled but special
monitoring of the sound scene stability is necessary due to the involved
latency.
This paper evaluates the influence of the visual information in a VR
context on the perception of different ambience recordings. Information
will be gathered through a comparative listening test realized using VR
goggles. Audiovisual scenes recorded simultaneously will be played
back to the test subject enabling him to switch between different audio
signals in real times. The audio signals will contain coincidence and
equivalence stereophony based recording techniques, that showed
significant preference in loudspeaker-based comparative listening tests.
Following an elicitation procedure, we will research the effects and
evaluate the results.