A Novel Sound Localization Experiment for Mobile Audio Augmented Reality Applications

Download files
Access & Terms of Use
open access
Abstract
This paper describes a subjective experiment in progress to study human sound localization using mobile audio augmented reality systems. The experiment also serves to validate a new methodology for studying sound localization where the subject is outdoors and freely mobile, experiencing virtual sound objects corresponding to real visual objects. Subjects indicate the perceived location of a static virtual sound source presented on headphones, by walking to a position where the auditory image coincides with a real visual object. This novel response method accounts for multimodal perception and interaction via self-motion, both ignored by traditional sound localization experiments performed indoors with a seated subject, using minimal visual stimuli. Results for six subjects give a mean localization error of approximately thirteen degrees; significantly lower error for discrete binaural rendering than for ambisonic rendering, and insignificant variation to filter lengths of 64, 128 and 200 samples.
Persistent link to this record
DOI
Link to Publisher Version
Additional Link
Author(s)
Mariette, Nick
Supervisor(s)
Creator(s)
Editor(s)
Translator(s)
Curator(s)
Designer(s)
Arranger(s)
Composer(s)
Recordist(s)
Conference Proceedings Editor(s)
Other Contributor(s)
Pan, Zhigeng
Cheok, Adrian
Haller, Michael
Lau, Rynson W.H.
Saito, Hideo
Liang, Ronghua
Corporate/Industry Contributor(s)
Publication Year
2006
Resource Type
Book Chapter
Degree Type
UNSW Faculty
Files
download Nick_Mariette_ICAT2006.pdf 2.45 MB Adobe Portable Document Format
Related dataset(s)