A Bayesian computational basis for auditory selective attention using head rotation and the interaural time-difference cue

Loading...
Thumbnail Image
Date
2017
Authors
Hambrook, Dillon A.
Ilievski, Marko
Mosadeghzad, Mohamad
Tata, Matthew S.
Journal Title
Journal ISSN
Volume Title
Publisher
Public Library of Science
Abstract
The process of resolving mixtures of several sounds into their separate individual streams is known as auditory scene analysis and it remains a challenging task for computational systems. It is well-known that animals use binaural differences in arrival time and intensity at the two ears to find the arrival angle of sounds in the azimuthal plane, and this localization function has sometimes been considered sufficient to enable the un-mixing of complex scenes. However, the ability of such systems to resolve distinct sound sources in both space and frequency remains limited. The neural computations for detecting interaural time difference (ITD) have been well studied and have served as the inspiration for computational auditory scene analysis systems, however a crucial limitation of ITD models is that they produce ambiguous or “phantom” images in the scene. This has been thought to limit their usefulness at frequencies above about 1khz in humans. We present a simple Bayesian model and an implementation on a robot that uses ITD information recursively. The model makes use of head rotations to show that ITD information is sufficient to unambiguously resolve sound sources in both space and frequency. Contrary to commonly held assumptions about sound localization, we show that the ITD cue used with high-frequency sound can provide accurate and unambiguous localization and resolution of competing sounds. Our findings suggest that an “active hearing” approach could be useful in robotic systems that operate in natural, noisy settings. We also suggest that neurophysiological models of sound localization in animals could benefit from revision to include the influence of top-down memory and sensorimotor integration across head rotations.
Description
Open access; distributed under the terms of the Creative Commons Attribution License
Keywords
Bayesian , Selective attention , Head rotation , Computational systems , Interaural time difference , Sound localization
Citation
Hambrook, D. A., Ilievski, M., Mosadeghzad, M., & Tata, M. S. (2017). A Bayesian computational basis for auditory selective attention using head rotation and the interaural time-difference cue. PLoS ONE, 12(10), e0186104. Retrieved from https://doi.org/10.1371/journal. pone.0186104
Collections