Rendering visual events as sounds: spatial attention capture by auditory augmented reality

dc.contributor.authorStone, Scott A.
dc.contributor.authorTata, Matthew S.
dc.date.accessioned2018-09-27T17:27:47Z
dc.date.available2018-09-27T17:27:47Z
dc.date.issued2017
dc.descriptionThis is an open access article distributed under the terms of the Creative Commons Attribution Licenseen_US
dc.description.abstractMany salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasibleen_US
dc.description.peer-reviewYesen_US
dc.identifier.citationStone, S. A., & Tata, M. S. (2017). Rendering visual events as sounds: Spatial attention capture by auditory augmented reality. PLoS ONE, 12(8), e0182635. https://doi.org/10.1371/journal.pone.0182635en_US
dc.identifier.urihttps://hdl.handle.net/10133/5205
dc.language.isoen_USen_US
dc.publisherPublic Library of Scienceen_US
dc.publisher.departmentDepartment of Neuroscienceen_US
dc.publisher.facultyArts and Scienceen_US
dc.publisher.institutionUniversity of Lethbridgeen_US
dc.subjectVisual eventsen_US
dc.subjectAuditory eventsen_US
dc.subjectSensory eventsen_US
dc.subjectSpatial informationen_US
dc.subjectNeuromorphicen_US
dc.subjectSounds
dc.subject.lcshDirectional hearing
dc.subject.lcshAuditory scene analysis
dc.subject.lcshMotion perception (Visual)
dc.subject.lcshAuditory perception
dc.subject.lcshHearing--Research
dc.subject.lcshVision--Research
dc.titleRendering visual events as sounds: spatial attention capture by auditory augmented realityen_US
dc.typeArticleen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Tata rendering visual events.pdf
Size:
4.77 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.75 KB
Format:
Item-specific license agreed upon to submission
Description:
Collections