Permanent URI for this collection
Now showing 1 - 5 of 5
- ItemA Bayesian computational basis for auditory selective attention using head rotation and the interaural time-difference cue(Public Library of Science, 2017) Hambrook, Dillon A.; Ilievski, Marko; Mosadeghzad, Mohamad; Tata, Matthew S.The process of resolving mixtures of several sounds into their separate individual streams is known as auditory scene analysis and it remains a challenging task for computational systems. It is well-known that animals use binaural differences in arrival time and intensity at the two ears to find the arrival angle of sounds in the azimuthal plane, and this localization function has sometimes been considered sufficient to enable the un-mixing of complex scenes. However, the ability of such systems to resolve distinct sound sources in both space and frequency remains limited. The neural computations for detecting interaural time difference (ITD) have been well studied and have served as the inspiration for computational auditory scene analysis systems, however a crucial limitation of ITD models is that they produce ambiguous or “phantom” images in the scene. This has been thought to limit their usefulness at frequencies above about 1khz in humans. We present a simple Bayesian model and an implementation on a robot that uses ITD information recursively. The model makes use of head rotations to show that ITD information is sufficient to unambiguously resolve sound sources in both space and frequency. Contrary to commonly held assumptions about sound localization, we show that the ITD cue used with high-frequency sound can provide accurate and unambiguous localization and resolution of competing sounds. Our findings suggest that an “active hearing” approach could be useful in robotic systems that operate in natural, noisy settings. We also suggest that neurophysiological models of sound localization in animals could benefit from revision to include the influence of top-down memory and sensorimotor integration across head rotations.
- ItemCorrection: Dynamics of distraction: competition among auditory streams modulates gain and disrupts inter-trial phase coherence in the human electroencephalogram(Public Library of Science, 2013) Ponjavic-Conte, Karla D.; Hambrook, Dillon A.; Pavlovic, Sebastian; Tata, Matthew S.
- ItemDynamics of distraction: competition among auditory streams modulates gain and disrupts inter-trial phase coherence in the human electroencephalogram(Public Library of Science, 2013) Ponjavic-Conte, Karla D.; Hambrook, Dillon A.; Pavlovic, Sebastian; Tata, Matthew S.Auditory distraction is a failure to maintain focus on a stream of sounds. We investigated the neural correlates of distraction in a selective-listening pitch-discrimination task with high (competing speech) or low (white noise) distraction. Highdistraction impaired performance and reduced the N1 peak of the auditory Event-Related Potential evoked by probe tones. In a series of simulations, we explored two theories to account for this effect: disruption of sensory gain or a disruption of inter-trial phase consistency. When compared to these simulations, our data were consistent with both effects of distraction. Distraction reduced the gain of the auditory evoked potential and disrupted the inter-trial phase consistency with which the brain responds to stimulus events. Tones at a non-target, unattended frequency were more susceptible to the effects of distraction than tones within an attended frequency band.
- ItemRendering visual events as sounds: spatial attention capture by auditory augmented reality(Public Library of Science, 2017) Stone, Scott A.; Tata, Matthew S.Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible
- ItemMental rotational ability is correlated with spatial but not verbal working memory performance and P300 amplitude in males(Public Library of Science, 2013) Christie, Gregory J.; Cook, Charles M.; Ward, Brian J.; Tata, Matthew S.; Sutherland, Janice; Sutherland, Robert J.; Saucier, Deborah M.This study investigated how both sex and individual differences in a mental rotation test (MRT) influence performance on working memory (WM). To identify the neural substrate supporting these differences, brain electrical activity was measured using the event-related potential technique. No significant sex differences were observed in a test of verbal WM, however males were significantly faster than females to respond to probe stimuli in a test of spatial WM. This difference was no longer significant after controlling for differences in MRT score, suggesting that rotational ability mediates performance in the spatial memory task for both sexes. A posterior P300 was observed in both tasks as participants encoded information into memory, however the amplitude of the P300 correlated with RT in the spatial task but not in the verbal task. Individual differences in the MRT also correlated with RT and with the amplitude of the P300, but again only in the spatial task. After splitting the analysis by sex, partial correlations controlling for MRT revealed that for males, individual differences in rotational ability completely mediated the correlation between the P300 and RT in the spatial task. This mediating effect was not observed for the female participants. The results therefore suggest a relatively stronger association in males between innate mental rotational ability, spatial memory performance, and brain electrophysiological processes supporting spatial memory.