Neural and Perceptual Bases Of Auditory Scene Analysis
Project Description
We effortlessly parse an incoming acoustic waveform into perceptual objects (such as words or notes) or streams (such as speech or melodies), but very little is known about the underlying neural processing beyond the level of the cochlea. This project aims to uncover neural correlates of auditory object and stream formation by combining behavioral (psychoacoustic) measures with measures of brain activation (using MEG and fMRI) in various acoustic situations where listeners hear one, two, or many auditory streams.
Selected Publications
Allen EJ, Burton PC, Mesik J, Olman CA, Oxenham AJ (2019). Cortical correlates of attention to auditory features. J Neurosci 39:3292-3300. PMCID: PPMC6788818.
Ruggles DR, Tausend AN, Shamma SA, Oxenham AJ (2018). Cortical markers of auditory stream segregation revealed for streaming based on tonotopy but not pitch. J Acoust Soc Am 144:2424-2433.
David M, Tausend AN, Strelcyk O, Oxenham AJ (2018). Effect of age and hearing loss on auditory stream segregation of speech sounds. Hear Res 364:118-128. PMCID: PPMC5984159.