Bimodal Cochlear Implant Listeners' Ability to Perceive Minimal Audible Angle Differences.
J Am Acad Audiol. 2018 Nov 12;:
Authors: Zaleski-King A, Goupell MJ, Barac-Cikoja D, Bakke M
Abstract
BACKGROUND: Bilateral inputs should ideally improve sound localization and speech understanding in noise. However, for many bimodal listeners [i.e., individuals using a cochlear implant (CI) with a contralateral hearing aid (HA)], such bilateral benefits are at best, inconsistent. The degree to which clinically available HA and CI devices can function together to preserve interaural time and level differences (ITDs and ILDs, respectively) enough to support the localization of sound sources is a question with important ramifications for speech understanding in complex acoustic environments.
PURPOSE: To determine if bimodal listeners are sensitive to changes in spatial location in a minimum audible angle (MAA) task.
RESEARCH DESIGN: Repeated-measures design.
STUDY SAMPLE: Seven adult bimodal CI users (28-62 years). All listeners reported regular use of digital HA technology in the nonimplanted ear.
DATA COLLECTION AND ANALYSIS: Seven bimodal listeners were asked to balance the loudness of prerecorded single syllable utterances. The loudness-balanced stimuli were then presented via direct audio inputs of the two devices with an ITD applied. The task of the listener was to determine the perceived difference in processing delay (the interdevice delay [IDD]) between the CI and HA devices. Finally, virtual free-field MAA performance was measured for different spatial locations both with and without inclusion of the IDD correction, which was added with the intent to perceptually synchronize the devices.
RESULTS: During the loudness-balancing task, all listeners required increased acoustic input to the HA relative to the CI most comfortable level to achieve equal interaural loudness. During the ITD task, three listeners could perceive changes in intracranial position by distinguishing sounds coming from the left or from the right hemifield; when the CI was delayed by 0.73, 0.67, or 1.7 msec, the signal lateralized from one side to the other. When MAA localization performance was assessed, only three of the seven listeners consistently achieved above-chance performance, even when an IDD correction was included. It is not clear whether the listeners who were able to consistently complete the MAA task did so via binaural comparison or by extracting monaural loudness cues. Four listeners could not perform the MAA task, even though they could have used a monaural loudness cue strategy.
CONCLUSIONS: These data suggest that sound localization is extremely difficult for most bimodal listeners. This difficulty does not seem to be caused by large loudness imbalances and IDDs. Sound localization is best when performed via a binaural comparison, where frequency-matched inputs convey ITD and ILD information. Although low-frequency acoustic amplification with a HA when combined with a CI may produce an overlapping region of frequency-matched inputs and thus provide an opportunity for binaural comparisons for some bimodal listeners, our study showed that this may not be beneficial or useful for spatial location discrimination tasks. The inability of our listeners to use monaural-level cues to perform the MAA task highlights the difficulty of using a HA and CI together to glean information on the direction of a sound source.
PMID: 30417825 [PubMed - as supplied by publisher]
from #Audiology via ola Kala on Inoreader https://ift.tt/2Pqm1ur
via IFTTT
Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου