Abstract
Cochlear implants (CIs) convey fundamental-frequency information using primarily temporal cues. However, temporal pitch perception in CI users is weak and, when measured using rate discrimination tasks, deteriorates markedly as the rate increases beyond 300 pulses-per-second. Rate pitch may be weak because the electrical stimulation of the surviving neural population of the implant recipient may not allow accurate coding of inter-pulse time intervals. If so, this phenomenon should prevent listeners from detecting when a pulse train is physically temporally jittered. Performance in a jitter detection task was compared to that in a rate-pitch discrimination task. Stimuli were delivered using direct stimulation in cochlear implants, on a mid-array and an apical electrode, and at two different rates (100 and 300 pps). Average performance on both tasks was worse at the higher pulse rate and did not depend on electrode. However, there was a large variability across and within listeners that did not correlate between the two tasks, suggesting that rate-pitch judgement and regularity detection are to some extent limited by task-specific processes. Simulations with filtered pulse trains presented to NH listeners yielded broadly similar results, except that, for the rate discrimination task, the difference between performance with 100- and 300-pps base rates was smaller than observed for CI users.
from #Audiology via xlomafota13 on Inoreader http://ift.tt/2de75cO
via IFTTT