OAK

Analysis of human auditory system on sound source localization using auditory-evoked potentials

Metadata Downloads
Abstract
In order to improve the human auditory perception, many audio processing algorithms have been implemented on audio devices such as sound bars and high-fidelity headsets. Accordingly, it is necessary to explore the effect of such algorithms on actual human perception. In this paper, we analyze the properties of the human auditory system using two electroencephalography (EEG) experiments. First, the magnitude spectra of the EEG signals in several brain cortexes were investigated depending on the sound rendering method. Then, we observed the effects of amplitude panning applied to virtual sound source localization in the human brain using auditory evoked potentials. In particular, the panned audio sources were generated using the stereophonic law of sines, which is a typical amplitude panning technique for stereo loudspeaker listening environments. Depending on the azimuth direction, the event-related potentials (ERPs) between the actually located sound and the virtually panned ones were compared. It is implied from the experiments that the subjects can distinguish actually located sound sources, depending on the azimuth direction, while it is difficult to distinguish between virtually panned sound sources. Finally, it was also demonstrated that the performance of sound source localizations could be measured using EEG signals.
Author(s)
Chun, Chan JunCho, HohyunJun, Sung ChanKim, Hong Kook
Issued Date
2015-07
Type
Article
URI
https://scholar.gist.ac.kr/handle/local/14648
Publisher
Asia Life Sciences
Citation
Asia Life Sciences, pp.635 - 644
ISSN
0117-3375
Appears in Collections:
Department of AI Convergence > 1. Journal Articles
Department of Electrical Engineering and Computer Science > 1. Journal Articles
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.