OAK

EEG 신호 간 유사도 분석을 위한 DTW-N 기법 적용 연구

Metadata Downloads
Author(s)
이가흔임철기백종화전성찬이성한서현
Type
Article
Citation
의공학회지, v.46, no.2, pp.208 - 214
Issued Date
2025-04
Abstract
Although Euclidean Distance (ED) has limitations in fully capturing the inherent similarity between sig- nals, it has demonstrated higher accuracy in personal identification than Dynamic Time Warping (DTW) when applied in Electroencephalogram (EEG) signal-based authentication systems. In this study, we aim to compare the perfor- mance of ED, DTW, and DTW-Normalization (DTW-N) algorithms in assessing EEG signal similarity. Furthermore, this study evaluates the effects of normalization on similarity measurement across different channels, participants, and signal counts. EEG data were collected from ten participants during speech tasks with auditory stimuli, and all 32 EEG channels were analyzed. The  is an indicator used to quantitatively evaluate the difference between signals from the same subject and different subjects; a higher value indicates a greater difference in signal similarity. DTW- N achieved the highest  values compared to ED and DTW. Across all channels, DTW-N showed the highest  values, with the FC1 channel having the highest average DTW-N value of 3.4110 × 10-2. Additionally,  for participants 3 and 9 reached 4.7225 × 10 , approximately 55.79% higher than the DTW-N mean, while  for participants 7 and 8 -5 was the lowest at 4.7225 × 10-5. As the number of signals increased, the  values decreased. The DTW-N algorithm effectively addressed temporal distortion and amplitude variations in EEG signals, making it highly effective for dis- tinguishing individuals based on EEG patterns. Future research will explore optimal representative metrics for EEG data and enhance individual identification performance using DTW-N-based classification models
Publisher
대한의용생체공학회
ISSN
1229-0807
DOI
10.9718/JBER.2025.46.2.208
URI
https://scholar.gist.ac.kr/handle/local/31600
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.