OAK

Deep-Learning-Based Medication Action Recognition Using Information Fusion between 3D Skeleton Coordinate and Object-Classification of Hand-ROI

Metadata Downloads
Abstract
As the elderly population increases in an aging society, the number of patients with chronic disease is also increasing. Therefore, an automated sensor-based medication monitoring system is continuously being developed. In this paper, we propose a robot-based scenario that can guide medication by directly interacting with a patient beyond simply monitoring behavior. Behavior recognition was performed in this robot-based scenario. In previous studies, there was a lack of alternatives to variation in human behavior, and there was a weakness in distinguishing similar behaviors. In addition, the behavioral recognition in the mobile robot scenario has a problem that the existing behavioral recognition and the camera perspective are continuously different. In this regard, we presented an algorithm for behavior recognition by collecting and fusion of time-series information by extracting hand-ROI using behavior segmentation, the addition of object information, and joint information. As a result of comparing the method presented in the dataset including variation with the state-of-the-art algorithm, the behavior recognition rate was improved by 20%, resulting in 97%. Behavior segmentation makes the system much more robust to variation compared to existing algorithms. In addition, Hand-ROI supplements behavioral recognition with information about objects that are difficult to capture in the skeleton and frees them from dependency on the background that varies depending on the viewpoint.
Author(s)
Yundong Lee
Issued Date
2022
Type
Thesis
URI
https://scholar.gist.ac.kr/handle/local/19063
Alternative Author(s)
이윤동
Department
대학원 융합기술학제학부(지능로봇프로그램)
Advisor
Kim, Mun Sang
Degree
Master
Appears in Collections:
Department of AI Convergence > 3. Theses(Master)
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.