OAK

Automatic Sign Dance Generation with Diffusion Based Inpainting

Metadata Downloads
Author(s)
Eunhee Kim
Type
Thesis
Degree
Master
Department
대학원 융합기술학제학부(문화기술프로그램)
Advisor
Kim, KyungJoong
Abstract
Our research presents a novel method for creating sign language for music. Our approach involves automatically generating sign language that incorporates both the musical and lyrical aspects of songs, resulting in a full-body choreography of sign dance. We have created the first dataset for sign dance, which is unique in its emphasis on lyrically-driven sign language and its inclusion of full-body dance movements. The dataset focuses on the expressiveness of terminal joints, such as fingers, and rhythmic body movements in response to music. It is a valuable contribution to sign language research. Our methodology is inspired by the dance generating diffusion model, which integrates spatio-temporal motion inpainting in the sampling process. This approach ensures natural body movement synchronization with music while preserving key joint movements essential for sign language. To validate our approach, we conducted quantitative and qualitative evaluations, including a user study with certified sign language translators. The results demonstrate that our method effectively aligns with the music beat and has potential applications in enhancing sign language translation workflows.
URI
https://scholar.gist.ac.kr/handle/local/18958
Fulltext
http://gist.dcollection.net/common/orgView/200000880210
Alternative Author(s)
김은희
Appears in Collections:
Department of AI Convergence > 3. Theses(Master)
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.