OAK

GraspClutter6D: A Large-Scale Real-World Dataset for Robust Perception and Grasping in Cluttered Scenes

Metadata Downloads
Author(s)
Back, SeunghyeokLee, JoosoonKim, KangminRho, HeeseonLee, GeonhyupKang, RaeyoungLee, SangbeomNoh, SangjunLee, YoungjinLee, TaeyeopLee, Kyoobin
Type
Article
Citation
IEEE ROBOTICS AND AUTOMATION LETTERS, v.10, no.10, pp.10498 - 10505
Issued Date
2025-10
Abstract
Robust grasping in cluttered environments remains an open challenge in robotics. While benchmark datasets have significantly advanced deep learning methods, they mainly focus on simplistic scenes with light occlusion and insufficient diversity, limiting their applicability to practical scenarios. We present GraspClutter6D, a large-scale real-world grasping dataset featuring: (1) 1,000 highly cluttered scenes with dense arrangements (14.1 objects/scene, 62.6% occlusion), (2) comprehensive coverage across 200 objects in 75 environment configurations (bins, shelves, and tables) captured using four RGB-D cameras from multiple viewpoints, and (3) rich annotations including 736 K 6D object poses and 9.3B feasible robotic grasps for 52 K RGB-D images. We benchmark state-of-the-art segmentation, object pose estimation, and grasp detection methods to provide key insights into challenges in cluttered environments. Additionally, we validate the dataset's effectiveness as a training resource, demonstrating that grasping networks trained on GraspClutter6D significantly outperform those trained on existing datasets in both simulation and real-world experiments.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI
10.1109/LRA.2025.3601045
URI
https://scholar.gist.ac.kr/handle/local/32093
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.