Online 3D Deformable Object Classification for Mobile Cobot Manipulation

Konferenz: ISR Europe 2023 - 56th International Symposium on Robotics
26.09.2023-27.09.2023 in Stuttgart, Germany

Tagungsband: ISR Europe 2023

Seiten: 8Sprache: EnglischTyp: PDF

Autoren:
Nguyen, Khang; Dang, Tuan; Huber, Manfred (Learning and Adaptive Robotics Laboratory, Department of Computer Science and Engineering, University of Texas at Arlington, Texas, USA)

Inhalt:
Vision-based object manipulation in assistive mobile cobots essentially relies on classifying the target objects based on their 3D shapes and features, whether they are deformed or not. In this work, we present an auto-generated dataset of deformed objects specific for assistive mobile cobot manipulation using an intuitive Laplacian-based mesh deformation procedure. We first determine the graspable region of the robot hand on the given object’s mesh. Then, we uniformly sample handle points within the graspable region and perform deformation with multiple handle points based on the robot gripper configuration. In each deformation, we identify the orientation of handle points and prevent self-intersection to guarantee the object’s physical meaning when multiple handle points are simultaneously applied to the mesh at different deformation intensities. We also introduce a lightweight neural network for 3D deformable object classification. Finally, we test our generated dataset on the Baxter robot with two 7-DOF arms, an integrated RGB-D camera, and a 3D deformable object classifier. The result shows that the robot is able to classify realworld deformed objects from point clouds captured at multiple views by the RGB-D camera. The source code is available at https://github.com/mkhangg/deformable cobot.