Improving the Generalizability of Robot Assembly Tasks Learned from Demonstration via CNN-based Segmentation

Inigo Iturrate*, Etienne Roberge, Esben Hallundbak Ostergaard, Vincent Duchaine, Thiusius Rajeeth Savarimuthu

*Kontaktforfatter for dette arbejde

Publikation: Kapitel i bog/rapport/konference-proceedingKonferencebidrag i proceedingsForskningpeer review

Abstrakt

Kinesthetic teaching and Dynamic Movement Primitives (DMPs) enable fast and adaptable learning of robot tasks based on a human demonstration. A task encoded as a dynamic movement primitive can be reused with a different goal position, albeit with a resulting distortion in the approach trajectory with regards to the original task. While this is sufficient for some robotic applications, the accuracy requirements for assembly tasks in an industrial context, where tolerances are tight and workpieces are small, is much higher. In such a context it is also preferable to keep the number of demonstrations and of external sensors low. Our approach relies on a single demonstration and a single force-torque sensor at the robot tool. We make use of a Convolutional Neural Network (CNN) trained on the force-torque sensor data to segment the task into several movement primitives for the different phases: pickup - approach - insertion - retraction, allowing us to achieve better positional accuracy when generalizing the task primitives to new targets. To the best of our awareness, we are the first to utilize a CNN as a segmentation tool to improve the generalization performance of DMPs.

OriginalsprogEngelsk
Titel2019 IEEE 15th International Conference on Automation Science and Engineering, CASE 2019
ForlagIEEE
Publikationsdato2019
Sider553-560
ISBN (Trykt)978-1-7281-0357-0
ISBN (Elektronisk)9781728103556, 978-1-7281-0356-3
DOI
StatusUdgivet - 2019
Begivenhed15th IEEE International Conference on Automation Science and Engineering, CASE 2019 - Vancouver, Canada
Varighed: 22. aug. 201926. aug. 2019

Konference

Konference15th IEEE International Conference on Automation Science and Engineering, CASE 2019
LandCanada
ByVancouver
Periode22/08/201926/08/2019
SponsorABB Robotics, University of Wisconsin–Madison, et al., IEEE, IEEE Robotics and Automation Society, National Science Foundation (NSF)
NavnIEEE International Conference on Automation Science and Engineering
Vol/bind2019-August
ISSN2161-8070

Fingeraftryk Dyk ned i forskningsemnerne om 'Improving the Generalizability of Robot Assembly Tasks Learned from Demonstration via CNN-based Segmentation'. Sammen danner de et unikt fingeraftryk.

Citationsformater