Abstract
This paper presents a method for segmentation and classification of kinesthetic demonstrations of robot peg-in-hole tasks using a deep neural network. The presented method depends only on sensor data that is readily available on collaborative robots (kinematic state and forcetorque readings) and does not need any additional sensors. Our method can be used to automatically derive program structures from a single demonstration of a robot task. This can reduce programming time, and make it easier to revise sections of a larger task. We introduce a combined architecture consisting of a Convolutional Neural Network block for raw feature extraction and a Long Short-Term Memory block for tracking the time-evolution of these features. We also extend the model from binary insertion segmentation to multi-class segmentation including pick up, place, free air motions, insertions, and extraction. Through an ablation study and comparison to previous work, we show that the new model performs better on the test set, containing one demonstrator, but significantly better on a generalization set consisting of ten previously unseen demonstrators, reaching an overall accuracy of 70%, which is 15 percentage points better than the closest-performing other method.
Originalsprog | Engelsk |
---|---|
Titel | 2023 IEEE 19th International Conference on Automation Science and Engineering (CASE) |
Antal sider | 7 |
Forlag | IEEE Computer Society |
Publikationsdato | 2023 |
ISBN (Elektronisk) | 9798350320695 |
DOI | |
Status | Udgivet - 2023 |
Begivenhed | 19th IEEE International Conference on Automation Science and Engineering, CASE 2023 - Auckland, New Zealand Varighed: 26. aug. 2023 → 30. aug. 2023 |
Konference
Konference | 19th IEEE International Conference on Automation Science and Engineering, CASE 2023 |
---|---|
Land/Område | New Zealand |
By | Auckland |
Periode | 26/08/2023 → 30/08/2023 |
Sponsor | Beckhoff, CTEK - Combined Technologies |
Navn | Proceedings - IEEE International Conference on Automation Science and Engineering |
---|---|
Vol/bind | 2023-August |
ISSN | 2161-8070 |
Bibliografisk note
Publisher Copyright:© 2023 IEEE.