TY - GEN
T1 - Autonomous Bi-Manual Surgical Suturing Based on Skills Learned from Demonstration
AU - Schwaner, Kim Lindberg
AU - Iturrate, Iñigo
AU - Andersen, Jakob Kristian Holm
AU - Jensen, Pernille Tine
AU - Savarimuthu, Thiusius Rajeeth
N1 - Conference code: 34
PY - 2021
Y1 - 2021
N2 - We present a novel application of Learning from Demonstration to realize a fully autonomous bi-manual surgical suturing task, including needle pick up, insertion, re-grasping, extraction and hand-over. Surgical action primitives are learned from a single human demonstration and encoded into an action library from which they are pulled to compose more elaborate tasks at planning/execution time. The method is demonstrated in a non-clinical setting, using unmodified surgical instruments with a custom surgical robot system. We use stereo vision to automatically detect the suture needle and entry points to close the control loop and generalize tasks to different task conditions. The suturing task is shown to generalize well to differing initial conditions with a success rate of 17% for the full task, a mean subtask success rate of 75% and mean needle insertion error of 3.3 mm over the course of 46 trial task executions at human speed. Failures could all be attributed to erroneous vision-based detection, pose estimation and robot calibration.
AB - We present a novel application of Learning from Demonstration to realize a fully autonomous bi-manual surgical suturing task, including needle pick up, insertion, re-grasping, extraction and hand-over. Surgical action primitives are learned from a single human demonstration and encoded into an action library from which they are pulled to compose more elaborate tasks at planning/execution time. The method is demonstrated in a non-clinical setting, using unmodified surgical instruments with a custom surgical robot system. We use stereo vision to automatically detect the suture needle and entry points to close the control loop and generalize tasks to different task conditions. The suturing task is shown to generalize well to differing initial conditions with a success rate of 17% for the full task, a mean subtask success rate of 75% and mean needle insertion error of 3.3 mm over the course of 46 trial task executions at human speed. Failures could all be attributed to erroneous vision-based detection, pose estimation and robot calibration.
U2 - 10.1109/IROS51168.2021.9636432
DO - 10.1109/IROS51168.2021.9636432
M3 - Article in proceedings
T3 - I E E E International Conference on Intelligent Robots and Systems. Proceedings
SP - 4017
EP - 4024
BT - 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
PB - IEEE
T2 - 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Y2 - 27 September 2021 through 1 October 2021
ER -