TY - GEN
T1 - Detecting Worker Attention Lapses in Human-Robot Interaction
T2 - 2023 The 28th International Conference on Automation and Computing<br/>
AU - Dai, ZhuangZhuang
AU - Park, Jinha
AU - Kaszowska, Aleksandra
AU - Li, Chen
N1 - Funding Information:
ACKNOWLEDGMENT This work was funded by 2022/23 Aston Pump Priming Scheme and AAU Bridging Project “A Multimodal Attention Tracking In Human-robot Collaboration For Manufacturing Tasks.” We thank the Aalborg 5G Smart Production Lab for supporting our data collection campaign.
PY - 2023
Y1 - 2023
N2 - The advent of industrial robotics and autonomous systems endow human-robot collaboration in a massive scale. However, current industrial robots are restrained in co-working with human in close proximity due to inability of interpreting human agents' attention. Human attention study is non-trivial since it involves multiple aspects of the mind: perception, memory, problem solving, and consciousness. Human attention lapses are particularly problematic and potentially catastrophic in industrial workplace, from assembling electronics to operating machines. Attention is indeed complex and cannot be easily measured with single-modality sensors. Eye state, head pose, posture, and manifold environment stimulus could all play a part in attention lapses. To this end, we propose a pipeline to annotate multimodal dataset of human attention tracking, including eye tracking, fixation detection, third-person surveillance camera, and sound. We produce a pilot dataset containing two fully annotated phone assembly sequences in a realistic manufacturing environment. We evaluate existing fatigue and drowsiness prediction methods for attention lapse detection. Experimental results show that human attention lapses in production scenarios are more subtle and imperceptible than well-studied fatigue and drowsiness.
AB - The advent of industrial robotics and autonomous systems endow human-robot collaboration in a massive scale. However, current industrial robots are restrained in co-working with human in close proximity due to inability of interpreting human agents' attention. Human attention study is non-trivial since it involves multiple aspects of the mind: perception, memory, problem solving, and consciousness. Human attention lapses are particularly problematic and potentially catastrophic in industrial workplace, from assembling electronics to operating machines. Attention is indeed complex and cannot be easily measured with single-modality sensors. Eye state, head pose, posture, and manifold environment stimulus could all play a part in attention lapses. To this end, we propose a pipeline to annotate multimodal dataset of human attention tracking, including eye tracking, fixation detection, third-person surveillance camera, and sound. We produce a pilot dataset containing two fully annotated phone assembly sequences in a realistic manufacturing environment. We evaluate existing fatigue and drowsiness prediction methods for attention lapse detection. Experimental results show that human attention lapses in production scenarios are more subtle and imperceptible than well-studied fatigue and drowsiness.
KW - Human Attention Monitoring
KW - Eye Tracking
KW - Industrial Robots
KW - Human-Robot Interaction
U2 - 10.1109/ICAC57885.2023.10275177
DO - 10.1109/ICAC57885.2023.10275177
M3 - Article in proceedings
AN - SCOPUS:85175554257
SP - 558
EP - 563
BT - 2023 28th International Conference on Automation and Computing (ICAC)
PB - IEEE
Y2 - 30 August 2023 through 1 September 2023
ER -