Multisensory guidance of goal-oriented behaviour of legged robots

Publikation: Bidrag til bog/antologi/rapport/konference-proceedingKonferencebidrag i proceedingsForskningpeer review

Resumé

Biological systems often combine cues from two different sensory modalities to execute goal-oriented sensorimotor tasks, which otherwise cannot be accurately executed with either sensory stream in isolation. When auditory cues alone are not sufficient to accurately localise an audio-visual target by orienting towards it, visual cues can complement their auditory counterparts and improve localisation accuracy. We present a multisensory goal-oriented locomotion control architecture that uses visual feedback to adaptively improve acoustomotor orientation response of the hexapod robot AMOS II. The robot is tasked with localising an audio-visual target by turning towards it. The architecture extracts sound direction information with a model of the peripheral auditory system of lizards to modulate locomotion control parameters driving the turning behaviour. The visual information adaptively changes the strength of the acoustomotor coupling to adjust turning speed of the robot. Our experiments demonstrate improved orientation towards the audio-visual target emitting a tone of frequency 2.2kHz located at an angular offset of 45 degrees from the robot.
OriginalsprogEngelsk
TitelHuman-centric Robotics : Proceedings of the 20th International Conference on CLAWAR 2017
RedaktørerManuel F. Silva, Gurvinder S. Virk, Mohammad O Tokhi, Benedita Malheiro, Paulo Ferreira, Pedro Guedes
ForlagWorld Scientific
Publikationsdato2017
Sider97-105
ISBN (Trykt)978-981-3231-03-0
ISBN (Elektronisk)978-981-3231-05-4 , 978-981-3231-04-7
DOI
StatusUdgivet - 2017
Begivenhed20th International Conference on Climbing and Walking Robots and Support Technologies for Mobile Machines - Instituto Superior de Engenharia do Porto, Porto, Portugal
Varighed: 11. sep. 201713. sep. 2017
https://clawar.org/clawar2017/

Konference

Konference20th International Conference on Climbing and Walking Robots and Support Technologies for Mobile Machines
LokationInstituto Superior de Engenharia do Porto
LandPortugal
ByPorto
Periode11/09/201713/09/2017
Internetadresse

Fingeraftryk

Robots
Biological systems
Acoustic waves
Feedback
Experiments

Citer dette

Shaikh, D., Manoonpong, P., Tuxworth, G., & Bodenhagen, L. (2017). Multisensory guidance of goal-oriented behaviour of legged robots. I M. F. Silva, G. S. Virk, M. O. Tokhi, B. Malheiro, P. Ferreira, & P. Guedes (red.), Human-centric Robotics: Proceedings of the 20th International Conference on CLAWAR 2017 (s. 97-105). World Scientific. https://doi.org/10.1142/9789813231047_0015
Shaikh, Danish ; Manoonpong, Poramate ; Tuxworth, Gervase ; Bodenhagen, Leon. / Multisensory guidance of goal-oriented behaviour of legged robots. Human-centric Robotics: Proceedings of the 20th International Conference on CLAWAR 2017. red. / Manuel F. Silva ; Gurvinder S. Virk ; Mohammad O Tokhi ; Benedita Malheiro ; Paulo Ferreira ; Pedro Guedes. World Scientific, 2017. s. 97-105
@inproceedings{b346bb0c574947d6a9d73b4a6560f83a,
title = "Multisensory guidance of goal-oriented behaviour of legged robots",
abstract = "Biological systems often combine cues from two different sensory modalities to execute goal-oriented sensorimotor tasks, which otherwise cannot be accurately executed with either sensory stream in isolation. When auditory cues alone are not sufficient to accurately localise an audio-visual target by orienting towards it, visual cues can complement their auditory counterparts and improve localisation accuracy. We present a multisensory goal-oriented locomotion control architecture that uses visual feedback to adaptively improve acoustomotor orientation response of the hexapod robot AMOS II. The robot is tasked with localising an audio-visual target by turning towards it. The architecture extracts sound direction information with a model of the peripheral auditory system of lizards to modulate locomotion control parameters driving the turning behaviour. The visual information adaptively changes the strength of the acoustomotor coupling to adjust turning speed of the robot. Our experiments demonstrate improved orientation towards the audio-visual target emitting a tone of frequency 2.2kHz located at an angular offset of 45 degrees from the robot.",
keywords = "lizard peripheral auditory system, audio-visual localisation, multisensory integration, hexapedal locomotion",
author = "Danish Shaikh and Poramate Manoonpong and Gervase Tuxworth and Leon Bodenhagen",
year = "2017",
doi = "10.1142/9789813231047_0015",
language = "English",
isbn = "978-981-3231-03-0",
pages = "97--105",
editor = "Silva, {Manuel F.} and Virk, {Gurvinder S.} and Tokhi, {Mohammad O} and Benedita Malheiro and Paulo Ferreira and Pedro Guedes",
booktitle = "Human-centric Robotics",
publisher = "World Scientific",

}

Shaikh, D, Manoonpong, P, Tuxworth, G & Bodenhagen, L 2017, Multisensory guidance of goal-oriented behaviour of legged robots. i MF Silva, GS Virk, MO Tokhi, B Malheiro, P Ferreira & P Guedes (red), Human-centric Robotics: Proceedings of the 20th International Conference on CLAWAR 2017. World Scientific, s. 97-105, 20th International Conference on Climbing and Walking Robots and Support Technologies for Mobile Machines, Porto, Portugal, 11/09/2017. https://doi.org/10.1142/9789813231047_0015

Multisensory guidance of goal-oriented behaviour of legged robots. / Shaikh, Danish; Manoonpong, Poramate; Tuxworth, Gervase; Bodenhagen, Leon.

Human-centric Robotics: Proceedings of the 20th International Conference on CLAWAR 2017. red. / Manuel F. Silva; Gurvinder S. Virk; Mohammad O Tokhi; Benedita Malheiro; Paulo Ferreira; Pedro Guedes. World Scientific, 2017. s. 97-105.

Publikation: Bidrag til bog/antologi/rapport/konference-proceedingKonferencebidrag i proceedingsForskningpeer review

TY - GEN

T1 - Multisensory guidance of goal-oriented behaviour of legged robots

AU - Shaikh, Danish

AU - Manoonpong, Poramate

AU - Tuxworth, Gervase

AU - Bodenhagen, Leon

PY - 2017

Y1 - 2017

N2 - Biological systems often combine cues from two different sensory modalities to execute goal-oriented sensorimotor tasks, which otherwise cannot be accurately executed with either sensory stream in isolation. When auditory cues alone are not sufficient to accurately localise an audio-visual target by orienting towards it, visual cues can complement their auditory counterparts and improve localisation accuracy. We present a multisensory goal-oriented locomotion control architecture that uses visual feedback to adaptively improve acoustomotor orientation response of the hexapod robot AMOS II. The robot is tasked with localising an audio-visual target by turning towards it. The architecture extracts sound direction information with a model of the peripheral auditory system of lizards to modulate locomotion control parameters driving the turning behaviour. The visual information adaptively changes the strength of the acoustomotor coupling to adjust turning speed of the robot. Our experiments demonstrate improved orientation towards the audio-visual target emitting a tone of frequency 2.2kHz located at an angular offset of 45 degrees from the robot.

AB - Biological systems often combine cues from two different sensory modalities to execute goal-oriented sensorimotor tasks, which otherwise cannot be accurately executed with either sensory stream in isolation. When auditory cues alone are not sufficient to accurately localise an audio-visual target by orienting towards it, visual cues can complement their auditory counterparts and improve localisation accuracy. We present a multisensory goal-oriented locomotion control architecture that uses visual feedback to adaptively improve acoustomotor orientation response of the hexapod robot AMOS II. The robot is tasked with localising an audio-visual target by turning towards it. The architecture extracts sound direction information with a model of the peripheral auditory system of lizards to modulate locomotion control parameters driving the turning behaviour. The visual information adaptively changes the strength of the acoustomotor coupling to adjust turning speed of the robot. Our experiments demonstrate improved orientation towards the audio-visual target emitting a tone of frequency 2.2kHz located at an angular offset of 45 degrees from the robot.

KW - lizard peripheral auditory system

KW - audio-visual localisation

KW - multisensory integration

KW - hexapedal locomotion

U2 - 10.1142/9789813231047_0015

DO - 10.1142/9789813231047_0015

M3 - Article in proceedings

SN - 978-981-3231-03-0

SP - 97

EP - 105

BT - Human-centric Robotics

A2 - Silva, Manuel F.

A2 - Virk, Gurvinder S.

A2 - Tokhi, Mohammad O

A2 - Malheiro, Benedita

A2 - Ferreira, Paulo

A2 - Guedes, Pedro

PB - World Scientific

ER -

Shaikh D, Manoonpong P, Tuxworth G, Bodenhagen L. Multisensory guidance of goal-oriented behaviour of legged robots. I Silva MF, Virk GS, Tokhi MO, Malheiro B, Ferreira P, Guedes P, red., Human-centric Robotics: Proceedings of the 20th International Conference on CLAWAR 2017. World Scientific. 2017. s. 97-105 https://doi.org/10.1142/9789813231047_0015