Abstract
Perceptual anchoring traditionally relies on data from sensors mounted on a mobile robot. This allows the sensors to be close to objects in the environment, making it possible to acquire details with high accuracy. IoT sensors are becoming more and more ubiquitous, and are found in both private and public buildings. IoT cameras are often mounted on ceilings or walls, allowing them to observe a larger part of the environment than robot-mounted sensors, but making them unsuitable for acquiring detailed visual information. They often have a lower sampling rate, keeping them cost-effective. We hypothesize that IoT and robot sensors can be combined in a way that exploits the details of the robot sensors and the immediately high overview of the IoT sensors by embracing ubiquitous sensing. In this work, we evaluate and compare different methods for associating IoT and robot sensing data, including a novel context-based similarity measure and a simple geometric baseline. The results support our hypothesis and we find that all methods outperform the baseline method in most scenarios. Using context-similarity is most beneficial for the affinity propagation clustering algorithm for setups with 16 and 12 objects. These results can serve as a guideline for designing anchoring or world modeling systems, using IoT and robot sensing data.
Originalsprog | Engelsk |
---|---|
Titel | 2024 21st International Conference on Ubiquitous Robots, UR 2024 |
Forlag | IEEE |
Publikationsdato | jun. 2024 |
Sider | 550-557 |
ISBN (Elektronisk) | 9798350361070 |
DOI | |
Status | Udgivet - jun. 2024 |
Begivenhed | 21st International Conference on Ubiquitous Robots, UR 2024 - New York, USA Varighed: 24. jun. 2024 → 27. jun. 2024 |
Konference
Konference | 21st International Conference on Ubiquitous Robots, UR 2024 |
---|---|
Land/Område | USA |
By | New York |
Periode | 24/06/2024 → 27/06/2024 |
Bibliografisk note
Publisher Copyright:© 2024 IEEE.