Robots acting in everyday environments need a good knowledge of how a manipulation action can affect pairs of objects in a relationship, such as ‘inside’ or ‘behind’ or ‘on top’. These relationships afford certain means-end actions such as pulling a container to retrieve the contents, or pulling a tool to retrieve a desired object. We investigate how these relational affordances could be learnt by a robot from its own action experience. A major challenge in this approach is to reduce the number of training samples needed to achieve accuracy, and hence we investigate an approach which can leverage past knowledge to accelerate current learning (which we call bootstrapping). We learn Random Forest based affordance predictors from visual inputs and demonstrate two approaches to knowledge transfer for bootstrapping. In the first approach (direct bootstrapping), the state-space for a new affordance predictor is augmented with the output of previously learnt affordances. In the second approach (category based bootstrapping), we form categories that capture underlying commonalities of a pair of existing affordances and augment the state-space with this category classifier’s output. In addition, we introduce a novel heuristic, which suggests how a large set of potential affordance categories can be pruned to leave only those categories which are most promising for bootstrapping future affordances. Our results show that both bootstrapping approaches outperform learning without bootstrapping. We also show that there is no significant difference in performance between direct and category based bootstrapping.
|Journal||I E E E Transactions on Cognitive and Developmental Systems|
|Publication status||Published - Mar 2018|
- Artificial intelligence
- autonomous mental development
- intelligent systems
Fichtl, S., Kraft, D., Krüger, N., & Guerin, F. (2018). Bootstrapping relational affordances of object pairs using transfer. I E E E Transactions on Cognitive and Developmental Systems, 10(1), 56-71. https://doi.org/10.1109/TCDS.2016.2616496