Service robots will be deployed in the future as general assistive devices in dynamic human environments like households, schools and hospitals. In order to be valuable and cost-effective assistants, robots must allow a wide range of customization, especially regarding their skills. As pre-programming robots for every situation is impossible, robots need to gain new skills and adapt their behavior after deployment, possibly by interacting with domain experts (e.g. therapists and nurses in hospitals) that lack the technical expertise required to shape robot’s behaviors. We therefore believe robots should learn from people in a natural and effective way.
To this end, we investigated the use of Active Learning, a Machine Learning technique that enables robots to actively participate the learning process by making queries to their final users. Focusing on both the Machine Learning and the Human-Robot Interaction perspectives, we developed an Active Learning technique for the learning of temporal task models (in our HRI’18 paper), investigated the effects of different query selection mechanisms on user’s and robot’s performance (in our HRI’19 paper), and developed an Active Learning-aided End-User Programming framework for the tuning of robotic primitives (in our HRI’20 paper).
- M. Racca, V. Kyrki and M. Cakmak “Interactive Tuning of Robot Parameter Programs via Expected Divergence Maximization” Accepted to the 15th ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, United Kingdom, 2020. (HRI’20) <preprint> <video> <code>
- M. Racca, A. Oulasvirta and V. Kyrki “Teacher-Aware Active Robot Learning”, in Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction, Daegu, Korea (South), 2019, 335-343. (HRI’19) <DOI link> <preprint> <code> <slides>
- M. Racca and V. Kyrki “Active Robot Learning for Temporal Task models”, in Proceedings of the 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI ’18). ACM, New York, NY, USA, 123-131. <free access link> <slides> <code> <preprint>