Results were encouraging, showing the potential of the system to be used easily by a number of stakeholders, including health professionals, formal and informal carers, relatives, friends and the older persons themselves to adapt the robot to meet changing needs.Įnd-user personalisation is an active area of research since it has been recognised that robots to be used “in the field” need to allow users to adapt, modify and teach a robot. Our previous research investigated these issues, proposing and evaluating a teaching system in a human–robot interaction (HRI) experiment. that the robot can be taught to change its functional behaviours in response to the changing needs of the older adult. One of these challenges is to allow the robot to be personalised, e.g. The use of home assistance robots to help older adults stay independent in their homes faces many challenges. , motivated by the use of robotic solutions to address the concerns of cost and care issues resulting from an ageing population. The combination of an autonomous mobile robot and a “smart-home” environment, where the robot is able to extend its capabilities via access to the home sensor network, has been investigated in a number of large-scale projects, e.g. In previous research, companion robots have been designed to serve useful functions for their users, while carrying out those tasks in a socially acceptable manner. Such robots have started to appear in various guises ranging from stationary helpers to cleaning robots to robotic companions.
Phone tray 1.39 verification#
Keywords: human–robot interaction companion robots behaviour interference formal verification 1 IntroductionĪ long-term goal of robotics research is the use of assistive robots in the home. These results highlight a promising path towards verification and validation of assistive home companion robots that allow end-user personalisation. We did not find a significant influence of participants’ technical background. Results indicate that interference warnings given to participants during teaching provoked an understanding of the issue. We assessed the participants’ views on detected interference as reported by the behaviour teaching system.
Phone tray 1.39 how to#
A mechanism for detecting behaviour interference provided feedback to participants and suggestions on how to resolve those conflicts.
Twenty participants individually taught the robot behaviours according to instructions they were given, some of which caused interference with other behaviours. We conducted a proof-of-concept human–robot interaction study with an autonomous, multi-purpose robot operating within a smart home environment. The online use of behaviour checking is demonstrated, based on static analysis of behaviours during the operation of the robot, and evaluated in a user study. We describe the human–robot behaviour teaching system that we developed as well as the formal behaviour checking methods used. We focus in particular on how such situations can be detected and presented to the user. In this article, we consider the issue of behaviour interference caused by situations where newly taught robot behaviours may affect or be affected by existing behaviours and thus, those behaviours will not or might not ever be executed. Here, personalisation refers to the teaching of new robot behaviours by both technical and non-technical end users. When studying the use of assistive robots in home environments, and especially how such robots can be personalised to meet the needs of the resident, key concerns are issues related to behaviour verification, behaviour interference and safety.