With the advent of robots and machine intelligence, more robots are now working in contact with humans. These collaborative robots are found in manufacturing, assistive devices for physically impaired individuals and robots to aid surgery. However, these contact robots have little to no physical interaction with humans.
In its first issue, Nature Machine Intelligence reports the systematic methodology to produce versatile behaviours for contact robots, developed by researchers from Imperial College London, The University of Sussex, and Nanyang Technological University in Singapore. The new study shows how the joint team had developed the first time an interactive robot controller able to observe human users and react optimally to their movements.
It is a difficult task to help humans as their motions are constantly changing, imprecise and unpredictable. Thus current contact robots are typically fully controlled by the operator in master-slave mode to reproduce their movement in a remote location or are used amplify the human's force, such as in exoskeletons helping workers to carry heavy loads. Another way is to be independent of the human user, such as in current rehabilitation robots. This drastically reduces the possibilities to assist humans carrying out tasks.
The team set out to investigate how a contact robot should be controlled to provide a stable and appropriate response to a user with unknown control during various activities ranging from sport training, to physical rehabilitation and shared driving. They developed a versatile interactive motion behaviour for a robot in physical contact with a human user, with which these two agents can optimally react to each other by learning each other's control. The reactive robotic programming system enables a robot to continuously learn the human user's control and adapt its own control correspondingly. The programming enables the robot to 'understand' the human user's action and then responds to help the humans in performing successfully and with minimal physical and mental effort.
Associate Professor Domenico Campolo, Director of the Robotics Research Centre at Nanyang Technological University and co-author of the paper said: "Imagine a situation where two people are carrying a bulky item together, for example a large piece of furniture. A great deal of information is often non-verbal and tacitly exchanged through subtle force interactions. We used this type of information to enable a robot to adapt to its human partner with possible applications, for example, in robot-aided rehabilitation".
Professor Etienne Burdet, Chair Human Robotics in the Department of Bioengineering and senior author of the paper said: "It is not clear how to best assist human motion. When observing how humans physically interact with each other, we found that they succeed in this task very well through their unique ability to understand each other's control. We applied a similar principle to design human-robot interaction".
Dr. Li Yanan, who is now Lecturer at Sussex University and lead author of the paper, precised: "We enabled the robot to identify the human user's behaviour, and so could exploit game theory to let the robot optimally react to any human user".
The joint team will next apply the interactive control behaviour for robot-assisted neurorehabilitation and for shared driving in semi-autonomous vehicles.
For More info on the news https://www.nature.com/articles/s42256-018-0013-0