Skip to content

The new dressing robot can “imitate” the actions of healthcare workers

Scientists have developed a new robot that can “imitate” the two-handed movements of healthcare workers while dressing a person.

Until now, assistive dressing robots, designed to help an elderly or disabled person get dressed, have been created in the laboratory as a one-arm machine, but research has shown that this can be uncomfortable for the person. in charge or impractical.

To address this problem, Dr. Jihong Zhu, a robotics researcher at York University’s Institute for Safe Autonomy, proposed a two-arm dressing assistance system, which had not been attempted in previous research but was inspired in caregivers who have demonstrated that specific actions are necessary to reduce the discomfort and distress of the individual under their care.

It is believed that this technology could be important in the social care system to allow care workers to spend less time on practical tasks and more time on people’s mental health and wellbeing.

Dr. Zhu collected important information about how care workers moved during a dressing exercise, allowing a robot to observe and learn from human movements and then, through AI, generate a model that mimicked how human helpers they do their task.

This allowed the researchers to collect enough data to illustrate that two hands were needed for dressing and not one, as well as information about the angles the arms form and the need for a human to intervene and stop or alter certain movements.

Dr Zhu, from the Institute for Safe Autonomy at the University of York and the School of Physics, Engineering and Technology, said: “We know that practical tasks, such as dressing, can be carried out by a robot, freeing the caregiver to focus more on providing companionship and observing the overall well-being of the person in your care. It’s been tested in the lab, but for this to work outside the lab, we really needed to understand how care workers did this task in real life. – time.

“We adopted a method called learning by demonstration, which means that an expert is not needed to program a robot, a human only needs to demonstrate the movement required of the robot and the robot learns that action. It was clear that for the Care workers needed two arms to adequately care for the needs of people with different abilities.

“One hand holds the individual’s hand to guide it comfortably through the arm of a shirt, for example, while at the same time the other hand moves the garment up and around or over it. With the current single-machine scheme arm, the patient is required to “Have to do too much work for a robot to help them, moving their arm in the air or bending it in ways that perhaps they couldn’t do.”

The team was also able to build algorithms that made the robotic arm flexible enough in its movements so that it could perform the actions of pulling and lifting, but also prevent it from performing an action with the gentle touch of a human hand, or guided outside of An Action of a human hand moving the hand left or right, up or down, without the robot resisting.

Dr Zhu said: “Human modeling can really help with efficient and safe human-robot interactions, but it is not only important to ensure that it performs the task, but that it can be stopped or changed mid-action if an individual so requires.” “It’s an important part of this process, and the next step in this research is to test the safety limitations of the robot and whether it will be accepted by those who need it most.”

The research, in collaboration with researchers from TU Delft and Honda Research Institute Europe, was funded by Honda Research Institute Europe.