Robotic arm helps paraplegic patient eat independently

Wednesday, March 5, 2025
Robotics
News

Scientists at the University of Washington have been working for a decade on a robotic arm capable of helping people who can no longer wield a knife and fork by themselves due to illness or paralysis to eat. Some time ago, the robotic arm, or ADA (Assistive Dexterous Arm) was tested outside the lab on and by a paraplegic patient.

Robots in healthcare are not new. Surgical robots are being used in many ORs that not only help to make surgeries less invasive and even more precise, but also enable procedures in places that were previously inaccessible. A relatively new development are social robots in elderly and home care. And also being worked on are robots, or exoskeletons, that help healthcare professionals lift patients out of beds or wheelchairs and allow people with spinal cord injuries to walk independently again. The robotic arm designed at the University of Washington should enable people who are no longer able to use their arms or hands to eat independently again.

Robotic arm feeds patients

That the development of the ADA (Assistive Dexterous Arm) robotic arm took 10 years to complete has everything to do with the fact that the mechanism of eating is more complicated than it seems. The first challenge the scientists had to overcome was the handling of a fork by the robotic arm. Once they succeeded in that, the robotic arm could be further developed so that it could pick up food from a plate and bring it to (and into) a person's mouth.

The robotic arm was then tested in a laboratory setting for an extended period of time. Recently, it was used for the first time in a pair of studies outside the lab. In the first study, six users with motor limitations used the robot to eat a meal in the cafeteria, an office or a University conference room. In the second study, one of the researchers used the robotic arm at home. How that worked can be seen in the video below.

“Our previous studies took place in the lab because if you want to evaluate specific system components in isolation, you have to control all other aspects of the meal,” says Amal Nanavati, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering.

Development and functionality

The system consists of a robotic arm that can be attached to, say, an electric wheelchair or hospital table, close to the user. Using an app, the user can then indicate what he wants to eat from the plate. The robotic arm then picks that up and ensures that exactly that bite is brought to the user's mouth. The arm also has a force sensor and camera to distinguish between hard and soft foods.

In both studies, users successfully fed themselves. In developing the robotic arm, the scientists took into account changing conditions in and around use. In the first study, the robot achieved an accuracy of about 80 percent. In the second study, the system's standard functionality was hampered by different conditions and environments in the home. These included, successfully, testing whether the subject could also eat in bed, or in low light.

“It was a really important step to get the robot out of the lab. You eat in different environments and there are little variables you don't think about. If the robot is too heavy, it can tip over a table. Or if the lighting isn't right, facial recognition might struggle, but lighting is something you really don't think about when you're eating,” Jonathan Ko said.

The team plans to further improve the system in terms of effectiveness and adaptability. The team's research will be presented March 5 at the ACM/IEEE International Conference on Human-Robot Interaction in Melbourne.