Researchers are making progress in the effort to develop safe and practical supernumerary robotic limbs.
- Unlike exoskeletons or prostheses, supernumerary robotic limbs function independently of the human skeleton.
- This new example of the technology attaches to the wearer's hips, and can lift 11 pounds.
- The arm currently isn't autonomous. Before A.I. can control supernumerary limbs, researchers first have to figure out how to make the technology understand and execute what the wearer wants it to do.
Supernumerary robotic limbs<p>When movies depict wearable robots, they usually show exoskeletons ("Iron Man") or prostheses (<a href="https://www.youtube.com/watch?v=cik8cl_n9AE" target="_blank">Luke Skywalker's robotic hand</a>). But supernumerary robotic limbs — like the new robotic arm — seem to be an underrepresented genre, at least in the popular consciousness. This genre describes robotic limbs that function independently of the human skeleton, and which "actively perform tasks similar to or beyond natural human capabilities," as a <a href="https://www.medien.ifi.lmu.de/pubdb/publications/pub/alsada2017amplify/alsada2017amplify.pdf" target="_blank">2017 research paper</a> states.<br></p><p>One hurdle in developing safe and effective supernumerary robotic limbs is figuring out how to attach the technology to the body so that it doesn't interfere with the wearer. For example, a robotic arm could throw someone off balance if it swings its arm too fast, or it could become uncomfortable if it's not attached strategically.</p><p>With the new robotic arm, the researchers attached the device to the wearer's hips with a rigid harness, close to the center of mass. It seems to work well enough, though you can see how someone could be thrown off balance. There's also the fact that it must be physically tethered to a nearby power system.</p>
Robotic limbs and human intent<p>But the biggest obstacle in developing supernumerary robotic limbs lies in artificial intelligence. For a robotic arm (or legs, fingers, etc.) to be practical, the device has to understand and execute what the wearer wants it to do. Here's how <a href="https://www.linkedin.com/in/catherine-v%C3%A9ronneau-7710a3140/?originalSubdomain=ca" target="_blank">Catherine Véronneau</a>, the lead author of a recent paper about the technology, described this problem to <a href="https://spectrum.ieee.org/automaton/robotics/robotics-hardware/robotic-third-arm-can-smash-through-walls" target="_blank">IEEE Spectrum</a>:</p><p style="margin-left: 20px;">"For instance, if the job of a supernumerary pair of arms is opening a door while the user is holding something, the controller should detect when is the right moment to open the door. So, for one particular application, it's feasible. But if we want that SRL to be multifunctional, it requires some AI or intelligent controller to detect what the human wants to do, and how the SRL could be complementary to the user (and act as a coworker). So there are a lot of things to explore in that vast field of "human intent."</p>
This is huge news for the 285 million visually impaired people around the world.
For the 285 million visually impaired people around the world, the technological advancements of our day and the conveniences of everyday life enjoyed by the sighted are rarely experienced. From libraries without braille books, to street signs and smart devices, navigating life is an entirely different and much more difficult experience. Noticing this divide in university, where sighted students could instantly get information from smart devices, while a visually impaired classmate had to lug around heavy Braille books, Eric Kim was inspired to develop a cheap impairment-friendly smartwatch.