New MIT-developed glove teaches A.I.-based robots to 'identify' everyday objects

Our clever human hands may soon be outdone.

New MIT-developed glove teaches A.I.-based robots to 'identify' everyday objects
Image source: MIT
  • MIT-affiliated researchers develop a hypersensitive glove that can capture the way in which we handle objects.
  • The data captured by the glove can be "learned" by a neural net.
  • Smart tactile interaction will be invaluable when A.I.-based robots start to interact with objects — and us.

Our hands are amazing things. For sighted people, it may be surprising how good they are at recognizing objects by feel alone, though the vision-impaired know perfectly well how informative hands can be. In looking to a future in which A.I.-based robotic agents become fully adept at interacting with the physical world — including with us — scientists are investigating new ways to teach machines to perceive the world around them.

This said, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have hit upon a cool approach: Tactile gloves that can collect data for A.I. — similar to how our hands feel. It could be invaluable to designers of robots, prosthetics, and result in safer robot interaction with humans.

Subramanian Sundaram, the lead author of the paper, which was published in Nature on May 29, says, "Humans can identify and handle objects well because we have tactile feedback. As we touch objects, we feel around and realize what they are. Robots don't have that rich feedback. We've always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well."

The STAG

The "scalable tactile glove," or "STAG," that the CSAIL scientists are using for data-gathering contains 550 tiny pressure sensors. These track and capture how hands interact with objects as they touch, move, pick up, put down, drop, and feel them. The resulting data is fed into a convolutional neural network for learning. So far, the team has taught their system to recognize 26 everyday objects — among them a soda can, pair of scissors, tennis ball, spoon, pen, and mug — with an impressive 76 percent accuracy rate. The STAG system can also accurately predict object's weights plus or minus 60 grams.

There are other tactile gloves available, but the CSAIL gloves are different. While other versions tend to be very expensive — costing in the thousands of dollars — these are made from just $10 worth of readily available materials. In addition, other gloves typically sport a mingy 50 sensors, and are thus not nearly as sensitive or informative as this glove.

The STAG is laminated with electrically conductive polymer that perceives changes in resistance as pressure is applied to an object. Woven into the glove are conductive threads that overlap, producing comparative deltas that allow pairs of them to serve as pressure sensors. When the wearer touches an object, the glove picks up each point of contact as a pressure point.

Touching stuff

Image source: Jackie Niam/Shutterstock

An external circuit creates "tactile maps" of pressure points, brief videos that depict each contact point as a dot sized according to the amount of pressure applied. The 26 objects assessed so far were mapped out to some 135,000 video frames that show dots growing and shrinking at different points on the hand. That raw dataset had to be massaged in a few ways for optimal recognition by the neural network. (A separate dataset of around 11,600 frames was developed for weight prediction.)

In addition to capturing pressure information, the researchers also measured the manner in which a hand's joints interact while handling an object. Certain relationships turn out to be predictable: When someone engages the middle joint of their index finger, for example, they seldom use their thumb. On the other hand (no pun intended), using the index and middle-fingertips always means that the thumb will be involved. "We quantifiably show for the first time," says Sundaram, "that if I'm using one part of my hand, how likely I am to use another part of my hand."

Feelings

Giphy

The type of convolutional neural network that learned the team's tactile maps is typically used for classifying images, and it's able to associate patterns with specific objects so long as it has adequate data regarding the many ways in which an object may be handled.

The hope is that the insights gathered by the CSAIL researchers' STAG can eventually be conveyed to sensors on robot joints, allowing them to touch and feel much as we do. The result? You'll be able to vigorously shake hands with an automaton without getting your hand crushed.

Massive 'Darth Vader' isopod found lurking in the Indian Ocean

The father of all giant sea bugs was recently discovered off the coast of Java.

A close up of Bathynomus raksasa

SJADE 2018
Surprising Science
  • A new species of isopod with a resemblance to a certain Sith lord was just discovered.
  • It is the first known giant isopod from the Indian Ocean.
  • The finding extends the list of giant isopods even further.
Keep reading Show less

Astronomers find more than 100,000 "stellar nurseries"

Every star we can see, including our sun, was born in one of these violent clouds.

Credit: NASA / ESA via Getty Images
Surprising Science

This article was originally published on our sister site, Freethink.

An international team of astronomers has conducted the biggest survey of stellar nurseries to date, charting more than 100,000 star-birthing regions across our corner of the universe.

Stellar nurseries: Outer space is filled with clouds of dust and gas called nebulae. In some of these nebulae, gravity will pull the dust and gas into clumps that eventually get so big, they collapse on themselves — and a star is born.

These star-birthing nebulae are known as stellar nurseries.

The challenge: Stars are a key part of the universe — they lead to the formation of planets and produce the elements needed to create life as we know it. A better understanding of stars, then, means a better understanding of the universe — but there's still a lot we don't know about star formation.

This is partly because it's hard to see what's going on in stellar nurseries — the clouds of dust obscure optical telescopes' view — and also because there are just so many of them that it's hard to know what the average nursery is like.

The survey: The astronomers conducted their survey of stellar nurseries using the massive ALMA telescope array in Chile. Because ALMA is a radio telescope, it captures the radio waves emanating from celestial objects, rather than the light.

"The new thing ... is that we can use ALMA to take pictures of many galaxies, and these pictures are as sharp and detailed as those taken by optical telescopes," Jiayi Sun, an Ohio State University (OSU) researcher, said in a press release.

"This just hasn't been possible before."

Over the course of the five-year survey, the group was able to chart more than 100,000 stellar nurseries across more than 90 nearby galaxies, expanding the amount of available data on the celestial objects tenfold, according to OSU researcher Adam Leroy.

New insights: The survey is already yielding new insights into stellar nurseries, including the fact that they appear to be more diverse than previously thought.

"For a long time, conventional wisdom among astronomers was that all stellar nurseries looked more or less the same," Sun said. "But with this survey we can see that this is really not the case."

"While there are some similarities, the nature and appearance of these nurseries change within and among galaxies," he continued, "just like cities or trees may vary in important ways as you go from place to place across the world."

Astronomers have also learned from the survey that stellar nurseries aren't particularly efficient at producing stars and tend to live for only 10 to 30 million years, which isn't very long on a universal scale.

Looking ahead: Data from the survey is now publicly available, so expect to see other researchers using it to make their own observations about stellar nurseries in the future.

"We have an incredible dataset here that will continue to be useful," Leroy said. "This is really a new view of galaxies and we expect to be learning from it for years to come."

Protecting space stations from deadly space debris

Tiny specks of space debris can move faster than bullets and cause way more damage. Cleaning it up is imperative.

Videos
  • NASA estimates that more than 500,000 pieces of space trash larger than a marble are currently in orbit. Estimates exceed 128 million pieces when factoring in smaller pieces from collisions. At 17,500 MPH, even a paint chip can cause serious damage.
  • To prevent this untrackable space debris from taking out satellites and putting astronauts in danger, scientists have been working on ways to retrieve large objects before they collide and create more problems.
  • The team at Clearspace, in collaboration with the European Space Agency, is on a mission to capture one such object using an autonomous spacecraft with claw-like arms. It's an expensive and very tricky mission, but one that could have a major impact on the future of space exploration.

This is the first episode of Just Might Work, an original series by Freethink, focused on surprising solutions to our biggest problems.

Catch more Just Might Work episodes on their channel:
https://www.freethink.com/shows/just-might-work

Personal Growth

Meet the worm with a jaw of metal

Metal-like materials have been discovered in a very strange place.

Quantcast