Robotic hand moves objects with human-like grasps
A robotic hand developed at EPFL can pick up 24 different objects with human-like movements that emerge spontaneously, thanks to compliant materials and structures rather than programming.
Ecole Polytechnique Fédérale de Lausanne
image:
The ADAPT robotic hand (Adaptive Dexterous Anthropomorphic Programmable sTiffness)
view moreCredit: CREATE Lab EPFL
When you reach out your hand to grasp an object like a bottle, you generally don’t need to know the bottle’s exact position in space to pick it up successfully. But as EPFL researcher Kai Junge explains, if you want to make a robot that can pick up a bottle, you must know everything about the surrounding environment very precisely.
“As humans, we don’t really need too much external information to grasp an object, and we believe that’s because of the compliant – or soft – interactions that happen at the interface between an object and a human hand,” says Junge, a PhD student in the School of Engineering’s Computational Robot Design & Fabrication (CREATE) Lab, led by Josie Hughes. “This compliance is what we are interested in exploring for robots.”
In robotics, compliant materials are those that deform, bend, and squish. In the case of the CREATE Lab’s robotic ADAPT hand (Adaptive Dexterous Anthropomorphic Programmable sTiffness), the compliant materials are relatively simple: strips of silicone wrapped around a mechanical wrist and fingers, plus spring-loaded joints, combined with a bendable robotic arm. But this strategically distributed compliance is what allows the device to pick up a wide variety of objects using “self-organized” grasps that emerge automatically, rather than being programmed.
In a series of experiments, the ADAPT hand, which can be controlled remotely, was able to pick up 24 objects with a success rate of 93%, using self-organized grasps that mimicked a natural human grasp with a direct similarity of 68%. The research has been published in Nature Communications Engineering.
‘Bottom-up’ robotic intelligence
While a traditional robotic hand would need a motor to actuate each joint, the ADAPT hand has only 12 motors, housed in the wrist, for its 20 joints. The rest of the mechanical control comes from springs, which can be made stiffer or looser to tune the hand’s compliance, and from the silicone ‘skin’, which can also be added or removed.
As for software, the ADAPT hand is programmed to move through just four general waypoints, or positions, to lift an object. Any further adaptations required to complete the task occur without additional programming or feedback; in robotics, this is called ‘open loop’ control. For example, when the team programmed the robot to use a certain motion, it was able to adapt its grasp pose to various objects ranging from a single bolt to a banana. The researchers analyzed this extreme robustness -- thanks to the robot’s spatially distributed compliance -- with over 300 grasps and compared them against a rigid version of the hand.
“Developing robots that can perform interactions or tasks that humans do automatically is a lot harder than most people expect,” Junge says. “That’s why we are interested in exploiting this distributed mechanical intelligence of different body parts like skin, muscles, and joints, as opposed to the top-down intelligence of the brain.”
Balancing compliance and control
Junge emphasizes that the goal of the ADAPT study was not necessarily to create a robotic hand that can grasp like a human, but to show for the first time how much a robot can achieve through compliance alone.
Now that this has been demonstrated systematically, the EPFL team is building on the potential of compliance by re-integrating elements of closed-loop control into the ADAPT hand, including sensory feedback – via the addition of pressure sensors to the silicone skin – and artificial intelligence. This synergistic approach could lead to robots that combine compliance’s robustness to uncertainty, and the precision of closed-loop control.
“A better understanding of the advantages of compliant robots could greatly improve the integration of robotic systems into highly unpredictable environments, or into environments designed for humans,” Junge summarizes.
ADAPT robotic hand grasp: banana [VIDEO]
ADAPT hand (Adaptive Dexterous Anthropomorphic Programmable sTiffness) © CREATE Lab EPFL
Credit
CREATE Lab EPFL
Journal
Communications Engineering
Article Title
Spatially distributed biomimetic compliance enables robust anthropomorphic robotic manipulation
Handy octopus robot can adapt to its surroundings
image:
Suction cups using suction intelligence to grasp object.
view moreCredit: Tianqi Yue
Scientists inspired by the octopus’s nervous system have developed a robot that can decide how to move or grip objects by sensing its environment.
The team from the University of Bristol’s Faculty of Science and Engineering designed a simple yet smart robot which uses fluid flows of air or water to coordinate suction and movement as octopuses do with hundreds of suckers and multiple arms.
The study, published today in the journal Science Robotics, shows how a soft robot can use suction flow not just to stick to things, but also to sense its environment and control its own actions—just like an octopus. A single suction system enables the robot to grab delicate items, sense whether it’s touching air, water, or a rough surface, and even predict how hard something is pulling on it—all at once, without needing a central computer.
Lead author Tianqi Yue explained: “Last year, we developed an artificial suction cup that mimicked how octopuses stick to rocks using soft materials and water sealing.
“This research brings that work on, from using a suction cup like an octopus sucker to connect to objects to using ‘embodied suction intelligence’ - mimicking key aspects of the neuromuscular structure of the octopus in soft robotic systems.”
The suction intelligence works at two levels: by coupling suction flow with local fluidic circuitry, soft robots can achieve octopus-like low-level embodied intelligence, including gently grasping delicate objects, adaptive curling and encapsulating objects of unknown geometries. By decoding the pressure response from a suction cup, robots can achieve high-level perception including contact detection, classification of environment and surface roughness, as well as prediction of interactive pulling force.
This simple and low-cost suction intelligence could lead to a new generation of soft robots that are safer, smarter and more energy-efficient. Potential uses include picking fruit gently in agriculture, handling fragile items in factories, anchoring medical tools inside the human body, or creating soft toys and wearable tools that can interact safely with people.
The team are currently working on making the system smaller and more robust for real-world use. They also aim to combine it with smart materials and AI to improve its adaptability and decision-making in complex environments.
“It’s fascinating how a simple suction cup, with no electronics inside, can feel, think and act—just like an octopus arm,” concluded Tianqi. “This could help robots become more natural, soft and intuitive to use.”
Suction cups using suction intelligence to grasp object
Credit
Tianqi Yue
Paper:
‘Embodying soft robots with octopus-inspired hierarchical suction intelligence’ by Tianqi Yue, Chenghua Lu, Kailuan Tang, Qiukai Qi, Zhenyu Lu, Loong Yi Lee, Hermes Bloomfield-Gadȇlha, and Jonathan Rossiter in Science Robotics.
Journal
Science Robotics
Method of Research
Experimental study
No comments:
Post a Comment