Single-material electronic skin gives robots the human touch
University of Cambridge
image:
Scientists have developed a low-cost, durable, highly-sensitive robotic ‘skin’ that can be added to robotic hands like a glove, enabling robots to detect information about their surroundings in a way that’s similar to humans.
view moreCredit: University of Cambridge
Scientists have developed a low-cost, durable, highly-sensitive robotic ‘skin’ that can be added to robotic hands like a glove, enabling robots to detect information about their surroundings in a way that’s similar to humans.
The researchers, from the University of Cambridge and University College London (UCL), developed the flexible, conductive skin, which is easy to fabricate and can be melted down and formed into a wide range of complex shapes. The technology senses and processes a range of physical inputs, allowing robots to interact with the physical world in a more meaningful way.
Unlike other solutions for robotic touch, which typically work via sensors embedded in small areas and require different sensors to detect different types of touch, the entirety of the electronic skin developed by the Cambridge and UCL researchers is a sensor, bringing it closer to our own sensor system: our skin.
Although the robotic skin is not as sensitive as human skin, it can detect signals from over 860,000 tiny pathways in the material, enabling it to recognise different types of touch and pressure – like the tap of a finger, a hot or cold surface, damage caused by cutting or stabbing, or multiple points being touched at once – in a single material.
The researchers used a combination of physical tests and machine learning techniques to help the robotic skin ‘learn’ which of these pathways matter most, so it can sense different types of contact more efficiently.
In addition to potential future applications for humanoid robots or human prosthetics where a sense of touch is vital, the researchers say the robotic skin could be useful in industries as varied as the automotive sector or disaster relief. The results are reported in the journal Science Robotics.
Electronic skins work by converting physical information – like pressure or temperature – into electronic signals. In most cases, different types of sensors are needed for different types of touch – one type of sensor to detect pressure, another for temperature, and so on – which are then embedded into soft, flexible materials. However, the signals from these different sensors can interfere with each other, and the materials are easily damaged.
“Having different sensors for different types of touch leads to materials that are complex to make,” said lead author Dr David Hardman from Cambridge’s Department of Engineering. “We wanted to develop a solution that can detect multiple types of touch at once, but in a single material.”
“At the same time, we need something that’s cheap and durable, so that it’s suitable for widespread use,” said co-author Dr Thomas George Thuruthel from UCL.
Their solution uses one type of sensor that reacts differently to different types of touch, known as multi-modal sensing. While it’s challenging to separate out the cause of each signal, multi-modal sensing materials are easier to make and more robust.
The researchers melted down a soft, stretchy and electrically conductive gelatine-based hydrogel, and cast it into the shape of a human hand. They tested a range of different electrode configurations to determine which gave them the most useful information about different types of touch. From just 32 electrodes placed at the wrist, they were able to collect over 1.7 million pieces of information over the whole hand, thanks to the tiny pathways in the conductive material.
The skin was then tested on different types of touch: the researchers blasted it with a heat gun, pressed it with their fingers and a robotic arm, gently touched it with their fingers, and even cut it open with a scalpel. The team then used the data gathered during these tests to train a machine learning model so the hand would recognise what the different types of touch meant.
“We’re able to squeeze a lot of information from these materials – they can take thousands of measurements very quickly,” said Hardman, who is a postdoctoral researcher in the lab of co-author Professor Fumiya Iida. “They’re measuring lots of different things at once, over a large surface area.”
“We’re not quite at the level where the robotic skin is as good as human skin, but we think it’s better than anything else out there at the moment,” said Thuruthel. “Our method is flexible and easier to build than traditional sensors, and we’re able to calibrate it using human touch for a range of tasks.”
In future, the researchers are hoping to improve the durability of the electronic skin, and to carry out further tests on real-world robotic tasks.
The research was supported by Samsung Global Research Outreach Program, the Royal Society, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Fumiya Iida is a Fellow of Corpus Christi College, Cambridge.
Scientists have developed a low-cost, durable, highly-sensitive robotic ‘skin’ that can be added to robotic hands like a glove, enabling robots to detect information about their surroundings in a way that’s similar to humans.
Scientists have developed a low-cost, durable, highly-sensitive robotic ‘skin’ that can be added to robotic hands like a glove, enabling robots to detect information about their surroundings in a way that’s similar to humans.
Credit
University of Cambridge
Journal
Science Robotics
Article Title
Multimodal Information Structuring with Single-Layer Soft Skins and High-Density Electrical Impedance Tomography
Article Publication Date
11-Jun-2025
Soft robotic gripper injects leaves with precision
image:
The soft robotic leaf gripper injects leaves with sensors that help it detect and communicate with its environment.
view moreCredit: Savan DeSouza/Cornell University
Tools that offer early and accurate insight into plant health – and allow individual plant interventions – are key to increasing crop yields as environmental pressures increasingly impact horticulture and agriculture.
In response to this challenge, Cornell researchers have developed a soft robotic device that gently grips and injects living plant leaves with sensors that help it detect and communicate with its environment. The robot can also inject genetic material that could be used for bioengineering plants in the future.
The device allows for safe, repeatable delivery of sensors and genetic material in a reliable, plant-safe way – an essential step in precision, data-driven agriculture. The team’s findings were published in Science Robotics.
Funding for the research was provided by the National Science Foundation under a five-year, $25 million grant supporting the Cornell-led Center for Research on Programmable Plant Systems (CROPPS).
“Plants, like people, have different responses to the environment, and precision agriculture is an effort to move closer and closer to single-plant-level intervention – and the soil surrounding it,” said the paper’s senior author, Robert F. Shepherd, professor in the Sibley School of Mechanical and Aerospace Engineering in Cornell Engineering, and a research lead at CROPPS.
Horticulturists, farmers and agriculturalists face rising pressures from environmental impacts, such as drought and fertilizer runoff. By implanting sensors into leaves, researchers can monitor the impact of drought or an overdose of fertilizer on the plant.
To demonstrate, the team used the gripper to deliver two types of probes. The first, AquaDust, is a tiny gel particle that fluoresces in response to water stress, allowing researchers to non-invasively monitor a plant’s hydration levels. The second probe, RUBY, is a gene-encoded biological reporter that causes red pigmentation to appear where genetic transformation occurs within the plant.
“AquaDust allowed us to ‘see’ the water stress inside a leaf, and similarly, by injecting a bacterium that transforms the injection region with RUBY reporter genes, we were able to ‘see’ that this part of the leaf experienced a genetic transformation,” said first author Mehmet Mert Ilman, previously a postdoctoral researcher in the Organic Robotics Lab and now an assistant professor in mechanical engineering at Manisa Celal Bayar University in Turkey.
“It was fascinating to be able to robotically transform the local genetics of the plant leaf and then see it change back,” he said.
The researchers tested the device on sunflower and cotton leaves – plants known for their structural resistance to infiltration. The gripper achieved more than 91% success in delivery while causing significantly less damage than syringe-based methods and expanding the effective infiltration area by more than 12 times.
The soft robotic system works hands-free, delivers materials more evenly and causes little to no damage, even in tough and durable species like cotton. The technique is an improvement over traditional manual methods such as vacuum infiltration, which uses low air pressure to force liquids into plant tissues, or needle injections, which can injure leaves, are labor-intensive and often fail in tough plant types. This is especially important for horticultural crops – soft-skinned plants cultivated for their fruits, vegetables, flowers or ornamental value.
The device applies gentle, uniform pressure through a sponge tip that holds the nanoparticle or genetic probes. The design of the soft material and actuator (the portion of a machine that produces force or torque) were optimized through simulations software and 3D printing, allowing the gripper to function with a variety of leaf types and shapes.
“Its low stiffness allows it to warp the gripper’s shape to adapt to the orientation and surface of the leaf with little health implications to the leaf,” Shepherd said. “The shape of the extending actuator allowed for a large displacement and ability to adjust orientation without bulky or complex motor control.”
The research lays the foundation for real-time, minimally invasive plant monitoring, Shepherd said.
“Soft grippers to inject physical or biological probes unlock new and incredible capabilities,” he said. “The immediate use of our system would likely be in greenhouses, where a robot would persistently inject and monitor individual plants to infer how much water they need.”
In the long term, similar grippers could be used to deliver or retrieve other diagnostic materials, including sensors for nitrogen uptake, disease presence or even real-time metabolic changes, opening up new possibilities for smart agriculture and plant research.
“New nanoparticles will also eventually be created that will inform us about many other health aspects of the plant,” Shepherd said. “With this information, plants will yield more, and we will waste less.”
The team is now exploring the integration of the gripper onto robotic arms for automated greenhouse systems, with the long-term goal of adapting it for field-deployable platforms.
“Once translated out of greenhouses, the implications will be larger,” Shepherd said. “I am particularly interested in limiting the waste streams into lakes to prevent harmful algal blooms.”
Co-authors include researchers from the Boyce Thompson Institute, the Smith School of Chemical and Biomolecular Engineering and the School of Integrative Plant Science in the College of Agriculture and Life Sciences.
The work was supported by the National Science Foundation, the USDA National Institute of Food and Agriculture and the Scientific and Technological Research Council of Turkey.
Stephen D’Angelo is communications manager for Cornell Research and Innovation.
Journal
Science Robotics
Article Title
In situ foliar augmentation of multiple species for optical phenotyping and bioengineering using soft robotics
Article Publication Date
11-Jun-2025

No comments:
Post a Comment