Wednesday, November 13, 2024

 Robot learns how to be a surgeon



By Dr. Tim Sandle
DIGITAL JOURNAL
November 12, 2024


This handout photo released by the University of Maryland School of Medicine on January 10, 2022 shows surgeons performing a transplant of a heart from a genetically modified pig to patient David Bennett, Sr - 
Copyright AFP/File JIM WATSON

A robot, trained by watching videos of experienced surgeons at work, executed the same surgical procedures as skilfully as the human doctors.

The process of ‘imitation learning’ to train surgical robots removes the necessity to program robots with each move required during a medical procedure and brings the field of robotic surgery closer to true autonomy, where robots can perform complex surgeries without any human help.

The research has been led by Johns Hopkins University and it was presented to the November 2024 Conference on Robot Learning in Munich.

The device tested was the da Vinci Surgical System robot and the trials involved performing some fundamental surgical procedures: manipulating a needle; lifting body tissue, and suturing.

The model combined imitation learning with the same machine learning architecture that underpins ChatGPT. Imitation learning is a type of reinforcement learning, where an agent learns to perform a task by supervised learning from expert demonstrations.

Where ChatGPT works with words and text, this model speaks “robot” with kinematics, a language that breaks down the angles of robotic motion into mathematics.

For the training process, the researchers fed their model hundreds of videos recorded from wrist cameras placed on the arms of da Vinci robots during surgical procedures.

These videos, recorded by surgeons all over the world, are used for post-operative analysis and then archived.

Nearly 7,000 da Vinci robots are used worldwide, and more than 50,000 surgeons are trained on the system, creating a large archive of data for robots to “imitate.”

While the da Vinci system is widely used, it is relatively imprecise. However, the scientists found a way to make the flawed input work. This was achieved by training the model to perform relative movements rather than absolute actions, which are inaccurate.

The key was an update to the image input which permits the AI system to find the right actions. The model has the potential to quickly train a robot to perform any type of surgical procedure.

Robot identifies plants by “touching” their leaves



Cell Press




Researchers in China have developed a robot that identifies different plant species at various stages of growth by “touching” their leaves with an electrode. The robot can measure properties such as surface texture and water content that cannot be determined using existing visual approaches, according to the study, published November 13 in the journal Device. The robot identified ten different plant species with an average accuracy of 97.7% and identified leaves of the flowering bauhinia plant with 100% accuracy at various growth stages.

Eventually, large-scale farmers and agricultural researchers could use the robot to monitor the health and growth of crops and to make tailored decisions about how much water and fertilizer to give their plants and how to approach pest control, says Zhongqian Song, an associate professor at the Shandong First Medical University & Shandong Academy of Medical Sciences and an author of the study.

“It could revolutionize crop management and ecosystem studies and enable early disease detection, which is crucial for plant health and food security,” he says.

Rather than making physical contact with a plant, existing devices capture more limited information using visual approaches, which are vulnerable to factors such as lighting conditions, changes in the weather, or background interference.

To overcome these limitations, Song and colleagues developed a robot that “touches” plants using a mechanism inspired by human skin, with structures working together in a hierarchical way to gain information through touch. When an electrode in the robot makes contact with a leaf, the device learns about the plant by measuring several properties: the amount of charge that can be stored at a given voltage, how difficult it is for electrical current to move through the leaf, and contact force as the robot grips the leaf.

Next, this data is processed using machine learning in order to classify the plant, since different values for each measure correlate with different plant species and stages of growth.

While the robot shows “vast and unexpected” potential applications in fields ranging from precision agriculture to ecological studies to plant disease detection, it has several weaknesses that have yet to be addressed, says Song. For example, the device is not yet versatile enough to consistently identify types of plants with complicated structures, such as burrs and needle-like leaves. This could be remedied by improving the design of the robot’s electrode, he says. 

“It may take a relatively long period of time to reach large-scale production and deployment depending on technological and market developments,” says Song.

As a next step the researchers plan to expand how many plants the robot can recognize by collecting data from a wider variety of species, boosting the plant species database they use to train algorithms. The researchers also hope to further integrate the device’s sensor so that it can display results in real time, even without an external power source, says Song.

###

Device, Chen et al., “Iontronic tactile sensory system for plants species and growth stage classification” https://www.cell.com/device/fulltext/S2666-9986(24)00570-2

Device (@Device_CP) is a physical science journal from Cell Press along with ChemJoule, and MatterDevice aims to be the breakthrough journal to support device- and application-oriented research from all disciplines, including applied physics, applied materials, nanotechnology, robotics, energy research, chemistry, and biotechnology, under a single title that focuses on the integration of these diverse disciplines in the creation of the cutting-edge technology of tomorrow. Visit https://www.cell.com/device/home. To receive Cell Press media alerts, contact press@cell.com.

No comments:

Post a Comment