Showing posts sorted by date for query ROBOT DOG. Sort by relevance Show all posts
Showing posts sorted by date for query ROBOT DOG. Sort by relevance Show all posts

Wednesday, March 18, 2026

AI-Powered Robot Dogs Guarding Reviled Data Centers Is Where We Have Arrived

These robots, known as “quadrupeds,” are being used to patrol the sprawling energy-sucking complexes, which are increasingly being met with protest around the country.



A US Secret Service robot dog manufactured by Boston Dynamics patrols the grounds at then President-elect Donald Trump’s Mar-a-Lago resort on November 18, 2024, in Palm Beach, Florida.
(Photo by Scott Olson/Getty Images)


Stephen Prager
Mar 17, 2026
COMMON DREAMS

As Americans grow fed up with the rapid encroachment of artificial intelligence data centers into their communities, tech companies are embracing a novel solution to protect their energy-sucking behemoths from danger: Even more robots... robot dogs, to be exact.

According to a report from Business Insider on Monday:
As companies pour billions into sprawling industrial campuses for cloud and AI computing, some data center operators are experimenting with four-legged bots—about the size of large dogs—that can patrol fences, inspect equipment, and flag any issues before they turn into costly outages.


Bucking ‘Huge Consensus’ at India Summit, Trump Admin Opposes Global AI Guardrails

These robots, known as “quadrupeds,” are being used to patrol the complexes, which can sometimes reach the size of multiple football fields.



According to Fortune, tech companies are already pouring nearly $700 billion into building data centers across the US and are now spending hundreds of thousands of dollars more to enlist mechanical canines as security forces.

One model from Boston Dynamics, known as “Spot,” can cost anywhere from $175,000 to $300,000. And while the technology may seem futuristic, Spot and other quadrupeds like it have already been enlisted in law enforcement and public safety for years.

Another company—Ghost Robotics—advertises its quadrupeds for “reconnaissance, intelligence, and surveillance use by the military.”

With more than 5,000 data centers now in the US and 800-1,000 new ones in the process of being built, Michael Subhan, the chief growth officer for Ghost Robotics, told Business Insider he expects boom times are ahead for his industry.

As data centers expand their reach at breakneck speed, there may be more interlopers for the programmable pooches to sniff out.

Due to skyrocketing energy costs and water shortages in places where large data centers have been built, the sites of proposed projects from Illinois to Minnesota to South Carolina have drawn crowds of dozens and even hundreds of demonstrators in recent weeks.

Saturday, March 14, 2026

New AI-evolved robots refuse to die


By Dr. Tim Sandle
SCIENCE EDITOR
DIGITAL JOURNAL
March 10, 2026


Aigen's solar-powered autonomous robots aim to take the chemicals and toil out of industrial weeding - Copyright GETTY IMAGES NORTH AMERICA/AFP/File RICK DIAMOND

AI-designed metamachines run in the wild, recover from damage and transform into new shapes, according to recent studies. These new modular legged robots are said possess athletic intelligence and they are comprised of multiple smaller autonomous robots, with each athletic module being a complete robot with its own motor, battery and brain.

Together, each modules forms a larger machine that can be rapidly assembled, repaired or reshaped. The study, from Northwestern University engineers, marks first evolved robot to set foot outdoors and first modular robot with agility.

These robots can be combined and recombined in the wild, recover from injury and keep moving no matter what challenges they face.


If flipped upside down, the robots instinctively bring themselves upright and continue their journey. They can survive being chopped in half or cut up into many pieces. When separated, every module within the metamachine can become an individual agent.

Called “legged metamachines,” the creations are made from autonomous, Lego-like modules that snap together into an endless number of configurations. Each module by itself is a complete robot with its own motor, battery and computer. Alone, a module can roll, turn and jump. But the real agility and indestructibility emerges when the modules combine.




To design the most effective combinations, the engineers used artificial intelligence (AI) to evolve novel body configurations. Instead of sticking with standard dog- or human-like designs, the AI churned out strange new “species” of machines that no human engineer would have conceived. When connected to other modules, the metamachines undulate like seals, bound like lizards or spring like kangaroos.

The robots also can flip themselves upright when turned over, hop over obstacles and perform acrobatics like spinning in air. Because a metamachine is essentially a robot made up of other robots, it can resist catastrophic damage. Broken parts don’t become dead weight; they keep rolling, crawling and rejoin the team.

By combining physical modularity with AI-driven design, the researchers have opened the door to a new class of robots that don’t just survive the real world — they adapt to it. These machines point toward a future where robots are less like fragile, pre-designed tools and more like resilient, evolving lifeforms.

Evolution accelerated by computers

While today’s robots can be fast and agile, their body shapes are often fixed and rigid. Most robots cannot adapt to new tasks, environments or physical damage. If a robotic dog breaks a leg, for example, it’s basically useless. To escape those limitations, the engineersteam turned to AI — not to copy familiar designs but to evolve something entirely new.

The researchers started with an evolutionary algorithm that mimics natural selection. As a starting point, the team gave the algorithm the building blocks for the robot. These building blocks are half meter-long modular legs, which look like a pair of sticks joined by a central sphere.

The researchers gave the algorithm a goal: Design a robot with efficient, versatile movement. By mixing and matching the modules in different combinations, the algorithm generated new body types. It then simulated each design, keeping the best performers and discarding the weak. It also iteratively “bred” new designs by combining or mutating them. Depending on the robot’s body, modular legs became legs, spines or tails.

Traversing rugged terrain

To test the designs, the engineers assembled the best three-, four- and five-legged designs found by evolution. In outdoor tests, the metamachines ran across rough terrain, including gravel, grass, tree roots, leaves, sand, mud and uneven bricks. They jumped, spun and righted themselves when flipped — all without complicated setup or retraining.

Unlike traditional robots that fail when a single part breaks, these machines can adapt, recover and survive. Even when a leg breaks off, the metamachine remains resilient. The modules adapt to a missing leg and keep moving. The missing leg, too, can roll home and rejoin its team.

The research appears in the journal Proceedings of the National Academy of Sciences. The study is titled “Agile legged locomotion in reconfigurable modular robots.”

Friday, March 13, 2026

 

What can dogs tell us about how robots can locate objects? Gestures may be as important as words





Brown University
Fetch, Robot 

image: 

A new approach to helping robots locate objects incorporates both language cues and gesture. 

view more 

Credit: Tellex Lab / Brown University





PROVIDENCE, R.I. [Brown University] — Whether in the kitchen or on a workshop floor, robot assistants that can fetch items for people could be extremely useful. Now, a team of Brown University researchers has developed a way of making robots better at figuring out exactly which items a user might want them to retrieve.

The new approach enables robots to use inputs from both human language and gesture as they reason about how to locate and retrieve target objects. In a study that will be presented on Tuesday, March 17, during the International Conference on Human-Robot Interaction in Edinburgh, Scotland, the researchers show that the approach had an 89% success rate in finding the correct object in complex environments, outperforming other object retrieval approaches.

“Searching for things requires a robot to navigate large environments,” said Ivy He, a graduate student at Brown and the study’s lead author. “With current technology, robots are pretty good at identifying objects, but when the environment is cluttered, things are moving around or things are hidden by other objects, that makes things much more difficult. So this work is about using both language and gesture to help in that search task.”

The research makes use of an approach to robot planning called a POMDP (partially observable Markov decision process), a mathematical framework that allows a robot to reason under uncertainty. In the real world, robots rarely have a perfect understanding of the world. Different types of objects can look similar. There may be more than one of a particular object in a room. Items might be partially or completely hidden from view.

To succeed in a search, a robot has to act even when it isn’t sure what it’s seeing. Without a way to manage that uncertainty, it might freeze. Or worse, it might make overconfident final decisions based on incomplete information. A POMDP turns ambiguities into a probabilistic framework that helps the robot track how confident it is about what’s in the world, and update those beliefs according to new information, including information from large vision and language models. In the process, it can choose actions that help it learn more — for example, moving to get a better view — before committing to a final decision.

The innovation in this latest research is a POMDP that incorporates inputs from both language and human gestures, such as pointing toward the object of interest. To incorporate the gesture component, He drew on insights from a Brown laboratory led by Associate Professor of Cognitive and Psychological Sciences Daphna Buchsbaum, on how the undisputed world champions of fetch — dogs — interpret human pointing.

Building on this expertise, He and Ph.D. student Madeline Pelgrim performed a study of the finer points of human pointing, as well as how dogs interpret pointing gestures. The study helped He to model the target of a pointing gesture within a cone of probability.

“What we have found is that humans use eye gaze to align with what they’re pointing to,” He said. “So it was natural to create a cone based on a connecting line from the eye to elbow to the wrist. That turns out to be a fairly good approximation of where someone is pointing.”

Buchsbaum adds, “Our work in the Brown Dog Lab has shown just how sophisticated dogs are in their communication with humans, solving many of the cooperation problems we want robots to solve. This makes them a natural model for intuitive human-non-human cooperation. This work translates the dog's intuitive understanding of human gaze and pointing into a probabilistic model, which allows the robot to handle the ambiguity inherent in human communication. It moves us closer to truly intuitive robotic assistants.”

He then combined the gesture model with a vision language model or VLM, an AI system designed to interpret visual scenes together with natural language descriptions. The result was a POMDP capable of incorporating both language and gesture for robot planning.

In lab experiments, the researchers asked a quadruped robot to find various objects scattered around the lab space. The experiments showed that the robot was able to locate the correct object nearly 90% of time using combined gesture and language, far better than using either input alone.

For He and her coauthors, the research is a step toward robots that are able to operate side-by-side with people at home and in the workplace.

“The framework we developed helps pave the way for seamless multimodal human-robot interaction,” said research co-author Jason Liu, a postdoctoral researcher at MIT who worked on the project while completing his Ph.D. at Brown. “In the future, we can communicate with our assistant robots the same way people interact through language, gestures, eye gazes, demonstrations and much more.”

The work was supported through Brown’s AI Research Institute on Interaction for AI Assistants (ARIA), which is funded by the National Science Foundation.

"This is a really great illustration of how we can enable more natural and effective human-machine interaction by strengthening collaborations between computer science and cognitive science,” said Ellie Pavlick, an associate professor of computer science at Brown who leads ARIA. “Embracing what we know about how humans naturally want to communicate, and building systems aligned with those human tendencies and intuitions about behavior, is the right way forward.”

The work was supported by the National Science Foundation (2433429) and the Long-Term Autonomy for Ground and Aquatic Robotics program (GR5250131), and by the Office of Naval Research (N0001424-1-2784, N0001424-1-2603).

The new approach models human pointing as a cone encompassing possible objects of interest. 

Credit

Tellex Lab / Brown Universithy

Sunday, March 08, 2026

 

Evolved robots are born to run and refuse to die



AI-designed metamachines run in the wild, recover from damage and transform into new shapes




Northwestern University

Legged metamachine with three modules 

video: 

Called “legged metamachines,” the creations are made from autonomous, Lego-like modules that snap together into an endless number of configurations. Each module by itself is a complete robot with its own motor, battery and computer.

view more 

Credit: Sam Kriegman/Northwestern University



Northwestern University engineers have developed the first modular robots with athletic intelligence. They can be combined and recombined in the wild, recover from injury and keep moving no matter what’s thrown at them.

Called “legged metamachines,” the creations are made from autonomous, Lego-like modules that snap together into an endless number of configurations. Each module by itself is a complete robot with its own motor, battery and computer. Alone, a module can roll, turn and jump. But the real agility and indestructibility emerges when the modules combine.

The study was published today (March 6) in the Proceedings of the National Academy of Sciences.

To design the most effective combinations, the engineers used artificial intelligence (AI) to evolve novel body configurations. Instead of sticking with standard dog- or human-like designs, the AI churned out strange new “species” of machines that no human engineer would have conceived. When connected to other modules, the metamachines undulate like seals, bound like lizards or spring like kangaroos.

The robots also can flip themselves upright when turned over, hop over obstacles and perform acrobatics like spinning in air. Because a metamachine is essentially a robot made up of other robots, it can resist catastrophic damage. Broken parts don’t become dead weight; they keep rolling, crawling and rejoin the team.

By combining physical modularity with AI-driven design, the researchers have opened the door to a new class of robots that don’t just survive the real world — they adapt to it. These machines point toward a future where robots are less like fragile, pre-designed tools and more like resilient, evolving lifeforms.

“These are the first robots to set foot outdoors after evolving inside of a computer,” said Northwestern’s Sam Kriegman, who led the study. “They are rapidly assembled and then quite literally hit the ground running. They can move freely in the wild and easily recover from major injuries that would be fatal to every other wild robot. If flipped upside down, they instinctively bring themselves upright and continue their journey. They can survive being chopped in half or cut up into many pieces. When separated, every module within the metamachine can become an individual agent.”

An expert in biorobotics and AI, Kriegman is an assistant professor of computer science, mechanical engineering and chemical and biological engineering at Northwestern’s McCormick School of Engineering, where he is a member of the Center for Robotics and Biosystems (CRB). The study’s co-first authors are Chen Yu, David Matthews and Jingxian Wang, who are all Ph.D. students in the CRB. 

Evolution accelerated by computers

While today’s robots can be fast and agile, their body shapes are often fixed and rigid. Most robots cannot adapt to new tasks, environments or physical damage. If a robotic dog breaks a leg, for example, it’s basically useless. To escape those limitations, Kriegman’s team turned to AI — not to copy familiar designs but to evolve something entirely new.

Kriegman and his team started with an evolutionary algorithm that mimics natural selection. As a starting point, the team gave the algorithm the building blocks for the robot. These building blocks are half-meter-long modular legs, which look like a pair of sticks joined by a central sphere.

“Inside the sphere, the robot has everything it needs to survive: a ‘nervous system,’ a ‘metabolism’ and ‘muscle,’” Kriegman said. “By that, I mean a circuit board, a battery and a motor. The modules are mechanically simple. They can only rotate around a single axis, but they are surprisingly athletic and smart.”

Then, Kriegman and his team gave the algorithm a goal: Design a robot with efficient, versatile movement. By mixing and matching the modules in different combinations, the algorithm generated new body types. It then simulated each design, keeping the best performers and discarding the weak. It also iteratively “bred” new designs by combining or mutating them. Depending on the robot’s body, modular legs became legs, spines or tails. 

“We simulated the Darwinian process of mutation and selection within a virtual, physical environment,” Kriegman said. “This is survival of the fittest — accelerated by computers and made real by athletic modular building blocks.”

Traversing rugged terrain

To test the designs, Kriegman and his team assembled the best three-, four- and five-legged designs found by evolution. In outdoor tests, the metamachines ran across rough terrain, including gravel, grass, tree roots, leaves, sand, mud and uneven bricks. They jumped, spun and righted themselves when flipped — all without complicated setup or retraining.

Unlike traditional robots that fail when a single part breaks, these machines can adapt, recover and survive.Even when a leg breaks off, the metamachine remains resilient. The modules adapt to a missing leg and keep moving. The missing leg, too, can roll home and rejoin its team. 

“It can sense its surroundings, move from place to place, compute and learn,” Kriegman said. “Metamachines can be rapidly assembled, repaired, redesigned and recombined. Once assembled, they immediately move themselves across a wide array of unstructured environments.”

The new study builds off previous work from Kriegman’s lab, in which his team designed the first AI algorithm to intelligently design robots from scratch. By compressing billions of years of evolution into mere seconds, the algorithm successfully designed a small, flexible walking robot in mere seconds. While those robots could not do more than walk across a table, they proved that AI can instantly evolve working robots.

“Our previously evolved robots couldn’t sense their own bodies or coordinate themselves,” Kriegman said. “But they still taught us a lot about how evolution works and how to distill those lessons into useful technologies. Evolution can reveal new designs that are different from or even beyond what humans were previously capable of imagining. So, we really wanted to study how and why it works. The best way — or at least the most fun way — is to evolve structures in realistic conditions.”

The study, “Agile legged locomotion in reconfigurable modular robots,” was supported by Schmidt Sciences AI2050 (award number G-22-64506) and the National Science Foundation (award numbers FRR-2331581 and FRR-2440412).

One module of a legged metamachine is tself is a complete robot with its own motor, battery and computer.

Quadruped metamachine (IMAGE)

Northwestern University