Categories
News technology

Do ANYmal quadrupedal robots really hike? How’s that possible?

ETH Zurich researchers have developed a new control approach that enables a legged robot, called ANYmal, to move quickly and robustly over difficult terrain. Led by ETH Zurich robotics professor Marco Hutter, the team’s machine learning technology allows the robot to combine its visual perception of the environment with its sense of touch for the first time.

ANYmal quadrupedal robots learn to hike

 

“The robot has learned to combine visual perception of its environment with proprioception – its sense of touch – based on direct leg contact. This allows it to tackle rough terrain faster, more efficiently, and, above all, more robustly,” Hutter says. In the future, ANYmal can be used anywhere that is too dangerous for humans or too impassable for other robots.

Researchers say the new controller based on a neural network combines external and proprioceptive perception for the first time. Before the robot could put its capabilities to the test in the real world, the scientists exposed the system to numerous obstacles and sources of error in a virtual training camp. This lets the network learn the ideal way for the robot to overcome obstacles, as well as when it can rely on environmental data – and when it would do better to ignore that data.

“With this training, the robot is able to master the most difficult natural terrain without having seen it before,” says ETH Zurich Professor Hutter. This works even if the sensor data on the immediate environment is ambiguous or vague. ANYmal then plays it safe and relies on its proprioception. According to Hutter, this allows the robot to combine the best of both worlds: the speed and efficiency of external sensing and the safety of proprioceptive sensing.

How was the training overview?

Overview of the training methods and deployment. We first train a teacher policy with access to privileged simulation data using reinforcement learning (RL). This teacher policyis then distilled into a student policy, which is trained to imitate the teacher’s actions and to reconstruct the ground-truth environment state from noisy observations. We deploy the student policy zero-shot on real hardware using height samples from a robot-centric elevation map.

Supplementary

Walking over stairs in different directions

For traversing stairs, the other quadrupedal robots typically require that a dedicated mode is engaged, and the robot must be properly oriented with respect to the stairs. In contrast, our controller does not require any special mode for stairs, and can traverse stairs natively in any direction and any orientation, such as sideways, diagonally, and turning around on the stairway.

Baseline comparison

We have demonstrated the extreme robustness of our controller in the real world, but does exteroceptive input actually help improve locomotion performance? To answer this, we conducted controlled experiments to quantitatively evaluate the contribution of exteroception. We compared our controller to a proprioceptive baseline that does not use exteroception.

Robustness evaluation

To examine how our controller integrates proprioception and exteroception, we conducted a number of controlled experiments and visualized the reconsted features from the belief state.

 

Whole-Body MPC and Online Gait Sequence Generation for Wheeled-Legged Robots

How robots learn to hike

Steep sections on slippery ground, high steps, scree and forest trails full of roots: the path up the 1,098-​metre-high Mount Etzel at the southern end of Lake Zurich is peppered with numerous obstacles. But ANYmal, the quadrupedal robot from the Robotic Systems Lab at ETH Zurich, overcomes the 120 vertical metres effortlessly in a 31-​minute hike. That’s 4 minutes faster than the estimated duration for human hikers — and with no falls or missteps.

This is made possible by a new control technology, which researchers at ETH Zurich led by robotics professor Marco Hutter recently presented in the journal Science Robotics. “The robot has learned to combine visual perception of its environment with proprioception — its sense of touch — based on direct leg contact. This allows it to tackle rough terrain faster, more efficiently and, above all, more robustly,” Hutter says. In the future, ANYmal can be used anywhere that is too dangerous for humans or too impassable for other robots.

Perceiving the environment accurately

To navigate difficult terrain, humans and animals quite automatically combine the visual perception of their environment with the proprioception of their legs and hands. This allows them to easily handle slippery or soft ground and move around with confidence, even when visibility is low. Until now, legged robots have been able to do this only to a limited extent.

“The reason is that the information about the immediate environment recorded by laser sensors and cameras is often incomplete and ambiguous,” explains Takahiro Miki, a doctoral student in Hutter’s group and lead author of the study. For example, tall grass, shallow puddles or snow appear as insurmountable obstacles or are partially invisible, even though the robot could actually traverse them. In addition, the robot’s view can be obscured in the field by difficult lighting conditions, dust or fog.

“That’s why robots like ANYmal have to be able to decide for themselves when to trust the visual perception of their environment and move forward briskly, and when it is better to proceed cautiously and with small steps,” Miki says. “And that’s the big challenge.”

A virtual training camp

Thanks to a new controller based on a neural network, the legged robot ANYmal, which was developed by ETH Zurich researchers and commercialized by the ETH spin-off ANYbotics, is now able to combine external and proprioceptive perception for the first time. Before the robot could put its capabilities to the test in the real world, the scientists exposed the system to numerous obstacles and sources of error in a virtual training camp. This let the network learn the ideal way for the robot to overcome obstacles, as well as when it can rely on environmental data — and when it would do better to ignore that data.

“With this training, the robot is able to master the most difficult natural terrain without having seen it before,” says ETH Zurich Professor Hutter. This works even if the sensor data on the immediate environment is ambiguous or vague. ANYmal then plays it safe and relies on its proprioception. According to Hutter, this allows the robot to combine the best of both worlds: the speed and efficiency of external sensing and the safety of proprioceptive sensing.

Use under extreme conditions

Whether after an earthquake, after a nuclear disaster, or during a forest fire, robots like ANYmal can be used primarily wherever it is too dangerous for humans and where other robots cannot cope with the difficult terrain.

What are 5 different types of robots?

Generally, there are five types of robots:
1) Pre-Programmed Robots.
2) Humanoid Robots.
3) Autonomous Robots.
4) Teleoperated Robots.
5) Augmenting Robots.

Who invented robots first?

The earliest robots as we know them were created in the early 1950s by George C. Devol, an inventor from Louisville, Kentucky. He invented and patented a reprogrammable manipulator called “Unimate,” from “Universal Automation.” For the next decade, he attempted to sell his product in the industry, but did not succeed.

What are robots useful for?

robotics, design, construction, and use of machines (robots) to perform tasks done traditionally by human beings. Robots are widely used in such industries as automobile manufacture to perform simple repetitive tasks, and in industries where work must be performed in environments hazardous to humans

Why do we need robots?

Most robots today are used to do repetitive actions or jobs considered too dangerous for humans. … Robots are now used in medicine, for military tactics, for finding objects underwater and to explore other planets. Robotic technology has helped people who have lost arms or legs. Robots are a great tool to help mankind

Also read: