<?xml version="1.0" encoding="utf-8" standalone="no"?><rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/" version="2.0"><channel><title>Automaton</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum's robotics blog</description><atom:link href="https://spectrum.ieee.org/feeds/topic/robotics.rss" rel="self"/><language>en-us</language><lastBuildDate>Wed, 13 May 2026 13:02:50 -0000</lastBuildDate><item><title>Hello Robot Sets the Standard for Practical, Safe Home Robots</title><link>https://spectrum.ieee.org/stretch-4-home-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/tall-wheeled-home-robot-with-an-extended-arm-in-a-modern-living-room-near-a-potted-cactus.jpg?id=66719760&width=1245&height=700&coordinates=0%2C187%2C0%2C187"/><br/><br/><p>Many roboticists (and at least one robotics journalist) have been seduced by the dream of a robot butler. And the rampant popularity of videos showing <a href="https://www.youtube.com/watch?v=CAdTjePDBfc" rel="noopener noreferrer" target="_blank">humanoid robots doing household tasks</a> in improbably clean kitchens and unrealistically tidy bedrooms suggests that we’re not the only ones interested in a robot that can do our chores. But <a href="https://spectrum.ieee.org/humanoid-robot-scaling" target="_self">for all kinds of reasons</a>, legged humanoids are not yet ready for industrial or commercial applications at scale, and home applications (<a href="https://spectrum.ieee.org/home-humanoid-robots-survey" target="_self">if people even <em><em>want</em></em> them</a>), I would argue, are even farther away. Even so, ludicrously well-funded humanoid robotics companies are now <a href="https://www.1x.tech/manufacturing" rel="noopener noreferrer" target="_blank">ramping production</a> while explicitly promising that their robots will be doing ‘<a href="https://www.figure.ai/news/ramping-figure-03-production" rel="noopener noreferrer" target="_blank">housework</a>.’</p><p>So what about that robot butler dream, then? It still exists! All you have to do is forget about legs, arms, hands, faces, and focus on what really matters: mobility and manipulation. This is what <a href="https://spectrum.ieee.org/hello-robots-stretch-mobile-manipulator" target="_self">Hello Robot’s</a> <a href="https://spectrum.ieee.org/stretch-assistive-robot" target="_self">Stretch robot</a> is unapologetically all about, and the <a href="https://spectrum.ieee.org/hello-robot-stretch-3" target="_self">newest version</a> being announced today, Stretch 4, is closer than ever to a robot that could safely do practical work in the home at an accessible cost.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="c78e812c3287c07fcc7c2ab2bf8279de" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uyHa-Gk4THw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">Hello Robot says Stretch 4 is “built for the real world.”</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>“With Stretch 4, we wanted to make the transition from a research platform to something that is truly deployable,” explains <a href="https://www.linkedin.com/in/aaron-edsinger/" rel="noopener noreferrer" target="_blank">Aaron Edsinger</a>, Hello Robot co-founder and CEO. This version, while ready for research and enterprise customers now, is designed for pilot deployments to help Hello Robot understand how to scale in the home. “This has been our most difficult design process,” adds co-founder and CTO <a href="https://www.linkedin.com/in/charlie-kemp/" rel="noopener noreferrer" target="_blank">Charlie Kemp</a>. “We had a lot of fear of ‘second-system syndrome,’ where you add all the features you didn’t get to initially and end up with a monstrosity. But since we founded the company on making simple, minimalist robots, every time we added complexity it was an emotional challenge. Navigating that fear resulted in a nice compromise that sits in a great spot, rather than being a maximalist humanoid.”</p><h2>Stretch 4 Upgrades</h2><p>The biggest change from the previous version of Stretch is the addition of an omnidirectional base, meaning that the robot can translate in any direction without having to turn first. This makes it much easier to control (especially for novice users), but omnidirectional bases are significantly more complicated to design and build. What ultimately made it possible for Stretch were new types of omnidirectional wheels developed for powered wheelchairs, along with a solid six months of focused development by Hello Robot.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Close-up of a white robotic head with cameras, sensors, and glowing blue lights." class="rm-shortcode" data-rm-shortcode-id="a337347c7b7553dc4c62836ae58ff620" data-rm-shortcode-name="rebelmouse-image" id="67a32" loading="lazy" src="https://spectrum.ieee.org/media-library/close-up-of-a-white-robotic-head-with-cameras-sensors-and-glowing-blue-lights.jpg?id=66719735&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">A redesigned sensorized head gives Stretch more options for teleoperation and autonomy.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>Stretch 4 also ditches the <a href="https://hello-robot.com/stretch-3-whats-new" target="_blank">cute little pan-tilt head</a> for a more complex sensor suite with a much wider field of view. “We started out wanting to use lots of cheap cameras to keep costs low, like Tesla does,” Edsinger tells us. “But we ended up with an approach closer to Waymo’s: the richer and more reliable your data, the safer and more intelligent the robot can be.” There are a pair of hemispherical lidars, <a href="https://www.luxonis.com/" target="_blank">Luxonis</a> cameras for vision and navigation, and a wrist-mounted depth camera for manipulation. The robot’s primary system runs on an Intel NUC 15, plus an Nvidia Jetson Orin NX for researchers to play with for visual processing or AI.</p><h2>Philosophy on Autonomy</h2><p>Hello Robot’s general philosophy on autonomy is to have a human in the loop, but that can take many different forms ranging from direct control to purely supervisory control. The robot will ship with a baseline of autonomous capabilities that include mapping, navigation, and self-charging, along with demo-ready features like autonomous grasping. But unlike most other robotics companies, Hello Robot isn’t looking to use their hardware to collect a stupendous amount of data in the concerningly vague hope that commercially viable autonomy will follow. </p><p>“Stretch has huge advantages in safety, cost, and capability,” Kemp says. “I’d much rather be the platform that foundation model developers target.” Edsinger agrees: “We do want to partner with foundation model companies to explore things like dexterous in-home manipulation, but we aren’t the ones to build those foundation models.”</p><h2>In-Home Pilots</h2><p>While earlier versions of Stretch were primarily for research, Kemp tells us that Stretch 4 has been explicitly designed to be piloted in the homes of people with severe mobility impairments. Hello Robot will be happy to sell you one (or lots, I’m guessing) for commercial or industrial applications, but the broader goal with Stretch 4 is to use remote testing and in-home evaluations to work towards a robot that’s useful and reliable enough that it can provide consistent daily value for disabled users.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A series of 5 images of the robot show it's arm at different heights and extended lengths." class="rm-shortcode" data-rm-shortcode-id="5670ef548d8bbce284a871baf088ddc7" data-rm-shortcode-name="rebelmouse-image" id="45d3d" loading="lazy" src="https://spectrum.ieee.org/media-library/a-series-of-5-images-of-the-robot-show-it-s-arm-at-different-heights-and-extended-lengths.jpg?id=66719740&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">A holonomic base and an extendable arm make for a capable robot without the complexity.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>Part of why I’m optimistic about Stretch finding near-term success in this role is precisely <em><em>because</em></em> it’s not a humanoid. One of the primary arguments for humanoids is that they’re worth pursuing because they can better operate in environments designed for humans, where legs and five-fingered hands are tangible advantages. But those very same environments often exclude an entire subset of humanity—a subset of humanity that we will all likely join at some point, because the best that any of us can ever say is that we are not disabled <em><em>yet</em></em>. </p><h2>Why Not Humanoids?</h2><p>A key partner for Hello Robot throughout the Stretch development process has been <a href="https://spectrum.ieee.org/stretch-assistive-robot" target="_self">Henry Evans</a>. Evans is paralyzed and cannot speak, although he can use a computer (for controlling robots, among other things) and type at about 15 words per minute. I spoke with Evans about his thoughts on the idea of a humanoid assistive robot, compared to a robot like Stretch. “The question is: What benefit does a bipedal robot offer to a person who can’t walk?” Evans asks. “Their entire environment has been modified to accommodate wheeled conveyances. Automobiles don’t have legs, and neither should home robots. Wheels are cheap, stable, precise, require very few controls, and don’t have to be invented.”</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A man lies in bed looking up at a robotic hand." class="rm-shortcode" data-rm-shortcode-id="ccf3a458f19daf1eec03f02da9576826" data-rm-shortcode-name="rebelmouse-image" id="9e077" loading="lazy" src="https://spectrum.ieee.org/media-library/a-man-lies-in-bed-looking-up-at-a-robotic-hand.jpg?id=66719738&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Henry Evans has been testing a Stretch 4 as a home assistive robot.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>Evans also points out that humanoids can require the simultaneous control of dozens of degrees of freedom. “A paralyzed person who can’t talk (like yours truly) can control maybe one or two joints at a time with today’s control mechanisms, if they are lucky.” Evans believes that AI, along with Brain Computer Interfaces (BCIs), show promise for dramatically increasing what he can do when it comes to motion. “Remember, though, a paralyzed person has no movements to mimic, so until a perfectly tuned BCI gets here and facilitates a true humanoid body surrogate, I don’t think it will work. And even then, I don’t see the advantage of legs for assistive care robots. I am willing to be proven wrong, though, and will test-drive almost anything once, so bring it on!”</p><p>Kemp and Edsinger, who have many decades of humanoid experience between them, feel similarly. “There are applications where the human form is fundamental,” Kemp says. “But for many applications, the value of the human form is unclear or even problematic. Jumping to the conclusion that robots must be humanoid means missing opportunities to take advantage of the structured indoor environments that we’ve already created.”</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="2466daf5440094ed584445540285be84" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zk2C3KJeuto?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">Georgena Moran and her sisters tested Stretch 4 at the California Academy of Sciences Museum, allowing her to interact with the exhibits from home.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Hello Robot</small></p><p>And of course there’s the question of safety, which Evans brings up. “My caregivers and I have been testing robots in my home to assist us for about 15 years, and the very first concerns are: Where is the emergency stop, and how do you activate it? It gets used surprisingly often. The thing is, when a wheeled robot gets emergency stopped, it freezes in place. When a bipedal robot gets run-stopped, it collapses on anything under it, including the patient.” Kemp agrees. “The safety aspect of humanoids in a home freaks me out. I don’t know how someone can confidently think about safety with a humanoid in a home.”</p><h2>Robots for Sale</h2><p>However you feel about humanoids, here’s one more reason why Stretch feels like a much more realistic solution for in-home assistive robots right now: You can actually buy one, and at US $29,950, it’s very affordable, <a href="https://robotsguide.com/robots/tiago" target="_blank">as mobile manipulators go</a>. Edsinger and Kemp are planning to leverage in-home Stretch 4 pilot deployments to make the <em><em>next</em></em> version of Stretch the one that can be commercially sold for home assistance. At the rate that Hello Robot has been releasing new hardware, that could easily be within the next year or so—and my guess is that Stretch 5 is very likely to be the first practical, affordable assistive robot for home use. It may not look like Rosie, but it promises to be safe, and it works.</p>]]></description><pubDate>Tue, 12 May 2026 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/stretch-4-home-robot</guid><category>Hello-robot</category><category>Home-robots</category><category>Humanoid-robots</category><category>Mobile-manipulator</category><category>Mobility-impaired</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/tall-wheeled-home-robot-with-an-extended-arm-in-a-modern-living-room-near-a-potted-cactus.jpg?id=66719760&amp;width=980"/></item><item><title>Video Friday: AI Gives Robot Hands Humanlike Dexterity</title><link>https://spectrum.ieee.org/video-friday-robotic-hand-dexterity</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robot-hand-grips-a-blender-pitcher-to-pour-a-pink-smoothie-into-a-cup-held-in-another-robot-hand.png?id=66709264&width=1245&height=700&coordinates=0%2C62%2C0%2C63"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><h5><a href="https://actuate.foxglove.dev/">Actuate 2026</a>: 18–19 August 2026, SAN FRANCISCO</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="6k_bgh54lti"><em>Introducing GENE-26.5—the first AI brain to give robots human-level physical manipulation capabilities. Cooking a full meal. Cracking an egg one-handed. Conducting lab experiments. Wire harnessing. Even playing the piano. Tasks that were impossible for robots. Until now.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a8e4ae208b291c232e100dfd59cecf1e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6K_bGH54ltI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.genesis.ai/">Genesis AI</a> ] via [ <a href="https://techcrunch.com/2026/05/06/khosla-backed-robotics-startup-genesis-ai-has-gone-full-stack-demo-shows/">TechCrunch</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ve6zyrgxqzw"><em>This is Labububot—one of the rarest monsters on Earth. Twelve Labubu heads are reconstituted into a single spherical form: a Frankenstein’s Monster of pop culture iconography. Labububot is a playful critique of <a data-linked-post="2655919083" href="https://spectrum.ieee.org/social-robots-children" target="_blank">social robots</a>, and a question made physical—what do the monsters we make reveal about the monsters we are?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="35d39988b0c0e318a7d00c00f87a0274" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ve6ZYrgxqZw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.media.mit.edu/projects/labububot/overview/">MIT Media Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0yzjvaefq5w"><em>Watch Spot crouch, jump, climb boxes, and leap across gaps, controlled by a neural network trained with reinforcement learning (RL) and multi-expert distillation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="49a459734245d6d4b35ca0a4453b58c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0YZjvAEFQ5w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rai-inst.com/">Robotics and AI Institute</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xjkgyr8l7ea">Good, now there is a robot that can take over exercise for me.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="acc80fe4b452b6a677874565a473916e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XJKgYR8L7eA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://gotokepler.com">Kepler</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gfd_k30syms"><em>Additive manufacturing has become an enabling technology, but existing techniques are not capable of directly <a data-linked-post="2675666255" href="https://spectrum.ieee.org/3d-printed-linear-motor" target="_blank">3D printing</a> high-current electromagnetic actuators due to material and design limitations. In this work, a novel 3D-printable, multilayer, wave-winding topology is created for high-efficiency electric motors.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bd82967ac82276c8212f45f6982de73c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gFD_k30SYms?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.gatech.edu/chen-mazumdar/">Sensing Technologies Laboratory</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="abjntvayt9g"><em>NASA is pushing the limits of <a data-linked-post="2650251618" href="https://spectrum.ieee.org/my-10-favorite-mars-novels" target="_blank">flight on Mars</a>—by spinning helicopter rotor blades so fast, they’re breaking the sound barrier. During recent tests at NASA’s Jet Propulsion Laboratory, engineers accelerated the tips of next-generation rotor blades beyond Mach 1 inside a special chamber that simulates the atmospheric conditions of the Red Planet.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="444a49b45e34e772b665a1f6b1b6c4bb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/aBJNtvAyt9g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.jpl.nasa.gov/news/nasa-pushes-next-gen-mars-helicopter-rotor-blades-past-mach-1/">NASA Jet Propulsion Laboratory</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uohfghlhrkg"><em>Balancing commercial goals and robotics research can be tricky, but with Atlas, we’re making it work.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="180b190ec971c22d723823a3d05de3f9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UoHfGhLHRkg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="2tsjxsuixb4">Open Duck Mini is an open-source version of Disney’s BDX droids, and you can play with it in your browser.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="43791706fc31151bfffa1aafaf3d2a64" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2tsJxsuiXB4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mertcookimg.github.io/Open_Duck_Mini_Viewer/">Open Duck Mini Viewer</a> ]</p><p>Thanks, Masato!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0_ad8sdj1gc"><em>Automated inspection of steel structures using magnetic climbing robots can reduce costs and improve safety, but many such structures feature interior corners that are challenging for wheeled or tracked robots to traverse. We present the first magnetic-wheeled robot to use X-ray fluorescence for steel structure inspection, Sally, capable of overcoming all interior corner transition types, traversing small obstacles, and maneuvering in tight spaces.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e06f5d0f9676f6a30ded533c3dd41350" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0_AD8SDj1gc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cmu.edu/me/robomechanicslab/">Robomechanics Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="4djthku2kjo">I don’t know what this is, but it’s coming soon from SwitchBot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0bd24132d67958351833a63c385e92be" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4dJthkU2kjo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://us.switch-bot.com/pages/katafriends">SwitchBot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jzwuctc2sou">You probably know the answers to these questions already, but this ELI5 from Aaron Ames is still fun.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="54476bc0d07f1438817842d782b75ff7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jZwuCtc2SoU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/@WIRED">Wired</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="3y8aq_ofevs"><em>Jim Fan, who leads the embodied autonomous research group at Nvidia, returns to AI Ascent to argue that robotics is entering its endgame—and that the playbook is already written.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cad5aef0dfe85d2403c36d2d64162a9b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3Y8aq_ofEVs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/@sequoiacapital">Sequoia</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Sat, 09 May 2026 16:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robotic-hand-dexterity</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Manipulation</category><category>Robot-videos</category><category>Autonomous-robots</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robot-hand-grips-a-blender-pitcher-to-pour-a-pink-smoothie-into-a-cup-held-in-another-robot-hand.png?id=66709264&amp;width=980"/></item><item><title>iRobot Founder Wants to Put a Robotic Familiar Into Your Home</title><link>https://spectrum.ieee.org/familiar-machines-and-magic</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-gif-shows-a-short-clip-of-a-teenager-sitting-with-and-then-hugging-a-torso-sized-animal-like-robot.gif?id=66675837&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>Two years ago, <a href="https://spectrum.ieee.org/irobot-amazon" target="_self">Colin Angle stepped down as CEO of iRobot</a>, <a href="https://spectrum.ieee.org/irobot-bankruptcy-colin-angle-amazon" target="_self">the company that he cofounded</a> and the most successful home robot company the world has ever seen. Angle almost immediately founded a stealthy new “physical AI” company called <a href="https://www.familiarmachines.com/" rel="noopener noreferrer" target="_blank">Familiar Machines & Magic</a> (FM&M), which in short order managed to attract a combination of exceptionally talented robotics folks, including <a href="https://spectrum.ieee.org/u/morgan-pope" target="_self">Morgan Pope from Disney Research</a>, which got us very curious.</p><p>Today, Familiar Machines & Magic is announcing its first robot, a “physically embodied AI system designed to perceive, adapt, and interact with people in ways that feel natural and consistent,” the press release says. This robot is not a toy, and it’s not specifically for kids. Rather, it’s for adults to purchase for themselves and their families. It will get to know you, seek you out for attention, and actively help you positively pursue an idealized routine in your life.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Gif shows a short clip of a cute white bear like robot looking around a doorframe and nodding." class="rm-shortcode" data-rm-shortcode-id="5113f3353932d28e5e93351b4c826ea7" data-rm-shortcode-name="rebelmouse-image" id="bc585" loading="lazy" src="https://spectrum.ieee.org/media-library/gif-shows-a-short-clip-of-a-cute-white-bear-like-robot-looking-around-a-doorframe-and-nodding.gif?id=66675850&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Intended for adults, Familiar is pet-like in that it will seek you out for attention.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Familiar Machines & Magic</small></p> <p><span>Here are the (limited) technical details from the press release:</span></p><p><em><em>The first Familiar is a quadruped, specifically designed for human-robot interaction, with 23 degrees of freedom enabling both lifelike movement and expressive behaviors. The Familiar is covered with a custom touch-sensitive coat, a vision system, and a microphone array and audio system, to support rich interactions. Its onboard edge AI stack is powered by a custom small multimodal model optimized for social reasoning, combining vision, audio, language, and memory to create socially responsive behaviors in real time.</em></em></p><p>FM&M <a href="https://www.familiarmachines.com/" target="_blank">CEO and cofounder Colin Angle</a> tells us that this first prototype Familiar is designed to look like a sort of highly abstracted bear. It’s very deliberately nothing like a dog or a cat, following the successful strategy of other social robots like <a href="https://spectrum.ieee.org/paro-the-robotic-seal-could-diminish-dementia" target="_self">Paro</a> and <a href="https://spectrum.ieee.org/new-pleo-robotic-dinosaur-much-more-advanced-than-original" target="_self">Pleo</a>—if you can’t connect the form factor to an animal that you have direct experience with, you won’t bring expectations to your interactions with the robot.</p><h3>What Does it Do?</h3><p>“Our goal is to position this as a robot familiar that lives with you and helps reinforce healthy routines,” Angle says. He explains that thinking of a Familiar like a pet is a strong analogy, but pet-like also undersells what the robot can do. The Familiar behaves a little more like a service animal, in the narrow sense of being able to recognize activities and intervene to motivate you to do more or less of them, as the case may be. One easy example is screen time—the Familiar can note how much time you spend on your phone, and if it’s too much, it can actively try to engage you in other activities, including taking it for a walk outside. “The idea,” says Angle, “is that you can have a bit of technology in your home which is hyperloyal to you, gets to know you, helps you figure out an idealized routine, and then plays a positive role.”</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A man reaches out to touch a white robot while lying on the couch looking at his phone." class="rm-shortcode" data-rm-shortcode-id="16e94ab841236e2af33c4f467f0e9beb" data-rm-shortcode-name="rebelmouse-image" id="97de4" loading="lazy" src="https://spectrum.ieee.org/media-library/a-man-reaches-out-to-touch-a-white-robot-while-lying-on-the-couch-looking-at-his-phone.jpg?id=66675852&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Spending too much time on your phone? Familiar can help with that.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Familiar Machines & Magic</small></p><p>Cramming this amount of intelligence into a robot that you can take for a walk outside (at regular human walking pace) is extremely ambitious. I asked FM&M’s creative director <a href="https://www.linkedin.com/in/morganthomaspope/" target="_blank">Morgan Pope</a> what made him feel that a robot like a Familiar was possible, with enough confidence that he was willing to leave Disney Research to join the startup.<strong> “</strong>Two recent advancements made it feel tractable,” Pope says. “First, seeing <a href="https://spectrum.ieee.org/disney-robot" target="_self">Disney’s bipedal robots walk flexibly over various terrain</a> using reinforcement learning proved you can execute dynamic motion without needing perfect, zero-backlash actuators or crazy expensive hardware. And second, while I am often skeptical of generative AI hype, it is a perfect fit here because it excels at creating the plausible assumption of intelligence, which helps the character feel coherent and lifelike.<strong>”</strong></p><h3>The Challenge of Social Home Robots</h3><p>As a social home robot, the Familiar will have quite a lot of work to do to single-pawedly reestablish a category that burned itself out between 2012 and 2019. A series of high-profile and very-well-funded startups including <a href="https://spectrum.ieee.org/consumer-robotics-company-anki-abruptly-shuts-down" target="_self">Anki</a>, <a href="https://spectrum.ieee.org/mayfield-robotics-cancels-kuri-social-home-robot" target="_self">Mayfield</a>, and <a href="https://spectrum.ieee.org/jibo-is-probably-totally-dead-now" target="_self">Jibo</a> were not able to sustain social home robots as a business, <a href="https://spectrum.ieee.org/anki-jibo-and-kuri-what-we-can-learn-from-social-robotics-failures" target="_self">primarily because</a> of a struggle with longer-term engagement. It’s not enough for a robot to be cute and charming in the short term; it has to continue enthralling its users or at least providing value after the initial novelty has worn off. In other words, a flashy demo is arguably counterproductive, which is a real problem, since robots excel at flashy demos.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Animated gif shows a woman doing yoga while a soft looking animal-like white robot imitates her pose." class="rm-shortcode" data-rm-shortcode-id="f8819e686639af106e4245421a8a7456" data-rm-shortcode-name="rebelmouse-image" id="45c39" loading="lazy" src="https://spectrum.ieee.org/media-library/animated-gif-shows-a-woman-doing-yoga-while-a-soft-looking-animal-like-white-robot-imitates-her-pose.gif?id=66675858&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Part of the value of Familiar is that it will help you establish healthy routines.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Familiar Machines & Magic</small></p><p>“It’s about creating the right expectation and delivering on that expectation,” says Angle. “Familiars live in your world and play by your rules, and if you don’t find yourself hanging out with it, petting it, and engaging with it, then we haven’t succeeded.”</p><p>In what is very much not a coincidence, the term <em>familiar</em> really is the best way of thinking about this robot—a sort of vaguely magical nonhuman entity that has some amount of independence but whose existence and motivation are fundamentally tied to its human. “This isn’t trying to be a replacement for a real friend,” Angle explains. “It’s artificial life that lives in your world, has its own personality and goals, and has a special link to its guardian where it wants attention and wants its guardian to be active.”</p><h3>Creating Long-Term Value</h3><p>This philosophy is a key differentiator for FM&M. A Familiar is more than a companion; it has long-term objectives that it’s trying to fulfill to improve your life in a targeted way, says Angle. It’ll attempt to connect with you socially to encourage you to spend time with it in service of those goals, but the goals are the end, er, goals, rather than just the social connection itself, which was the primary draw of the previous generation of social robots. “Within a few days of bringing your Familiar home,” Angle tells us, “it’s figured out what its role in your life is. It’s trying to reinforce a healthy routine, whether that be summoning people to dinner or cuddling up while you watch TV, or greeting you when you get home. And then the way you sustain that relationship is by having it evolve, with both characters playing an active role—you’re also helping it with the things required to keep a robot operating.”</p><h3>Human-Familiar Interaction</h3><p>The temptation to leverage recent advances in AI to make a robot like a Familiar talk, especially in the context of regularly interacting with humans in pursuit of specific goals, must have been overwhelming. But to its credit, FM&M managed to resist. “I don’t believe that the technology exists today for AI to talk to humans in a safe, responsible fashion,” Angle explains. Consequently, a Familiar does not currently speak, although it does make sounds, and has plenty of other ways of communicating. “Through careful design, you’d be amazed what you can powerfully convey using a tail, wiggly ears, blinking eyes, and a brow that can be happy, sad, angry, or annoyed,” Angle says. This will likely resonate strongly with dog owners, somewhat less strongly with cat owners, and only very slightly with reptile owners like me.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A white animal-like soft looking robot poses next to a golden retriever." class="rm-shortcode" data-rm-shortcode-id="ab3f471e93e12344eb07005a12656679" data-rm-shortcode-name="rebelmouse-image" id="68a49" loading="lazy" src="https://spectrum.ieee.org/media-library/a-white-animal-like-soft-looking-robot-poses-next-to-a-golden-retriever.jpg?id=66675856&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Familiar is capable enough to keep up with you on walks outdoors.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Familiar Machines & Magic</small></p><p>Going the other direction is more complicated. Those same recent advances in AI mean that a Familiar can very likely understand everything you say and obey you perfectly, if it chose to. But doing so would break the illusion that the robot has its own desires and goals and personality, so FM&M had to be careful. “The way we’ve trained it from an AI perspective is really cool,” Angle explains. “We’re using a tableau of speech and vision inputs presented to a small multimodal model trained on stories, and for a given tableau of inputs, it goes through a generative process to decide at a high level what it is going to do. That decision is handed to a behavior engine which builds out those behavior trees into goals and drives a reinforcement learning unified motion model. There is nothing fully deterministic about your Familiar’s behavior; it truly tries to live its life with a variety of personality-driven emotions.”</p><h3>Safety at Home</h3><p>A Familiar is not big, as robots go, but it’s not exactly small, either. And as something with legs, there’s always a concern about what happens if it falls over. “Its low center of gravity helps immensely,” says Pope. “If we pull power, it collapses downward safely rather than tipping over. Furthermore, it is wrapped in soft rubber, fur, and padding, so even if a leg impacts you, it won’t have a lot of force behind it.” Interestingly, FM&M is also leveraging the “character experience” to mitigate risks to both robot and user. “We can use emotions to communicate hazards effectively,” explains Pope. “For example, if someone carries it somewhere high or puts it near an open flame, the Familiar can act visibly scared to directly communicate that it doesn’t like the situation.”</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A young child reads a book while the white soft robot looks on." class="rm-shortcode" data-rm-shortcode-id="d57550d63bfc728a4f056d826b80c96f" data-rm-shortcode-name="rebelmouse-image" id="915a2" loading="lazy" src="https://spectrum.ieee.org/media-library/a-young-child-reads-a-book-while-the-white-soft-robot-looks-on.jpg?id=66675878&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">While not a toy or specifically intended for children, Familiar can provide gentle, warm attention to your family.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Familiar Machines & Magic</small></p><p>Besides physical safety, social robots must also consider emotional safety. The better job you do emotionally connecting with people, the more responsibility you have to make sure that those connections are positive. “We take this very seriously,” Pope tells us. “We must follow a ‘do no harm’ philosophy, ensuring we don’t trigger unhealthy dependency or monopolize people’s attention the way a phone does. We are designing carefully to ensure the overall impact remains positive and never crosses the line into harm.” Additionally, the Familiar’s AI runs onboard the robot, and the robot does not stream private data to the cloud. It will, in fact, run just fine if you disconnect it from the internet entirely, although you’ll lose access to any new features that come out.</p><h3>Managing Expectations</h3><p>Alongside the many engineering and human-robot interaction (HRI) challenges that FM&M is having to manage is one other challenge that, in the near term, sounds rather dull but may be the most challenging: marketing. The company obviously has to promote this robot, but there’s a real danger (which has had <a href="https://spectrum.ieee.org/anki-jibo-and-kuri-what-we-can-learn-from-social-robotics-failures" target="_self">dire consequences for many robotics companies in the past</a>) of selling an idea of what the robot <em><em>could be</em></em> rather than the reality of what the robot <em><em>actually is</em></em>.</p><p>From my conversations with Pope, FM&M seems to understand that robots have always been the most successful when the experience or task is incidental to the robot itself—in other words, what’s most compelling is what the robot <em><em>will do</em></em>, rather than the fact that it’s a robot. “The best way to understand a Familiar is that we are not building a robot; we are building a relationship,” Pope explains.</p><p>Whether in the context of locomotion or relationships, we can be absolutely certain that a robot of this level of sophistication is not going to do what it’s supposed to every single time. Fortunately, the folks at FM&M have been building robots for long enough that they’re prepared for this. “We’ve explicitly tried to design it to motivate forgiveness,” Angle tells us. “This is not a precise robotic entity in its motion or dexterity. It’s supposed to be imperfect, but it’s going to get some of it right. By actively working to manage expectations to a place we can achieve, we want consumers to appreciate what it can do.”</p><p>What customers expect, what they appreciate, and how much forgiveness they’re willing to bestow is for better or worse highly dependent on how much a Familiar will cost. “For the cost of ownership of something like a pet, you’re getting something that can help you live a healthier life, feel attended to, and provide social benefit,” Angle says. This could mean many things, depending on the pet, but <a href="https://www.rover.com/blog/cost-of-dog-parenthood/#h-how-much-does-a-dog-cost-per-year-nbsp" target="_blank">one source</a> puts the low end of the monthly cost for a cat at around $65 per month, with a dog somewhat more expensive at closer to $100 per month. FM&M’s press release stresses that today’s announcement ‘is not a commercial product launch,’ and specific pricing and a timeline will come later.</p><h3>A Future Platform</h3><p>While it’s much too early for us to be speculating about what the future might hold for FM&M’s robots, Angle is of course already thinking about other places where Familiars might be at home. “This first robot is meant to be a platform with general appeal and an opportunity to specialize into things like elder care and parental support,” Angle says. “From the ground up we are designing machines focused on human connection, and the underlying technology can further generalize into other form factors.”</p><p>This will require the Familiar to find success, and it’s important to reiterate how much of a challenge this will be. A legged robot, designed for human interaction, in the home—everything about what FM&M is doing is hard. Because of his experience launching and leading iRobot, Angle is one of the very few people with the experience to really understand this, but his excitement and optimism about the Familiar is undiminished. “Do we know exactly how it’s going to land? I don’t,” says Angle. “But do I think it’s going to work? Absolutely. We’re going to find out, with a mission and goals that are noble at heart.”</p>]]></description><pubDate>Mon, 04 May 2026 17:30:02 +0000</pubDate><guid>https://spectrum.ieee.org/familiar-machines-and-magic</guid><category>Irobot</category><category>Social-robots</category><category>Colin-angle</category><category>Robot-animals</category><category>Home-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/a-gif-shows-a-short-clip-of-a-teenager-sitting-with-and-then-hugging-a-torso-sized-animal-like-robot.gif?id=66675837&amp;width=980"/></item><item><title>DAIMON Robotics Wants to Give Robot Hands a Sense of Touch</title><link>https://spectrum.ieee.org/daimon-robotics-physical-ai</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/man-wearing-glasses-and-a-gray-shirt-smiles-at-camera-while-surrounded-by-futuristic-robots-and-tech-devices-in-a-photo-illustra.jpg?id=66444415&width=1245&height=700&coordinates=0%2C83%2C0%2C83"/><br/><br/><p><em>This article is brought to you by <a href="https://www.dmrobot.com/" rel="noopener noreferrer" target="_blank">DAIMON Robotics</a>.</em></p><p>This April, Hong Kong-based <a href="https://www.dmrobot.com/" target="_blank">DAIMON Robotics</a> has released <a href="https://modelscope.cn/datasets/daimonrobotics/Daimon-Infinity" target="_blank">Daimon-Infinity</a>, which it describes as the largest omni-modal robotic dataset for physical AI, featuring high resolution tactile sensing and spanning a wide range of tasks from folding laundry at home to manufacturing on factory assembly lines. The project is supported by collaborative efforts of partners across China and the globe, including Google DeepMind, Northwestern University, and the National University of Singapore.</p><p>The move signals a key strategic initiative for DAIMON, a two-and-a-half-year-old company known for its advanced tactile sensor hardware, most notably a monochromatic, vision-based tactile sensor that packs over 110,000 effective sensing units into a fingertip-sized module. Drawing on its high-resolution tactile sensing technology and a distributed out-of-lab collection network capable of generating millions of hours of data annually, DAIMON is building large-scale robot manipulation datasets that include vast amounts of tactile sensing data. To accelerate the real-world deployment of embodied AI, the company has also open-sourced 10,000 hours of its data.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Person in navy suit and blue striped tie against a blue studio backdrop" class="rm-shortcode" data-rm-shortcode-id="8cece378ab4c77c48b623176c4b987f1" data-rm-shortcode-name="rebelmouse-image" id="75715" loading="lazy" src="https://spectrum.ieee.org/media-library/person-in-navy-suit-and-blue-striped-tie-against-a-blue-studio-backdrop.jpg?id=66443402&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Prof. Michael Yu Wang, co-founder and chief scientist at DAIMON Robotics, has pioneered Vision-Tactile-Language-Action (VTLA) architecture, elevating the tactile to a modality on par with vision.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DAIMON Robotics</small></p><p>Behind the strategy is Prof. Michael Yu Wang, DAIMON’s co-founder and chief scientist. Prof. Wang earned his PhD at Carnegie Mellon — studying manipulation under <a href="https://mtmason.com/" target="_blank">Matt Mason</a> — and went on to found the Robotics Institute at the Hong Kong University of Science and Technology. An IEEE Fellow and former Editor-in-Chief of <em>IEEE Transactions on Automation Science and Engineering</em>, he has spent roughly four decades in the field. His objective is to address the missing “insensitivity” of robot manipulation, which practically relies on the dominant Vision-Language-Action (VLA) model. He and his team have pioneered Vision-Tactile-Language-Action (VTLA) architecture, elevating the tactile to a modality on par with vision.</p><p>We spoke with Prof. Wang about how tactile feedback aims to change dexterous manipulation, how the dataset initiative is foreseen to improve our understanding of robotic hands in natural environments, and where — from hotels to convenience stores in China — he sees touch-enabled robots making their first real-world inroads.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="aefd06e65c87457b36383efcb6824f8b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ui2Wby0Rty4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><small class="image-media media-caption" placeholder="Add Photo Caption...">Daimon-Infinity is the world’s largest omni-modal dataset for Physical AI, featuring million-hour scale multimodal data, ultra-high-res tactile feedback, data from 80+ real scenarios and 2,000+ human skills, and more.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DAIMON Robotics</small></p><h2>The Dataset Initiative</h2><p><strong>This </strong><strong>month, DAIMON Robotics </strong><strong>release</strong><strong>d the <a href="https://modelscope.cn/datasets/daimonrobotics/Daimon-Infinity" target="_blank">largest and most comprehensive robotic manipulation dataset</a> with multiple leading academic institutions and enterprises. Why releas</strong><strong>ing the dataset now, rather than continuing to focus on product</strong><strong> development? What impact will this have on the embodied intelligence industry?</strong></p><p>DAIMON Robotics has been around for almost two and a half years. We have been committed to developing high-resolution, multimodal tactile sensing devices to perceive the interaction between a robot’s hand (particularly its fingertips) and objects. Our devices have become quite robust. They are now accepted and used by a large segment of users, including academic and research institutes as well as leading humanoid robotics companies.</p><p>As embodied AI continues to advance, the critical role of data has been clearer. Data scarcity remains a primary bottleneck in robot learning, particularly the lack of physical interaction data, which is essential for robots to operate effectively in the real world. Consequently, data quality, reliability, and cost have become major concerns in both research and commercial development.</p><p>This is exactly where DAIMON excels. Our vision-based tactile technology captures high-quality, multimodal tactile data. Beyond basic contact forces, it records deformation, slip and friction, material properties and surface textures — enabling a comprehensive reconstruction of physical interactions. Building on our expertise in multimodal fusion, we have developed a robust data processing pipeline that seamlessly integrates tactile feedback with vision, motion trajectories, and natural language, transforming raw inputs into training-ready dataset for machine learning models.</p><p>Recognizing the industry-wide data gap, we view large-scale data collection not only as our unique competitive advantage, but as a responsibility to the broader community.</p><p>By building and open-sourcing the dataset, we aim to provide the high-quality “fuel” needed to power embodied AI, ultimately accelerating the real-world deployment of general-purpose robotic foundation models.</p><p><strong>The robotics industry is highly competitive, and many teams have chosen to focus on data. DAIMON is releasing a large and highly comprehensive cross-embodiment, vision-based tactile multimodal robotic manipulation dataset. How were you able to achieve this?</strong></p><p>We have a dedicated in-house team focused on expanding our capabilities, including building hardware devices and developing our own large-scale model. Although we are a relatively small company, our core tactile sensing technology and innovative data collection paradigm enable us to build large-scale dataset.</p><p>Our approach is to broaden our offering. We have built the world’s largest distributed out-of-lab data collection network. Rather than relying on centralized data factories, this lightweight and scalable system allows data to be gathered across diverse real-world environments, enabling us to generate millions of hours of data per year.</p><p class="pull-quote">“To drive the advancement of the entire embodied AI field, we have open-sourced 10,000 hours of the dataset for the broader community.” <strong>—Prof. Michael Yu Wang, DAIMON Robotics</strong></p><p><strong>This dataset is being jointly </strong><strong>developed with several institutions</strong><strong> worldwide. What roles did they play in its development, and how will the dataset benefit their research and products?</strong></p><p>Besides China based teams, our partners include leading research groups from universities, such as Northwestern University and the National University of Singapore, as well as top global enterprises like Google DeepMind and China Mobile. Their decision to partner with DAIMON is a strong testament to the value of our tactile-rich dataset.</p><p>Among the companies involved there are some that have already built their own models but are now incorporating tactile information. By deploying our data collection devices across research, manufacturing and other real-world scenarios, they help us to gather highly practical, application-driven data. In turn, our partners leverage the data to train models tailored to their specific use cases. Furthermore, to drive the advancement of the entire embodied AI field, we have open-sourced 10,000 hours of the dataset for the broader community.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Robotic gripper delicately holding a cracked eggshell in a dimly lit room" class="rm-shortcode" data-rm-shortcode-id="e2dc7370e54c8fc89b1c0d53a044f79c" data-rm-shortcode-name="rebelmouse-image" id="30fd8" loading="lazy" src="https://spectrum.ieee.org/media-library/robotic-gripper-delicately-holding-a-cracked-eggshell-in-a-dimly-lit-room.png?id=66495381&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Equipped with Daimon’s visuotactile sensor, the gripper delicately senses contact and precisely controls force to pick up a fragile eggshell.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Daimon Robotics</small></p><h2>From VLA to VTLA: Why Tactile Sensing Changes the Equation</h2><p><strong>The mainstream paradigm in robotics is currently the Vision-Language-Action (VLA) model, but your team has proposed a Vision-Tactile-Language-Action (VTLA) model. Why is it necessary to incorporate tactile sensing? What does it enable robots to achieve, and which tasks are likely to fail without tactile feedback?</strong></p><p>Over these years of working to make generalist robots capable of performing manipulation tasks, especially dexterous manipulation — not just power grasping or holding an object, but manipulating objects and using tools to impart forces and motion onto parts — we see these robots being used in household as well as industrial assembly settings.</p><p>It is well established that tactile information is essential for providing feedback about contact states so that robots can guide their hands and fingers to perform reliable manipulation. Without tactile sensing, robots are severely limited. They struggle to locate objects in dark environments, and without slip detection, they can easily drop fragile items like glass. Furthermore, the inability to precisely control force often leads to failed manipulation tasks or, in severe cases, physical damage. Naturally, the VLA approach needs to be enhanced to incorporate tactile information. We expanded the VLA framework to incorporate tactile data, creating the VTLA model.</p><p>An additional benefit of our tactile sensor is that it is vision-based: We capture visual images of the deformation on the fingertip surface. We capture multiple images in a time sequence that encodes contact information, from which we can infer forces and other contact states. This aligns well with the visual framework that VLA is based upon. Having tactile information in a visual image format makes it naturally suitable for integration into the VLA framework, transforming it into a VTLA system. That is the key advantage: Vision-based tactile sensors provide very high resolution at the pixel level, and this data can be incorporated into the framework, whether it is an end-to-end model or another type of architecture.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Close-up of a vision-based tactile sensor with 110,000 sensing units, resembling a smartwatch screen glowing with colorful digital static in the dark" class="rm-shortcode" data-rm-shortcode-id="9c723ec3951683491dace7c3aae69f1f" data-rm-shortcode-name="rebelmouse-image" id="58650" loading="lazy" src="https://spectrum.ieee.org/media-library/close-up-of-a-vision-based-tactile-sensor-with-110000-sensing-units-resembling-a-smartwatch-screen-glowing-with-colorful-digit.png?id=66495588&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">DAIMON has been known for its vision-based tactile sensors that can pack over 110,000 effective sensing units.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DAIMON Robotics</small></p><h2>The Technology: Monochromatic Vision-based Tactile Sensing</h2><p><strong>You and your team have spent many years deeply engaged in vision-based tactile sensing and have developed the world’s first monochromatic vision-based tactile sensing technology. Why did you choose this technical path?</strong></p><p>Once we started investigating tactile sensors, we understood our needs. We wanted sensors that closely mimic what we have under our fingertip skin. Physiological studies have well documented the capabilities humans have at their fingertips — knowing what we touch, what kind of material it is, how forces are distributed, and whether it is moving into the right position as our brain controls our hands. We knew that replicating these capabilities on a robot hand’s fingertips would help considerably.</p><p>When we surveyed existing technologies, we found many types, including vision-based tactile sensors with tri-color optics and other simpler designs. We decided to integrate the best of these into an engineering-robust solution that works well without being overly complicated, keeping cost, reliability, and sensitivity within a satisfactory range, thus ultimately developing a monochromatic vision-based tactile sensing technique. This is fundamentally an engineering approach rather than a purely scientific one, since a great deal of foundational research already existed. With the growing realization of the necessity of tactile data, all of this will advance hand in hand.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Daimon tactile sensor showing force, geometry, material, and contact data visualizations." class="rm-shortcode" data-rm-shortcode-id="d09e9760397ad4cc2faa8b8a54386c20" data-rm-shortcode-name="rebelmouse-image" id="d69d7" loading="lazy" src="https://spectrum.ieee.org/media-library/daimon-tactile-sensor-showing-force-geometry-material-and-contact-data-visualizations.png?id=66495899&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">DAIMON vision-based tactile sensor captures high-quality, multimodal tactile data.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DAIMON Robotics</small></p><p><strong>Last year, DAIMON launched a multi-dimensional, high-resolution, high-frequency vision-based tactile sensor. Compared with traditional tactile sensors, where does its core advantage lie? Which industries could it potentially transform?</strong></p><p>The key features of our sensors are the density of distributed force measurement and the deformation we can capture over the area of a fingertip. I believe we have the highest density in terms of sensing units. That is one very important metric. The other is dynamics: the frequency and bandwidth — how quickly we can detect force changes, transmit signals, and process them in real time. Other important aspects are largely engineering-related, such as reliability, drift, durability of the soft surface, and resistance to interference from magnetic, optical, or environmental factors.</p><p>A growing number of researchers and companies are recognizing the importance of tactile sensing and adopting our technology. I believe the advances in tactile sensing will elevate the entire community and industry to a higher level. One of our potential customers is deploying humanoid robots in a small convenience store, with densely packed shelves where shelf space is at a premium. The robot needs to reach into very tight spaces — tighter than books on a shelf — to pick out an object. Current two-jaw parallel grippers cannot fit into most of these spaces. Observing how humans pick up objects, you clearly need at least three slim fingers to touch and roll the object toward you and secure it. Thus, we are starting to see very specific needs where tactile sensing capabilities are essential.</p><h2>From Academia to Startup</h2><p><strong>After 40 years in academia — founding the HKUST Robotics Institute, earning prestigious honors including IEEE Fellow, and serving as Editor-in-Chief of IEEE TASE — what motivated you to found DAIMON Robotics?</strong></p><p>I have come a long way. I started learning robotics during my PhD at Carnegie Mellon, where there were truly remarkable groups working on locomotion under Marc Raibert, who founded Boston Dynamics, and on manipulation under my advisor, Matt Mason, a leader in the field. We have been working on dexterous manipulation, not only at Carnegie Mellon, but globally for many years.</p><p>However, progress has been limited for a long time, especially in building dexterous hands and making them work. Only recently have locomotion robots truly taken off, and only in the last few years have we begun to see major advancements in robot hands. There is clearly room for advancing manipulation capabilities, which would enable robots to do work like humans. While at Hong Kong University of Science and Technology, I saw increasingly greater people entering this area in the form of students and postdoctoral researchers. We wanted to jumpstart our effort by leveraging the available capital and talent resources.</p><p>Fortunately, one of my postdocs, <a href="https://www.dmrobot.com/en/news/55.html" target="_blank">Dr. Duan Jianghua</a>, has a strong sense for commercial opportunities. Recognizing the rapid growth of robotics market and the unique value that our vision-based tactile sensing technology could bring, together we started DAIMON Robotics, and it has progressed well. The community has grown tremendously in China, Japan, Korea, the U.S., and Europe.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Humanoid robots assembling electronics on an automated factory production line" class="rm-shortcode" data-rm-shortcode-id="4b3c36c692c89677062b5292d09e4650" data-rm-shortcode-name="rebelmouse-image" id="851b9" loading="lazy" src="https://spectrum.ieee.org/media-library/humanoid-robots-assembling-electronics-on-an-automated-factory-production-line.png?id=66496027&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Robots equipped with DAIMON technology have been deployed in factory settings. The company aims to enable robots to achieve “embodied intelligence” and close the gap between what they can see and what they can feel.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DAIMON Robotics</small></p><h2>Business Model and Commercial Strategy</h2><p><strong>What is DAIMON’s current business model and strategic focus? What role does the dataset release play in your commercial strategy?</strong></p><p>We started as a device company focused on making highly capable tactile sensors, especially for robot hands. But as technology and business developed, everyone realized it is not just about one component, rather the entire technology chain: devices, data of adequate quality and quantity, and finally the right framework to build, train, and deploy models on robots in real application environments.</p><p>Our business strategy is best described as “3D”: Devices, Data, and Deployment. We build devices for data collection, our own ecosystem, and for deploying them in our partners’ potential application domains. This enables the collection of real-world tactile-rich data and complete closed-loop validation. This will become an integral part of the 3D business model. Most startups in this space are following a similar path until eventually some may become more specialized or more tightly integrated with other companies. For now, it is mostly vertical integration.<strong></strong></p><h2>Embodied Skills and the Convergence Moment</h2><p><strong>You’ve introduced the concept of “embodied skills” as essential for humanoid robots to move beyond having just an advanced AI “brain.” What prompted this insight? What new capabilities could embodied skills enable? After the rapid evolution of models and hardware over the past two years, has your definition or roadmap for embodied skills evolved?</strong></p><p>We have come a long way now see a convergence point where electrical, electronic, and mechatronic hardware technologies have advanced tremendously in last two decades. Robots are now fully electric, do not require hydraulics, because hardware has evolved rapidly. Modern electronics provide tremendous bandwidth with high torques. If we can build intelligence into these systems, we can create truly humanoid robots with the ability to operate in unstructured environments, make decisions, and take actions autonomously.</p><p class="pull-quote">“Our vision is for robots to achieve robust manipulation capabilities and evolve into reliable partners for humans.” <strong>—Prof. Michael Yu Wang, DAIMON Robotics</strong></p><p>AI has arrived at exactly the right time. Enormous resources have been invested in AI development, especially large language models, which are now being generalized into world models that enable physical AI capabilities. We would like to see these manifested in real-world systems.</p><p>While both AI and core hardware technologies continue to evolve, the focus is much clearer now. For example, human-sized robots are preferred in a home environment. This is an exciting domain with a promise of great societal benefit if we can eventually achieve safe, reliable, and cost-effective robots.</p><h2>The Road to Real-World Deployment</h2><p><strong>Today, many robots can deliver impressive demos, yet there remains a gap before they truly enter real-world applications. What could be a potential trigger for real-world deployment? Which scenarios are most likely to achieve large-scale deployment first?</strong></p><p>I think the road toward large-scale deployment of generalist robots is still long, but we are starting to see signs of feasibility within specific domains. It is very similar to autonomous vehicles, where we are yet to see full deployment of robo-taxis, while we have already started to find mobile robots and smaller vehicles widely deployed in the hospitality industry. Virtually every major hotel in China now has a delivery robot — no arms, just a vehicle that picks up items from the hotel lobby (e.g., food deliveries). The delivery person just loads the food and selects the room number. It is up to the robot thereafter to navigate and reach the guest’s room, which includes using the elevator, to deliver the food. This is already nearly 100 percent deployed in major Chinese hotels.</p><p>Hotel and restaurant robots are viewed as a model for deploying humanoid robots in specific domains like overnight drugstores and convenience stores. I expect complete deployment in such settings within a short timeframe, followed by other applications. Overall, we can expect autonomous robots, including humanoids, to progressively penetrate specific sectors, delivering value in each and expanding into others.</p><p>Ultimately, our vision is for robots to achieve robust manipulation capabilities and evolve into reliable partners for humans. By seamlessly integrating into our homes and daily lives, they will genuinely benefit and serve humanity.</p><p><em>This interview has been edited for length and clarity.</em></p>]]></description><pubDate>Mon, 04 May 2026 11:08:34 +0000</pubDate><guid>https://spectrum.ieee.org/daimon-robotics-physical-ai</guid><category>Type-sponsored</category><category>Factory-robots</category><category>Tactile-sensing</category><category>Ai-models</category><category>Embodied-intelligence</category><dc:creator>Sujeet Dutta</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/man-wearing-glasses-and-a-gray-shirt-smiles-at-camera-while-surrounded-by-futuristic-robots-and-tech-devices-in-a-photo-illustra.jpg?id=66444415&amp;width=980"/></item><item><title>Video Friday: Figure, 1X Ramp Up Humanoid Robot Production</title><link>https://spectrum.ieee.org/video-friday-humanoid-robot-production</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/rows-of-identical-humanoid-robots-standing-on-platforms-in-a-large-industrial-hall.jpg?id=66666641&width=1245&height=700&coordinates=0%2C62%2C0%2C63"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><h5><a href="https://actuate.foxglove.dev/">Actuate 2026</a>: 18–19 August 2026, SAN FRANCISCO</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="yzh1csmhndo">Figure is now able to produce 55 robots per week, which will be “allocated to internal research and development groups, data collection, efforts for robots to perform end-to-end housework, and commercial use-case development.” Er, that seems like a lot of robots to be making when commercial use cases are still “in development,” doesn’t it?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a1d02b36d3a487ff9b647ea46d3a94ee" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/YZH1csMhnDo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/news/ramping-figure-03-production">Figure</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ag_rfhvsnme"><em>The opening of the NEO Factory in Hayward, Calif., marks a fundamental shift in humanoid robotics: The United States’ most vertically integrated robot factory has now begun full-scale production, bringing end-to-end manufacturing of NEO under one roof. Spanning 58,000 square feet and employing over 200 team members, 1X designs and builds every critical component in-house—motors, batteries, transmissions, sensors, structures, and final assembly—enabling faster iteration, superior safety, and true American scale. With the first robots already coming off the line and consumer shipments planned for 2026, this is the critical milestone that turns the vision of abundant, general-purpose home robots into reality.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7dd3abd4be4b35e34183b2fd7d779c33" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ag_rFhvSNmE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Scale will fix everything...?</p><p>[ <a href="https://www.1x.tech/">1X</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="a5_0m84rjcg"><em>Unlike statically stable robots, a <a data-linked-post="2650278041" href="https://spectrum.ieee.org/building-robots-that-can-go-where-we-go" target="_blank">dynamically balanced robot </a>can shift its center of mass to accommodate loads without tipping over, so we like to see just how far we can push our software. Getting Digit to stand on one leg pushes the limits of our sim-to-real pipeline training methodologies—even the slightest model mismatches can lead to instability.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d4082492a954917b413201af0a3efdf7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/A5_0M84rJCg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qt9j6zmlnpy"><em>In this work, we develop a tactile-enabled whole-body humanoid manipulation system for stable, dexterous, contact-rich real-world manipulation. Our system combines VR-based whole-body teleoperation, a lower-body controller based on reinforced learning, dexterous hand retargeting, distributed tactile sensing, and a multimodal policy called Humanoid Transformer with Touch Dreaming (HTD).</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="567192d2a46e5f880312d9c2611c9c34" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QT9J6zMlNpY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://humanoid-touch-dream.github.io/">Humanoid Touch Dream</a> ]</p><p>Thanks, Yaru!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="enizqmpwgsu"><em>Originally posted two years ago, “Can I Have a Pet T. Rex?” is a short interdisciplinary portrait documentary. It features paleontologist and Kod*lab postdoc Aja Mia Carter and the Kod*lab robotics researchers Wei-Hsi Chen (also a postdoc) and J. Diego Caporale, a Ph.D. student.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1912fbad2bf4b1a1296a23b3368d43a9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/enIzqmpwGSU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>It’s been two years! Where is her pet <em>T. rex</em>!?</p><p>[ <a href="https://kodlab.seas.upenn.edu/kod/">Kod*Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="u6wcnee1ade">I am not entirely sure why CMU and HEBI had robots at the 2026 NFL Draft, but I’m entirely sure that it made it more interesting to watch.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d2d3b6ebb7bfb7e7e025e78980282e48" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U6wCnEe1aDE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.hebirobotics.com/">HEBI Robotics</a> ]</p><p>Thanks, Trevor!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bgi1fp0tziq"><em>Ethan Lauer, a software engineer, answers your questions about <a data-linked-post="2672234096" href="https://spectrum.ieee.org/video-friday-atlas-robot-sees-world" target="_blank">robot perception</a>, world modeling, and what spooks our Stretch robot.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="36767fc6b992b51ee1f2803c9cc4a555" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bgI1fp0TZIQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/products/stretch/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="gdxoqpbnnxo">Yet another thing that a robot is consistently better at than I am.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9dc6708f31385766ee572599ed81e1b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GdxoQpBnnXo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://generalistai.com/blog/apr-02-2026-GEN-1">Generalist</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="baehg_j9mig">If you’re wondering where all those reported humanoid robot sales are coming from, it’s because every big company needs one or two for this sort of thing.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="49bc0543da443317dd231b866af97766" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BaehG_j9mIg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.watch.impress.co.jp/docs/news/2104981.html">Impress</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="0jhxwclllpg">Full-color laser yo-yo zapper, a phrase never before written in the history of the universe.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="53d6d45a3e25ef56c223d164e80c09cd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0jhXWcLLLpg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ishikawa-vision.org/">Ishikawa Group Laboratory</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="l-qnujfjzza">The future of the L’Oréal Pro 2026 Le Hair Show is...a bald robot?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bba867c054e8348d81adb5a66bdf78a5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/L-QNUjfjzZA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bwgnetspzvm"><em>Meet MagicHand H01, our all-new dexterous hand.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="90f61b42f8614b0cc57d8ecbb7f77962" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BwgneTspZvM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.magiclab.top/en/parts/hand">MagicLab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="autewtv2bae">This is briefly one of the flattest quadrupeds I have ever seen.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="314bb1329467924e0eb0854f5493938e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AuteWtv2BAE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="0bv35cuxf6o">I appreciate that Engineered Arts did not try to cover up the sound in this video.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d41c3fd69589fa7d169b160c14c2c0b8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0BV35CuxF6o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://engineeredarts.com/robots/ameca">Engineered Arts</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ngpe3-jrlyo">This is very impressive considering that magnets are basically indistinguishable from magic.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="815587efcae927c2aa103935a1386336" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ngPe3-jrLyo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sung.seas.upenn.edu/">Sung Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="q_p0swqazdk"><em>NASA has two rovers on Mars, but they’re exploring entirely different eras of the planet’s past. Separated by 2,300 miles, the two rovers are uncovering clues from very different moments in Martian history. <a data-linked-post="2675264402" href="https://spectrum.ieee.org/perseverance-rover-nasa-anthropic-ai" target="_blank">Perseverance</a> is on the rim of Jezero Crater, where it’s studying some of the oldest Martian terrain ever explored while searching for signs of ancient microbial life. Meanwhile, <a data-linked-post="2676801565" href="https://spectrum.ieee.org/curiosity-rover-organic-molecules-mars" target="_blank">Curiosity</a> is climbing Mount Sharp inside Gale Crater, where layers of rock reveal how Mars’s climate changed as water dried up from its surface.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4ff221bd896052e04182ac3c453d32fb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Q_P0swqaZDk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://science.nasa.gov/mars/">NASA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ko53gwuqzuq"><em>We’ve built a <a data-linked-post="2672130583" href="https://spectrum.ieee.org/star-autonomous-surgical-robot" target="_blank">surgical robot</a> to automate key steps in the process of receiving a <a data-linked-post="2650279456" href="https://spectrum.ieee.org/what-is-neural-implant-neuromodulation-brain-implants-electroceuticals-neuralink-definition-examples" target="_blank">Neuralink implant</a> to promote safety, reliability, and scalability.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1186270187fdbebd1610a962a9315286" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KO53gwuqZUQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://neuralink.com/">Neuralink</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ucy9vtldwpu"><em>The Chinese-made Unitree G1 humanoid robots are making their way into the United States. And they aren’t just in viral videos but in major tech companies like OpenAI and Nvidia, and top academic institutions. Most arrive through Robostore, a robotics reseller based on Long Island. I went there to watch them come off the pallet, then brought one to my home to see what it could actually do. Are these the future of home robots? A security risk? A Chinese surveillance system on legs? I got answers—and a broken toe.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a11a8f30f0108e035d7cf395f93c59a1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ucy9VTLDwPU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thenewthings.com/">New Things</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hgpraidvaqa"><em>How do autonomous robots make decisions when the world is unpredictable? From self-driving cars to drone swarms, autonomous systems must operate under uncertainty—making real-time decisions with incomplete or unreliable data. In this video, Harvard SEAS Prof. Stephanie Gil explains how AI-powered robots coordinate, adapt, and stay safe in complex, real-world environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b6c97f1ae08edc70084a24cc18cdd4b4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hgPRAidvAQA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://seas.harvard.edu/">Harvard University</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 01 May 2026 16:30:01 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-humanoid-robot-production</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Robot-videos</category><category>Robot-manipulation</category><category>Industrial-robots</category><category>Robot-hands</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/rows-of-identical-humanoid-robots-standing-on-platforms-in-a-large-industrial-hall.jpg?id=66666641&amp;width=980"/></item><item><title>Video Friday: Who Wins in Robot vs. Pro Ping-Pong Player?</title><link>https://spectrum.ieee.org/video-friday-ping-pong-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robotic-arm-with-paddle-and-an-orange-ping-pong-ball-hovering-over-a-sony-labeled-table-tennis-table.png?id=65961797&width=1245&height=700&coordinates=0%2C60%2C0%2C61"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="frgq8ltb-_e"><em>Sony AI’s latest research, published on the cover of </em>Nature<em>, addresses a long-standing challenge in physical AI: Can a high-speed autonomous system master the complex perception and dynamic control required to compete against professional athletes?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="65bad2307d3e3da0543d00c4de449a16" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FrGq8ltb-_E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ace.ai.sony/">Sony AI</a> ]</p><div class="horizontal-rule"></div><blockquote><span>In this video, we present Ringbot Quad, a novel <a data-linked-post="2667223117" href="https://spectrum.ieee.org/video-friday-monocycle-robot-with-legs" target="_blank">monocycle robot</a> with four legs that combines wheeled and legged locomotion on a single platform. Ringbot Quad is designed as a unique monocycle mechanism that replaces the traditional drivetrain with four individually actuated driving modules, each integrated with an articulated leg.<br/></span><span>Ringbot Quad aims to provide versatile and efficient mobility through two distinct locomotion modes. In driving mode, the four legs assist with balance and steering, while in walking mode, they fully support the body for quadruped locomotion. By switching between these modes, Ringbot Quad can navigate diverse terrains and overcome obstacles that are difficult for either wheeled or legged systems alone.</span></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="831cea79dd4d291051295c00a25b6298" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zI0Mv2Ga3FA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">Kinetic Intelligent Machine Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="1vunusbznmq"><em>Humanoid robots have beaten human runners in a Beijing half-marathon, marking a breakthrough in China’s rapidly advancing robotics industry. More than 100 robots competed alongside 12,000 people in the 21-kilometer race, with three crossing the finish line ahead of any human.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c4ccc1d37a866a0f3b5c17a98197d67e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1vUnusbzNMQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.aljazeera.com/sports/2026/4/19/humanoid-robot-breaks-half-marathon-world-record-in-beijing">Al Jazeera</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="d4hqks4fev0"><em>Watch AthenaZero juggle barehanded using on-board sensory feedback only. No motion capture. No funnels. No help adding the third ball. The robot learns to adapt to the uncertainties from contact and the appropriate hand-eye coordination.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6b1a3e759a227786d74ab63ae929e5af" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/d4HqKs4fEV0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rai-inst.com/">Robotics and AI Institute</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="srpz8trpz_8">From the look of this, it’s based on data capture from humans. What I want to know is, what this will look like when it’s not based on data capture from humans.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ae0865cdf55df9e8806e81cbb60c1978" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/srPz8TRpZ_8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hc1zy6-f_z4">Looks like Sphero would like to fill that sad gap in educational robotics left by <a data-linked-post="2650256027" href="https://spectrum.ieee.org/mindstorms-not-just-a-kids-toy" target="_blank">LEGO Mindstorms</a>.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2b59e0fd71b3b073effdcf9c95ae929b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HC1zy6-f_Z4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sphero.com/products/blueprint-robotics/#sp">Sphero</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="hj79hby1uje">I am pretty sure that this is not how the shell game is played.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="afcb6298cc833e28cae6c5aa83ee1200" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Hj79HBy1UjE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://generalistai.com/blog/apr-02-2026-GEN-1">Generalist</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ng4ibozehig">At this point, real value from robots in warehouses much more commonly comes from systems like these, not humanoids.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8964b18918da211bcc2490800b4bde17" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ng4IBozehig?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.berkshiregrey.com/">Berkshire Grey</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fgxsjls_o-o"><em>Scientists at the Max Planck Institute for Intelligent Systems propose a method to measure the efficiency of soft electrostatic actuators, enabling systematic evaluation of electrical-to-mechanical energy conversion. Using Peano-HASEL actuators, they demonstrate efficiencies up to 63.6%, over three times higher than previously reported, and validate the approach across other actuator types, paving the way for more energy-efficient soft electrostatic robotic systems.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="015fd4c4fabecd354cc80cf389b49f36" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FGXSJLs_O-o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://is.mpg.de/news/the-future-of-actuation-is-soft-and-efficient">Max Planck Institute</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="gjt1dqgbsxk">Already deployed in North America, quadruped robots provide continuous patrol, real-time monitoring, and faster incident detection across residential communities—day and night.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="05b155006c9e55c99c98c3215356b051" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gjT1dQgBSxk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Um, thanks, but no thanks.</p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pahog1egeog">Catching drones with what looks like a UR20 robot arm is a neat trick.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dd40e9178f5a6e15e278e96492d5ebaf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pAhog1EGEOg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.skydio.com/f10">Skydio</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ybt80myb9_y"><a data-linked-post="2650277182" href="https://spectrum.ieee.org/flying-dragon-robot-transforms-itself-to-squeeze-through-gaps" target="_blank">Overactuated drones</a> performing aerial maneuvers will always look just a little bit wrong to me.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cb7fb79b6bd75d06910780cbcd28d39b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ybT80MyB9_Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2412.16107">Paper</a> ] from [ <a href="https://rsl.ethz.ch/" target="_blank">ETH Zurich</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vh9o9zhkpdc">Need a rugged and reliable mobile manipulator? Please consider a not-humanoid.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="30675d047aaf2e3146468b541f19e50d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vH9O9zhKPdc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://clearpathrobotics.com/husky-a300-unmanned-ground-vehicle-robot/">Clearpath</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ydnosmbyyqo">This CMU Robotics Institute talk is from CMU’s Raj Reddy, on “The Future of AI : Doomers vs. Abundance.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7d5e4a3bf547f2b80d79383ad095e721" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ydnOSMbyyQo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote style="margin-left: 60px;"><em>The last decade has seen extraordinary advances in AI. The potential arrival of Artificial General Intelligence (AGI) has profound implications for future of our society. We anticipate a world where AI assistants and humanoid robots will perform most of the  tasks requiring human expertise and skill at 10% of current costs. In this paradigm, essential services—including food, housing, energy, education, healthcare, and transportation—will be provided via Universal Basic Services, signaling a historic shift  from a society of scarcity to one of abundance. This transformation raises a critical concern: widespread displacement of traditional labor. What is the human role when AI can do everything? This talk presents an  alternative scenario: a “Human-in-the-Loop” evolution. In this model, humans  transition into high-level supervisory roles, collaborating with AGI to train robots in novel skills and adapt them to unforeseen tasks.</em><br/><em>We explore this as the “Maharaja Model” where technology serves humanity so comprehensively that work will be optional for humans. Finally, we will discuss how  institutions like the Robotics Institute must lead this transition, developing the hybrid technologies and ethical frameworks necessary to bridge the gap between our current economy and a robot-assisted future.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/ri-faculty/raj-reddy/">Carnegie Mellon University Robotics Institute</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 24 Apr 2026 16:30:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-ping-pong-robot</guid><category>Robotics</category><category>Humanoid-robots</category><category>Video-friday</category><category>Quadruped-robots</category><category>Robot-videos</category><category>Industrial-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-arm-with-paddle-and-an-orange-ping-pong-ball-hovering-over-a-sony-labeled-table-tennis-table.png?id=65961797&amp;width=980"/></item><item><title>This Roboticist-Turned-Teacher Built a Life-Size Replica of ENIAC</title><link>https://spectrum.ieee.org/roboticist-turned-teacher-eniac-replica</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/man-crouches-behind-three-robots.png?id=65575461&width=1245&height=700&coordinates=0%2C219%2C0%2C219"/><br/><br/><p><a href="https://linkedin.com/in/thomas-burick" rel="noopener noreferrer" target="_blank">Tom Burick</a> has always considered himself a builder. Over the years he’s designed robots, constructed a <a href="https://www.youtube.com/watch?v=po58YSF8UKs&t=596s" rel="noopener noreferrer" target="_blank">vintage teardrop trailer</a>, and most recently, led a group of students in building a full-scale replica of a pivotal 1940s computer. </p><p>Burick is a technology instructor at PS Academy in Gilbert, Ariz., a middle and high school for students with <a href="https://spectrum.ieee.org/tag/autism-spectrum-disorder" target="_blank">autism</a> and other specialized learning needs. At the start of the 2025–26 school year, he began a project with his students to build a full-scale replica of the Electronic Numerical Integrator and Computer, or ENIAC, for the <a href="https://spectrum.ieee.org/eniac-80-ieee-milestone" target="_self">80th anniversary of the historic computer’s construction</a>. ENIAC was one of the world’s first programmable electronic computers. When it was built, it was about one thousand times as fast as other machines.</p><p>Before becoming a teacher, Burick owned a robotics company for a decade in the 2000s. But when a financial downturn forced him to close the business, he turned to teaching. “I had so many amazing people help me when I was young [who] really gave me their time and resources, and really changed the trajectory of my life,” Burick says. “I thought I need to pay that forward.”</p><h2>Becoming a Roboticist</h2><p>As a young child in Latrobe, Pa., Burick watched the television show <em><em>Lost in Space</em></em>, which includes a robot character who protects the family. “He was the young boy’s best friend, and I was so captivated by that. I remember thinking to myself, I want that in my life. And that started that lifelong love affair with robotics and technology.”</p><p>He started building toy robots out of anything he could find, and in junior high school, he began adding electronics. “By early high school, I was building full-fledged autonomous, microprocessor-controlled machines,” he says. At age 15, he built a 150-pound steel firefighting robot, for which he won awards from IEEE and other organizations. </p><p>Burick kept building robots and reached out for help from local colleges and universities. He first got in touch with a student at <a href="https://www.cmu.edu/" rel="noopener noreferrer" target="_blank">Carnegie Mellon University</a>, who invited him to visit campus. “My parents drove me down the next weekend, and he gave me a tour of the robotics lab. I was mesmerized. He sent me home with college textbooks and piles of metal and gears and wires,” Burick says. He would read the textbook a page at a time, reading it again and again until he felt he had an understanding of it. Then, to help fill gaps in his understanding, he got in touch with a robotics instructor at <a href="https://www.stvincent.edu/index.html" rel="noopener noreferrer" target="_blank">Saint Vincent College</a>, in his hometown of Latrobe, who let him sit in on classes. Each of these adults, he says, “helped change the trajectory of my life.” </p><p>Toward the end of high school, Burick realized that college wouldn’t be the right environment for him. “I was drawn to real-world problem-solving rather than structured coursework and I chose to continue along that path,” he says. Additionally, Burick has <a href="https://my.clevelandclinic.org/health/diseases/23949-dyscalculia" rel="noopener noreferrer" target="_blank">dyscalculia</a>, which makes traditional mathematics more challenging for him. “It pushed me to develop alternative methods of engineering.”</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="recreation of a large machine arranged in a U shape. A podium in the middle reads \u201cENIAC 80\u201d" class="rm-shortcode" data-rm-shortcode-id="11b834e11cfecce37836f1a912816b02" data-rm-shortcode-name="rebelmouse-image" id="2528e" loading="lazy" src="https://spectrum.ieee.org/media-library/recreation-of-a-large-machine-arranged-in-a-u-shape-a-podium-in-the-middle-reads-u201ceniac-80-u201d.png?id=65575467&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The ENIAC replica Burick’s students built precisely matches what the original computer would have looked like before it was disassembled in the 1950s. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Robert Gamboa</small></p><p>When he graduated, he worked in several tech jobs before starting his own company. In 2000, he opened a computer retail store and adjacent robotics business, White Box Robotics. The idea for the company came when Burick was building a “white box” PC from standard, off-the-shelf components, and realized there was no comparable product for robotics. </p><p>So, he started developing a modular, general-purpose platform that applied white box PC standards to mobile robots. “The robot’s chassis was like a box of Legos,” he says. You could click together two torsos to double its payload, switch out the drive system, or swap its head for a different set of sensors. He filed utility and design <a href="https://patents.justia.com/inventor/thomas-j-burick" target="_blank">patents</a> for the platform, called the 914 PC-Bot, and after merging with a Canadian defense robotics company called Frontline Robotics, started production. They sold about 200 robots in 17 countries, Burick says. </p><p>Then the 2008 financial crisis hit. White Box Robotics held on for a couple of years, shuttering in late 2010. “I got to live my life’s dream for 10 years,” he says. After closing White Box, “there was some soul searching” about what to do next. He recalled the impact his own mentors had, and decided to pay it forward by teaching. </p><h2>Neurodiversity as a Superpower</h2><p> In 2013, Burick started working in a vocational training program for young adults living with autism. The program didn’t have a technical arm, so he started one and ran it until 2019, when he was hired to be a technology instructor at PS Academy Arizona. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" rel="float: left;" style="float: left;"> <img alt="Student using power drill on wood under instructor\u2019s guidance in workshop." class="rm-shortcode" data-rm-shortcode-id="f2ffb116874f4573ed0d154a8392678a" data-rm-shortcode-name="rebelmouse-image" id="bd65a" loading="lazy" src="https://spectrum.ieee.org/media-library/student-using-power-drill-on-wood-under-instructor-u2019s-guidance-in-workshop.png?id=65575500&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Burick and one of his students assemble the base for one of ENIAC’s three portable function tables, which contained banks of switches that stored numerical constants. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Bri Mason</small></p><p> Burick feels he can connect with his students, because he is also neurodivergent. Throughout his childhood, he was told what he wasn’t able to do because of his dyscalculia diagnosis. “People tell you what it takes, but they never tell you what it gives,” Burick says. </p><p>In adulthood, he realized that some of his strengths are linked to dyscalculia, too, like strong 3D spatial reasoning. “I have this CAD program that runs in my head 24 hours a day,” he says. “I think the reason I was successful in robotics, truly, was because of the dyscalculia…. To me, [it] has always been a superpower.” </p><p>Whenever his students say something disparaging about living with autism, he shares his own experience. “You need to have maybe just a bit more tenacity than others, because there are parts of it you do have to fight through, but you come through with gifts and strengths,” he tells them. </p><p>And Burick’s classes aim to play to those strengths. “I didn’t want my technology program to feel like craft hour,” he says. Instead, through projects like the ENIAC replica, students can leverage traits many of them share, like the abilities to hyperfocus and to precisely repeat tasks. </p><h2>Recreating ENIAC</h2><p> Burick has taught his students about ENIAC for several years. While reading about it, he learned that the massive, 27-tonne computer was dismantled and partially destroyed after being decommissioned in 1955. Although a few of ENIAC’s 40 original panels are on display at museums, “there was no hope of ever seeing it together again. We wanted to give the world that experience,” Burick says. </p><p> He and his students started by learning about ENIAC, and even Burick was surprised by how complex the 80-year-old computer was. They built a one-twelfth scale model to help the students better understand what it looked like. Seeing the students light up, Burick became confident in their ability to move onto the full-scale model, and he started ordering supplies. </p><p> ENIAC was composed of 40 large metal panels arranged in a U-shape that housed its many vacuum tubes, resistors, capacitors, and switches. Twenty of the panels were accumulators with the same design, so the students started with these, then worked through smaller groupings of panels. The repeating panels brought symmetry to ENIAC, Burick says, but it was also one of the main challenges of recreating it. If one part was slightly out of place, the next one would be too and the mistake would compound. </p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Group of students in a gym holding large silver patterned boards facing the camera." class="rm-shortcode" data-rm-shortcode-id="ec54f1caeb938893258637e62d3d7e21" data-rm-shortcode-name="rebelmouse-image" id="1cc34" loading="lazy" src="https://spectrum.ieee.org/media-library/group-of-students-in-a-gym-holding-large-silver-patterned-boards-facing-the-camera.png?id=65575510&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The students installed 500 simulated vacuum tubes in each of the panels here, for a total of 18,000 vacuum tubes.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Robert Gamboa</small></p><p> Once they constructed the panels, they added ENIAC’s three function tables, which stored numerical constants in banks of switches, then two punch-card machines. Finally, they installed 18,000 simulated vacuum tubes. In total, the project used nearly 300 square meters of thick-ream cardboard, 1,600 hot-glue-gun sticks, and 7 gallons of black paint. </p><p> The scale of the machine—and his students’ work—left Burick in awe. “By the time we were done, I felt like I was in a room full of scientists,” he says.</p><p> Previously, Burick’s students built an 8-foot-long drivable Tesla Cybertruck (“complete with a 400-watt stereo system and a subwoofer”) and he plans to keep the momentum with another recreation—maybe from the Apollo moon missions. </p><p>“I go to work every day, and I feel passionate about robotics [and] technology. I get to share that passion with the students,” Burick says. “I get to feel what it’s like to be in the position of the people that helped me. It closes that loop, and I find that really rewarding.”</p>]]></description><pubDate>Thu, 23 Apr 2026 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/roboticist-turned-teacher-eniac-replica</guid><category>Robotics</category><category>Eniac</category><category>Teaching</category><category>Neurodivergent</category><category>Computer-history</category><dc:creator>Gwendolyn Rak</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/man-crouches-behind-three-robots.png?id=65575461&amp;width=980"/></item><item><title>Proposed Chinese Robot Ban Is Latest U.S. Tech Sovereignty Move</title><link>https://spectrum.ieee.org/chinese-robots-us-ban</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/photo-illustration-shows-a-chinese-made-humanoid-drone-quadruped-robot-on-a-red-background-with-yellow-highlights.jpg?id=65566471&width=1245&height=700&coordinates=0%2C187%2C0%2C188"/><br/><br/><p>The <a href="https://stefanik.house.gov/2026/3/stefanik-cotton-introduce-bipartisan-bill-to-propel-america-s-robotics-superiority-protect-u-s-national-security" target="_blank">American Security Robotics Act,</a> a bipartisan bill introduced in March by Senators Tom Cotton (R-Ark.) and Chuck Schumer (D-N.Y.) and Representative Elise Stefanik (R-N.Y.), proposes to limit U.S. government use of Chinese ground robots including humanoids, dogs, and crawlers. The proposal came just a few days after the Federal Communications Commission (FCC) <a href="https://apnews.com/article/fcc-foreign-router-ban-national-security-technology-7e5333aeaf82496ce6350f57699db5ba" rel="noopener noreferrer" target="_blank">tightened its rules</a> for new foreign-made routers. The two changes are part of a much broader decoupling of sensitive U.S. tech from China, which include <a href="https://spectrum.ieee.org/us-takes-strategic-step-to-onshore-electronics-manufacturing" target="_self">semiconductors</a>, <a href="https://maritime-executive.com/article/konecranes-to-build-ports-cranes-in-the-u-s-to-loosen-china-s-monopoly" rel="noopener noreferrer" target="_blank">port cranes</a>, <a href="https://sanctionsnews.bakermckenzie.com/us-president-signs-defense-policy-bill-with-implications-for-export-controls-sanctions-and-supply-chain-restrictions/" rel="noopener noreferrer" target="_blank">logistics data</a>, <a href="https://spectrum.ieee.org/huawei-and-zte-eu" target="_self">telecom cellular base stations and network hardware</a>, <a href="https://www.fcc.gov/document/fcc-bans-authorizations-devices-pose-national-security-threat" rel="noopener noreferrer" target="_blank">security cameras</a>, <a href="https://www.insideglobaltech.com/2025/04/23/u-s-tech-legislative-regulatory-update-first-quarter-2025/" rel="noopener noreferrer" target="_blank">passenger vehicles</a>, and, in December 2025, <a href="https://arstechnica.com/gadgets/2025/12/djis-new-drones-will-not-be-available-in-the-us-as-fcc-ban-takes-effect/" rel="noopener noreferrer" target="_blank">uncrewed aircraft systems (UAS)</a> including those sold by DJI. </p><p>“I see the robots and the routers as being the latest in a long line of growing tech security concerns in the U.S. vis-à-vis Chinese technology,” says sociologist <a href="https://www.kyleichan.com/" target="_blank">Kyle Chan</a> of the Brookings Institute in Washington, D.C, who <a href="https://docs.house.gov/meetings/ZS/ZS00/20260416/119165/HHRG-119-ZS00-Wstate-ChanK-20260416.pdf" rel="noopener noreferrer" target="_blank">testified</a> on 16 April 2026 before the <a href="https://chinaselectcommittee.house.gov/" rel="noopener noreferrer" target="_blank">Congressional Select Committee on the Strategic Competition Between the United States and the Chinese Communist Party</a>.</p><p>Certain U.S. firms, such as <a href="https://spectrum.ieee.org/tag/ghost-robotics" target="_self">Ghost Robotics</a>, may benefit, because they are among the few companies that can handle demand for ground robots from U.S. government buyers. Ground robots are finished products at the top of the chain of added value, unlike semiconductors, which are “lower” down the value chain since they are always components of other products. If the proposed ground robot ban were to move lower down the value chain, preventing American robot makers from buying Chinese-made components, those companies might have a harder time fulfilling U.S. demand. The U.S. robotics industry is in a pickle: Companies would benefit from eliminating Chinese competitors at their level of the value chain, so long as they can retain their Chinese suppliers. </p><p class="pull-quote">The U.S. does not have a serious, overarching strategy to guiding its approach to the U.S.-China techno-economic competition.”   <strong>—Stephen Ezell, Information Technology & Innovation Foundation</strong></p><p>It’s still early for the ground robotics industry in the U.S. Adoption is not yet that high, nor are the supply chains mature yet. South Korea and Japan make many crucial robot components, for example, so if they or other countries the U.S. considers friendly can replace Chinese components the U.S. government declares unsafe, the U.S. robotics industry may be able to adapt and build its competitiveness. </p><p>For other technologies, it’s Chinese tech all the way down the chain. The UAS market, for example, is dominated by Chinese producers. The U.S. Department of Commerce <a href="https://www.aerotime.aero/articles/us-drops-chinese-drone-import-ban-but-dji-still-impacted" rel="noopener noreferrer" target="_blank">has sought to ban them</a> for more than a year, and in December, the FCC added <a href="https://www.fcc.gov/document/fcc-updates-covered-list-add-certain-uas-and-uas-components-0" rel="noopener noreferrer" target="_blank">UAS’s to its import ban list</a>, called the Covered List.</p><p>“That was a problem with the drone ban,” Chan says. “Rather than thinking about how you would ramp up domestic production and then have this tapering off of dependence on Chinese drones, it was a sharp and fast switch, which left industry in the lurch.” </p><h2>Many Supply Chains Already Extend Beyond China</h2><p>The FCC’s March <a href="https://apnews.com/article/fcc-foreign-router-ban-national-security-technology-7e5333aeaf82496ce6350f57699db5ba" rel="noopener noreferrer" target="_blank">ban on new foreign-made routers</a> was a surprise to that industry. In 2025, the U.S. imported nearly US $ 31 billion of routers, according to the <a href="https://emails.ipc.org/links/Global-Electronics-Association-routers-report26.pdf" rel="noopener noreferrer" target="_blank">Global Electronics Association</a>. Yet China produced only 1.1 percent of that, by value, down from around 20.5 percent of the U.S. market share in 2019. In 2025, the top three sources of routers in the U.S. by value were Vietnam, Mexico, and Thailand, together accounting for 68.4 percent of the market.</p><p>“A lot of this is more nuanced than the regulatory approaches suggest. The real vulnerabilities are outdated software, patches that haven’t been installed, unchanged default passwords,” says Global Electronics Association economist <a href="https://www.electronics.org/meet-shawn-dubravac-ipcs-chief-economist" target="_blank">Shawn DuBravac</a>, one of the authors of the association’s report.</p><p>On 14 April, the <a href="https://docs.fcc.gov/public/attachments/DA-26-351A1.pdf" target="_blank">FCC issued conditional approvals</a> for U.S. distribution of certain <a href="https://www.netgear.com/" target="_blank">Netgear</a> and <a href="https://www.adtran.com/en" target="_blank">Adtran</a> routers, along with Sees.ai UAS’s. U.S.-headquartered Netgear manufactures routers in Vietnam and Taiwan, according to <a href="https://www.consumerreports.org/electronics-computers/wireless-routers/foreign-made-routers-fcc-ban-a1057564057/" target="_blank"><em>Consumer Reports</em></a>. DuBravac says the fact that the FCC took only about three weeks to exempt those imports is positive, but that since the exemptions last only 18 months, manufacturers must still contend with a lot of uncertainty.</p><p>“If you’re a company you’re going to have to have clear visibility into your suppliers and into your suppliers’ suppliers,” DuBravac says. “There’s much, much more scrutiny.”</p><p>The last several U.S. administrations have restricted a growing list of Chinese tech, across both political parties. “I see this as bipartisan,” Chan says, “and I would expect continued scrutiny.”</p><p>Companies building technology subject to security controls should also prepare for speed. A White House interagency task force determined that foreign routers were a security risk, leading to the FCC’s Public Safety and Homeland Security Bureau announcing first the UAS ban and later the router ban. Because UAS’s use radio to communicate, they are subject to FCC oversight. Both security-related determinations, unlike conventional FCC rule making, did not require public notice or a commenting period. </p><p>“There hasn’t been much of a back and forth process into [the UAS] rule,” Chan says. </p><p>The electronics industry is also accustomed to more dialogue with trade-related changes, DuBravac says. “When you see a problem, you open an investigation and stakeholders can submit input into that investigation so it feels a little more like a two-way conversation, so you’re actually hearing from industry on this.” So far, that has not happened.</p><p>Instead, even analysts that welcome U.S. security scrutiny of Chinese technology are finding the fits and starts of the associated policymaking jarring, says <a href="https://itif.org/person/stephen-ezell/" target="_blank">Stephen Ezell</a> of the Information Technology and Innovation Foundation, a think tank in Washington, D.C.: “The U.S. does not have a serious, overarching strategy guiding its approach to the U.S.-China techno-economic competition.” </p>]]></description><pubDate>Wed, 22 Apr 2026 12:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/chinese-robots-us-ban</guid><category>Robots</category><category>Robot-policy</category><category>China</category><category>Us-congress</category><category>Trump-administration</category><category>American-security-robotics-act</category><dc:creator>Lucas Laursen</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo-illustration-shows-a-chinese-made-humanoid-drone-quadruped-robot-on-a-red-background-with-yellow-highlights.jpg?id=65566471&amp;width=980"/></item><item><title>The USC Professor Who Pioneered Socially Assistive Robotics</title><link>https://spectrum.ieee.org/socially-assistive-robotics</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-smiling-blonde-woman-poses-with-a-humanoid-robotic-torso-wearing-a-usc-sweatshirt.jpg?id=65574156&width=1245&height=700&coordinates=0%2C187%2C0%2C188"/><br/><br/><p>When the robotics engineering field that <a href="https://www.linkedin.com/in/maja-mataric-5b670014/" rel="noopener noreferrer" target="_blank">Maja Matarić</a> wanted to work in didn’t exist, she helped create it. In 2005 she helped define the new area of socially assistive robotics.</p><p>As an associate professor of computer science, neuroscience, and pediatrics at the <a href="https://www.usc.edu/" rel="noopener noreferrer" target="_blank">University of Southern California</a>, in Los Angeles, she developed robots to provide personalized therapy and care through social interactions.</p><h3>Maja Matarić</h3><br/><p><strong>Employer </strong></p><p><strong></strong>University of Southern California, Los Angeles</p><p><strong>Job Title </strong></p><p><strong></strong>Professor of computer science, neuroscience, and pediatrics</p><p><strong>Member grade</strong></p><p>Fellow</p><p><strong>Alma maters </strong></p><p><strong></strong>University of Kansas and MIT</p><p>The robots could have conversations, play games, and respond to emotions.</p><p>Today the IEEE Fellow is a professor at USC. She studies how robots can help students with anxiety and depression undergo cognitive behavioral therapy. CBT focuses on changing a person’s negative thought patterns, behaviors, and emotional responses.</p><p>For her work, she received a 2025 Robotics Medal from <a href="https://www.massrobotics.org/" rel="noopener noreferrer" target="_blank">MassRobotics</a>, which recognizes female researchers advancing robotics. The Boston-based nonprofit provides robotics startups with a workspace, prototyping facilities, mentorship, and networking opportunities.</p><p>When receiving the award at the ceremony in Boston, Matarić was overcome with joy, she says.</p><p>“I’ve been very fortunate to be honored with several awards, which I am grateful for. But there was something very special about getting the MassRobotics medal, because I knew at least half the people in the room,” she says. “Everyone was just smiling, and there was a great sense of love.”</p><h2>Seeing herself as an engineer</h2><p>Matarić grew up in Belgrade, Serbia. Her father was an engineer, and her mother was a writer. After her father died when she was 16, Matarić and her mother moved to the United States.</p><p>She credits her father for igniting her interest in engineering, and her uncle who worked as an aerospace engineer for introducing her to computer science.</p><p>Matarić says she didn’t consider herself an engineer until she joined USC’s faculty, since she always had worked in computer science.</p><p>“In retrospect, I’ve always been an engineer,” Matarić says. “But I didn’t set out specifically thinking of myself as one—which is just one of the many things I like to convey to young people: You don’t always have to know exactly everything in advance.”</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="d2fd2dba0701e451f2378a616fd4821c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NbTDF3_djI8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">Maja Matarić and her lab are exploring how socially assistive robots can help improve the communication skills of children with autism spectrum disorder.</small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">National Science Foundation News</small> </p><p>While pursuing her bachelor’s degree in computer science at the <a href="https://www.ku.edu/" rel="noopener noreferrer" target="_blank">University of Kansas</a> in Lawrence, she was introduced to industrial robotics through a textbook. After earning her degree in 1987, she had an opportunity to continue her education as a graduate student at MIT’s AI Lab (now the <a href="https://www.csail.mit.edu/node/2873" rel="noopener noreferrer" target="_blank">Computer Science and Artificial Intelligence Lab</a>). During her first year, she explored the different research projects being conducted by faculty members, she said in a <a href="https://ethw.org/Oral-History:Maja_Mataric" rel="noopener noreferrer" target="_blank">2010 oral history</a> conducted by the <a href="https://www.ieee.org/content/dam/ieee-org/ieee/web/org/about/history-center/ieee-history-center-newsletter-114.pdf" rel="noopener noreferrer" target="_blank">IEEE History Center</a>. She met IEEE Life Fellow <a href="https://spectrum.ieee.org/rodney-brooks-three-laws-robotics" target="_self">Rodney Brooks</a>, who was working on novel reactive and behavior-based robotic systems. His work so excited her that she joined his lab and conducted her master’s thesis under his tutelage.</p><p>Inspired by the way animals use landmarks to navigate, Matarić developed <a href="https://dspace.mit.edu/bitstream/handle/1721.1/7027/AITR-1228.pdf?...#:~:text=Toto%20is%20an%20example%20of,learn%2D%20ing%20and%20path%20planning." rel="noopener noreferrer" target="_blank">Toto</a>, the first navigating behavior-based robot. Toto used distributed models to map the AI Lab building where Matarić worked and plan its path to different rooms. Toto used sonar to detect walls, doors, and furniture, according to Matarić’s paper, “<a href="https://pages.ucsd.edu/~ehutchins/cogs8/mataric-primer.pdf" rel="noopener noreferrer" target="_blank">The Robotics Primer</a>.”</p><p>After earning her master’s degree in AI and robotics in 1990, she continued to work under Brooks as a doctoral student, pioneering distributed algorithms that allowed a team of up to 20 robots to execute complex tasks in tandem, including searching for objects and exploring their environment.</p><p>Matarić earned her Ph.D. in AI and robotics in 1994 and joined <a href="https://www.brandeis.edu/" rel="noopener noreferrer" target="_blank">Brandeis University</a>, in Waltham, Mass., as an assistant professor of computer science. There she founded the Interaction Lab, where she developed autonomous robots that work together to accomplish tasks.</p><p>Three years later, she relocated to California and joined USC’s <a href="https://viterbischool.usc.edu/" rel="noopener noreferrer" target="_blank">Viterbi School of Engineering</a> as an assistant professor in computer science and neuroscience.</p><p>In 2002 she helped to found the Center for Robotics and Embedded Systems (now the <a href="https://rasc.usc.edu/" rel="noopener noreferrer" target="_blank">Robotics and Autonomous Systems Center</a>). The RASC focuses on research into human-centric and scalable robotic systems and promotes interdisciplinary partnerships across USC.</p><p>Matarić’s shift in her research came after she gave birth to her first child in 1998. When her daughter was a bit older and asked Matarić why she worked with robots, she wanted to be able to “say something better than ‘I publish a lot of research papers,’ or ‘it’s well-recognized,’” she says.</p><p class="pull-quote">“In academia, you can be in a leadership role and still do research. It’s a wonderful and important opportunity that lets academics be on top of our field and also train the next generation of students and help the next generation of faculty colleagues.”</p><p>“Kids don’t consider those good answers, and they’re probably right,” she says. “This made me realize I was in a position to do something different. And I really wanted the answer to my daughter’s future question to be, ‘Mommy’s robots help people.’”</p><p>Matarić and her doctoral student <a href="https://www.unr.edu/cse/people/david-feil-seifer" rel="noopener noreferrer" target="_blank">David Feil-Seifer</a> presented a paper defining socially assistive robotics at the 2005 <a href="https://icorr-c.org/" rel="noopener noreferrer" target="_blank">International Conference on Rehabilitation Robotics</a>. It was the only paper that talked about helping people complete tasks and learn skills by speaking with them rather than by performing physical jobs, she says.</p><p>Feil-Seifer is now a professor of computer science and engineering at the <a href="https://www.unr.edu/" rel="noopener noreferrer" target="_blank">University of Nevada</a> in Reno.</p><p>At the same time, she founded the <a href="https://uscinteractionlab.web.app/" rel="noopener noreferrer" target="_blank">Interaction Lab at USC</a> and made its focus creating robots that provide social, rather than physical, support.</p><p>“At this point in my career journey, I’ve matured to a place where I don’t want to do just curiosity-driven research alone,” she says. “Plenty of what my team and I do today is still driven by curiosity, but it is answering the question: ‘How can we help someone live a better life?’”</p><p>In 2006 she was promoted to full professor and made the senior associate dean for research in USC’s Viterbi School of Engineering. In 2012 she became vice dean for research.</p><p>“In academia, you can be in a leadership role and still do research,” she says. “It’s a wonderful and important opportunity that lets academics be on top of our field and also train the next generation of students and help the next generation of faculty colleagues.”</p><h2>Research in socially assistive robotics</h2><p>One of the longest research projects Matarić has led at her Interaction Lab is exploring how socially assistive robots can help improve the communication skills of children with <a href="https://www.mayoclinic.org/diseases-conditions/autism-spectrum-disorder/symptoms-causes/syc-20352928" rel="noopener noreferrer" target="_blank">autism spectrum disorder</a>. ASD is a lifelong neurological condition that affects the way people interact with others, and the way they learn. Children with ASD often struggle with social behaviors such as reading nonverbal cues, playing with others, and making eye contact.</p><p>Matarić and her team developed a robot, <a href="https://spectrum.ieee.org/041910-bandit-little-dog-and-more-usc-shows-off-its-robots" target="_self">Bandit</a>, that can play games with a child and give the youngster words of affirmation. Bandit is 56 centimeters tall and has a humanlike head, torso, and arms. Its head can pan and tilt. The robot uses two <a href="https://www.edmundoptics.com/c/firewire-cameras/1014/?srsltid=AfmBOopjvhJQdzbmxyRP-Bgi50iYGeAIcQp3WkFHPM4R78EHqgr4buL0" rel="noopener noreferrer" target="_blank">FireWire</a> cameras as its eyes, and it has a movable mouth and eyebrows, allowing it to exhibit a variety of facial expressions, according to the <a href="https://spectrum.ieee.org/" target="_self"><em><em>IEEE Spectrum</em></em></a>’s <a href="https://robotsguide.com/robots/bandit" rel="noopener noreferrer" target="_blank">robots guide</a>. Its torso is attached to a wheeled base.</p><p>The study showed that when interacting with Bandit, children with ASD exhibited social behaviors that were out of the ordinary for them, such as initiating play and imitating the robot.</p><p>Matarić and her team also studied how the robot could serve as a social and cognitive aid for elderly people and stroke patients. Bandit was programmed to instruct and motivate users to perform daily movement exercises such as seated aerobics.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A smiling blonde woman gestures at a customizable tabletop robot that wears a knit outfit of a cute animal over its shell." class="rm-shortcode" data-rm-shortcode-id="d0240a8f48f895ca49e2fdac2114e5f9" data-rm-shortcode-name="rebelmouse-image" id="e361f" loading="lazy" src="https://spectrum.ieee.org/media-library/a-smiling-blonde-woman-gestures-at-a-customizable-tabletop-robot-that-wears-a-knit-outfit-of-a-cute-animal-over-its-shell.jpg?id=65574186&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Maja Matarić and doctoral student Amy O’Connell testing Blossom, which is being used to study how it can aid students with anxiety or depression.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">University of Southern California</small></p><p>Over the years, Matarić’s lab developed other robots including <a href="https://magazine.viterbi.usc.edu/spring-2020/features/say-hi-to-kiwi/" target="_blank">Kiwi</a> and <a href="https://dl.acm.org/doi/10.1145/3310356" rel="noopener noreferrer" target="_blank">Blossom</a>. Kiwi, which looked like an owl, helped children with ASD learn social and cognitive skills, helped motivate elderly people living alone to be more physically active, and mediated discussions among family members. Blossom, originally developed at <a href="https://www.cornell.edu/" rel="noopener noreferrer" target="_blank">Cornell</a>, was adapted by the Interaction Lab to make it less expensive and personalizable for individuals. The robot is being used to study how it can aid students with anxiety or depression to practice cognitive behavioral therapy.</p><p>Matarić’s line of research began when she learned that large language model (LLM) chatbots were being promoted to help people with mental health struggles, she said in an <a href="https://edhub.ama-assn.org/jn-learning/audio-player/18985349" rel="noopener noreferrer" target="_blank">episode of the AMA Medical News podcast</a>.</p><p>“It is generally not easy to get [an appointment with a] therapist, or there might not be insurance coverage,” she said. “These, combined with the rates of anxiety and depression, created a real need.”</p><p>That made the chatbot idea appealing, she says, but she was interested to see if they were effective compared with a friendly robot such as Blossom.</p><p>Matarić and her team used the same LLMs to power CBT practice with a chatbot and with Blossom. They ran a two-week study in the USC dorms, where students were randomly assigned to complete CBT exercises daily with either a chatbot or the robot. Participants filled out a clinical assessment to measure their psychiatric distress before and after each session.</p><p>The study showed that students who interacted with the robot experienced a significant decrease in their mental state, Matarić said in the podcast, and students who interacted with the chatbot did not.</p><p class="pull-quote">“Joining an [IEEE] society has an impact, and it can be personal. That’s why I recommend my students join the organization—because it’s important to get out there and get connected.”</p><p>She and her team also reviewed transcripts of conversations between the students and the robot to evaluate how well the LLM responded to the participants. They found the robot was more effective than the chatbot, even though both were using the same model.</p><p>Based on those findings, in 2024 Matarić received a <a href="https://reporter.nih.gov/search/l8sqmMXycEaOMmv3hQHU1A/project-details/11064932" rel="noopener noreferrer" target="_blank">grant</a> from the U.S. <a href="https://www.nimh.nih.gov/" rel="noopener noreferrer" target="_blank">National Institute of Mental Health</a> to conduct a six-week clinical trial to explore how effective a socially assistive robot could be at delivering CBT practice. The trial, currently underway, also is expected to study how Blossom can be personalized to adapt to each user’s preferences and progress, including the way the robot moves, which exercises it recommends, and what feedback it gives.</p><p>During the trial, the 120 students participating are wearing <a href="https://spectrum.ieee.org/fitbit" target="_self">Fitbits</a> to study their physiologic responses. The participants fill out a clinical assessment to measure their psychiatric distress before and after each session.</p><p>Data including the participants’ feelings of relating to the robot, intrinsic motivation, engagement, and adherence will be assessed by the research team, Matarić says.</p><p>She says she’s proud of the graduate students working on this project, and seeing them grow as engineers is one of the most rewarding parts of working in academia.</p><p>“Engineers generally don’t anticipate having to work with human study participants and needing to understand psychology in addition to the hardcore engineering,” she says. “So the students who choose to do this research are just wonderful, caring people.”</p><h2>Finding a community at IEEE</h2><p>Matarić joined IEEE as a graduate student in 1992, the year she published her first paper in <a href="https://ieeexplore.ieee.org/document/1303682" rel="noopener noreferrer" target="_blank">IEEE Transactions on Robotics and Automation</a>. The paper, “<a href="https://ieeexplore.ieee.org/document/143349/" rel="noopener noreferrer" target="_blank">Integration of Representation Into Goal-Driven Behavior-Based Robots</a>,” described her work on Toto.</p><p>As a member of the <a href="https://www.ieee-ras.org/" rel="noopener noreferrer" target="_blank">IEEE Robotics and Automation Society</a>, she says she has gained a community of like-minded people. She enjoys attending conferences including the <a href="https://2025.ieee-icra.org/" rel="noopener noreferrer" target="_blank">IEEE International Conference on Robotics and Automation</a>, the <a href="https://www.ieee-ras.org/conferences-workshops/financially-co-sponsored/iros/" rel="noopener noreferrer" target="_blank">IEEE/RSJ International Conference on Intelligent Robots and Systems</a>, and the <a href="https://humanrobotinteraction.org/2026/" rel="noopener noreferrer" target="_blank">ACM/IEEE International Conference on Human-Robot Interaction</a>, which is closest to her field of research.</p><p>Matarić credits IEEE Life Fellow <a href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10896982" rel="noopener noreferrer" target="_blank">George Bekey</a>, the founding editor in chief of the <a href="https://dl.acm.org/journal/tor" rel="noopener noreferrer" target="_blank"><em><em>IEEE Transactions on Robotics</em></em></a>, for recruiting her for the USC engineering faculty position. He knew of her work through her graduate advisor Brooks, who published a paper in the journal that introduced reactive control and the subsumption architecture, which became the foundation of a new way to control robots. It is his <a href="https://ieeexplore.ieee.org/document/108703" rel="noopener noreferrer" target="_blank">most cited paper</a>. Bekey, who was editor in chief at the time, helped guide Brooks through the challenging review process. Matarić joined Brooks’s lab at MIT two years after its publication, and her work on Toto built on that foundation.</p><p>“Joining a society has an impact, and it can be personal,” she says. “That’s why I recommend my students join the organization—because it’s important to get out there and get connected.”</p>]]></description><pubDate>Mon, 20 Apr 2026 18:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/socially-assistive-robotics</guid><category>Ieee-member-news</category><category>Robots</category><category>Socially-assistive-robotics</category><category>Mental-health</category><category>Ieee-robotics-and-automation-soc</category><category>Type-ti</category><dc:creator>Joanna Goodrich</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-smiling-blonde-woman-poses-with-a-humanoid-robotic-torso-wearing-a-usc-sweatshirt.jpg?id=65574156&amp;width=980"/></item><item><title>Video Friday: Digit Learns to Deadlift</title><link>https://spectrum.ieee.org/robot-learning</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/humanoid-industrial-robot-lifting-a-barbell-with-weighted-plates-in-a-testing-facility.png?id=65564472&width=1245&height=700&coordinates=0%2C55%2C0%2C55"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="32aiuwkocf8"><em>Training a policy for Digit to perform a dead lift isn’t just about Digit impressing colleagues—it lets us push the limits of our hardware and training methodologies. The heavier the object (in this case, 65 pounds [29.5 kg]) the more whole-body coordination we need in our controller, and the more resilience Digit’s actuators and joints require. By including whatever object we want Digit to lift in simulation as we train a new policy, we’re able to account for load distribution, grip forces, and changes to Digit’s center of mass. The result is a policy that translates to a dynamically balanced lift in the real world. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4d192f59ec9b0b2207f97d772708da11" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/32AiUwKOCf8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>New robot, you say...?</p><p>[ <a href="https://www.agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nfxyep79gpa"><em>Gatlin Robotics is proud to unveil our first commercial, showcasing our robots in action for our debut Robot-as-a-Service (RaaS) contract!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3ae4826e934010220e6ac80d8a6e2ce5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/NFxYep79GpA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://grid.gatlinrobotics.com/p/gatlin-robotics-showcases-fleet-progress-in-first-commercial">Gatlin Robotics</a> ]</p><p>Thanks, Erika!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ikrx4lirzje"><em>At Dexterity, we build robots designed for precision, adaptability, and real-world problem solving. But every now and then, we like to remind ourselves (and everyone else) that motion intelligence isn’t just about efficiency—it can be expressive, fluid, even a little playful.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="69d56dd887204ffc25996528b66b492e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IKrX4LIrZjE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://dexterity.ai/">Dexterity</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="m7obdhxipnq"><em>Harvard researchers built a <a data-linked-post="2650258032" href="https://spectrum.ieee.org/warehouse-robots-get-smarter-with-ant-intelligence" target="_blank">swarm of simple antlike robots </a>(RAnts) that can collectively excavate and construct structures without central control. By tuning just two parameters—cooperation strength and material-deposition rate—the same swarm can switch between building new structures and dismantling existing ones. Adaptive group behavior can emerge from the interaction between many simple agents and their environment, with potential applications in many fields. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ebd9d37c7c79a129bb4e41fe11a8d816" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/m7oBdhXiPNQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://seas.harvard.edu/news/simple-robots-collectively-build-and-excavate-are-inspired-ants">Harvard University</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jj_4zyjilde">I really appreciate companies who <a data-linked-post="2650279611" href="https://spectrum.ieee.org/honda-research-institute-haru-social-robot" target="_blank">give their robots the ability to entertain themselves</a>.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f4dbd553626889cf87c98a120bffc1ac" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JJ_4ZYJiLDE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://generalistai.com/">Generalist</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hggqvzyc_qw"><em>“Spark of Color.” Manvi Saxena, Yihao Geng, Jason Brown, Daniel Newman, Cameron Aubin. A tiny controlled explosion inflates the soft membrane of a microcombustion actuator, sending colorful, carefully arranged water droplets skyward. The actuator measures just 8 millimeters in diameter, while the high-speed sequence captures only 3 milliseconds of motion. The work challenges the assumption that <a data-linked-post="2673861853" href="https://spectrum.ieee.org/soft-robot-actuators-bugs" target="_blank">soft actuators</a> must be slow or gentle, showing instead how softness can also be fast, forceful, and explosive.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4bf07d62e25fa3146f39feab3c3ae8b2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hgGQVzyc_qw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.umich.edu/news/2026/spark-of-color-wins-soft-robotics-art-awards/">Michigan Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zomdadpqlka"><em>With the physique of an ordinary person, running at a world champion’s speed!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="28fe6d925e4c9408ea8621b643be78ce" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zoMDadPQLKA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I am questioning whether it knows how to stop.</p><p>[ <a href="https://www.unitree.com/h1">Unitree Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="mitkqvslw8u">Awww. <3</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f9254a961f44a4e39b44c52e6af081b7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mITkQVSLw8U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="io_uqyngasc"><em>In this episode of Innovator Story, the FotoBot team from the University of Hong Kong made an appearance and conducted on-site tests with their AI photography robot at Shenzhen Bay Talent Park. Relying on TRON 1, it easily handles complex terrains such as grasslands, slopes, and stairs, unlocking a brand-new “Robot + Photography” experience for the public.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5c223c1149ff25ffd0772aa8722d4b8c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Io_uqYngasc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://limxdynamics.com/en/products/tron1">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="f6kmjw0teyc">The objective of this game is to cover up as much of the hole as possible, right?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="aae32510483f1dd4a04a301cae713b53" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/f6kMJW0tEyc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">Kinetic Intelligent Machine Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="bf-p70gkm9e">MagicLab Robotics just deployed a massive swarm of robot dogs and humanoids at the Jiangsu Super League opening ceremony. Beyond a stunning spectacle, this is live proof of embodied AI at scale. Coordinating a cross-category fleet in a complex, open-air environment proves our multiagent control systems are ready for real-world deployment.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c6e773d436f4e9a55624e64bb0ec43e0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Bf-P70GkM9E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.magiclab.top/en/">MagicLab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="76lh8g-gwpy">A <a data-linked-post="2653906691" href="https://spectrum.ieee.org/video-friday-intelligent-drone-swarms" target="_blank">swarm of drones</a> being launched out of the back of a Chinook would be terrifying except that from this angle, it looks like the drones are being puked out by an astonished frog.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="57f20921cae6214721d201f002da07fc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/76Lh8g-gwpY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.boeing.com/defense/military-rotorcraft/h-47-chinook">Boeing</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="anarpczkgzc">Welcome to Robot Talk, from IHMC Robotics!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d18939e7e600170f5c30a147bc71f68c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AnARPCZKGZc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robots.ihmc.us/">IHMC Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="r7emahhfmsm"><em>Third-year ‪Michigan Engineering‬ undergrad Yulei Fu sits down with Professor Jessy Grizzle to talk about what it’s actually like to major in robotics at ‪the University of Michigan. What makes it different from computer science or mechanical engineering? Where do graduates end up? Are the courses brutal? And what makes the department feel like a community instead of a competition?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d99734a78261075ba1522e9045dc3b7d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/R7eMAHHFMsM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.umich.edu/">Michigan Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ua7umj7jmja">This CMU RI Yata Memorial Lecture, “Journeys From Research to Commercialization: Lessons from Anki, Waymo, and Bedrock Robotics,” is by Boris Softman.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="989278f672ccc74b0728e4d2617de175" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UA7UMj7jMJA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>In this lecture, Boris will share an honest account of that journey and its lessons, including the energizing wins, the wrong turns and painful surprises, and the moments where an earlier experience turned out to matter more than expected. Closing with a deeper look at Bedrock, he will share why he believes autonomous construction is one of the most important problems robotics can tackle right now, driven by a unique convergence of maturing technology and critical industry need. For students at the beginning of their own paths, this is a talk about how a career in robotics and entrepreneurship might actually unfold, the many variables one navigates in the journey, and why the connections you cannot yet see may end up being the most valuable ones.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/the-teruko-yata-memorial-lecture-in-robotics-boris-sofman/">Carnegie Mellon University Robotics Institute</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 17 Apr 2026 15:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/robot-learning</guid><category>Industrial-robots</category><category>Humanoid-robots</category><category>Video-friday</category><category>Swarm-robotics</category><category>Dancing-robot</category><category>Bipedal-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/humanoid-industrial-robot-lifting-a-barbell-with-weighted-plates-in-a-testing-facility.png?id=65564472&amp;width=980"/></item><item><title>​Boston Dynamics and Google DeepMind Teach Spot to Reason​</title><link>https://spectrum.ieee.org/boston-dynamics-spot-google-deepmind</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/photo-of-yellow-boston-dynamics-robot-dog-using-its-arm-to-load-laundry-into-a-white-basket.png?id=65521323&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span><strong></strong><strong></strong>The amazing and frustrating thing about robots is that they can do almost anything you want them to do, as long as you know how to ask properly. In the not-so-distant past, asking properly meant writing code, and while we’ve thankfully moved beyond that brittle constraint, there’s still an irritatingly inverse correlation between ease of use and complexity of task. </span></p><p><span>AI has promised to change that. The idea is that when AI is embodied within robots—giving AI software a physical presence in the world—those robots will be imbued with reasoning and understanding. This is cutting-edge stuff, though, and while we’ve seen plenty of examples of embodied AI in a research context, finding applications where reasoning robots can provide reliable commercial value has not been easy. <a href="https://bostondynamics.com/" target="_blank">Boston Dynamics</a> is one of the few companies to commercially deploy legged robots at any appreciable scale; there are now several thousand hard at work. Today the company is <a href="https://bostondynamics.com/blog/tools-for-your-to-do-list-with-spot-and-gemini-robotics/" target="_blank">announcing</a> that its quadruped robot <a href="https://spectrum.ieee.org/tag/spot-robot" target="_self">Spot</a> is now equipped with <a href="https://deepmind.google/blog/gemini-robotics-er-1-6/">Google DeepMind’s Gemini Robotics-ER 1.6</a>, a <a href="https://spectrum.ieee.org/gemini-robotics" target="_blank">high-level embodied reasoning model</a> that brings usability and intelligence to complex tasks.</span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="155eddc016bd1bedcfb5b83c4b4a54c3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LP4-c5AK30g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">YouTube.com</small></p><p><span>Although this video shows Spot in a home context, the focus of this partnership is on one of the very few applications where legged robots have proven themselves to be commercially viable: inspection. That is, wandering around industrial facilities, checking to make sure that nothing is imminently exploding. With the new AI onboard, Spot is now able to autonomously look for dangerous debris or spills, read complex gauges and sight glasses, and call on tools like vision-language-action models when it needs help understanding what’s going on in the environment around it.</span></p><p>“Advances like Gemini Robotics-ER 1.6 mark an important step toward robots that can better understand and operate in the physical world,” <a href="https://www.linkedin.com/in/marco-da-silva-447b72/" target="_blank">Marco da Silva</a>, vice president and general manager of Spot at Boston Dynamics, says <a href="https://bostondynamics.com/blog/aivi-learning-now-powered-google-gemini-robotics/" target="_blank">in a press release</a>. “Capabilities like instrument reading and more reliable task reasoning will enable Spot to see, understand, and react to real-world challenges completely autonomously.”</p><h2>Understanding Robot Understanding</h2><p>The words “reasoning” and “understanding” are being increasingly applied to AI and robotics, but as <a href="https://spectrum.ieee.org/humanoid-robots-gill-pratt-darpa" target="_self">Toyota Research Institute’s Gill Pratt recently pointed out</a>, what those words actually <em><em>mean</em></em> for robots in practice isn’t always clear. “The benchmark we measure ourselves against when it comes to understanding is that the system should answer the way a human would,” <a href="https://www.linkedin.com/in/carolinaparada/" target="_blank">Carolina Parada</a>, head of robotics at Google DeepMind, explained in an interview. For robots to reliably and safely perform tasks, this connection between how robots understand the world and how humans do is critical. Otherwise, there may be a disconnect between the instructions that a human gives a robot, and how the robot decides to carry out that task.</p><p>Boston Dynamics’ video above is a potentially messy example of this. One of the instructions to Spot was to “recycle any cans in the living room.” It has no problem completing the task, as the video shows, but in doing so, it grips the can sideways, which is not going to end up well for cans that have leftover liquid in them. We humans would avoid this because we can draw on a lifetime of experience to know how cans should be held, but robots don’t (yet) have that kind of world knowledge.</p><p>Parada says that Gemini Robotics-ER 1.6 approaches situations like this from a safety perspective. “If you ask the robot to bring you a cup of water, it will reason not to place it on the edge of a table where it could fall. We track this using our <a href="https://asimov-benchmark.github.io/v1/" target="_blank">ASIMOV benchmark</a>, which includes a whole lot of natural language examples of things the robot should not do.” The current version of Spot doesn’t use these semantic safety models for manipulation, but the plan is to make future versions reason about holding objects in ways that are safe.</p><p class="shortcode-media shortcode-media-youtube" style="background-color: rgb(255, 255, 255);"><span class="rm-shortcode" data-rm-shortcode-id="5934a9a019325c2e996f3f0dab47b3c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kBwxmlI2yHQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">YouTube.com</small></p><p><span>There does still seem to be a disconnect between Gemini Robotics-ER 1.6 as a high-level reasoning model for a robot, and the robot itself as an interface with the physical world. One of the new features of 1.6 is </span><em><em>success detection</em></em><span>, which combines multiple camera angles to more reliably be able to tell when Spot has successfully grasped an object. This is great if you’re relying entirely on vision for your object interaction, but robots have all kinds of other well-established ways to detect a successful grasp, including touch sensors and force sensors, that 1.6 is not using. The reason why this is the case speaks to a fundamental problem that the robotics field is still trying to figure out: how to train models when you need physical data.</span></p><p><span>“At the moment, these models are strictly vision only,” Parada explains. “There is lots of [visual] information on the web about how to pick up a pen. If we had enough data with touch information, we could easily learn it, but there is not a lot of data with touch sensing on the internet.” Customers who use these new capabilities for inspection with Spot will be required to share their data with Boston Dynamics, which is where some of this data will come from.</span></p><h2>Real-World Robots That Are Useful</h2><p>The fact that Boston Dynamics <em><em>has </em></em>customers makes them something of an anomaly when it comes to legged robots that rely on AI in commercial deployments. And those customers will have to be able to trust the robot—<a href="https://spectrum.ieee.org/ai-hallucination" target="_self">always a problem when AI is involved</a>. “We take this very seriously,” da Silva said in an interview. “We roll out new DeepMind capabilities through beta programs to a smaller set of customers to understand what to anticipate, and we only actively advertise features we are confident will work.” There’s a threshold of usefulness that robots like Spot need to reach, and fortunately, the real world doesn’t demand perfection. “Most critical infrastructure in a facility will be instrumented to tell you whether something is wrong,” da Silva says. “But there is a lot of stuff that is not instrumented that can still cause a problem if you aren’t paying attention to it. We’ve found that somewhere north of 80 percent is the threshold where it’s not annoying. Below that, basically the robot is crying wolf, and the operators will start ignoring it.”</p><p><span></span><span>Both da Silva and Parada agree that there’s still plenty of room for improvement in robotic inspection. As Parada points out, Spot’s rarefied status as a scalable commercial platform provides a valuable opportunity to learn how models like Gemini Robotics-ER 1.6 can be the most useful, and then apply that knowledge to other embodied AI platforms, including </span><a href="https://spectrum.ieee.org/boston-dynamics-atlas-scott-kuindersma" target="_self">Boston Dynamics’ Atlas</a><span>. Does that mean that Atlas is going to be the next industrial inspection robot? Probably not. But if this real-world experience can get us closer to safe and reliable robots that can pick up laundry, take a dog for a walk, and clear away soda cans without making a mess, that’s something we can all get excited about.</span></p>]]></description><pubDate>Tue, 14 Apr 2026 19:45:01 +0000</pubDate><guid>https://spectrum.ieee.org/boston-dynamics-spot-google-deepmind</guid><category>Boston-dynamics</category><category>Spot-robot</category><category>Google-deepmind</category><category>Inspection-robots</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/photo-of-yellow-boston-dynamics-robot-dog-using-its-arm-to-load-laundry-into-a-white-basket.png?id=65521323&amp;width=980"/></item><item><title>Video Friday: This Floor Lamp Will Do Your Chores</title><link>https://spectrum.ieee.org/video-friday-robot-lamp</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robot-arms-hold-up-a-white-t-shirt-in-a-warm-wood-paneled-bedroom.png?id=65502238&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="ahbf2xka9no"><em>Lume is a sculptural floor lamp designed to feel at home the moment you place it. It’s crafted from anodized aluminum and high-gloss finishes, shaped into a slender, balanced form that quietly conceals its complexity. Every surface is refined to feel smooth, precise, and enduring. When it moves, it’s quiet and deliberate. When it’s still, it holds its place with ease.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d40d3980d838fa0341599d8d391d1516" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ahBF2XkA9No?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Apparently, and let me stress that “apparently,” Lume can make the bed, <a data-linked-post="2674304335" href="https://spectrum.ieee.org/robots-folding-clothes" target="_blank">fold laundry</a>, and do other chores involving soft materials. I’m intensely skeptical because it feels like that video has more footage of people staring out of windows and dancing for no reason beyond the robot actually doing anything. And when you do see the robot working at a task, it’s cut up into lots of different pieces of footage in a way that is typically used to distract from either plodding speed, frequent failures, or both. So, yeah. There may be a lot to like about the philosophy here, but even at a suspiciously cheap US $2,500 for a pair of these robots, more detail is certainly called for before they’ve earned your preorder.</p><p>[ <a href="https://syncere.com/">Syncere</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="_p6qoe8zgw0"><em>In Science Robotics, researchers from MIT Media Lab and collaborators from Politecnico di Bari present Electrofluidic Fiber Muscles, a new class of artificial muscle fibers for robots and wearables. Unlike the rigid servo motors used in most robots, these fiber-shaped muscles are soft and flexible. They combine electrohydrodynamic (EHD) fiber pumps—slender tubes that move liquid using electric fields to generate pressure with no moving parts—with fluidic fiber actuators. The muscles are driven by electric fields and operate silently, with no external pumps or reservoirs.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6a2a7924462095e48590bf2423837ee1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_P6QoE8zGw0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2026/new-type-electrically-driven-artificial-muscle-fiber-0409">MIT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="akehvalnlb4">We first saw this thing at <a data-linked-post="2669267948" href="https://spectrum.ieee.org/epfl-lasa" target="_blank">ICRA@40</a> a few years ago, but the paper is out now.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c4b27b74c847c4b2739bc9d300669b05" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AKEHvalnLb4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s41467-025-67675-8">Nature Communications</a> ] via [ <a href="https://www.epfl.ch/labs/lasa/">LASA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="z-p9qizwezu">I do like tea, and I suppose there could be worse applications for a robot than this one, since it leverages both payload and complex terrain mobility.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="126462e5a4e20dca2ce29e218ab9a574" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/z-P9qiZwEzU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="2-p6yzprhg0"><em>We’ve created GEN-1, our latest milestone in scaling robot learning. We believe it to be the first general-purpose AI model that crosses a new performance threshold: mastery of simple physical tasks. It improves average success rates to 99 percent on tasks where previous models achieve 64 percent, completes tasks roughly 3x faster than state-of-the-art, and requires only one hour of robot data for each of these results. GEN-1 unlocks commercial viability across a broad range of applications—and while it cannot solve all tasks today, it is a significant step toward our mission of creating generalist intelligence for the physical world.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7196b1368642643983789adf118058e9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2-P6YZPrHg0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://generalistai.com/blog/apr-02-2026-GEN-1">Generalist</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="g3uo2vawg64"><em>Legged manipulators offer high mobility and versatile manipulation. However, robust interaction with heterogeneous articulated objects, such as doors, drawers, and cabinets, remains challenging because of the diverse articulation types of the objects and the complex dynamics of the legged robot. In this paper, we propose a robust and sample-efficient framework for opening heterogeneous articulated objects with a legged manipulator.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4c8fd46f3a6762d3ce21fe34b5cad8dd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/g3Uo2vAWG64?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://openheart-icra.github.io/OpenHEART/">OpenHEART</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="obr6dncstbm"><em>By deeply coupling real-time depth perception with reinforcement learning motion control, Adam achieves natural humanlike stair-stepping gait, showing outstanding dynamic stability and environmental adaptability.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="298b7a30f4935bda98c7c8902b731b97" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OBR6DncstbM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="kswyn7ihhbu">The way these robots deliver packages will never not be amusing to me.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f77e1a0210ec4d02f17ed729e7abb860" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kSwYN7IhHbU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="spw8rm0a6gw"><em>Tether performs autonomous real-world functional play involving structured, task-directed interactions. We introduce a policy that performs trajectory warping anchored by keypoint correspondences, which is extremely data-efficient and robust to significant spatial and semantic environment variation. Running the policy within a VLM-guided multitask loop, we generate a stream of play data that consistently improves downstream policy learning over time.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1e1d2556d5297447f840842f6e9920d6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SPW8RM0a6gw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://tether-research.github.io/">Tether</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xvurnpqryok"><em>What happens when your walls begin to move? This paper explores the design of human-robot interaction for architectural-scale, shape-changing environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d1689bf5fae640e1c1ded34adde8daa7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xvUrNpQRYok?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://interactive-structures.org/publications/2026-04-fluent-interaction-cyber-physical-architecture/">Interactive Structures Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="j_suvhilbx4">I will admit to being somewhat disappointed about the reality of the Unreal Robotics Lab.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="46f9283ce1c88214401e69b13fcb1d4a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/J_suVHiLBX4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://github.com/URLab-Sim/UnrealRoboticsLab">URLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="9ecw7io-md4"><em>We’re not done yet! Illinois is back in the Final Four for the first time since 2005, and we’re cheering all the way to the championship. This video features teleoperated G1 and AI Worker robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d425261c79a348a97240984e9d86ada8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9eCw7io-MD4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iy2rzubvhhq"><em>Fighting robots are cool. Destroying expensive electronics while fighting robots is not cool. We make robots out of plastic so our electronics survive.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="aa6096b1552bcaaffab7ae289d7974cb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iy2RzuBVHhQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://weaponizedplastic.com/">Weaponized Plastic Fighting League</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 10 Apr 2026 17:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-lamp</guid><category>Home-robots</category><category>Video-friday</category><category>Artificial-muscle</category><category>Agricultural-robots</category><category>Robot-ai</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robot-arms-hold-up-a-white-t-shirt-in-a-warm-wood-paneled-bedroom.png?id=65502238&amp;width=980"/></item><item><title>GoZTASP: A Zero-Trust Platform for Governing Autonomous Systems at Mission Scale</title><link>https://content.knowledgehub.wiley.com/goztasp-a-zero-trust-platform-for-governing-autonomous-systems-at-mission-scale/</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/technology-innovation-institute-logo-with-stylized-tii-and-curved-line.png?id=65498963&width=980"/><br/><br/><p>ZTASP is a mission-scale assurance and governance platform designed for autonomous systems operating in real-world environments. It integrates heterogeneous systems—including drones, robots, sensors, and human operators—into a unified zero-trust architecture. Through Secure Runtime Assurance (SRTA) and Secure Spatio-Temporal Reasoning (SSTR), ZTASP continuously verifies system integrity, enforces safety constraints, and enables resilient operation even under degraded conditions.</p><p>ZTASP has progressed beyond conceptual design, with operational validation at Technology Readiness Level (TRL) 7 in mission critical environments. Core components, including Saluki secure flight controllers, have reached TRL8 and are deployed in customer systems. While initially developed for high-consequence mission environments, the same assurance challenges are increasingly present across domains such as healthcare, transportation, and critical infrastructure.</p><p><span><a href="https://content.knowledgehub.wiley.com/goztasp-a-zero-trust-platform-for-governing-autonomous-systems-at-mission-scale/" target="_blank">Download this free whitepaper now!</a></span></p>]]></description><pubDate>Thu, 09 Apr 2026 15:06:39 +0000</pubDate><guid>https://content.knowledgehub.wiley.com/goztasp-a-zero-trust-platform-for-governing-autonomous-systems-at-mission-scale/</guid><category>Autonomous-systems</category><category>Drones</category><category>Sensors</category><category>Transportation</category><category>Type-whitepaper</category><dc:creator>Technology Innovation Institute</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/65498963/origin.png"/></item><item><title>What Happened When We Set Up a Robotics Lab in a Mall</title><link>https://spectrum.ieee.org/boston-dynamics-spot-interaction</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/families-watch-a-colorful-robotic-dog-demo-at-a-robotics-and-ai-institute-exhibit.jpg?id=65453180&width=1245&height=700&coordinates=0%2C62%2C0%2C63"/><br/><br/><p>Building the next generation of robots for successful integration into our homes, offices, and factories is more than just solving the hardware and software problems—we also need to understand how they will be perceived and how they can work effectively with people in those spaces.</p> <p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <a href="https://rai-inst.com/"></a><a class="shortcode-media-lightbox__toggle shortcode-media-controls__button material-icons" title="Select for lightbox">aspect_ratio</a><a href="https://rai-inst.com/" target="_blank"><img alt="Robotics and AI Institute logo with text about post originally appearing there" class="rm-shortcode" data-rm-shortcode-id="09961581414b810cff45f77932185cb3" data-rm-shortcode-name="rebelmouse-image" id="89ff0" loading="lazy" src="https://spectrum.ieee.org/media-library/robotics-and-ai-institute-logo-with-text-about-post-originally-appearing-there.png?id=65453513&width=980"/></a> </p><p>In summer 2025, <a href="https://spectrum.ieee.org/boston-dynamics-ai-institute-hyundai" target="_blank">RAI Institute</a> set up a free pop-up robot experience in the CambridgeSide mall, designed to let people experience state-of-the-art robotics first hand. While news stories about robots and AI are common, with some being overly critical and some overly optimistic, most people have not encountered robots in the flesh (or metal) as it were. With no direct experience, their opinions are largely shaped by pop culture and social media, both of which are more focused on sensational stories instead of accurate information about how the robots might be used effectively and where the technology still falls short. Our goal with the pop-up was twofold: first, to give people an opportunity to see robots that they would otherwise not have a chance to experience; and second, to better understand how the public feels about interacting with these robots.</p><h2>Designing a Robot Experience for the General Public</h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Three experimental robotic prototypes displayed behind barriers in a bright gallery." class="rm-shortcode" data-rm-shortcode-id="a1fe59976ca74226f29b65137649c4d4" data-rm-shortcode-name="rebelmouse-image" id="c9163" loading="lazy" src="https://spectrum.ieee.org/media-library/three-experimental-robotic-prototypes-displayed-behind-barriers-in-a-bright-gallery.webp?id=65453673&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Some earlier version legged robots, built by the RAI Institute’s Executive Director, Marc Raibert</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Red robot dog and electric bike displayed in glass cases inside a modern mall." class="rm-shortcode" data-rm-shortcode-id="f0c655444535aac7e11e20510c8bbbae" data-rm-shortcode-name="rebelmouse-image" id="6b96a" loading="lazy" src="https://spectrum.ieee.org/media-library/red-robot-dog-and-electric-bike-displayed-in-glass-cases-inside-a-modern-mall.webp?id=65453671&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The ANYmal by ANYrobotics (left) and a previous model of the RAI Institute’s UMV (right)</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p><span>The pop-up space had two areas: a museum area where people could see historical and modern robots, including some <a href="https://spectrum.ieee.org/marc-raibert-boston-dynamics-instutute" target="_blank">RAI Institute</a> builds like the </span><a href="https://rai-inst.com/resources/blog/designing-wheeled-robotic-systems/" target="_blank">UMV</a>,<span> and an interactive experience called “Drive-a-Spot.” This area was a driving arena where anyone who came by could take the controls of a Spot quadruped, one of the more recognizable, commercially available robots today.</span></p><p>The guest robot drivers used a custom controller built on an adaptive video game controller that was designed so that anyone of any age could use it. It featured basic controls: move forward, back, left, right, adjust height, sit, stand, and tilt. The buttons were large so that tiny or elderly hands could use the controller, and the people who drove Spot ranged in age from 2 to over 90.<br/></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Adaptive gaming controller with large programmable buttons on a black table." class="rm-shortcode" data-rm-shortcode-id="d191483045e332282c7d73dac0962f80" data-rm-shortcode-name="rebelmouse-image" id="2545f" loading="lazy" src="https://spectrum.ieee.org/media-library/adaptive-gaming-controller-with-large-programmable-buttons-on-a-black-table.jpg?id=65453210&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The guest robot drivers used a custom controller built on an adaptive video game controller that was designed so that anyone of any age could use it.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p>The demo area was designed to be a bit challenging for the Spot robot to maneuver in—it contained tight passages, low obstacles to step over, a barrier to crouch under, and taller objects the robot had to avoid. Much to the surprise of many of our guests, Spot is able to autonomously adjust itself to traverse and avoid those obstacles when being supervised by the joystick.<br/></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="1c2dcee3b7a437fc3f967b9095f81e91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dPjUkJGC5Xg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small> </p><p><span>The driving arena’s theme rotated every few weeks across four scenarios: a factory, a home, a hospital, and an outdoor/disaster environment. These were chosen to contrast settings where robots are broadly accepted (industrial, emergency response) with settings where public ambivalence is well documented (domestic, healthcare).</span></p><h3></h3><br/><div class="rblad-ieee_in_content"></div><p>The visitors who chose to drive the Spot robot could also participate in a short survey before and after their driving experience. The survey focused on two core dimensions:</p><ul><li><strong>Comfort: How comfortable would you feel if you encountered a robot in a factory, home, hospital, office, or outdoor/disaster scenario?</strong></li><li><strong>Suitability: How well would this robot work in each of those contexts?</strong> </li></ul><p>The survey also recorded emotional reactions immediately after driving, likelihood to recommend the experience, and open-ended responses about what they found memorable or surprising. The researchers were careful to separate the environment participants drove through from the scenarios they were asked to evaluate in the survey. This distinction is important for interpreting the results given below.</p><h2>Did Interacting With the Robot Change People’s Feelings about Robots?</h2><p><span></span><span>Out of approximately 10,000 guests that visited the Robot Lab, 10 percent of those drove the Spot and opted in to our surveys. Of those surveyed, more than 65 percent of people had seen images or videos of Spot robots online, but most had never seen one of the robots in person.</span></p><h3>Increased Comfort Through Experience</h3><p>Across all five contexts presented in the survey (factory, home, hospital, office, and outdoor/disaster scenarios), comfort scores increased significantly after the driving session. The effects were small to moderate in magnitude, but they were consistent and statistically robust after correcting for multiple comparisons across all participants spanning children to older adults.</p><p>The largest gain appeared in the outdoor/disaster context, which started with low comfort despite high perceived suitability. People already thought Spot would be useful in search-and-rescue scenarios; they just weren’t comfortable with it performing in that scenario. This discomfort may stem from media portrayals of quadruped robots in military contexts. A few minutes of hands-on control appears to partially dissolve that apprehension.</p><p>Participants who drove through the factory-themed arena showed no significant increase in comfort, but this scenario already had the highest rating of any rated context at baseline, leaving little room for improvement.</p><p>No matter their previous experience, most people were neutral about having a Spot robot in their home before their driving experience. However, after the experience of controlling the Spot robot, people had a statistically significant increase in their comfort at having a Spot in their home and also felt that a Spot robot was more suitable for work in any environment, not just the one they had driven it in.</p><h3>Better Understanding of Where Robots Can Fit Into Daily Life</h3><p>Perceived suitability for Spot to operate in each context also increased. However, the pattern in the data is different. The largest gains weren’t in the high-baseline industrial and outdoor contexts. They were in home, office, and hospital—the very environments where people started out most skeptical.</p><p>Participants who drove the Spot robot in a home-themed environment didn’t just consider homes more suitable for robots; they also rated hospitals and offices as more suitable. This result suggests that hands-on control alters something more fundamental than just context-specific familiarity. It may change a person’s underlying understanding of a robot’s capabilities and, consequently, where they believe robots are appropriate.</p><h3>Results by Demographic</h3><p>The hands-on experience seems to be similarly effective across genders, although it does not completely eliminate existing disparities. For example, men reported higher baseline comfort than women across all five contexts. However, all genders improved at similar rates after interaction. The gap didn’t significantly widen or close in most contexts, though it did narrow in factory and office settings.</p><p>Age effects were more context dependent. Children (aged 8–17) rated factory environments as less comfortable and less suitable before the study. However, this could be because most children do not have experience with factory settings or industrial environments. After interaction, this gap largely persisted. By contrast, children showed stronger gains in office comfort than older adults and entered the study rating home contexts more favorably than adults did.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Stacked bar chart of survey participants by age group and gender categories." class="rm-shortcode" data-rm-shortcode-id="91a6e3f855ba0152f034182d4710df9d" data-rm-shortcode-name="rebelmouse-image" id="313e6" loading="lazy" src="https://spectrum.ieee.org/media-library/stacked-bar-chart-of-survey-participants-by-age-group-and-gender-categories.jpg?id=65453246&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Participants ranged from age 8 to over age 75.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p><span>Participants who had previously driven Spot (mainly robotics professionals) began with higher comfort across the board. But after the hands-on session, people with no prior exposure caught up to experienced drivers. This level of familiarity would be difficult to replicate with images and videos alone.</span></p><h3>Post-Interaction Results</h3><p>Post-interaction emotional data was overwhelmingly positive. “Excitement” was reported by 74 percent of participants, “happiness” by 50 percent, and only 12 percent reported “nervousness.” Over 55 percent rated the experience as “brilliant,” and 62 percent said they were very likely to recommend it to a friend.</p><p>The open-ended responses added a lot more color. The most commonly mentioned moments were locomotion and terrain adaptation (22 percent). This included the way Spot navigated steps, tight spaces, and uneven ground and expressive tilt movements (22 percent), which people found surprisingly doglike or dancelike. A smaller set of responses (3 percent) described anthropomorphic reactions: worrying about “hurting” the robot or finding its behavior “silly” in a way that prompted genuine emotional response.</p><p>When asked what tasks they’d want a robot to perform, responses shifted meaningfully. Before driving, answers clustered around domestic assistance and heavy or hazardous labor. After driving, domestic help remained prominent, but entertainment and play jumped from 7.5 percent to 19.4 percent. Companionship also appeared at 5 percent. References to hazardous or industrial tasks declined as people who had operated the robot began imagining it as a companion and playmate, not just a labor-replacement tool.</p><h2>Key Takeaways from the Robot Lab</h2><p>In the not-so-distant future, robots will become more common in public and private spaces. But whether that integration into daily life will be accepted by the general public remains to be seen. The standard approach to building acceptance has been passive exposure such as videos, exhibits, and articles. This study suggests giving people agency and letting them actually operate a robot is a qualitatively different intervention.</p><p>Short, well-designed, hands-on encounters can raise comfort in precisely the social domains where ambivalence is highest and where future robotics deployment will likely take place. This hands-on experience shouldn’t be limited to tech conferences and museums, as it may be more valuable than just entertaining.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Children control a robot car at a tech booth with staff and jungle-themed backdrop" class="rm-shortcode" data-rm-shortcode-id="561f653ae87e1468c7ac31ac92d0fe00" data-rm-shortcode-name="rebelmouse-image" id="a32d5" loading="lazy" src="https://spectrum.ieee.org/media-library/children-control-a-robot-car-at-a-tech-booth-with-staff-and-jungle-themed-backdrop.jpg?id=65453264&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Fun for all ages!</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p><span>We consider the pop-up a success, but as with all experiments, we also learned a lot along the way. For our takeaways, in addition to the increased comfort with robots, we also found that the guests to our space really enjoyed talking to the robotics experts who staffed the location. For many people, the opportunity to talk to a roboticist was as unique as the opportunity to drive a robot, and in the future, we are excited to continue to share our technical work as well as the experiences of our humans, in addition to our humanoids.</span></p><p>Does building a space where folks can experience robots firsthand have the potential to create meaningful, long-term attitude shifts? That remains an open question. But the effect’s direction and consistency across different situations, ages, and genders are hard to ignore.</p><div class="horizontal-rule"></div><p><a href="https://rai-inst.com/wp-content/uploads/2026/03/HRI26-Pop-Up_Encounters_with_Spot.pdf" target="_blank">Pop-Up Encounters With Spot: Shaping Public Perceptions of Robots Through Hands-On Experience</a>, by Hae Won Park, Georgia Van de Zande, Xiajie Zhang, Dawn Wendell, and Jessica Hodgins from the RAI Institute and the MIT Media Lab, was presented last month at the <a href="https://humanrobotinteraction.org/2026/" target="_blank">2026 ACM/IEEE International Conference on Human-Robot Interaction</a> in Edinburgh, Scotland.</p>]]></description><pubDate>Sun, 05 Apr 2026 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/boston-dynamics-spot-interaction</guid><category>Boston-dynamics</category><category>Legged-robots</category><category>Spot-robot</category><dc:creator>Dawn Wendell</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/families-watch-a-colorful-robotic-dog-demo-at-a-robotics-and-ai-institute-exhibit.jpg?id=65453180&amp;width=980"/></item><item><title>Video Friday: Digit Learns to Dance—Virtually Overnight</title><link>https://spectrum.ieee.org/video-humanoid-dancing</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/bipedal-teal-robot-practices-side-to-side-dance-move-with-arm-movement.gif?id=65460048&width=1245&height=700&coordinates=0%2C47%2C0%2C47"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="pc-n6aciusu"><em>Getting Digit to dance takes more than putting on some fancy shoes—our AI Team can teach Digit new whole-body control capabilities overnight. Using raw motion data from mocap, animation, and teleop methods, Digit gets new skills through sim-to-real reinforcement training.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4477bcbaf1f5072afe88c2c0015eebd1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Pc-n6ACIuSU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sy2xyrmv44y"><em>We’ve created GEN-1, our latest milestone in scaling robot learning. We believe it to be the first general-purpose AI model that crosses a new performance threshold: mastery of simple physical tasks. It improves average success rates to 99% on tasks where previous models achieve 64%, completes tasks roughly 3x faster than state of the art, and requires only 1 hour of robot data for each of these results. GEN-1 unlocks commercial viability across a broad range of applications—and while it cannot solve all tasks today, it is a significant step towards our mission of creating generalist intelligence for the physical world.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bbbeecb0e15f3b78f50b3ebf230ecf33" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SY2xyrmV44Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://generalistai.com/blog/apr-02-2026-GEN-1">Generalist</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pn_bj5-qyw8"><em>Unitree open-sources UnifoLM-WBT-Dataset—high-quality real-world humanoid robot <a data-linked-post="2650273084" href="https://spectrum.ieee.org/mit-humanoid-robot-teleoperation-dynamic-tasks" target="_blank">whole-body teleoperation</a> (WBT) dataset for open environments. Publicly available since March 5, 2026, the dataset will continue to receive high-frequency rolling updates. It aims to establish the most comprehensive real-world humanoid robot dataset in terms of scenario coverage, task complexity, and manipulation diversity.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bd19da6e3dfeb2ede20007b534d1b9a6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pN_bj5-QyW8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://huggingface.co/collections/unitreerobotics/unifolm-wbt-dataset">Hugging Face</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="79mr-_-a9js"><em>Autonomous mobile robots operating in human-shared indoor environments often require paths that reflect human spatial intentions, such as avoiding interference with pedestrian flow or maintaining comfortable clearance. This paper presents MRReP, a Mixed Reality-based interface that enables users to draw a Hand-drawn Reference Path (HRP) directly on the physical floor using hand gestures.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="783457e452248043a5ec6e2898ae5289" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/79mR-_-a9js?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mertcookimg.github.io/mrrep/">MRReP</a> ]</p><p>Thanks, Masato!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="97qialc5hnm"><em>Eye contact, even momentarily between strangers, plays a pivotal role in fostering human connection, promoting happiness, and enhancing belonging. Through autonomous navigation and adaptive mirror control, Mirrorbot facilitates serendipitous, nonverbal interactions by dynamically transitioning reflections from self-focused to mutual recognition, sparking eye contact, shared awareness, and playful engagement.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="232f93e3a45a2e11d81366bb7ed95286" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/97qIaLC5hNM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arl.human.cornell.edu/research-MirrorBot.html">ARL</a> ] via [ <a href="https://news.cornell.edu/stories/2026/04/mirrorbot-fostering-human-connection">Cornell University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jya06ffonyg"><em>Experience PAL Robotics’ new teleoperation system for TIAGo Pro, the AI-ready mobile manipulator designed for advanced research. This real-time VR teleoperation setup allows precise control of TIAGo Pro’s dual arms in Cartesian space, ideal for remote manipulation, AI data collection, and robot learning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="86699af54f2bfd064590b0cd59aa3f8c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jya06FFONyg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pal-robotics.com/robot/tiago-pro/">PAL Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="t52sq8gk5ks">Utter brilliance from Robust AI. No notes.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="71e7d47e220a5b61b914c1491f1df3dc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/T52SQ8Gk5Ks?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robust.ai/">Robust AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="w8lqu8dkvp4"><em>Come along with our Senior Test Engineer, Nick L., as he takes us on a tour of the <a data-linked-post="2650277831" href="https://spectrum.ieee.org/qa-irobot-roomba-i7" target="_blank">Home Test Labs</a> inside the iRobot HQ.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="56a753f2b7e0640f199e35246a22843f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/W8lQU8dKvP4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.irobot.com/en_US/our-story.html">iRobot</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gjukjrwjpxg"><em>By automating the final “magic 5%” of production—the precise trimming of swim goggles’ silicone gaskets based on individual face scans—UR cobots allow THEMAGIC5 to deliver affordable, custom-fit goggles, enabling the company to scale from a Kickstarter sensation to selling over 400,000 goggles worldwide.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="76ebeda03bf930b9cd576a8e870f8dad" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GJukJRWjPxg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.universal-robots.com/case-stories/non-stop-robot-precision-for-7-years-cobots-deliver-the-last-magic-5-in-swim-goggle-production/">Universal Robots</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="x16ht1erjhk"><em>Sanctuary AI has once again demonstrated its industry-leading approach to training dexterous manipulation policies for its advanced hydraulic hands. In this video, their proprietary hydraulic hand autonomously manipulates a lettered cube, continuously reorienting it to match a specified goal (displayed in the bottom-left corner of the video).</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ad1d77f7ce4f331c7e74b0b779ff6cae" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/X16Ht1ERjHk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.sanctuary.ai/">Sanctuary AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="r3toz2pgppy"><em>China’s Yuxing 3-06 commercial experimental satellite, the first of its kind to be equipped with a flexible robotic arm, has recently completed an in-orbit refueling test and verification of key technologies. The test paves the way for Yuxing 3-06, dubbed a “space refueling station,” to refuel other satellites in orbit, manage space debris, and provide other in-orbit services.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="eaf9d2765bb1e0ebff60f038ccba42fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/R3TOZ2PgPPY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mp.weixin.qq.com/s/1c-9aNwuXv_p-VhojMkwwA">Sanyuan Aerospace</a> ] via [ <a href="https://spacenews.com/chinese-startup-tests-flexible-robotic-arm-in-space-for-on-orbit-servicing/">Space News</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="z4poalprrhe"><em>This is a demonstration of natural walking, whole-body teleoperation, and motion tracking with our custom-built humanoid robot. The control policies are trained using large-scale parallel reinforcement learning (RL). By deploying robust policies learned in a physics simulator onto the real hardware, we achieve dynamic and stable whole-body motions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="703bacdcb0167fb3aa9bfe36e1da07ac" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/z4POaLPRRhE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.tokyo/">Tokyo Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="5olcwku7l9u"><em>Faced with aging railway infrastructure, a shrinking workforce and rising construction costs, Japan Railway West asked construction innovator Serendix to replace an old wooden building at its Hatsushima railway station using its 3D printing technology. An ABB robot enabled the company to assemble the new building in a single night ready for the first train service the next day.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="031eec5b200f86cdad72129d9a002cfc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5olcWkU7l9U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.abb.com/global/en/news/134689">ABB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="1k1phiqcfty"><em>Humanoid, SAP, and Martur Fompak team up to test humanoid robots in automotive manufacturing logistics. This joint proof of concept explores how robots can streamline operations, improve efficiency, and shape the future of smart factories.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cc54aa14687108db3bc231b8cc456fea" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1K1phiQCftY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thehumanoid.ai/">Humanoid</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="oqglmefwbt8">This MIT Robotics Seminar is from Dario Floreano at EPFL, on “Avian Inspired Drones.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7013e7fe97df52eb328681b647c9fddc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oqglMEFWBt8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.mit.edu/robotics-seminar/">MIT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="etk5es0jvm4">This MIT Robotics Seminar is from Ken Goldberg at UC Berkeley: “Good Old-Fashioned Engineering Can Close the 100,000 Year ‘Data Gap’ in Robotics.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="710bc514cbab6092dc5f439cf03127c6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EtK5es0jVM4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.mit.edu/robotics-seminar/">MIT</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 03 Apr 2026 16:30:01 +0000</pubDate><guid>https://spectrum.ieee.org/video-humanoid-dancing</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Robot-ai</category><category>Human-robot-interaction</category><category>Teleoperation</category><category>Industrial-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/bipedal-teal-robot-practices-side-to-side-dance-move-with-arm-movement.gif?id=65460048&amp;width=980"/></item><item><title>Humanoid Robots Hit a Turning Point as Their Brains Catch Up</title><link>https://spectrum.ieee.org/humanoid-robots-gill-pratt-darpa</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-smiling-bespectacled-bearded-man-kneels-posed-behind-a-robotic-torso.jpg?id=65446567&width=1245&height=700&coordinates=0%2C187%2C0%2C188"/><br/><br/><p>In 2012, the U.S. Defense Advanced Research Projects Agency announced the <a href="https://spectrum.ieee.org/darpa-robotics-challenge-here-are-the-official-details" target="_self">DARPA Robotics Challenge</a> (DRC). The multiyear, multimillion-dollar competition for disaster robotics resulted in <a href="https://spectrum.ieee.org/darpa-unveils-atlas-drc-robot" target="_self">Boston Dynamics’ Atlas</a>, some <a href="https://spectrum.ieee.org/darpa-robotics-challenge-amazing-moments-lessons-learned-whats-next" target="_self">absolutely incredible moments</a> from one of the very first generations of useful humanoid robots, and <a href="https://www.youtube.com/watch?v=g0TaYhjpOfo" rel="noopener noreferrer" target="_blank">a blooper video</a> that will live on forever.</p><p><a href="https://www.tri.global/about-us/dr-gill-pratt" rel="noopener noreferrer" target="_blank">Gill Pratt</a>, the architect of the competition, had a very clear understanding of what the DRC was going to do for robotics. “The reason [for the DARPA Robotics Challenge] is actually to push the field forward and make this capability a reality,” <a href="https://spectrum.ieee.org/darpa-robotics-challenge-qa-with-gill-pratt" target="_self">Pratt told <em><em>IEEE Spectrum</em></em> in 2012</a>. At the time, he pointed out that before the <a href="https://spectrum.ieee.org/sand-trap" target="_self">DARPA Grand Challenge</a> in 2004 and the <a href="https://spectrum.ieee.org/autonomous-vehicles-complete-darpa-urban-challenge" target="_self">DARPA Urban Challenge</a> in 2007, driverless cars for complex environments essentially did not exist. He saw the DRC doing the same thing for robotics.</p><p>It’s been about a decade since <a href="https://spectrum.ieee.org/darpa-robotics-challenge-finals-winner" target="_self">the conclusion of the DARPA Robotics Challenge</a>, and many in the industry believe humanoid robots are about to have the transformative moment that Pratt predicted. But as is common in robotics, things tend to be far more difficult than it seems like they should be. <em><em>Spectrum</em></em> checked in with Pratt, now the <a href="https://www.linkedin.com/in/gillpratt/" rel="noopener noreferrer" target="_blank">CEO of the Toyota Research Institute</a> (TRI), to find out what’s holding humanoid robotics back, what he thinks these robots should be doing (or not doing), and how to navigate the humanoid hype bubble. </p><p><strong>What do you think about this robotics moment that we’re in?</strong></p><p><strong>Gill Pratt:</strong> What has changed is actually not about humanoids. Many people have been building research robots in the humanoid form for a long time. What’s different now isn’t the body, but the brain. We have always had this disparity in the robotics field where the mechanisms we were building were incredibly capable, but we didn’t really have the means for making the utility of the robot match that potential. Now we actually do, and that’s because of the AI revolution that has happened over the last few years.</p><p><strong>It’s very tempting to look back 10 years and directly credit the DRC with a lot of what is now happening with commercial humanoids. Is there any reason </strong><em><strong><em>not </em></strong></em><strong>to do that?</strong></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A smiling man poses with his arm around two humanoid robots, one with a shell on, and one with electronics exposed." class="rm-shortcode" data-rm-shortcode-id="7958768fc634cfa9e3071e39840d118e" data-rm-shortcode-name="rebelmouse-image" id="47f73" loading="lazy" src="https://spectrum.ieee.org/media-library/a-smiling-man-poses-with-his-arm-around-two-humanoid-robots-one-with-a-shell-on-and-one-with-electronics-exposed.jpg?id=65446571&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Gill Pratt poses with an early version of NASA’s Valkyrie DRC robot.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Gill Pratt</small></p><p><strong>Pratt:</strong> No, but I want to be humble about it. The DRC was focused on half autonomy and half teleoperation in real time. There was remote supervision, and then semiautonomy to amplify that supervision to handle tasks in real time while the remote person was telling the robot what to do. That was all before the breakthroughs that have happened in AI recently.</p><p>What has changed now is that we have a way to essentially teach robots what to do, and make them competent in a way that doesn’t require writing code; you can just demonstrate the task to the robot instead. With a sufficient amount of that data and new AI methods, robots can be far more performant than ever before.</p><p><strong>But that data is a bottleneck, right? How do we know what it should consist of, and what a sufficient amount is to get a robot to do something reliably?</strong></p><p><strong>Pratt:</strong> This mirrors exactly the debate going on in large language models [LLMs]. You have certain people who believe that if you take LLMs—which are autoregressive predictors that guess what the next word should be based on past words—and patch them up with a variety of methods to solve their hallucinations, we’ll eventually get to a point where we can trust the AI system. And then there are other people, and I think Yann LeCun is the most well-known of them, who say that’s nonsense, and we need something else. His view, and I agree, is that we need world models. We need some way for the AI system to imagine, try things out, and truly reason.</p><p>And I know that we’re applying words like ‘reason’ to what are essentially pattern-matching systems. Saying that there’s ‘reasoning’ is just a sticker we put on whatever we’ve built; it’s not true reasoning.</p><h2>Data Bottlenecks in Robot Learning</h2><p><strong>This is an example of </strong><a href="https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow" target="_blank"><strong>”system one” versus “system two”</strong></a><strong> thinking, right?</strong></p><p><strong>Pratt:</strong> Yes. System one is the fast, reflexive thinking we have, which is the kind of pattern matching that current LLMs do. System two is the slow reasoning that involves imagination and world models. That’s what we have not done yet. Progress on system one has been extraordinary, but we still don’t have system two. These attempts to patch system one to make it system two are like trying to squeeze a balloon filled with water; you squeeze it on one side and the water bulges out on the other side. You keep getting surprised that you fix one thing and something else breaks, and the performance overall doesn’t really get that much better.</p><p><strong>How have you been approaching this problem at TRI?</strong></p><p><strong>Pratt:</strong> Two years ago, <a href="https://medium.com/toyotaresearch/tris-robots-learn-new-skills-in-an-afternoon-here-s-how-2c30b1a8c573" target="_blank">we came up with diffusion policy</a>, and then we came up with what I call <a href="https://spectrum.ieee.org/boston-dynamics-toyota-research" target="_self">large behavior models</a> (LBMs). That involves having one model trained on many tasks, and showing that as you add each task, it actually helps with the other tasks and cuts down on the amount of training data needed to reach a given level of performance. These have been incredible system one advances.</p><p>The breakthrough happened when we realized that diffusion could be applied to robot behavior. We discovered that operating in the behavior space, from vision in, to action out, worked incredibly well. That kicked off the whole field, and since then, I think every robotics demonstration that we’ve seen is using some form of diffusion policy to do what it’s doing. But again, this is system-one pattern matching: ‘If I see the world like this, I act on the world like that.’ The robot’s not imagining, thinking, and planning the way traditional robotics with hand coding used to do. It’s just reacting.</p><p><strong>System one’s pattern matching often breaks down in the real world, though, as we’ve seen with autonomous driving’s struggles.</strong></p><p><strong>Pratt:</strong> Ten years ago, when TRI first started, <a href="https://spectrum.ieee.org/toyota-gill-pratt-on-the-reality-of-full-autonomy" target="_self">almost everybody </a>was saying that automated driving was right around the corner. </p><p>Ten years later, I do think we are now there, and the remaining questions are business ones: How much does the hardware cost, the insurance, the support, does it economically make sense? We haven’t necessarily <em><em>solved</em></em> automated driving, but our solutions are good enough, because we use humans for backup. When an automated vehicle gets stuck at a double-parked car, it calls home and asks a person for a system-two decision. I think other robots could do that also. Most of the time they do their work on their own, and every once in a while, they raise their hand for help.</p><p><strong>If we’ve just barely managed to get autonomous cars right, why are we devoting so much attention to the legged humanoid form factor?</strong></p><p><strong>Pratt:</strong> We’ve built the world with physical affordances for our bodies. If the robot is to do well in that world, it should have something that takes advantage of those affordances. It’s also easier for imitation learning to work because we have the same form. And legs are good for certain environments; you can step over obstacles to balance faster than you can roll to a new point of support with wheels. Having said all that, legs are not always the most practical thing. It’s very weird to see so much focus on legged robots in factories, which are flat environments perfectly suited for wheels.</p><h2>Managing the Humanoid Robotics Hype</h2><p><strong>Do you think that the amount of money being poured into legged humanoids is a good thing for robotics?</strong></p><p><strong>Pratt:</strong> It has both advantages and dangers. It’s wonderful seeing so many resources into the robotics field, and I do think that something special has occurred. Things are not the way they were before, and there are so many possibilities when you think about people teaching robots how to do things.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A smiling man gazes up at a humanoid robotic structure that is many times larger than him." class="rm-shortcode" data-rm-shortcode-id="c0e4ba89ea81f79fe4fce5cddd5edb27" data-rm-shortcode-name="rebelmouse-image" id="528a3" loading="lazy" src="https://spectrum.ieee.org/media-library/a-smiling-man-gazes-up-at-a-humanoid-robotic-structure-that-is-many-times-larger-than-him.jpg?id=65446572&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Gill Pratt admires a robot on the roof of the Ghibli Museum in Tokyo.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Gill Pratt</small></p><p><strong>What kinds of things should humans be teaching robots to do?</strong></p><p><strong>Pratt: </strong>For 10 years at TRI, <a href="https://spectrum.ieee.org/gill-pratt-toyota-elder-care-robots" target="_self">we’ve been thinking about society and aging</a>. It’s not just about physical disability; it’s about loneliness and loss of purpose, which are far more prevalent (and far worse) problems. And so the question is, what can we do technologically to help people feel that they’re younger?</p><p>At TRI, we’re exploring “care-receiving robots”—robots that receive teaching from a human. We have evolved to be creatures that love giving and love helping. When you program a machine by demonstration, and that machine goes on to help someone else, you feel a sense of purpose. We think robots can be bidirectional things to improve quality of life psychologically, not only physically.</p><p><strong>When you started TRI 10 years ago, I asked you what you would be focusing on, your answer really stuck with me: You said elder care, because “we don’t have a choice.”</strong></p><p><strong>Pratt:</strong> Yes. The statistics in Japan and the U.S. are only getting worse, and we <em><em>don’t</em></em> have a choice. It’s important to remember that an aging society has a huge impact on young people. This is because of the dependency ratio, which is how many young people in the workforce are supporting both people that are too young to work, and also people that are too old to work. Those numbers keep getting worse and worse.</p><p><strong>How do we solve this?</strong></p><p><strong>Pratt:</strong> We’ve had some incredible breakthroughs with system one, but it doesn’t mean the robots are going to be doing all that much, unless somebody makes a system-two breakthrough also. Or, where we have a system where humans provide some level of system-two supervisory control.</p><p><strong>That kind of human supervisory control takes us right back to the DRC, doesn’t it?</strong></p><p><strong>Pratt: </strong>[Laughs] That’s exactly right! Look, I’m not going to tell you not to praise the DRC… There was someone who called it the “<a href="https://www.youtube.com/watch?v=w222KFAiMQc" target="_blank">Woodstock of Robots</a>,” which just warmed my heart, that was so cool!</p><p><strong>So, 10 years later, how do you feel about the amount of hype in humanoid robotics right now?</strong></p><p><strong>Pratt: </strong>We are approaching what (I hope!) is a peak of inflated expectations for humanoids. And that’s because nobody’s thinking deeply enough about the system-one versus system-two thing.</p><p>Right now, our physical AI systems are just pattern matching. They’re incredibly capable, and it’s astonishing how good these things are—we are so proud of it. And we do believe that aggregating learning from many tasks through large behavior models will be incredibly effective. But it’s still not system two. There’s a lot of overpromising going on, and it’s very sad because it’s setting us up for a fall. What I’m worried about is the trough of disillusionment that will follow.</p><p><strong>How do we avoid that crash in robotics when the humanoid hype bubble bursts?</strong></p><p><strong>Pratt: </strong>For now, we need damping. In control systems, you stabilize an unstable system by adding damping. The press and the academic world can add lead compensation by reminding everyone that what we’re seeing in humanoids now isn’t really reasoning.</p><p>We should also remember that the automated driving field went through a bubble burst also, and just a few companies survived that, by keeping the hype down and being persistent. I think we should do that here, too.</p>]]></description><pubDate>Thu, 02 Apr 2026 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/humanoid-robots-gill-pratt-darpa</guid><category>Humanoid-robots</category><category>Darpa</category><category>Artificial-intelligence</category><category>Drc</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-smiling-bespectacled-bearded-man-kneels-posed-behind-a-robotic-torso.jpg?id=65446567&amp;width=980"/></item><item><title>Wi-Fi That Can Withstand a Nuclear Reactor</title><link>https://spectrum.ieee.org/robotics-in-nuclear-industry</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/close-up-of-a-receiver-chip.jpg?id=65428613&width=1245&height=700&coordinates=0%2C187%2C0%2C188"/><br/><br/><p>Researchers have made a Wi-Fi receiver that’s tough enough to work inside a nuclear reactor. They hope the receiver might be part of a wireless communications system for robotics used to <a href="https://www.iaea.org/topics/decommissioning" rel="noopener noreferrer" target="_blank">decommission</a> reactors.</p><p>Yasuto Narukiyo, a graduate student at the Institute of Science Tokyo, <a href="https://ieeexplore.ieee.org/document/11408968" rel="noopener noreferrer" target="_blank">presented</a> the wireless receiver at the <a href="https://www.isscc.org/" rel="noopener noreferrer" target="_blank">IEEE International Solid-State Circuits Conference</a> (<a href="https://spectrum.ieee.org/tag/isscc" target="_blank">ISSCC</a>), in San Francisco in February. The receiver endured a total radiation dose of 500 kilograys, orders of magnitude higher than the doses typically tolerated by electronics in outer space.</p><p>After the 2011 nuclear disaster at the <a href="https://spectrum.ieee.org/special-reports/fukushima-and-the-future-of-nuclear-power/" target="_self">Fukushima Daiichi</a> plant, engineers began using robots to help characterize and clean up the site. Most of these require local area network (LAN) cables that can get tangled, says Narukiyo. His team, which includes his advisor <a href="https://strdb.s.isct.ac.jp/html/100002402_en.html" rel="noopener noreferrer" target="_blank">Atsushi Shirane</a> and <a href="https://www2.kek.jp/qup/member/miyahara.html" rel="noopener noreferrer" target="_blank">Masaya Miyahara</a> of Japan’s High Energy Accelerator Research Organization (KEK), is aiming to develop a wireless system for controlling robots in this harsh environment.</p><p>Even under less dramatic circumstances, nuclear plants don’t last forever, and they need to be safely dismantled and decontaminated so the sites can be reused, a process called decommissioning. The process is lengthy, and risks exposing people to radiation, which is why engineers hope robots can come to the rescue. </p><p>The need for such robots is only growing. According to a <a href="https://www.sciencedirect.com/science/article/pii/S1364032124003472" rel="noopener noreferrer" target="_blank">2024 study</a>, of 204 reactors that have been closed, only 11 plants with a capacity over 100 megawatts have been fully decommissioned, and 200 more reactors will reach the end of their lifetimes in the next 20 years.</p><p>While electronics for space exploration are typically required to endure radiation doses of 100 to 300 grays over three years, a robot operating in a nuclear reactor needs to endure more than 500 kGy over the course of six months, says Narukiyo—at least 1,000 times the dosage. A robotic arm made by KUKA was able to <a href="https://www.frontiersin.org/journals/robotics-and-ai/articles/10.3389/frobt.2020.00006/full" rel="noopener noreferrer" target="_blank">withstand</a> just 164.55 Gy of damage before failing. For comparison, the lens of the eye absorbs just <a href="https://www.epa.gov/radiation/radiation-terms-and-units" rel="noopener noreferrer" target="_blank">60 milligrays</a> during a CT scan of the brain.</p><h2>Radiation Hardening</h2><p>To “<a href="https://spectrum.ieee.org/self-healing-electronics-jupiter" target="_blank">harden</a>” the 2.4-gigahertz Wi-Fi receiver against intense levels of radiation, Narukiyo and his team changed its mix of components, minimized the total number of transistors, and tinkered with the geometry of the transistors that were left. </p><p>The transistors, silicon MOSFETs (metal-oxide semiconductor field-effect transistors), contain an oxide layer that’s particularly vulnerable to radiation damage. Blasts of gamma rays can trap positive charges in the oxide, degrading the device’s performance and causing errors. So using fewer of them minimizes the problem. The researchers also made each transistor’s gate longer and wider. The gate controls the flow of current—longer, wider gates perform better under exposure to radiation.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" rel="float: left;" style="float: left;"> <img alt="A tabletop metal cylinder with a circuit board connected to power plugs on top of it." class="rm-shortcode" data-rm-shortcode-id="f6dd940d1127aa3f80e4b75a102fc43c" data-rm-shortcode-name="rebelmouse-image" id="49944" loading="lazy" src="https://spectrum.ieee.org/media-library/a-tabletop-metal-cylinder-with-a-circuit-board-connected-to-power-plugs-on-top-of-it.jpg?id=65428642&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Researchers tested the Wi-Fi receiver by placing it on top of a radiation source.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Yasuto Narukiyo, Sena Kato, et al.</small></p><p>The group also considered the differences in how radiation affects PMOS transistors, MOSFETs in which current is carried primarily by positive charges, and NMOS, in which it is carried by electrons. PMOS transistors are more vulnerable to radiation damage because positive charge gets trapped in both the oxide and at the interface between the oxide and the rest of the semiconductor. These add up and shift the transistor towards the “off” state, says Narukiyo. To compensate, the new receiver design minimizes the use of PMOS, replacing these transistors with other elements such as inductors that don’t have an oxide layer. NMOS transistors are more resilient, says Narukiyo, because positive charges trapped in the oxide are to some extent canceled out by negative charges that get trapped at the interface.</p><p>Narukiyo and his team measured the performance of the receiver before exposure to radiation, and again after blasting it with a total dose of 300 kGy and then 500 kGy. Before being irradiated, it showed comparable performance to typical Wi-Fi receivers. After reaching the highest radiation dose, the gain of the receiver had decreased by about 1.5 decibel.</p><p>Narukiyo says the receiver is hardened enough, and now he hopes to improve its performance. He’s also working on a transmitter, which would allow for two-way communications. This is more challenging due to the need to produce high levels of current to generate the Wi-Fi signal. He says an earlier version he tried was broken by a 300 kGy dose. The group is exploring using other semiconductors, such as <a href="https://spectrum.ieee.org/diamond-electronics" target="_blank">diamond</a>, to toughen the transmitter.</p>]]></description><pubDate>Thu, 02 Apr 2026 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/robotics-in-nuclear-industry</guid><category>Wi-fi</category><category>Nuclear-reactors</category><category>Isscc</category><category>Decommissioning</category><category>Industrial-robots</category><category>Radiation-hardening</category><dc:creator>Katherine Bourzac</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/close-up-of-a-receiver-chip.jpg?id=65428613&amp;width=980"/></item><item><title>Scientists Build Living Robots With Nervous Systems</title><link>https://spectrum.ieee.org/neurobot-living-robot-nervous-system</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/close-up-of-a-neuro-robot-that-has-been-stained-to-highlight-multi-ciliated-cells-around-its-periphery.jpg?id=65444408&width=1245&height=700&coordinates=0%2C187%2C0%2C188"/><br/><br/><p>Engineers have long tried to mimic life. They’ve built machine learning algorithms <a href="https://spectrum.ieee.org/topographic-neural-network" target="_self"><span><span>modeled after the human brain</span></span></a>, designed machines that <a href="https://spectrum.ieee.org/boston-dynamics-research-spot" target="_self"><span><span>walk like dogs</span></span></a> or <a href="https://spectrum.ieee.org/flying-robot-bug" target="_self"><span><span>fly like insects</span></span></a>, and taught robots to adapt, <a href="https://spectrum.ieee.org/video-friday-morphing-robots" target="_self"><span><span>however clumsily</span></span></a>, to the world around them.</p><p>Now they are skipping imitation altogether.</p><p>Instead of taking inspiration from biology, they are building robots out of it: fashioning tiny, <a href="https://spectrum.ieee.org/aidesigned-living-robots-crawl-heal-themselves" target="_self">free-swimming assemblages of living cells</a> that organize into self-directed systems, complete with neurons that wire themselves into functional circuits.</p><p>The result, <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/advs.202508967" target="_blank">reported last month in <em>Advanced Science</em></a>, is what the researchers call a “neurobot.”</p><p>These living machines could help scientists better understand how simple neural networks give rise to complex behaviors, a foundational step toward building cyborg systems that integrate biological tissue with engineered control. And with further refinement, they could be put to use in applications ranging from precision tissue repair to environmental cleanup.</p><p>“My general reaction is, ‘Wow, this is amazing!’ ” says <a href="https://cbs.umn.edu/directory/kate-adamala" target="_blank"><span>Kate Adamala</span></a>, a synthetic biologist at the University of Minnesota Twin Cities, who was not involved in the research. “This truly puts the engineering component into bioengineering.”</p><h2>Toward Internal Control</h2><p>Neurobots mark the latest advance in a <a href="https://journals.sagepub.com/doi/10.1089/soro.2022.0142" target="_blank">series of increasingly sophisticated biological machines</a> developed by Tufts University biologist <a href="https://allencenter.tufts.edu/our-team/michael-levin/" target="_blank">Michael Levin</a> and his collaborators.</p><p><a href="https://www.pnas.org/doi/10.1073/pnas.1910837117" target="_blank">First described in 2020</a>, these clusters of living cells, when removed from their normal developmental context and cultured in simple saline conditions, spontaneously self-organize in such a manner that they move and act in novel ways. Under the microscope, they look like irregular, translucent blobs of tissue, but their coordinated motion reveals an emergent order that is unlike anything found in the natural world.</p><p>“These things don’t occur naturally,” says <a href="https://www.binghamton.edu/ssie/people/profile.html?id=cgg" target="_blank"><span><strong><span></span></strong><span>Carlos Gershenson</span></span></a>, a<em> </em>computer scientist<em><em> </em></em>at Binghamton University, State University of New York, who <a href="https://direct.mit.edu/artl/article/29/2/153/114834/Emergence-in-Artificial-Life?guestAccessKey=" target="_blank"><span>studies artificial life</span></a> and complex systems but was not involved in the neurobot research. “They’re made with natural cells, but we’re the ones arranging them.”</p><p>The <a href="https://www.science.org/doi/full/10.1126/scirobotics.abf1571" target="_blank">earliest examples of this technology</a>, called xenobots, were built from frog-derived tissues and mainly from a single type of structural cell. Despite the simplicity of their construction, however, they could propel themselves through water using beating hair-like projections called cilia. They survived for days without added nutrients. And they could repair minor damage, all without any scaffolding materials or genetic manipulation. <a href="https://www.pnas.org/doi/10.1073/pnas.2112672118" target="_blank">Some could even self-<span>replicate</span></a><em><em> </em></em>by spontaneously sweeping up loose stem cells.<em><em></em></em></p><p>Still, for all the novelty of these biological machines, their behavior was essentially mechanical. Their movements were driven by anatomy and physics, not by anything resembling internal control. They could sense chemical cues, change direction accordingly, and even retain traces of past experiences, as <a href="https://www.biorxiv.org/content/10.64898/2026.03.17.712168v1" target="_blank">detailed in a preprint posted 17 March on <em>bioRxiv</em></a>.</p><p>But many other simple organisms—fungi, protists, and bacteria included—can do much the same. To achieve more flexible, coordinated behavior, they would need a way to integrate information across the body and dynamically direct their actions. Neurobots begin to provide that missing layer of control.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="f86434d62c5577170353478e6aeab577" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wrIpHUmYKBE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">Small tufts of hairlike cilia, combined with the neurobot’s nervous system, allow it to move on its own.</small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Haleh Fotowat</small> </p><h2>Linking Neural Activity to Action</h2><p>Like earlier xenobots, neurobots are still built from frog cells, but they are now endowed with neurons that mature from partially differentiated stem<strong> </strong>cells. These nerve cells develop alongside structural tissues, forming branching connections throughout the autonomous beings. This means they can relay electrochemical signals from cell to cell.</p><p>And unlike other laboratory models of the nervous system—<a href="https://spectrum.ieee.org/organoid-intelligence-computing-on-brain" target="_self">brain organoids</a>, say, or <a href="https://spectrum.ieee.org/biological-computer-for-sale" target="_self">lab-on-a-chip</a> technologies—neurobots move. They swim, explore, and respond to their surroundings in ways that tie electrical signaling to observable movement, producing patterns of <strong></strong>physical activity  distinct from<strong> </strong>their non-neural counterparts.</p><p>Neurobots spend less time idling and more time exploring. They also trace looping and spiraling paths rather than repeating simple trajectories. And they respond differently to neuroactive drugs.</p><p>If the organizing principles that enable these internally guided motions and reflexes can now be deciphered, they could then be harnessed to produce more predictable biological functions, says <a href="https://wyss.harvard.edu/team/advanced-technology-team/haleh-fotowat/" target="_blank">Haleh Fotowat</a>, a neuroengineer from Harvard’s Wyss Institute for Biologically Inspired Engineering, who collaborated with Levin’s team on the study.</p><p>“We’re still very early in terms of understanding the system and its capabilities.” But once the scientists understand how the neurobots self-organize, she says, “then we can begin to engineer on top of that.”</p><p>Beyond the practical, neurobots also raise deeper epistemological questions about the nature of biological organization, notes Levin. “Where does form and function come from in the first place?” he asks. “When it’s not evolved and it’s not engineered, where do these patterns come from?”</p><p>“This is a model system for asking those kinds of questions,” Levin says—in frog and human constructs alike.</p><h2>From Discovery to Deployment</h2><p>Among the many variations on the biobot theme are “<a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/advs.202303575" target="_blank"><span><span>anthrobots</span></span></a><span>,</span>” built from clusters of human lung cells instead of frog tissue.</p><p>Levin’s team now plans to add human neural cells to their anthrobots, extending the neurobot framework into a fully human context. Then, through further conditioning and guided learning, these living machines—like <a href="https://spectrum.ieee.org/using-a-twopronged-approach-to-detect-explosive-substances-from-bombs" target="_self">dogs trained to sniff for bombs</a>—may become capable of adapting their behavior in predictable ways.</p><p>“The hope would be that you could teach them or train them to do what you want them to do,” says <a href="https://www.uvm.edu/cems/cs/profile/josh-bongard" target="_blank">Josh Bongard</a>, a computer scientist and roboticist at the University of Vermont.</p><p>Bongard was not involved in the neurobot study but is a frequent collaborator of Levin’s. Together, they cofounded the nonprofit <a href="https://icdorgs.org/" target="_blank">Institute for Computationally Designed Organisms</a> and a commercial startup, <a href="https://www.faunasystems.com/" target="_blank">Fauna Systems</a>, to advance biobot-related technologies.</p><p>According to Fauna CEO <a href="https://www.linkedin.com/in/naimish-patel-925a84" target="_blank">Naimish Patel</a>, the company is initially targeting environmental sensing applications, aiming to deploy xenobots in settings such as aquaculture, wastewater monitoring, and pollutant detection, where the technology’s ability to integrate multiple signals could provide an early readout of ecosystem health.</p><p> If the xenobots encounter a mixture of stressors—say, elevated heavy metals, shifts in pH, and traces of agricultural runoff—their collective changes in movement or activity could provide a sensitive, real-time signal that something in the environment is amiss. </p><p>Precedent for this idea comes from Poland, where many cities already use <a href="https://www.atlasobscura.com/articles/wild-life-excerpt-water-quality-mussels" target="_blank">freshwater mussels as living sentinels of water quality</a>, wired with sensors that register when the animals clamp their shells shut in response to pollutants. Xenobots could extend this concept further, Patel says, potentially offering greater sensitivity and specificity by integrating multiple environmental cues into a single, measurable behavioral response. And neurobots could eventually push this fusion of sensing and computation into ever more sophisticated territory, he adds.</p><p>But the technical hurdles remain substantial—and the practical opportunities with simpler, non-neural versions are already compelling—so the first-gen xenobots, for the time being,  remain the focus of Fauna’s initial product-development efforts, Patel says. “Right now, we’re looking for the intersection between unmet commercial need and emerging capability.” </p>]]></description><pubDate>Thu, 02 Apr 2026 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/neurobot-living-robot-nervous-system</guid><category>Bioengineering</category><category>Frog</category><category>Living-cells</category><category>Biomimetics</category><category>Bioinspired-robots</category><dc:creator>Elie Dolgin</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/close-up-of-a-neuro-robot-that-has-been-stained-to-highlight-multi-ciliated-cells-around-its-periphery.jpg?id=65444408&amp;width=980"/></item><item><title>Video Friday: Beep! Beep! Roadrunner Bipedal Bot Breaks the Mold</title><link>https://spectrum.ieee.org/roadrunner-bipedal-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/two-wheeled-balancing-robot-leans-while-rolling-in-an-indoor-testing-lab.png?id=65415603&width=1245&height=700&coordinates=3%2C0%2C4%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="9kae-uame1u"><em>“Roadrunner” is a new bipedal wheeled robot prototype designed for multimodal locomotion. It weighs around 15 kg (33 lb) and can seamlessly switch between its side-by-side and in-line wheel modes and stepping configurations depending on what is required for navigating its environment. The robot’s legs are entirely symmetric, allowing it to point its knees forward or backward, which can be used to avoid obstacles or manage specific movements. A single control policy was trained to handle both side-by-side and in-line driving. Several behaviors, including standing up from various ground configurations and balancing on one wheel, were successfully deployed zero-shot on the hardware.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="76bd6c7edd7ff24700dad004edd086aa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9kae-UAME1U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rai-inst.com/">Robotics and AI Institute</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="tyasuwrkv4e">Incredibly (INCREDIBLY!) <a data-linked-post="2657767692" href="https://spectrum.ieee.org/nasa-mars-sample-return" target="_blank">NASA</a> says that this is actually happening.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bc72d2ac20028faf8c32287c722f0ce9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TYasUWRkv4E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>NASA’s SkyFall mission will build on the success of the Ingenuity Mars helicopter, which achieved the first powered, controlled flight on another planet. Using a daring midair deployment, SkyFall will deliver a team of next-gen Mars helicopters to scout human landing sites and map subsurface water ice.</em></blockquote><p>[ <a href="https://www.nasa.gov/news-release/nasa-unveils-initiatives-to-achieve-americas-national-space-policy/">NASA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jsk-ff2mycg"><em>NASA’s MoonFall mission will blaze a path for future <a data-linked-post="2662067231" href="https://spectrum.ieee.org/video-friday-training-artemis" target="_blank">Artemis</a> missions by sending four highly mobile drones to survey the lunar surface around the Moon’s South Pole ahead of astronauts’ arrival there. MoonFall is built on the legacy of NASA’s Ingenuity Mars Helicopter. The drones will be launched together and released during descent to the surface. They will land and operate independently over the course of a lunar day (14 Earth days) and will be able to explore hard-to-reach areas, including permanently shadowed regions (PSRs), surveying terrain with high-definition optical cameras and other potential instruments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="24cd6ef18a5608c71e3afdc55a0d2507" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JsK-ff2Mycg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>For what it’s worth, <a data-linked-post="2671177906" href="https://spectrum.ieee.org/moon-landing-2025" target="_blank">Moon landings</a> have a success rate well under 50%. So let’s send some robots there to land over and over!</p><p>[ <a href="https://www.nasa.gov/news-release/nasa-unveils-initiatives-to-achieve-americas-national-space-policy/">NASA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hdjiukrfvca"><em>In Science Robotics, researchers from the Tangible Media group led by Professor Hiroshi Ishii, together with colleagues from Politecnico di Bari, present Electrofluidic Fiber Muscles: a new class of artificial muscle fibers for robots and wearables. Unlike the rigid servo motors used in most robots, these fiber-shaped muscles are soft and flexible. They combine electrohydrodynamic (EHD) fiber pumps—slender tubes that move liquid using electric fields to generate pressure silently, with no moving parts—with fluid-filled fiber actuators. These artificial muscles could enable more agile untethered robots, as well as wearable assistive systems with compact actuation integrated directly into textiles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="401e33c5be7f9feea5a4219dd786d2ab" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HdjIukrfvcA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.media.mit.edu/projects/electrofluidicmuscle/overview/">MIT Media Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xzfzkmq2rrq"><em>In this study, we developed MEVIUS2, an open-source quadruped robot. It is comparable in size to the Boston Dynamics Spot, equipped with two lidars and a C1 camera, and can freely climb stairs and steep slopes! All hardware, software, and learning environments are released as open source.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ef4dd2071d09d4ac4c97d9e6993be2ea" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xzfZkmQ2rrQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://github.com/haraduka/mevius2">MEVIUS2</a> ]</p><p>Thanks, Kento!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zj07hhjnrto"><em>What goes into preparing for a live performance? Arun highlights the reliability testing that goes into trying a new behavior for Spot.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="075596c69914e064444994a7d74fe2dc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zj07hHJnrto?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="41kpw6jwxty"><em>In this work, a multirobot planning and control framework is presented and demonstrated with a team of 40 indoor robots, including both ground and aerial robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e8811d7981e9be82f23859aafea31249" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/41kPW6JwXtY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>That soundtrack, though.</p><p>[ <a href="https://proroklab.github.io/agile-mapf/">GitHub</a> ]</p><p>Thanks, Keisuke!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="img5a_ykjms"><em>Quadrupedal robots can navigate cluttered environments like their animal counterparts, but their floating-base configuration makes them vulnerable to real-world uncertainties. Controllers that rely only on proprioception (body sensing) must physically collide with obstacles to detect them. Those that add exteroception (vision) need precisely modeled terrain maps that are hard to maintain in the wild. DreamWaQ++ bridges this gap by fusing both modalities through a resilient multimodal reinforcement learning framework. The result: a single controller that handles rough terrains, steep slopes, and high-rise stairs—while gracefully recovering from sensor failures and situations it has never seen before.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="61dd08501e1c8f10d63a43acb5bb2911" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Img5a_yKjMs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>That cliff behavior is slightly uncanny.</p><p>[ <a href="https://dreamwaqpp.github.io/">DreamWaQ++</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="toh8pd4o34u">I take issue with this from iRobot:</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1d86fae43d52011c45db0102b9fdc86b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tOH8pD4O34U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>While the <a data-linked-post="2650276443" href="https://spectrum.ieee.org/robotic-blimp-could-explore-hidden-chambers-of-great-pyramid" target="_blank">pyramid exploration</a> that iRobot did was very cool, they did it with a custom-made robot designed for a very specific environment. Cleaning your floors is way, way harder. Here’s a bit more detail on the pyramids thing:</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1b4538cb0137311b0b433425e56096f0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Pts3w2Pw8F4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/watch?v=Pts3w2Pw8F4">iRobot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="t1vub0knci4">More robots in the circus, please!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="89aa286bf5c7d16563d9223df6cc3d2b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/T1VUb0kncI4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://danielsimu.com/acrobot/">Daniel Simu</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="f2hasoladgm"><em>MIT engineers have designed a wristband that lets wearers control a robotic hand with their own movements. By moving their hands and fingers, users can direct a robot to perform specific tasks, or they can manipulate objects in a virtual environment with high-dexterity control.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="88281d6e7db31cc58ef4b327756809b2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/F2HaSoladgM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2026/wristband-enables-wearers-control-robotic-hand-with-own-movements-0325">MIT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0ozaw6rryie"><em>At <a data-linked-post="2676218078" href="https://spectrum.ieee.org/nvidia-groq-3" target="_blank">Nvidia GTC 2026</a>, we showcased how AI is moving into the physical world. Visitors interacted with robots using voice commands, watching them interpret intent and act in real time—powered by our KinetIQ AI brain.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="95460eeec4fec87fd729fe5aa4314531" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0oZAw6rryIE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thehumanoid.ai/">Humanoid</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7sl93jl8_o8">Props to Sony for its continued support and updates for <a data-linked-post="2670284977" href="https://spectrum.ieee.org/aibo" target="_blank">Aibo</a>!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f05e5074c48cd251f832782efa434226" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7sL93Jl8_O8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://us.aibo.com/myaibo/">Aibo</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yd7enmgniei">This robot looks like it could be a little curvier than normal?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3be1fe9e24c6ee745f0f1fa7a2a1b201" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Yd7eNmGNIeI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dncww0qmkce"><em>Developed by Zhejiang Humanoid Robot Innovation Center Co., Ltd., the Naviai Robot is an intelligent cooking device. It can autonomously process ingredients, perform cooking tasks with high accuracy, adjust smart kitchen equipment in real time, and complete postcooking cleaning. Equipped with multimodal perception technology, it adapts to daily kitchen environments and ensures safe and stable operation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f58863823d5082a3e5e104c47b9e68f6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dNcWW0qMkcE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>That 7x is doing some heavy lifting.</p><p>[ <a href="https://en.zhejianglab.com/institutescenters/researchunits/interdisciplinaryresearchcenters/researchcenterforintelligentrobot/">Zhejiang Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="gthxsfhdt8q">This CMU RI Seminar is by Hadas Kress-Gazit from Cornell, on “Formal Methods for Robotics in the Age of Big Data.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a0150919b813daa034367d7a41c9d68e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gthXSFhDt8Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Formal methods—mathematical techniques for describing systems, capturing requirements, and providing guarantees—have been used to synthesize robot control from high-level specification, and to verify robot behavior. Given the recent advances in robot learning and data-driven models, what role can, and should, formal methods play in advancing robotics? In this talk I will give a few examples for what we can do with formal methods, discuss their promise and challenges, and describe the synergies I see with data-driven approaches.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/formal-methods-for-robotics-in-the-age-of-big-data/">Carnegie Mellon University Robotics Institute</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 27 Mar 2026 16:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/roadrunner-bipedal-robot</guid><category>Video-friday</category><category>Nasa</category><category>Bipedal-robots</category><category>Quadruped-robots</category><category>Artificial-muscles</category><category>Humanoid-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/two-wheeled-balancing-robot-leans-while-rolling-in-an-indoor-testing-lab.png?id=65415603&amp;width=980"/></item><item><title>30 Years Ago, Robots Learned to Walk Without Falling</title><link>https://spectrum.ieee.org/honda-p2-robot-ieee-milestone</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/collage-of-hondas-p2-humanoid-robot-from-1996-against-a-background-of-figures-related-to-its-technical-features.jpg?id=65402169&width=1245&height=700&coordinates=0%2C187%2C0%2C188"/><br/><br/><p>When you hear the term <a href="https://spectrum.ieee.org/search/?q=humanoid+robot" target="_self"><em><em>humanoid robot</em></em></a>, you may think of <a href="https://starwars.fandom.com/wiki/C-3PO" rel="noopener noreferrer" target="_blank">C-3PO</a>, the human-cyborg-relations android from <a href="https://www.starwars.com/" rel="noopener noreferrer" target="_blank"><em><em>Star Wars</em></em></a><em><em>.</em></em> C-3PO was designed to assist humans in communicating with robots and alien species. The droid, which first appeared on screen in 1977, joined the characters on their adventures, walking, talking, and interacting with the environment like a human. It was ahead of its time.</p><p>Before the release of <em><em>Star Wars</em></em>, a few androids did exist and could move and interact with their environment, but none could do so without losing its balance.</p><p>It wasn’t until 1996 that the first autonomous robot capable of walking without falling was developed in Japan. <a href="https://www.google.com/aclk?sa=L&ai=DChsSEwjfwP2lmviSAxUdF60GHVa5APUYACICCAEQARoCcHY&ae=2&co=1&ase=2&gclid=CjwKCAiA-__MBhAKEiwASBmsBMZ3C7eg4qpf1gS-s4hmogZZL-Tr0YQ7T1h4mn0IoFztQ7NVCqHCjhoCXqoQAvD_BwE&cid=CAASZuRo0CEpRkUaLKjRvxVglDhyNNQqb9IGBGToAJFwXbXIyMx3bZTVg0T8ishwxc5PTKrYMjYnaSzvAx3ewj0dizuR563LtzuoBcRH9l0T-TNDiYKEN25LZQWjdGD6NduB7UgbPw6wRg&cce=2&category=acrcp_v1_71&sig=AOD64_0hkFjU2fo-VGEWLhz4zejdBOhDxw&q&nis=4&adurl&ved=2ahUKEwif9vWlmviSAxU-ITQIHZAXPSIQ0Qx6BAg8EAE" rel="noopener noreferrer" target="_blank">Honda</a>’s <a href="https://hondanews.com/en-US/photos/p2-robot" rel="noopener noreferrer" target="_blank">Prototype 2</a> (P2) was nearly 183 centimeters tall and weighed 210 kilograms. It could control its posture to maintain balance, and it could move multiple joints simultaneously.</p><p>In recognition of that decades-old feat, P2 has been honored as an <a href="https://ieeemilestones.ethw.org/Main_Page" rel="noopener noreferrer" target="_blank">IEEE Milestone</a>. The dedication ceremony is scheduled for 28 April at the <a href="https://www.mr-motegi.jp/eng/collection-hall/?from=navi_header_drawer_global_en" rel="noopener noreferrer" target="_blank">Honda Collection Hall</a>, located on the grounds of the <a href="https://en.wikipedia.org/wiki/Mobility_Resort_Motegi" rel="noopener noreferrer" target="_blank">Mobility Resort Motegi</a>, in Japan. The machine is on display in the hall’s robotics exhibit, which showcases the evolution of Honda’s humanoid technology.</p><p>In support of the Milestone nomination, members of the <a href="https://ieee-jp.org/section/nagoya/" rel="noopener noreferrer" target="_blank">IEEE Nagoya (Japan) Section</a> wrote: “This milestone demonstrated the feasibility of humanlike locomotion in machines, setting a new standard in robotics.” The <a href="https://ieeemilestones.ethw.org/Milestone-Proposal:Honda%27s_P2,_First_Bipedal_Robot,_1996" rel="noopener noreferrer" target="_blank">Milestone proposal</a> is available on the <a href="https://ethw.org/Main_Page" rel="noopener noreferrer" target="_blank">Engineering Technology and History Wiki</a>.</p><h2>Developing a domestic android</h2><p>In 1986 Honda researchers Kazuo Hirai, Masato Hirose, Yuji Haikawa, and <a href="https://research.com/u/toru-takenaka" rel="noopener noreferrer" target="_blank">Toru Takenaka</a> set out to develop what they called a “domestic robot” to collaborate with humans. It would be able to climb stairs, remove impediments in its path, and tighten a nut with a wrench, according to their <a href="https://www.cs.cmu.edu/~cga/humanoids/honda.pdf" rel="noopener noreferrer" target="_blank">research paper on the project</a>.</p><p>“We believe that a robot working within a household is the type of robot that consumers may find useful,” the authors wrote.</p><p>But to create a machine that would do household chores, it had to be able to move around obstacles such as furniture, stairs, and doorways. It needed to autonomously walk and read its environment like a human, according to the researchers.</p><p>But no robot could do that at the time. The closest technologists got was the <a href="https://www.humanoid.waseda.ac.jp/booklet/kato_2.html" rel="noopener noreferrer" target="_blank">WABOT-1</a>. Built in 1973 at <a href="https://www.waseda.jp/top/en" rel="noopener noreferrer" target="_blank">Waseda University</a>, in Tokyo, the WABOT had eyes and ears, could speak Japanese, and used tactile sensors embedded on its hands as it gripped and moved objects. Although the WABOT could walk, albeit unsteadily, it couldn’t maneuver around obstacles or maintain its balance. It was powered by an external battery and computer.</p><p>To build an android, the Honda team began by analyzing how people move, using themselves as models.</p><p>That led to specifications for the robot that gave it humanlike dimensions, including the location of the leg joints and how far the legs could rotate.</p><p>Once they began building the machine, though, the engineers found it difficult to satisfy every specification. Adjustments were made to the number of joints in the robot’s hips, knees, and ankles, according to the research paper. Humans have four hip, two knee, and three ankle joints; P2’s predecessor had three hip, one knee, and two ankle joints. The arms were treated similarly. A human’s four shoulder and three elbow joints became three shoulder joints and one elbow joint in the robot.</p><p>The researchers installed existing Honda motors and hydraulics in the hips, knees, and ankles to enable the robot to walk. Each joint was operated by a DC motor with a harmonic-drive reduction gear system, which is compact and offered high torque capacity.</p><p>To test their ideas, the engineers built what they called E0. The robot, which was just a pair of connected legs, successfully walked. It took about 15 seconds to take each step, however, and it moved using static walking in a straight line, according to <a href="https://global.honda/en/ASIMO/history/" rel="noopener noreferrer" target="_blank">a post about the project</a> on Honda’s website. (Static walking is when the body’s center of mass is always within the foot’s sole. Humans walk with their center of mass below their navel.)</p><p>The researchers created several algorithms to enable the robot to walk like a human, according to the Honda website. The codes allowed the robot to use a locomotion mechanism, dynamic walking, whereby the robot stays upright by constantly moving and adjusting its balance, rather than keeping its center of mass over its feet, according to a video on the YouTube channel <a href="https://www.youtube.com/watch?v=BCAZkjXgBE4" rel="noopener noreferrer" target="_blank">Everything About Robotics Explained</a>.</p><p class="pull-quote">“P2 was not just a technical achievement; it was a catalyst that propelled the field of humanoid robotics forward, demonstrating the potential for robots to interact with and assist humans in meaningful ways.” <strong>—IEEE Nagoya Section</strong></p><p>The Honda team installed rubber brushes on the bottom of the machine’s feet to reduce vibrations from the landing impacts (the force experienced when its feet touch the ground)—which had made the robot lose its balance.</p><p>Between 1987 and 1991, three more prototypes (E1, E2, and E3) were built, each testing a new algorithm. E3 was a success.</p><p>With the dynamic walking mechanism complete, the researchers continued their quest to make the robot stable. The team added 6-axis sensors to detect the force at which the ground pushed back against the robot’s feet and the movements of each foot and ankle, allowing the robot to adjust its gait in real time for stability.</p><p>The team also developed a posture-stabilizing control system to help the robot stay upright. A local controller directed how the electric motor actuators needed to move so the robot could follow the leg joint angles when walking, according to the research paper.</p><p>During the next three years, the team tested the systems and built three more prototypes (E4, E5, and E6), which had boxlike torsos atop the legs.</p><p>In 1993 the team was finally ready to build an android with arms and a head that looked more like C-3PO, dubbed <em><em>Prototype 1</em></em> (P1). Because the machine was meant to help people at home, the researchers determined its height and limb proportions based on the typical measurements of doorways and stairs. The arm length was based on the ability of the robot to pick up an object when squatting.</p><p>When they finished building P1, it was 191.5 cm tall, weighed 175 kg, and used an external power source and computer. It could turn a switch on and off, grab a doorknob, and carry a 70 kg object.</p><p>P1 was not launched publicly but instead used to conduct research on how to further improve the design. The engineers looked at how to install an internal power source and computer, for example, as well as how to coordinate the movement of the arms and legs, according to Honda.</p><p>For P2, four video cameras were installed in its head—two for vision processing and the other two for remote operation. The head was 60 cm wide and connected to the torso, which was 75.6 cm deep.</p><p>A computer with four <a href="https://en.wikipedia.org/wiki/MicroSPARC" target="_blank">microSparc II</a> processors running a real-time operating system was added into the robot’s torso. The processors were used to control the arms, legs, joints, and vision-processing cameras.</p><p>Also within the body were DC servo amplifiers, a 20-kg nickel-zinc battery, and a wireless Ethernet modem, according to the research paper. The battery lasted for about 15 minutes; the machine also could be charged by an external power supply.</p><p>The hardware was enclosed in white-and-gray casing.</p><p>P2, which was launched publicly in 1996, could walk freely, climb up and down stairs, push carts, and perform some actions wirelessly.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="4c1ac513d31347c699292e05c673df46" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FEXSqsW6rMM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">P2, which was launched publicly in 1996, could walk freely, climb up and down stairs, push carts, and perform some actions wirelessly.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">King Rose Archives</small></p><p><span>The following year, Honda’s engineers released the smaller and lighter </span><a href="https://www.youtube.com/watch?v=hS82TL73V3E" target="_blank">P3</a><span>. It was 160 cm tall and weighed 130 kg.</span></p><p>In 2000 the popular <a href="https://spectrum.ieee.org/honda-asimo" target="_self">ASIMO robot</a> was introduced. Although shorter than its predecessors at 130 cm, it could walk, run, climb stairs, and recognize voices and faces. The <a href="https://spectrum.ieee.org/honda-robotics-unveils-next-generation-asimo-robot" target="_self">most recent version</a> was released in 2011. Honda has retired the robot.</p><h2>Honda P2’s influence</h2><p>Thanks to P2, today’s androids are not just ideas in a laboratory. Robots have been deployed to work in factories and, increasingly, at <a href="https://spectrum.ieee.org/home-humanoid-robots-survey" target="_self">home</a>.</p><p>The machines are even being used for entertainment. During this year’s <a href="https://www.cgtn.com/specials/2026/spring-festival.html" target="_blank">Spring Festival</a> gala in Beijing, machines developed by Chinese startups <a href="https://www.unitree.com/" target="_blank">Unitree Robotics</a>, <a href="https://www.galbot.com/" rel="noopener noreferrer" target="_blank">Galbot</a>, <a href="https://en.noetixrobotics.com/" rel="noopener noreferrer" target="_blank">Noetix</a>, and <a href="https://www.magiclab.top/en" rel="noopener noreferrer" target="_blank">MagicLab</a><a href="https://spectrum.ieee.org/robot-martial-arts" target="_self"> performed synchronized dances, martial arts, and backflips</a> alongside human performers.</p><p>“P2’s development shifted the focus of robotics from industrial applications to human-centric designs,” the Milestone sponsors explained in the wiki entry. “It inspired subsequent advancements in humanoid robots and influenced research in fields like biomechanics and artificial intelligence.</p><p>“It was not just a technical achievement; it was a catalyst that propelled the field of humanoid robotics forward, demonstrating the potential for robots to interact with and assist humans in meaningful ways.”</p><p>To learn more about robots, check out <a href="https://spectrum.ieee.org/" target="_self"><em><em>IEEE Spectrum</em></em></a>’s <a href="https://robotsguide.com/about" rel="noopener noreferrer" target="_blank">guide</a>.</p><h2>Recognition as an IEEE Milestone</h2><p>A plaque recognizing Honda’s P2 robot as an IEEE Milestone is to be installed at the <a href="https://www.mr-motegi.jp/eng/collection-hall/?from=navi_header_drawer_global_en" rel="noopener noreferrer" target="_blank">Honda Collection Hall</a>. The plaque is to read:</p><p><em><em>In 1996 Prototype 2 (P2), a self-contained autonomous bipedal humanoid robot capable of stable dynamic walking and stair-climbing, was introduced by Honda. Its legged robotics incorporated real-time posture control, dynamic balance, gait generation, and multijoint coordination. Honda’s mechatronics and control algorithms set technical benchmarks in mobility, autonomy, and human-robot interaction. P2 inspired new research in humanoid robot development, leading to increasingly sophisticated successors.</em></em></p><p>Administered by the <a href="https://www.ieee.org/about/history-center" rel="noopener noreferrer" target="_blank">IEEE History Center</a> and supported by <a href="https://secure.ieeefoundation.org/site/Donation2?df_id=1680&mfc_pref=T&1680.donation=form1" rel="noopener noreferrer" target="_blank">donors</a>, the Milestone program recognizes outstanding technical developments around the world.</p>]]></description><pubDate>Wed, 25 Mar 2026 18:00:05 +0000</pubDate><guid>https://spectrum.ieee.org/honda-p2-robot-ieee-milestone</guid><category>Ieee-history</category><category>Ieee-milestone</category><category>Honda</category><category>Robotics</category><category>Asimo</category><category>Type-ti</category><dc:creator>Joanna Goodrich</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/collage-of-hondas-p2-humanoid-robot-from-1996-against-a-background-of-figures-related-to-its-technical-features.jpg?id=65402169&amp;width=980"/></item><item><title>The Coming Drone-War Inflection in Ukraine</title><link>https://spectrum.ieee.org/autonomous-drone-warfare</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/person-holding-a-large-drone-outdoors-under-a-sunny-partly-cloudy-sky.jpg?id=65327386&width=1245&height=700&coordinates=0%2C104%2C0%2C104"/><br/><br/><p><strong>WHEN</strong><strong> </strong><strong>KYIV-BORN</strong><strong> </strong><strong>ENGINEER </strong><a href="https://www.instagram.com/yaroslavazhnyuk/?hl=en" rel="noopener noreferrer" target="_blank">Yaroslav Azhnyuk</a> thinks about the future, his mind conjures up dystopian images. He talks about “swarms of autonomous drones carrying other autonomous drones to protect them against autonomous drones, which are trying to intercept them, controlled by <a href="https://spectrum.ieee.org/ai-agents" target="_self">AI</a> <a href="https://spectrum.ieee.org/ai-agents" target="_self">agents</a> overseen by a human general somewhere.” He also imagines flotillas of autonomous submarines, each carrying hundreds of drones, suddenly emerging off the coast of California or Great Britain and discharging their cargoes en masse to the sky.</p><p>“How do you protect from that?” he asks as we speak in late December 2025; me at my quiet home office in London, he in Kyiv, which is bracing for another wave of <a href="https://spectrum.ieee.org/ukraine-air-defense" target="_self">missile attacks</a>.</p><p>Azhnyuk is not an alarmist. He cofounded and was formerly CEO of <a href="https://petcube.com/" rel="noopener noreferrer" target="_blank">Petcube</a>, a California-based company that uses smart cameras and an app to let pet owners keep an eye on their beloved creatures left alone at home. A self-described “liberal guy who didn’t even receive military training,” Azhnyuk changed his mind about developing military tech in the months following the <a href="https://commonslibrary.parliament.uk/research-briefings/cbp-9847/" rel="noopener noreferrer" target="_blank">Russian invasion of</a> <a href="https://commonslibrary.parliament.uk/research-briefings/cbp-9847/" rel="noopener noreferrer" target="_blank">Ukraine</a> in February 2022. By 2023, he had relinquished his CEO role at Petcube to do what many Ukrainian technologists have done—to help defend his country against a mightier aggressor.</p><p>It took a while for him to figure out what, exactly, he should be doing. He didn’t join the military, but through friends on the front line, he witnessed how, out of desperation, Ukrainian troops turned to off-the-shelf consumer drones to make up for their country’s lack of artillery.</p><p>Ukrainian troops first began using drones for battlefield surveillance, but within a few months they figured out how to strap explosives onto them and turn them into effective, <a href="https://spectrum.ieee.org/ukraine-hackers-war" target="_self">low-cost killing</a> <a href="https://spectrum.ieee.org/ukraine-hackers-war" target="_self">machines</a>. Little did they know they were fomenting a revolution in warfare.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Group observes a drone demonstration indoors, with a presenter explaining features." class="rm-shortcode" data-rm-shortcode-id="bfc4f902e7ae9ffa663bf3bcc8ff144c" data-rm-shortcode-name="rebelmouse-image" id="cc3bb" loading="lazy" src="https://spectrum.ieee.org/media-library/group-observes-a-drone-demonstration-indoors-with-a-presenter-explaining-features.jpg?id=65341730&width=980"/></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Compact black camera module with textured surface and orange ribbon cable on white background." class="rm-shortcode" data-rm-shortcode-id="e904e39e8ac7797c354a205ed343d150" data-rm-shortcode-name="rebelmouse-image" id="4d58e" loading="lazy" src="https://spectrum.ieee.org/media-library/compact-black-camera-module-with-textured-surface-and-orange-ribbon-cable-on-white-background.jpg?id=65341726&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">The Ukrainian robotics company The Fourth Law produces an autonomy module [above] that uses optics and AI to guide a drone to its target. Yaroslav Azhnyuk [top, in light shirt], founder and CEO of The Fourth Law, describes a developmental drone with autonomous capabilities to Ukrainian President Volodymyr Zelenskyy and German Chancellor Olaf Scholz.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Top: THE PRESIDENTIAL OFFICE OF UKRAINE; Bottom: THE FOURTH LAW</small></p><p>That revolution was on display last month, as the U.S. and Israel went to war with Iran. It soon became clear that attack drones are being extensively used by both sides. Iran, for example, is relying heavily on the Shahed drones that the country invented and that are now also being manufactured in Russia and launched by the thousands every month against Ukraine.</p><p>A thorough analysis of the Middle East conflict <span>will take some time to emerge. And so to understand the direction of this new way of war, look to Ukraine, where its next phase—autonomy—is already starting to come into view. Outnumbered by the Russians and facing increasingly sophisticated jamming and spoofing aimed at causing the drones to veer off course or fall out of the sky, Ukrainian technologists realized as early as 2023 that what could really win the war was autonomy. Autonomous operation means a drone isn’t being flown by a remote pilot, and therefore there’s no communications link to that pilot that can be severed or spoofed, rendering the drone useless.</span></p><p>By late 2023, <a href="https://www.linkedin.com/in/yaroslavazhnyuk/?locale=uk" target="_blank">Azhnyuk</a> set out to help make that vision a reality. He founded two companies, <a href="https://thefourthlaw.ai/blog/funding-products-video" target="_blank">The</a> <a href="https://thefourthlaw.ai/blog/funding-products-video" target="_blank">Fourth Law</a> and <a href="https://oddsystems.io/en/" target="_blank">Odd Systems</a>, the first to develop AI algorithms to help drones overcome jamming during final approach, the second to build thermal cameras to help those drones better sense their <span>surroundings.</span></p><p>“I moved from making devices that throw treats to dogs to making devices that throw explosives on Russian occupants,” Azhnyuk quips.</p><p>Since then, The Fourth Law has dispatched “more than thousands” of <a href="https://thefourthlaw.ai/#section3" target="_blank">autonomy modules</a> to troops in eastern Ukraine (it declines to give a more specific figure), which can be retrofitted on existing drones to take over navigation during the final <span>approach to the target. Azhnyuk says the autonomy modules, worth around US $50, increase the drone-strike success rate by up to four times that of purely operator-controlled drones.</span></p><p>And that is just the beginning. Azhnyuk is one of thousands of developers, including some <span>who </span>relocated from Western countries, who are applying their skills and other resources to advancing the drone technology that is the defining characteristic <span>of the war in Ukraine. This eclectic group of startups and founders includes </span><a href="https://en.wikipedia.org/wiki/Eric_Schmidt" target="_blank">Eric Schmidt</a>, the forme<a href="https://about.google/company-info/" target="_blank">r</a> <a href="https://about.google/company-info/" target="_blank">Google</a> CEO, whose company <a href="https://epravda.com.ua/oborona/milyarder-ta-ekskerivnik-google-robit-droni-dlya-ukrajini-shcho-nim-ruhaye-809495/" target="_blank">Swift Beat</a> is churning out autonomous <a href="https://www.nytimes.com/2025/12/31/magazine/ukraine-ai-drones-war-russia.html" target="_blank">drones and modules for Ukrainian</a> <a href="https://www.nytimes.com/2025/12/31/magazine/ukraine-ai-drones-war-russia.html" target="_blank">forces</a>. The frenetic pace of tech development is helping a scrappy, innovative underdog hold at bay a much larger and better-equipped foe.</p><p>All of this development is careening toward AI-based systems that enable drones to navigate by recognizing features in the terrain, lock on to and chase targets without an operator’s guidance, and eventually exchange information with each other through mesh networks, forming self-organizing robotic kamikaze swarms. Such an attack swarm would be commanded by a single operator from a safe distance.</p><p><span>According to some reports, autonomous swarming technology is also being developed <a href="https://www.usni.org/magazines/proceedings/2025/may/step-step-ukraine-built-technological-navy" target="_blank">for</a> <a href="https://www.usni.org/magazines/proceedings/2025/may/step-step-ukraine-built-technological-navy" target="_blank">sea drones</a>. Ukraine has had some notable <span>successes with sea drones, which have reportedly</span> </span><span>destroyed or damaged </span><a href="https://en.usm.media/sbu-naval-drones-hit-11-russian-ships-and-vessels-details/" target="_blank">around a dozen</a><span> Russian vessels.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Hand holding a drone with six rotors, outdoors against a blue sky." class="rm-shortcode" data-rm-shortcode-id="90f30978c5ba0e77e9b1873c155131d2" data-rm-shortcode-name="rebelmouse-image" id="7cf11" loading="lazy" src="https://spectrum.ieee.org/media-library/hand-holding-a-drone-with-six-rotors-outdoors-against-a-blue-sky.jpg?id=65341722&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">The Skynode X system, from Auterion, provides a degree of autonomy to a drone.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">AUTERION</small></p><p>For Ukraine, swarming can solve a major problem that puts the nation at a disadvantage against Russia—the lack of personnel. Autonomy is “the single most impactful defense technology of this century,” says Azhnyuk. “The moment this happens, you <span>shift from a manpower challenge to a production challenge, which is much more manageable,” he adds.</span></p><p>The autonomous warfare future envisioned by Azhnyuk and others is not yet a reality. But <a href="https://www.linkedin.com/in/marcclange/?skipRedirect=true" target="_blank">Marc Lange</a>, a German defense analyst and business strategist, believes that “an inflection point” is already in view. Beyond it, “things will be so dramatically different,” he says.</p><p>“Ukraine pretty rapidly realized that if the operator-to-drone ratio can be shifted from one-to-one to one-to-many, that creates great economies of scale and an amazing cost exchange ratio,” Lange adds. “The moment one operator can launch 100, 50, or even just 20 drones at once, this completely changes the economics of the war.”</p><h2>Drones With a View </h2><p>For a while, jammers that sever the radio links between drones and <span>operators or that spoof GPS receivers were able to provide fairly reliable defense against human-controlled first-person-view attack drones (FPVs). But as autonomous navigation progressed, those electronic shields have gradually become less effective. Defenders must now contend with unjammable drones—ones that are attached to hair-thin optical fibers or that are capable of </span><a href="https://spectrum.ieee.org/ukraine-killer-drones" target="_self">finding</a> <a href="https://spectrum.ieee.org/ukraine-killer-drones" target="_self">their way to their targets</a> without external guidance. In this emerging struggle, the defenders’ track records aren’t very encouraging: The typical countermeasure is to try to shoot down the attacking drone with a service weapon. It’s rarely successful.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Truck on rural road covered with camouflage netting, trees and fields in the background." class="rm-shortcode" data-rm-shortcode-id="7c7af1e395cf35752b367f8dd54130fc" data-rm-shortcode-name="rebelmouse-image" id="58155" loading="lazy" src="https://spectrum.ieee.org/media-library/truck-on-rural-road-covered-with-camouflage-netting-trees-and-fields-in-the-background.jpg?id=65341708&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A truck outfitted with signal-jamming gear drives under antidrone nets near Oleksandriya, in eastern Ukraine, on 2 October 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">ED JONES/AFP/GETTY IMAGES</small></p><p>“The attackers gain an immense advantage from unmanned systems,” says Lange. “You can have a drone pop up from anywhere and it can wreak havoc. But from autonomy, they gain even more.”</p><p>The self-navigating drones rely on image-recognition algorithms that have been around for over a decade, says Lange. And the mass deployments of drones on Ukrainian battlefields are enabling both Russian and Ukrainian technologists to create <a href="https://www.reuters.com/technology/ukraine-collects-vast-war-data-trove-train-ai-models-2024-12-20/" target="_blank">huge datasets</a> that improve the training and precision of those AI algorithms.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Six-wheeled robotic vehicle with mounted equipment in a grassy field." class="rm-shortcode" data-rm-shortcode-id="caa0a697b2d5752603687ac7f0278581" data-rm-shortcode-name="rebelmouse-image" id="1c591" loading="lazy" src="https://spectrum.ieee.org/media-library/six-wheeled-robotic-vehicle-with-mounted-equipment-in-a-grassy-field.jpg?id=65341706&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A Ukrainian land robot, the Ravlyk, can be outfitted with a machine gun.</small></p><p>While uncrewed aerial vehicles (UAVs) have received the most attention, the Ukrainian military is also deploying dozens of different kinds of drones on land and sea. Ukraine, struggling with the shortage of infantry personnel, began working on replacing a portion of human soldiers with wheeled ground robots in 2024. As of early 2026, thousands of ground robots are crawling across the gray zone along the front line in Eastern Ukraine. Most are used to deliver supplies to the front line or to help evacuate the wounded, but some “killer” ground robots fitted with turrets and remotely controlled machine guns have also been tested.</p><p>In mid-February, Ukrainian authorities released a video of a Ukrainian ground robot using its thermal camera to detect a Russian soldier in the dark of the night and then kill the invader with a round from a heavy machine gun. So far these robots are mostly controlled <span>by a human operator, but the makers of these uncrewed ground vehicles say their systems are capable of basic autonomous operations, such as returning to base when radio connection is lost. The goal is to enable them to swarm so that one operator controls not one, but a whole herd of mesh-connected killer robots.</span></p><p>But <a href="https://www.hudson.org/experts/1303-bryan-clark" target="_blank">Bryan <span>Clark</span></a>, senior fellow and <span>director of the Center for Defense Concepts and Technology at the </span><a href="https://www.hudson.org/" target="_blank">Hudson Institute</a>, questions how quickly ground robots’ abilities can progress. “Ground environments are very difficult to navigate in because of the terrain you have to address,” he says. “The line of sight for the sensors on the ground vehicles is really constrained because of terrain, whereas an air vehicle can see everything around it.”</p><p>To achieve autonomy, <a href="https://spectrum.ieee.org/sea-drone" target="_self">maritime drones</a>, too, will require <span>naviga</span><span>tional approaches beyond AI-based image recognition, possibly based on star positions or electronic signals from radios and cell towers that are within reach, says Clark. Such technologies are still being developed or are in a relatively early operational stage.</span></p><h2>How the Shaheds Got Better</h2><p>Russia is not lagging behind. In fact, some analysts believe its autonomous systems may be slightly ahead of Ukraine’s. For a good example of the Russian military’s rapid <span>evolu</span><span>tion, they say, consider the long-range Iranian-designed Shahed drones. Since 2022, Russia has been using them to attack Ukrainian cities and other targets hundreds of kilometers from the front line. “At the beginning, Shaheds just had a frame, a </span><span>motor, and an inertial navigation system,” </span><a href="https://www.linkedin.com/in/oleksii-solntsev-aa0b72189?originalSubdomain=ua" target="_blank">Oleksii</a><span> </span><a href="https://www.linkedin.com/in/oleksii-solntsev-aa0b72189?originalSubdomain=ua" target="_blank">Solntsev</a><span>, CEO of Ukrainian defense tech startup MaXon Systems, tells me. “They used to be imprecise and pretty stupid. But they are becoming more and more autonomous.” Solntsev founded MaXon </span><span>Systems in late 2024 to help protect Ukrainian civil</span><span>ians from the growing threat of Shahed </span><span>raids.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Silhouette of a triangular drone flying in the sky." class="rm-shortcode" data-rm-shortcode-id="a9c89e21028ccf85e20a49ecead8309f" data-rm-shortcode-name="rebelmouse-image" id="72159" loading="lazy" src="https://spectrum.ieee.org/media-library/silhouette-of-a-triangular-drone-flying-in-the-sky.jpg?id=65341701&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A Russian Geran-2 drone, based on the Iranian Shahed-136, flies over Kyiv during an attack on 27 December 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">SERGEI SUPINSKY/AFP/GETTY IMAGES</small></p><p>First produced <a href="https://www.adaptinstitute.org/from-tehran-to-alabuga-the-evolution-of-shahed-drones-into-russias-strategic-asset/26/09/2025/" target="_blank">in Iran in the 2010s</a>, Shaheds can <span>carry 90-kilogram warheads </span><a href="https://isis-online.org/isis-reports/alabugas-shahed-136-geran-2-warheads-a-dangerous-escalation" target="_blank">up to 650 km</a> (50-kg warheads can go twice as far). <a href="https://www.csis.org/analysis/calculating-cost-effectiveness-russias-drone-strikes" target="_blank">They cost around $35,000 per unit</a><span>, compared to a couple of million dollars, at least, for a ballistic missile. The low cost </span><span>allows Russia to manufacture Shaheds in high quantities, unleashing entire fleets onto </span><a href="https://isis-online.org/isis-reports/a-comprehensive-analytical-review-of-russian-shahed-type-uavs-deployment-against-ukraine-in-2025" target="_blank">Ukrainian cities</a><span> </span><a href="https://isis-online.org/isis-reports/a-comprehensive-analytical-review-of-russian-shahed-type-uavs-deployment-against-ukraine-in-2025" target="_blank">and infrastructure almost every night</a><span>.</span></p><p>The early Shaheds were able to reach a prepro<span>grammed location based on satellite-navigation coordinates. Even one of these early models could frequently overcome the jamming of satellite-navigation signals with the help of an onboard inertial navigation unit. This was essentially a dead-reckoning system of accelerators and gyroscopes that estimate the drone’s position from continual measurements of its motions.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Silhouette of person with large equipment under a starry night sky." class="rm-shortcode" data-rm-shortcode-id="37186ec06b71203ba4f30db497507797" data-rm-shortcode-name="rebelmouse-image" id="1aca7" loading="lazy" src="https://spectrum.ieee.org/media-library/silhouette-of-person-with-large-equipment-under-a-starry-night-sky.jpg?id=65341699&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">In the Donetsk Region, on 15 August 2025, a Ukrainian soldier hunts for Shaheds and other drones with a thermalimaging system attached to a ZU23 23-millimeter antiaircraft gun.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">KOSTYANTYN LIBEROV/LIBKOS/GETTY IMAGES</small></p><p>Ukrainian defense forces learned to down Shaheds with heavy machine guns, but as Russia continued to innovate, the daily onslaughts started to become <a href="https://euromaidanpress.com/2025/06/29/why-cant-ukraine-stop-russias-shahed-drones-anymore/" target="_blank">increasingly effective.</a></p><p>Today’s Shaheds fly faster and higher, and therefore are more difficult to detect and take down. Between January 2024 and August 2025, the number of Shaheds and Shahed-type attack drones launched by Russia into Ukraine per month <a href="https://united24media.com/war-in-ukraine/why-russias-shahed-drones-are-now-deadlier-and-harder-than-ever-to-stop-11693" target="_blank">increased more than tenfold</a>, from 334 to more than 4,000. In 2025, Ukraine found <a href="https://www.unmannedairspace.info/counter-uas-systems-and-policies/recently-downed-russian-shahed-demonstrates-new-levels-of-autonomous-capability/" target="_blank">AI-enabling</a> <a href="https://www.unmannedairspace.info/counter-uas-systems-and-policies/recently-downed-russian-shahed-demonstrates-new-levels-of-autonomous-capability/" target="_blank">N</a><a href="https://www.unmannedairspace.info/counter-uas-systems-and-policies/recently-downed-russian-shahed-demonstrates-new-levels-of-autonomous-capability/" target="_blank">vidia</a> <a href="https://www.unmannedairspace.info/counter-uas-systems-and-policies/recently-downed-russian-shahed-demonstrates-new-levels-of-autonomous-capability/" target="_blank">chipsets in wreckages of Shaheds</a>, as well as thermal-vision modules capable of locking onto targets at night.</p><p>“Now, they are interconnected, which allows them to exchange information with each other,” Solntsev says. “They also have cameras that allow them to autonomously navigate to objects. Soon they will be able to tell each other to avoid a <span>jammed</span> <span>region or an area where one of them got </span><span>intercepted.”</span></p><p>These Russian-manufactured Shaheds, which Russian forces call Geran-2s, are thought to be more capable than the garden variety Shahed-136s that Iran has lately been launching against targets throughout the Middle East. Even the relatively primitive Shahed-136s have done considerable damage, according to <a href="https://www.theguardian.com/world/2026/mar/02/iran-unleashes-hundreds-of-drones-aimed-at-targets-across-middle-east" target="_blank">press accounts</a>.</p><p>Those Shahed successes may accrue, at least in part, from the fact that the United States and Israel <span>lack Ukraine’s long experience with fending them off. In just two days in early March, upward of a thousand drones, mostly Shaheds, were launched against U.S. and Israeli targets, with </span><a href="https://www.theguardian.com/world/2026/mar/02/iran-unleashes-hundreds-of-drones-aimed-at-targets-across-middle-east" target="_blank">hundreds of</a> <a href="https://www.theguardian.com/world/2026/mar/02/iran-unleashes-hundreds-of-drones-aimed-at-targets-across-middle-east" target="_blank">them reportedly finding their marks</a>.</p><p>One attack, caught on videotape, shows a Shahed destroying a radar dome at the U.S. navy base in <span>Manama, Bahrain. U.S. forces were understood to be </span><a href="https://carnegieendowment.org/emissary/2026/03/iran-drones-shahed-us-lessons" target="_blank">attempting to fend off the drones</a> by striking launch platforms, dispatching fighter aircraft to shoot them down, and by using some extremely costly air-defense interceptors, including ones meant to down ballistic missiles. On 4 March, <a href="https://www.cnn.com/2026/03/04/politics/us-air-defenses-iran-attack-drones-challenge" target="_blank">CNN</a> <a href="https://www.cnn.com/2026/03/04/politics/us-air-defenses-iran-attack-drones-challenge" target="_blank">reported</a> that in a congressional briefing the day before, top U.S. defense officials, including Secretary of Defense Pete Hegseth, acknowledged that U.S. air defenses weren’t keeping up with the onslaught of Shahed drones.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Broken drone on soil, cylindrical container nearby." class="rm-shortcode" data-rm-shortcode-id="769830682ff53a401780108ca11db2b6" data-rm-shortcode-name="rebelmouse-image" id="c9d58" loading="lazy" src="https://spectrum.ieee.org/media-library/broken-drone-on-soil-cylindrical-container-nearby.jpg?id=65341692&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Russian V2U attack drones are outfitted with Nvidia processors and run computer-vision software and AI algorithms to enable the drones to navigate autonomously.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">GUR OF THE MINISTRY OF DEFENSE OF UKRAINE</small></p><p>Russia is also starting to field a newer generation of attack drones. One of these, the V2U, has been used to strike targets in the Sumy region of northeastern Ukraine. <a href="https://euromaidanpress.com/2025/06/09/russias-v2u-drone-uses-ai-for-autonomous-strikes-in-ukraines-sumy-oblast/" target="_blank"><span>The V2U drones</span></a> are outfitted with Nvidia Jetson Orin processors and run <span>computer</span>-­<span>vision software and AI algorithms that allow the drones to navigate even where satellite navigation is jammed.</span></p><p>The sale of Nvidia chips to Russia is banned under U.S. sanctions against the country. However, press reports suggest that the chips are getting to Russia <a href="https://www.pravda.com.ua/eng/news/2024/10/28/7481703/" target="_blank">via intermediaries in India</a>.</p><h2>Antidrone Systems Step Up</h2><p>MaXon Systems is one of several companies working to fend off the nightly drone onslaught. Within one year, the company developed and battle-tested a Shahed interception system that hints at the sci-fi future envisioned by Azhnyuk. For a system to be capable of reliably defending against autonomous weaponry, it, too, needs to be autonomous.</p><p><span>MaXon’s solution consists of ground turrets scanning the sky with infrared sensors, with additional input from a network of radars that </span><span>detects approaching Shahed drones at distances of, typically, </span><a href="https://en.defence-ua.com/weapon_and_tech/2025_systems_to_shield_kyiv_from_shaheds_new_air_defense_details_from_maxon_where_balloons_carry_interceptor_drones-15499.html" target="_blank">12 to 16</a><span> km. The turrets fire autonomous fixed-winged interceptor drones, fitted with explosive warheads, toward the approaching Shaheds at speeds of nearly 300 km/h. To boost the chances of successful interception, MaXon </span><a href="https://en.defence-ua.com/weapon_and_tech/2025_systems_to_shield_kyiv_from_shaheds_new_air_defense_details_from_maxon_where_balloons_carry_interceptor_drones-15499.html" target="_blank">is also fielding</a><span> an airborne anti-Shahed fortification </span><span>system</span><span> </span><span>consisting of helium-filled </span><a href="https://spectrum.ieee.org/airships-drones-ukraine" target="_self">aerostats</a><span> hovering above the city that dispatch the interceptors from a higher altitude.</span></p><p>“We are trying to increase the level of automation of the system compared to existing solutions,” says Solntsev. “We need automatic <span>detection, automatic takeoff, and automatic mid-track guidance so that we can guide the interceptor before it can itself flock the target.”</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Gray drone on display stand, surrounded by military personnel in camouflage uniforms." class="rm-shortcode" data-rm-shortcode-id="592b19dbfc4fe9a54033067c6169aeec" data-rm-shortcode-name="rebelmouse-image" id="ab79b" loading="lazy" src="https://spectrum.ieee.org/media-library/gray-drone-on-display-stand-surrounded-by-military-personnel-in-camouflage-uniforms.jpg?id=65341687&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">An interceptor drone, part of the U.S. MEROPS defensive system, is tested in Poland on 18 November 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">WOJTEK RADWANSKI/AFP/GETTY IMAGES</small></p><p>In November 2025, the Ukrainian military announced it had been conducting successful trials of the <a href="https://www.forcesnews.com/nato/bang-your-buck-200m-worth-russian-drones-taken-out-15m-merops-uavs" target="_blank">Merops Shahed drone interceptor</a> system developed by the U.S. startup <a href="https://themerge.co/p/project-eagle" target="_blank">Project Eagle</a>, another of former <span>Google CEO Eric Schmidt’s Ukraine defense ventures. Like the MaXon gear, the system can operate largely autonomously and has so far downed over 1,000 Shaheds.</span></p><h2>What Works in the Lab Doesn’t Necessarily Fly on the Battlefield </h2>Despite the progress on both sides, analysts say that <span>the kind of robotic warfare imagined by Azhnyuk won’t be a reality for years.</span><p>“The software for drone collaboration is there,” says <a href="https://www.csis.org/people/kateryna-bondar" target="_blank">Kate Bondar</a>, a former policy advisor for the Ukrainian <span>government and currently a research fellow at the U.S. </span><a href="https://www.csis.org/" target="_blank">Center for Stra</a><a href="https://www.csis.org/" target="_blank">tegic and International Studies</a><span>. “Drones can fly in labs, but in real life, [the forces] are afraid to deploy them because the risk of a mistake is too high,” she adds.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Two people launching a drone in an open field using a catapult system." class="rm-shortcode" data-rm-shortcode-id="894baf9e936bef6f8c45a0363afac141" data-rm-shortcode-name="rebelmouse-image" id="7c4e9" loading="lazy" src="https://spectrum.ieee.org/media-library/two-people-launching-a-drone-in-an-open-field-using-a-catapult-system.jpg?id=65341682&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Ukrainian soldiers watch a GOR reconnaissance drone take to the sky near Pokrovsk in the Donetsk region, on 10 March 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">ANDRIY DUBCHAK/FRONTLINER/GETTY IMAGES</small></p>In Bondar’s view, powerful AI-equipped drones won’t be deployed in large numbers given the current prices for high-end processors and <span>other advanced components. And, she adds, the more autonomous the system needs to be, the more expensive are the processors and sensors it must have. “For these cheap attack drones that fly only once, you don’t install a high-resolution camera that [has] the resolution for AI to see properly,” she says. “[You install] the cheapest camera. You don’t </span><span>want expensive chips that can run AI algorithms either. Until we can achieve this balance of technological sophistication, when a system can conduct a mission but at the lowest price possible, it won’t be deployed en masse.”</span><p>While existing AI systems are doing a good job recognizing and following large objects like Shaheds or tanks, experts question their ability to reliably distinguish and pursue smaller and more nimble or inconspicuous targets. “When we’re getting into more specific questions, like can it distinguish a Russian soldier from a Ukrainian soldier or at least a soldier from a civilian? The answer is no,” says Bondar. “Also, it’s one thing to track a tank, and it’s another to track infantrymen riding buggies and motorcycles that are moving very fast. That’s really challenging for AI to track and strike precisely.”</p><p>Clark, at the Hudson Institute, says that although the AI algorithms used to guide the Russian and <span>Ukrainian drones are “pretty good,” they rely on information provided bysensors that “aren’t good enough.” “You need multiphenomenology sensors that are able to look at infrared and visual and, in some cases, different parts of the infrared spectrum to be able to figure out if something is a decoy or real target,” </span><span>he </span><span>says.</span></p><p><span>German defense analyst Lange agrees that right now, battlefield AI image-recognition systems are too easily fooled. “If you compress reality into a </span><span>2D</span><span> image, a lot of things can be easily camouflaged—like what Russia did recently, when they started drawing birds on the back of their drones,” he <span>says.</span></span></p><h2>Autonomy Remains Elusive on the Ground and at Sea, Too</h2><p>To make Ukraine’s <span>emerging uncrewed ground vehicles (UGVs) equally self-sufficient will be an even greater task, in Clark’s view. Still, </span><span>Bondar expects major advances to materialize within the next several years, even if humans are still going to be part of the decision-making loop.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Military radar equipment in a grassy field." class="rm-shortcode" data-rm-shortcode-id="0b36a03b7582535b3d3319d7d9b74c33" data-rm-shortcode-name="rebelmouse-image" id="d65ea" loading="lazy" src="https://spectrum.ieee.org/media-library/military-radar-equipment-in-a-grassy-field.jpg?id=65341671&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A mobile electronic-warfare system built by PiranhaTech is demonstrated near Kyiv on 21 October 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DANYLO ANTONIUK/ANADOLU/GETTY IMAGES</small></p><p>“I think in two or three years, we will have pretty good full autonomy, at least in good weather conditions,” she says, referring to aerial drones in partic<span>ular. “Humans will still be in the loop for some years, simply because there are so many unpredictable situations when you need an intervention. We won’t be able to fully rely on the machine for at least another 10 or 15 years.”</span></p><p>Ukrainian defenders are apprehensive about that autonomous future. The boom of drone inno<span>vation has come hand in hand with the development of sophisticated jamming and radio-frequency detection systems. But a lot of that innovation will become obsolete once the pendulum swings away from human control. Ukrainians got their first taste of dealing with unjammable drones in mid-2024, when Russia began rolling out fiber-optic tethered drones. Now they have to brace for a threat on a much larger scale.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Quadcopter drone flying with a fire extinguisher attached in a cloudy sky." class="rm-shortcode" data-rm-shortcode-id="70f326221988cb6004338272d1d8dd4d" data-rm-shortcode-name="rebelmouse-image" id="aa25d" loading="lazy" src="https://spectrum.ieee.org/media-library/quadcopter-drone-flying-with-a-fire-extinguisher-attached-in-a-cloudy-sky.jpg?id=65341673&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">An experimental drone is demonstrated at the Brave1 defense-tech incubator in Kyiv.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DANYLO DUBCHAK/FRONTLINER/GETTY IMAGES</small></p><p>“Today, we have a situation where we have lots of signals on the battlefield, but in the near future, <span>in maybe two to five years, UAVs are not going to be sending any signals,” says Oleksandr Barabash, CTO of </span><a href="https://www.falcons.com.ua/en" target="_blank">Falcons</a>, a Ukrainian startup that has developed a smart radio-frequency detection system capable <span>of revealing precise locations of enemy radio sources such as drones, control stations, and jammers.</span></p><p>Last September, Falcons secured funding from the U.S.-based dual-use tech fund <a href="https://www.greenflag.vc/" target="_blank">Green Flag Ven</a><a href="https://www.greenflag.vc/" target="_blank">tures</a> to scale production of its technology and work toward NATO certification. But Barabash admits that its system, like all technologies fielded in <span>Ukrainian war zones, has an expiration date. Instead of radio-frequency detectors, Barabash thinks, the next R&D push needs to focus on passive radar systems capable of identifying small and fast-moving targets based on the signal from sources like TV towers or radio transmitters that propagate through the environment and are reflected by those moving targets. Passive radars have a significant advantage in the war zone, according to Barabash. Since they don’t emit their own signal, they can’t be that easily discovered by the enemy.</span></p><p>“Active radar is emitting signals, so if you are using active radars, you are target No. 1 on the front line,” Barabash says.</p><p><span>Bondar, on the other hand, thinks that the increased onboard compute power needed </span><span>for</span> AI-controlled drones will, by itself, generate enough electromagnetic radiation to prevent autonomous drones from ever operating completely undetectably.</p><p><span>“You can have full autonomy, but you will still have systems onboard that emit electromagnetic radiation or heat that can be detected,” says Bondar. “Batteries emit electromagnetic radiation, motors emit heat, and [that heat can be] visible in infrared from far away. You just need to have the right sensors to be able to identify it in advance.” She adds that that takeaway is “how capable contemporary detection systems have become and how technically challenging it is to design drones that can reliably operate in the Ukrainian battlefield environment.”</span></p><h2>There Will Be Nowhere to Hide from Autonomous Drones</h2><p>When autonomous drones become a standard weapon <span>of war, their threat will extend far beyond the battlefields of Ukraine. Autonomous turrets and drone-interceptor fortification might soon dot the perimeter of European cities, particularly in the eastern part of the continent.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Person holding gray drone against a blue sky, preparing to launch it." class="rm-shortcode" data-rm-shortcode-id="c480e8fb2bdf2e560c142729e35c7320" data-rm-shortcode-name="rebelmouse-image" id="f9032" loading="lazy" src="https://spectrum.ieee.org/media-library/person-holding-gray-drone-against-a-blue-sky-preparing-to-launch-it.jpg?id=65327903&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A fixed-wing drone is tested in Ukraine in April 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">ANDREWKRAVCHENKO/BLOOMBERG/GETTY IMAGES</small></p><p>Nefarious actors from all over the world have closely watched Ukraine and taken notes, warns Lange. Today, FPV drones are being used by <a href="https://gnet-research.org/2025/07/30/weaponised-skies-the-expansion-of-terrorist-drone-use-across-africa/" target="_blank">terrorists in Africa</a> and <a href="https://www.atlanticcouncil.org/blogs/new-atlanticist/drug-cartels-are-adopting-cutting-edge-drone-technology-heres-how-the-us-must-adapt/#%3A~%3Atext%3DIf%20confirmed%2C%20this%20would%20suggest%2CUS%20homeland%20security%E2%80%94are%20profound" target="_blank">Mexican drug cartels</a> to fight against local authorities.</p><p>When autonomous killing machines become widely available, it’s likely that no city will be safe. “We might see nets above city centers, protecting civilian streets,” Lange says. “In every case, the West needs to start performing similar kinetic-defense development that we see in Ukraine. Very rapid iteration and testing cycles to find solutions.”</p><p>Azhnyuk is concerned that the historic defenders of Europe—the <span>United States and the European countries themselves—are falling behind. “We are in danger,” he says. While Russia and Ukraine made major strides in their drones and countermeasures over the past year, “Europe and the United States have progressed, in the best-case scenario, from the winter-of-2022 technology to the summer-of-2022 technology.</span></p><p>“The gap is getting wider,” he warns. “I think the next few years are very dangerous for the security of Europe.” <span class="ieee-end-mark"></span></p><p><em>This article appears in the April 2026 print issue as “Rise of the <span>Autonomous </span>Attack Drones.”<br/></em></p><p><em>A correction was made on 30 April 2026. The paragraph describing use of offensive drones by “nefarious actors” originally referred to “Islamic terrorists.” We regret the use of that phrase, which wrongly characterizes an entire religion. </em></p>]]></description><pubDate>Tue, 24 Mar 2026 13:00:05 +0000</pubDate><guid>https://spectrum.ieee.org/autonomous-drone-warfare</guid><category>Military-robots</category><category>Military-drones</category><category>Drone-war</category><category>Shahed-drones</category><category>Ai-agents</category><dc:creator>Tereza Pultarova</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/person-holding-a-large-drone-outdoors-under-a-sunny-partly-cloudy-sky.jpg?id=65327386&amp;width=980"/></item><item><title>Video Friday: Humanoid Learns Tennis Skills Playing Humans</title><link>https://spectrum.ieee.org/tennis-playing-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robot-playing-tennis-holding-racket-on-green-court-inset-shows-human-opponent-hitting-ball.png?id=65325604&width=1245&height=700&coordinates=19%2C0%2C20%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.</p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="23zsarayx6o"><em>Human athletes demonstrate versatile and highly dynamic tennis skills to successfully conduct competitive rallies with a high-speed tennis ball. However, reproducing such behaviors on humanoid robots is difficult, partially due to the lack of perfect humanoid action data or human kinematic motion data in tennis scenarios as reference. In this work, we propose LATENT, a system that Learns Athletic humanoid TEnnis skills from imperfect human motioN daTa.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b359b1966adb83fc68515b1a4514b8ca" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/23ZsaraYX6o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://zzk273.github.io/LATENT/">LATENT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="cwithpe4hna">A beautifully designed robot inspired by Strandbeests.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1c60a43596b696ace279c9366e02ecd4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CwItHPe4HnA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cranfield.ac.uk/press/news-2026/wind-powered-robot-could-enable-long-term-exploration-of-hostile-environments">Cranfield University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uvqdqf8ppuw"><em>We believe we’re the first robotics company to demonstrate a robot peeling an apple with dual dexterous humanlike hands. This breakthrough closes a key gap in robotics, achieving bimanual, contact-rich manipulation and moving far beyond the limits of simple grippers.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2c50d7039587c10b8f33da57970bff7f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UVQdqf8ppuw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Today’s AI models (VLMs) are excellent at perception but struggle with action. Controlling high-degree-of-freedom hands for tasks like this is incredibly complex, and precise finger-level teleoperation is nearly impossible for humans.  Our first step was a shared-autonomy system: rather than controlling every finger, the operator triggers prelearned skills like a “rotate apple or tennis ball” primitive via a keyboard press or pedal. This makes scalable data collection and RL training possible.</em><br/><em>How does the AI manage this? We created “<a data-linked-post="2674040994" href="https://spectrum.ieee.org/video-friday-google-gemini-robotics" target="_blank">MoDE-VLA</a>” (Mixture of Dexterous Experts). It fuses vision, language, force, and touch data by using a team of specialist “experts,” making control in high-dimensional spaces stable and effective.  The combination of these two innovations allows for seamless, contact-rich manipulation. The human provides high-level guidance, and the robot executes the complex in-hand coordination required.</em></blockquote><p>[ <a href="https://www.sharpa.com/">Sharpa</a> ]</p><p>Thanks, Alex!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pczsnnwxvia"><em>It was great to see our name amongst the other “AI Native” companies during the <a data-linked-post="2676218078" href="https://spectrum.ieee.org/nvidia-groq-3" target="_blank">NVIDIA GTC</a> keynote. NVIDIA Isaac Lab helps us train reinforcement learning policies that enable the UMV to drive, jump, flip, and hop like a pro.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7b935f0fe975b31f175c1f1fb07566e0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pcZSNNWXviA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rai-inst.com/">Robotics and AI Institute</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iojvnq-zhww"><em>This Finger-Tip Changer technology was jointly researched and developed through a collaboration between Tesollo and RoCogMan LaB at Hanyang University ERICA. The project integrates Tesollo’s practical robotic hand development experience with the lab’s expertise in robotic manipulation and gripper design.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="02d553395b82e93112b8f1739a601bd4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iojvNQ-Zhww?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I don’t know why more robots don’t do this. Also, those pointy fingertips are terrifying.</p><p>[ <a href="http://bmr.hanyang.ac.kr/">RoCogMan LaB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="z55m_um_7fq">Here’s an upcoming ICRA paper from the Fluent Robotics Lab at the University of Michigan along with the <a href="https://www.laas.fr/en/teams/ris/" target="_blank">Robotics and Interactions Team at LAAS-CNRS</a> featuring an operational <a data-linked-post="2650254910" href="https://spectrum.ieee.org/this-is-what-pr2s-do-for-fun" target="_blank">PR2</a>! With functional batteries!!!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="df662d906aa6b4c85644b271ad7a281c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/z55M_um_7fQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://fluentrobotics.com/">Fluent Robotics Lab</a> ] and [ <a href="https://www.laas.fr/en/teams/ris/" target="_blank">RIS</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="9qzctmarvpk"><em>This video showcases the field tests and interaction capabilities of KAIST Humanoid v0.7, developed at the DRCD Lab featuring in-house actuators. The control policy was trained through deep reinforcement learning leveraging human demonstrations.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6868cb35447265d5d8ab10642b15acd5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9qZcTMARvpk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://dynamicrobot.kaist.ac.kr/">KAIST DRCD Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="_wnckaf2gb8">This needs to come in adult size.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="489e194d2beb7942474b8da6039ec082" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_WNckAf2GB8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="k5wgpmenpcq">I did not know this, but apparently shoeboxes are really annoying to manipulate because if you grab them by the lid, they just open, so specialized hardware is required.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b2f884b6d81248335c4efbff6414e328" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/k5WGpMENPCQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://nomagic.ai/news/zalando-to-install-up-to-50-ai-powered-nomagic-robots/">Nomagic</a> ]</p><p>Thanks, Gilmarie!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="clfpxcpza14"><em>This paper presents a method to recover quadrotor Unmanned Air Vehicles (UAVs) from a throw, when no control parameters are known before the throw.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="da02ec67edcf7a40100d406b105b468a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CLFPXcpzA14?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10801514">MAVLab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pmetcxgumhm">Uh-oh, robots can see glass doors now. We’re in trouble.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="31ecd9975c0baef1553d7e3372c79b98" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pMeTCxGumhM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en/products/oli">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pshyocgoc5u">This drone hugs trees <3</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="12d5d406c1777d91e696c722d9f0fba1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pSHYocGOC5U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://slap-perching.github.io/">Stanford BDML</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="afviggntkm8"><em>Electronic waste is one of the fastest-growing environmental problems in the world. As robotics and electronic systems become more widespread, their environmental footprint continues to increase. In this research, scientists developed a fully biodegradable soft robotic system that integrates electronic devices, sensors, and actuators yet completely decomposes after use.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="75a180ef9157983647255f5588abe215" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AFVIGgntKm8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s41893-026-01780-4">Nature</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="yhyvrk9wce8"><em>We developed a distributed algorithm that enables multiple aerial robots to flock together safely in complex environments, without explicit communication or prior knowledge of the surroundings, using only onboard sensors and computation. Our approach ensures collision avoidance, maintains proximity between robots, and handles uncertainties (tracking errors and sensor noise). Tested in simulations and real-world experiments with up to four drones in a dense forest, it proved robust and reliable.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0ff57e0c9dc071bc6306ca0c3798c944" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yHyvrk9WCE8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mrs.fel.cvut.cz/rbl">RBL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="b3v-ylwcaee"><em>The University of Pennsylvania’s 2025 President’s Sustainability Prize winner Piotr Lazarek has developed a system that uses satellite data to pinpoint inefficiencies in farmers’ fields, conducts real-time soil analysis with autonomous drones to understand why they occur, and generates precise fertilizer application maps. His startup Nirby aims to increase productivity in farm areas that are underperforming and reduce fertilizer in high-performing ones.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="796256fd6880d5e76310d5685661fa67" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/b3v-yLwcAEE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://penntoday.upenn.edu/news/2025-penn-presidents-sustainability-prize-recipient-nirby">University of Pennsylvania</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wl0-pu_8f0u"><em>The production version of Atlas is a departure from the typical humanoid form factor, favoring industrial utility over human likeness. Intended for purposeful work in an industrial setting, Atlas has a form factor that signals its role as a machine rather than a companion or friendly assistant. Join two lead hardware engineers and our head of industrial design for a technical discussion of how key product requirements, ranging from passive thermal management to a modular architecture, dictated a bold new vision for a humanoid.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cce82a9b133af0d383e29e75c54cb937" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wL0-Pu_8F0U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/blog/atlas-evolution-from-research-robot-to-industrial-humanoid/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cmbbkd46z48"><em>Dr. Christian Hubicki gives a talk exploring the common themes of modern robotics research and his time on the reality competition show, “Survivor.”</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="04cf1a709c7b176620b8d56b2629431a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CmBbkd46Z48?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.optimalroboticslab.com/">Optimal Robotics Lab</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Sat, 21 Mar 2026 16:30:04 +0000</pubDate><guid>https://spectrum.ieee.org/tennis-playing-robot</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Robot-locomotion</category><category>Nvidia</category><category>Robot-manipulation</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robot-playing-tennis-holding-racket-on-green-court-inset-shows-human-opponent-hitting-ball.png?id=65325604&amp;width=980"/></item><item><title>Overcoming Core Engineering Barriers in Humanoid Robotics Development</title><link>https://content.knowledgehub.wiley.com/engineering-challenges-and-component-strategies-in-humanoid-robotics-from-prototype-to-production/</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/logo-of-murata-in-red-with-text-innovator-in-electronics-below.png?id=65106483&width=980"/><br/><br/><p><span>A technical examination of the sensing, motion control, power, and thermal challenges facing humanoid robotics engineers — with component-level design strategies for real-world deployment.</span></p><p><span>What Attendees will Learn</span></p><ol><li><span>Why motion control remains the hardest unsolved problem — Explore the modelling complexity, real-time feedback requirements, and sensor fusion demands of maintaining stable bipedal locomotion across dynamic environments.</span></li><li><span>How sensing architectures enable perception and safety — Understand the role of inertial measurement units, force/torque feedback, and tactile sensing in achieving reliable human-robot interaction and collision avoidance.</span></li><li><span>What power and thermal constraints mean for system design — Examine the trade-offs in battery chemistry selection (LFP vs. NCA), DC/DC converter topologies, and thermal protection strategies that determine operational endurance.</span></li><li><span>How the industry is transitioning from prototype to mass production — Learn about the shift toward modular architectures, cost-driven component selection, and supply chain readiness projected for the late 2020s.</span></li></ol><p><a href="https://content.knowledgehub.wiley.com/engineering-challenges-and-component-strategies-in-humanoid-robotics-from-prototype-to-production/" target="_blank">Download this free whitepaper now!</a></p>]]></description><pubDate>Thu, 19 Mar 2026 10:00:05 +0000</pubDate><guid>https://content.knowledgehub.wiley.com/engineering-challenges-and-component-strategies-in-humanoid-robotics-from-prototype-to-production/</guid><category>Sensor-fusion</category><category>Type-whitepaper</category><category>Motion-control</category><category>Humanoid-robots</category><dc:creator>Murata Manufacturing Co.</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/65106483/origin.png"/></item><item><title>Video Friday: These Robots Were Born to Run</title><link>https://spectrum.ieee.org/legged-modular-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/rolling-cannon-distant-cityscape-trees-and-water.gif?id=65282014&width=1245&height=700&coordinates=0%2C0%2C0%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="8vksx1zsg7q"><em>All legged robots deployed “in the wild” to date were given a body plan that was predefined by human designers and could not be redefined in situ. The manual and permanent nature of this process has resulted in very few species of agile terrestrial robots beyond familiar four-limbed forms. Here, we introduce highly athletic modular building blocks and show how they enable the automatic design and rapid assembly of novel agile robots that can “hit the ground running” in unstructured outdoor environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="508a07a4b7d915c6cfd07081bdc63e86" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8VKSx1zSg7Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.northwestern.edu/news-events/index.html" target="_blank">Northwestern University Center for Robotics and Biosystems</a> ] [ <a href="https://www.pnas.org/doi/10.1073/pnas.2519129123">Paper</a> ] via [ <a href="https://gizmodo.com/these-self-configuring-modular-robots-may-one-day-rule-the-world-2000731381">Gizmodo</a> ] </p><div class="horizontal-rule"></div><p class="rm-anchors" id="l2q3kpl4mjq">If you were going to develop the ideal urban delivery robot more or less from scratch, it would be this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ba83a841b32a7807384eeb10bc2c6b03" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/l2q3kPl4mJQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.rivr.ai/rivr-two">RIVR</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="cadtjepdbfc">Don’t get me wrong, there are some clever things going on here, but I’m still having a lot of trouble seeing where the unique, sustainable value is for a <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid robot</a> performing these sorts of tasks.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b6313fcff2b0315bed664e00897cf53a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CAdTjePDBfc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/news/helix-02-living-room-tidy">Figure</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xyhob9__qk0">One of those things that you don’t really think about as a human, but which is actually pretty important.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="53ef3877dae03acc90a17fd9dcba1e6b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xYhOb9__Qk0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2602.05760">Paper</a> ] via [ <a href="https://rsl.ethz.ch/" target="_blank">ETH Zurich</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wi6u8bvofvc"><em>We propose TRIP-Bag (Teleoperation, Recording, Intelligence in a Portable Bag), a portable, puppeteer-style teleoperation system fully contained within a commercial suitcase, as a practical solution for collecting high-fidelity manipulation data across varied settings.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4d39bbac7f62958700b13bfd53bc8bfd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Wi6U8bvoFvc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://uiuckimlab.github.io/TRIP-Bag-pages/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nuouwhuzpwq"><em>We propose an open-vocabulary semantic exploration system that enables robots to maintain consistent maps and efficiently locate (unseen) objects in semi-static real-world environments using LLM-guided reasoning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1724903fcd3e1d57df45824508205a87" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nUouwHUZPWQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.tum.de/en/news-and-events/all-news/press-releases/details/search-robot-thinks-for-itself">TUM</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vrxamllkjko">That’s it, folks. We have no need for real pandas anymore—if we ever did in the first place. Be honest, what has a <a data-linked-post="2675288239" href="https://spectrum.ieee.org/robot-martial-arts" target="_blank">panda</a> done for you lately?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="af51d3f513d68d80617dd0b62738a1bb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VRxAMLlkjko?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.magiclab.top/en/">MagicLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uhd6o6dem_o"><em>RoboGuard is a general-purpose guardrail for ensuring the safety of LLM-enabled robots. RoboGuard is configured offline with high-level safety rules and a robot description, reasons about how these safety rules are best applied in robot’s context, then synthesizes a plan that maximally follows user preferences while ensuring safety.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bfc2abc33b815af7c16c37617a485a87" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uhd6O6DEM_o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robo-guard.github.io/">RoboGuard</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="5ekki51q1sk"><em>In this demonstration, a small team responds to a (simulated) radiation contamination leak at a real nuclear reactor facility. The team deploys their reconfigurable robot to accompany them through the facility. As the station is suddenly plunged into darkness, the robot’s camera is hot-swapped to thermal so that it can continue on. Upon reaching the approximate location of the contamination, the team installs a Compton gamma-ray camera and pan-tilt illuminating device. The robot autonomously steps forward, locates the radiation source, and points it out with the illuminator.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7928c582f10167b05ca04c694c729b67" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5ekKI51q1Sk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/11078050">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zcnmhsg5bpw"><em>On March 6, 2025, the Robomechanics Lab at CMU was flooded with 4 feet of black water (i.e., mixed with sewage). We lost most of the robots in the lab, and as a tribute, my students put together this “In Memoriam” video. It includes some previously unreleased robots and video clips.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e1739161841c3f7f5fb2ae563d8b15bc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zcnMHsg5Bpw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cmu.edu/me/robomechanicslab/">Carnegie Mellon University Robomechanics Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="i3goczr4ya0">There haven’t been a lot of successful <a data-linked-post="2650267089" href="https://spectrum.ieee.org/your-kid-wants-a-thymio-ii-education-robot" target="_blank">education robots</a>, but here’s one of them.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="971e1ccf67faed9fa1a9a5292d6b5b49" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/i3goCzr4YA0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sphero.com/collections/all/products/rvr?variant=42004659142701">Sphero</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="35i9m-jc0oc">The opening keynote from the 2025 Silicon Valley Humanoids Summit: “Insights Into Disney’s Robotic Character Platform,” by Moritz Baecher, Director, Zurich Lab, Disney Research.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a7fc3671608ce481554dac55c022d319" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/35i9M-jc0Oc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://humanoidssummit.com/">Humanoids Summit</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 13 Mar 2026 16:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/legged-modular-robot</guid><category>Robotics</category><category>Humanoid-robots</category><category>Video-friday</category><category>Modular-robots</category><category>Robot-videos</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/rolling-cannon-distant-cityscape-trees-and-water.gif?id=65282014&amp;width=980"/></item><item><title>Video Friday: A Robot Hand With Artificial Muscles and Tendons</title><link>https://spectrum.ieee.org/video-friday-robot-hand-artificial-muscles</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robotic-hand-grasping-a-red-bull-can-against-a-dark-background.png?id=65162441&width=1245&height=700&coordinates=0%2C42%2C0%2C43"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="hd1hdfw1bhy"><em>The functional replication and actuation of complex structures inspired by nature is a longstanding goal for humanity. Creating such complex structures combining soft and rigid features and actuating them with artificial muscles would further our understanding of natural kinematic structures. We printed a biomimetic hand in a single print process composed of a rigid skeleton, soft joint capsules, tendons, and printed touch sensors.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d1520429687b7c6ef41cd204b2161ddc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hD1HDFw1BhY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/abstract/document/10522043">Paper</a> ] via [ <a href="https://srl.ethz.ch/">SRL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="u18ehtnvfd4">Two <a href="https://spectrum.ieee.org/tag/boston-dynamics" target="_blank">Boston Dynamics</a> product managers talk about their favorite classic BD robots, and then I talk about mine.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ee14e75b8b4fac354bdb72fef9eb1549" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U18EHTnvFd4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>And this is Boston Dynamics’ LittleDog, doing legged locomotion research 16 or so years ago in what I’m pretty sure is Katie Byl’s lab at UCSB.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="27626ccc6010288122cf616a0f35aa3d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AdWpo43b2FI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/about/history/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gocorcrlgb4"><em>This is our latest work on the trajectory planning method for floating-based articulated robots, enabling the global path for searching in complex and cluttered environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f2d2afeed034c4c40136e41360360951" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GOcorcrLGb4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dragon.t.u-tokyo.ac.jp/">DRAGON Lab</a> ]</p><p>Thanks, Moju!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="i2jmf_z9ts8"><em>OmniPlanner is a unified solution for exploration and inspection-path planning (as well as target reach) across aerial, ground, and underwater robots. It has been verified through extensive simulations and a multitude of field tests, including in underground mines, ballast water tanks, forests, university buildings, and submarine bunkers.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fcaa6b98fc3995a5010528eb89bb8f14" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/I2JMF_Z9tS8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.autonomousrobotslab.com/">NTNU</a> ]</p><p>Thanks, Kostas!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="a_hwcpqbbly"><em>In the ARISE project, the <a href="https://www.fzi.de/en/" target="_blank">FZI Research Center for Information Technology</a> and its international partners ETH Zurich, University of Zurich, University of Bern, and University of Basel took a major step toward future lunar missions by testing cooperative autonomous multirobot teams under outdoor conditions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="17b9634bb780c7e02ba8230822684990" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/a_hwCPQbBlY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fzi.de/en/2025/02/26/one-step-closer-to-the-moon-through-international-cooperation/">FZI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dmbjbwhwyeu">Welcome to the future, where there are no other humans.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="05f12866fdd4c32a9372563b0d407f5d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DmbJbwhWYEU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.zj-humanoid.com/">Zhejiang Humanoid</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8oot8cnpai0"><em>This is our latest work on robotic fish, and it’s also the first underwater robot from DRAGON Lab. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2e719c55aa3bd82ab9f1c1123ecfe88f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8oot8CnpAi0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dragon.t.u-tokyo.ac.jp/">DRAGON Lab</a> ]</p><p>Thanks, Moju!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="awrnl8rcbmk">Watch this one simple trick to make <a href="https://spectrum.ieee.org/topic/robotics/humanoid-robots/" target="_blank">humanoid robots</a> cheaper and safer!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e69112d5d83fdcd0226a652b2b7cb898" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AwRnL8rcBmk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.zj-humanoid.com/">Zhejiang Humanoid</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="90twy79yffo"><em>Gugusse and the Automaton</em> is a 1897 French film by <a href="https://en.wikipedia.org/wiki/Georges_M%C3%A9li%C3%A8s" target="_blank">Georges Méliès</a> featuring a humanoid robot in a depiction that’s nearly as realistic as some of the humanoid promo videos we’ve seen lately.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4d2d9a469b74b0b57aa6d34c9859e471" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/90tWY79YfFo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.loc.gov/item/2026125501/?loclr=blogloc">Library of Congress</a> ] via [ <a href="https://gizmodo.com/first-film-to-depict-a-robot-discovered-in-michigan-2000727995">Gizmodo</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lm3htxushva"><em>At Agility, we create automated solutions for the hardest work. We’re incredibly proud of how far we’ve come, and can’t wait to show you what’s next.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="007204bb742016f199f77925109d19ef" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LM3hTXUShvA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="si-jhnqcjt0"><a href="https://www.nist.gov/people/kamel-s-saidi" target="_blank">Kamel Saidi</a>, robotics program manager at the <a href="https://www.nist.gov/" target="_blank">National Institute of Standards and Technology (NIST)</a>, on how performance standards can pave the way for humanoid adoption.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4484b448f54ec06f10f3985953b03c9b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sI-jhnqcJt0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://humanoidssummit.com/">Humanoids Summit</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gwsl1oh1i4w"><em><a href="https://people.eecs.berkeley.edu/~anca/" target="_blank">Anca Dragan</a> is no stranger to Waymo. She worked with us for six years while also at UC Berkeley and now at Google DeepMind. Her focus on making AI safer helped Waymo as it launched commercially. In this final episode of our season, Anca describes how her work enables AI agents to work fluently with people, based on human goals and values.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bafb48d4c6dec0871809a152ad842b8e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GwSl1OH1i4w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/playlist?list=PLCkt0hth826G9AtnOrQsPbKKD5JmdaMXb">Waymo Podcast</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="r9ugdinfhbm">This <a href="https://www.grasp.upenn.edu/" target="_blank">UPenn GRASP</a> SFI Seminar is by Junyao Shi: “Unlocking Generalist Robots with Human Data and Foundation Models.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c1e2f8d6dc1171693ed8ee0180f30e9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/r9UGdInfhBM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Building general-purpose robots remains fundamentally constrained by data scarcity and labor-intensive engineering. Unlike vision and language, robotics lacks large, diverse datasets that span tasks, environments, and embodiments, thus limiting both scalability and generalization. This talk explores how human data and foundation models trained at scale can help overcome these bottlenecks.</em></blockquote><p>[ <a href="https://www.grasp.upenn.edu/events/spring-2026-grasp-sfi-junyao-shi/">UPenn</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 06 Mar 2026 16:00:05 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-hand-artificial-muscles</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Underwater-robots</category><category>Bipedal-robots</category><category>Robot-videos</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-hand-grasping-a-red-bull-can-against-a-dark-background.png?id=65162441&amp;width=980"/></item><item><title>What Military Drones Can Teach Self-Driving Cars</title><link>https://spectrum.ieee.org/military-drones-self-driving-cars</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/silhouette-from-the-back-of-an-adults-head-as-they-look-at-two-monitors-one-screen-displays-a-drone-and-the-other-shows-self-d.jpg?id=65098234&width=1245&height=700&coordinates=0%2C156%2C0%2C157"/><br/><br/><p><a href="https://spectrum.ieee.org/self-driving-cars/missy-cummings" target="_blank">Self-driving cars often struggle</a> with situations that are commonplace for human drivers. When confronted with construction zones, school buses, power outages, or misbehaving pedestrians, these vehicles often behave unpredictably, leading to crashes or freezing events, causing significant disruption to local traffic and possibly blocking first responders from doing their jobs. Because self-driving cars cannot successfully handle such routine problems, self-driving companies use human babysitters to remotely supervise them and intervene when necessary.</p><div class="rm-embed embed-media"><iframe height="110px" id="noa-web-audio-player" src="https://embed-player.newsoveraudio.com/v4?key=q5m19e&id=https://spectrum.ieee.org/military-drones-self-driving-cars&bgColor=F5F5F5&color=1b1b1c&playColor=1b1b1c&progressBgColor=F5F5F5&progressBorderColor=bdbbbb&titleColor=1b1b1c&timeColor=1b1b1c&speedColor=1b1b1c&noaLinkColor=556B7D&noaLinkHighlightColor=FF4B00&feedbackButton=true" style="border: none" width="100%"></iframe></div><p>This idea—humans supervising autonomous vehicles from a distance—is not new. The U.S. military has been doing it since the 1980s with unmanned aerial vehicles (UAVs). In those early years, the military experienced numerous accidents due to poorly designed control stations, lack of training, and communication delays.</p><p>As a Navy fighter pilot in the 1990s, I was one of the first researchers to examine how to improve the UAV remote supervision interfaces. The thousands of hours I and others have spent working on and observing these systems generated a deep body of knowledge about how to safely manage remote operations. With recent revelations that U.S. commercial self-driving car remote operations are handled by <a href="https://www.c-span.org/program/senate-committee/tesla-and-waymo-executives-others-testify-about-self-driving-cars/672835" rel="noopener noreferrer" target="_blank">operators in the Philippines</a>, it is clear that self-driving companies have not learned the hard-earned military lessons that would promote safer use of self-driving cars today.</p><p>While stationed in the Western Pacific during the Gulf War, I spent a significant amount of time in air operations centers, learning how military strikes were planned, implemented and then replanned when the original plan inevitably fell apart. After obtaining my PhD, I leveraged this experience to begin research on the remote control of UAVs for all three branches of the U.S. military. Sitting shoulder-to-shoulder in tiny trailers with operators flying UAVs in local exercises or from 4000 miles away, my job was to learn about the pain points for the remote operators as well as identify possible improvements as they executed supervisory control over UAVs that might be flying halfway around the world.</p><p>Supervisory control refers to situations where humans monitor and support autonomous systems, stepping in when needed. For self-driving cars, this oversight can take several forms. The first is teleoperation, where<strong> </strong>a human remotely controls the car’s speed and steering from afar. Operators sit at a console with a steering wheel and pedals, similar to a racing simulator. Because this method relies on real-time control, it is extremely sensitive to communication delays.</p><p>The second form of supervisory control is remote assistance. Instead of driving the car in real time, a human gives higher-level guidance. For example, an operator might click a path on a map (called laying “breadcrumbs”) to show the car where to go, or interpret information the AI cannot understand, such as hand signals from a construction worker. This method tolerates more delay than teleoperation but is still time-sensitive.</p><h2>Five Lessons From Military Drone Operations</h2><p>Over 35 years of UAV operations, the military consistently encountered five major challenges during drone operations which provide valuable lessons for self-driving cars.</p><h3>Latency</h3><p>Latency—delays in sending and receiving information due to distance or poor network quality—is the single most important challenge for remote vehicle control. Humans also have their own built-in delay: neuromuscular lag. Even under perfect conditions, people cannot reliably respond to new information in less than 200–500 milliseconds. In remote operations, where communication lag already exists, this makes real-time control even more difficult.</p><p>In early drone operations, U.S. Air Force pilots in Las Vegas (the primary U.S. UAV operations center) attempted to take off and land drones in the Middle East using teleoperation. With at least a two-second delay between command and response, the accident rate was <a href="https://dsiac.dtic.mil/articles/reliability-of-uavs-and-drones/" rel="noopener noreferrer" target="_blank">16 times that of fighter jets conducting the same missions</a> . The military switched to local line-of-sight operators and eventually to fully automated takeoffs and landings. When I interviewed the pilots of these UAVs, they all stressed how difficult it was to control the aircraft with significant time lag.</p><p>Self-driving car companies typically rely on cellphone networks to deliver commands. These networks are unreliable in cities and prone to delays. This is one reason many companies prefer remote assistance instead of full teleoperation. But even remote assistance can go wrong. In <a href="https://www.forbes.com/sites/bradtempleton/2024/03/26/waymo-runs-a-red-light-and-the-difference-between-humans-and-robots/" rel="noopener noreferrer" target="_blank">one incident</a>, a Waymo operator instructed a car to turn left when a traffic light appeared yellow in the remote video feed—but the network latency meant that the light had already turned red in the real world. After moving its remote operations center from the U.S. to the Philippines, Waymo’s latency increased even further. It is imperative that control not be so remote, both to resolve the latency issue but also increase oversight for security vulnerabilities.</p><h3>Workstation Design</h3><p>Poor interface design has caused many drone accidents. The military learned the hard way that confusing controls, difficult-to-read displays, and unclear autonomy modes can have disastrous consequences. Depending on the specific UAV platform, the FAA attributed between 20% and 100% of Army and Air Force UAV <a href="https://apps.dtic.mil/sti/pdfs/ADA460102.pdf" rel="noopener noreferrer" target="_blank">crashes caused by human error through 2004</a> to poor interface design.</p><h3>UAV crashes (1986-2004) caused by human factors problems, including poor interface and procedure design. These two categories do not sum to 100% because both factors could be present in an accident.</h3><br/><table border="0" style="white-space: unset;" width="100%"><tbody><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;"></th><th align="left" scope="col" style="color: #ffffff; background-color: #000000;" width="25%">Human Factors</th><th align="left" scope="col" style="color: #ffffff; background-color: #000000;" width="25%"> Interface Design</th><th align="left" scope="col" style="color: #ffffff; background-color: #000000;" width="25%"> Procedure Design</th></tr><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;"> Army Hunter</th><td align="left" style="background-color: #DFD5C1;"> 47%</td><td align="left" style="background-color: #DFD5C1;"> 20%</td><td align="left" style="background-color: #DFD5C1;"> 20%</td></tr><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;"> Army Shadow</th><td align="left" style="background-color: #E9E3D6;"> 21%</td><td align="left" style="background-color: #E9E3D6;"> 80%</td><td align="left" style="background-color: #E9E3D6;"> 40%</td></tr><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;">Air Force Predator</th><td align="left" style="background-color: #DFD5C1;"> 67%</td><td align="left" style="background-color: #DFD5C1;"> 38%</td><td align="left" style="background-color: #DFD5C1;"> 75%</td></tr><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;" width="25%"> Air Force Global Hawk</th><td align="left" style="background-color: #E9E3D6;"> 33%</td><td align="left" style="background-color: #E9E3D6;"> 100%</td><td align="left" style="background-color: #E9E3D6;"> 0%</td></tr></tbody></table><p>Many UAV aircraft crashes have been caused by poor human control systems. In one case, buttons were placed on the controllers such that it was relatively easy to <a href="https://spectrum.ieee.org/review-djis-new-fpv-drone-is-effortless-exhilarating-fun" target="_self">accidentally shut off the engine</a> instead of firing a missile. This poor design led to the accidents where the remote operators <a href="https://dspace.mit.edu/handle/1721.1/84129" rel="noopener noreferrer" target="_blank">inadvertently shut the engine down instead of launching a missile</a>.</p><p> The self-driving industry reveals hints of comparable issues. Some autonomous shuttles use off-the-shelf gaming controllers, which—while inexpensive—were never designed for vehicle control. The off-label use of such controllers can lead to mode confusion, which was a factor in a <a href="https://www.govtech.com/transportation/after-crash-orlandos-self-driving-bus-back-on-the-road" rel="noopener noreferrer" target="_blank">recent shuttle crash</a>. Significant human-in-the-loop testing is needed to avoid such problems, not only prior to system deployment, but also after major software upgrades.</p><h3>Operator Workload</h3><p>Drone missions typically include long periods of surveillance and information gathering, occasionally ending with a missile strike. These missions can sometimes last for days; for example, while the military waits for the person of interest to emerge from a building. As a result, the remote operators experience extreme swings in workload: sometimes overwhelming intensity, sometimes crushing boredom. Both conditions can lead to errors.</p><p>When operators teleoperate drones, workload is high and fatigue can quickly set in. But when onboard autonomy handles most of the work, operators can become bored, complacent, and less alert. This pattern is <a href="https://www.airuniversity.af.edu/Wild-Blue-Yonder/Articles/Article-Display/Article/2144225/airmen-and-unmanned-aerial-vehicles-the-danger-of-generalization/" rel="noopener noreferrer" target="_blank">well documented in UAV research</a>.</p><p>Self-driving car operators are likely experiencing similar issues for tasks ranging from interpreting confusing signs to helping cars escape dead ends. In simple scenarios, operators may be bored; in emergencies—like driving into a flood zone or responding during a citywide power outage—they can become quickly overwhelmed.</p><p>The military has tried for years to have one person supervise many drones at once, because it is far more cost effective. However, cognitive switching costs (regaining awareness of a situation after switching control between drones) result in workload spikes and high stress. That coupled with increasingly complex interfaces and communication delays have made this extremely difficult.</p><p>Self-driving car companies likely face the same roadblocks. They will need to model operator workloads and be able to reliably predict what staffing should be and how many vehicles a single person can effectively supervise, especially during emergency operations. If every self-driving car turns out to need a dedicated human to pay close attention, such operations would no longer be cost-effective.</p><h3>Training</h3><p>Early drone programs lacked formal training requirements, with training programs designed by pilots, for pilots. Unfortunately, supervising a drone is more akin to air traffic control than actually flying an aircraft, so the military often placed drone operators in critical roles with inadequate preparation. This caused many accidents. Only years later did the military conduct <a href="https://www.researchgate.net/publication/238795397_Enhancing_Unmanned_Aerial_System_Training_A_Taxonomy_of_Knowledge_Skills_Attitudes_and_Methods" rel="noopener noreferrer" target="_blank">a proper analysis of the knowledge, skills, and abilities needed to conduct safe remote operations</a>, and changed their training program.</p><p>Self-driving companies do not publicly share their training standards, and no regulations currently govern the qualifications for remote operators. On-road safety depends heavily on these operators, yet very little is known about how they are selected or taught. If commercial aviation dispatchers are required to have formal training overseen by the FAA, which are very similar to self-driving remote operators, we should hold commercial self-driving companies to similar standards.</p><h3>Contingency Planning</h3><p>Aviation has strong protocols for emergencies including predefined procedures for lost communication, backup ground control stations, and highly reliable onboard behaviors when autonomy fails. In the military, drones may fly themselves to safe areas or land autonomously if contact is lost. Systems are designed with cybersecurity threats—like GPS spoofing—in mind.</p><p>Self-driving cars appear far less prepared. The <a href="https://waymo.com/blog/2025/12/autonomously-navigating-the-real-world" rel="noopener noreferrer" target="_blank">2025 San Francisco power outage</a> left Waymo vehicles frozen in traffic lanes, blocking first responders and creating hazards. These vehicles are supposed to perform “minimum-risk maneuvers” such as pulling to the side—but many of them didn’t. This suggests gaps in contingency planning and basic fail-safe design.</p><div class="horizontal-rule"></div><p><span>The history of military drone operations offers crucial lessons for the self-driving car industry. Decades of experience show that remote supervision demands extremely low latency, carefully designed control stations, manageable operator workload, rigorous, well-designed training programs, and strong contingency planning.</span></p><p>Self-driving companies appear to be repeating many of the early mistakes made in drone programs. Remote operations are treated as a support feature rather than a mission-critical safety system. But as long as AI struggles with uncertainty, which will be the case for the foreseeable future, remote human supervision will remain essential. The military learned these lessons through painful trial and error, yet the self-driving community appears to be ignoring them. The self-driving industry has the chance—and the responsibility—to learn from our mistakes in combat settings before it harms road users everywhere.</p><p>A full paper on this topic will be presented at the <a href="https://www.ieeesmc.org/ichms-2026/" target="_blank">2026 IEEE International Conference on Human-Machine Systems (ICHMS)</a> meeting in Singapore in July.</p>]]></description><pubDate>Mon, 02 Mar 2026 12:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/military-drones-self-driving-cars</guid><category>Drones</category><category>Military-robots</category><category>Self-driving-cars</category><dc:creator>Missy Cummings</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/silhouette-from-the-back-of-an-adults-head-as-they-look-at-two-monitors-one-screen-displays-a-drone-and-the-other-shows-self-d.jpg?id=65098234&amp;width=980"/></item><item><title>Video Friday: Robot Dogs Haul Produce From the Field</title><link>https://spectrum.ieee.org/quadruped-farming-robots</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/quadruped-robots-with-crates-on-their-backs-carry-produce-on-a-path-amidst-lush-leafy-green-crops.png?id=65095903&width=1245&height=700&coordinates=0%2C54%2C0%2C55"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="vzjcvzyi2wq"><em>Our robots Lynx M20 help transport harvested crops in mountainous farmland—tackling the rural “last mile” logistics challenge.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ba185ed737063e503a9255c4a1cfd96d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VzjcvzYi2WQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="eqpyvr-b7hc">Once again, I would point out that now that we are reaching peak humanoid robots doing humanoid things, we are inevitably about to see humanoid robots doing nonhumanoid things.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="07f6309039fd04258b0e4abdcfae0617" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eQpyvR-B7hc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="g9oyrplrig8"><em>In a study, a team of researchers from the Max Planck Institute for Intelligent Systems, the University of Michigan, and Cornell University show that groups of <a data-linked-post="2650267091" href="https://spectrum.ieee.org/magnetic-microbots-to-fight-cancer" target="_blank">magnetic microrobots</a> can generate fluidic forces strong enough to rotate objects in different directions without touching them. These microrobot swarms can turn gear systems, rotate objects much larger than the robots themselves, assemble structures on their own, and even pull in or push away many small objects.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="316646fb51cfe112cec7b3c3839dec72" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/G9oYrPLRIG8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/sciadv.aea9947">Science</a> ] via [ <a href="https://is.mpg.de/en/news/magnetic-microrobot-swarms-enable-contactless-manipulation-of-objects-through-fluidic-torque">Max Planck Institute</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="klhx6qfrzes"><em>Bipedal—or two-legged—autonomous robots can be quite agile. This makes them useful for performing tasks on uneven terrain, such as carrying equipment through outdoor environments or performing maintenance on an oceangoing ship. However, unstable or unpredictable conditions also increase the possibility of a robot wipeout. Until now, there’s been a significant lack of research into how a robot recovers when its direction shifts—for example, a robot losing balance when a truck makes a quick turn. The team aims to fix this research gap.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="de95bdf7f06151477581b83f9cd1d146" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/klhX6qFRZEs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.gatech.edu/news/2026/02/18/humanoid-robots-make-confident-strides-toward-walking-stability">Georgia Tech</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="u7tsdb4nuge"><em>Robotics is about controlling energy, motion, and uncertainty in the real world.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d6071f0de516f1571a9e1cc180c0f753" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U7TSDb4NugE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cs.cmu.edu/~16311/current/labs/lab01/index.html">Carnegie Mellon University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fftskxohrxm"><em>Delicious dinner cooked by our robot Robody. We’ve asked our investors to speak about why they’re along for the ride.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e7e0e84ad77fb254d7071021f9b5677e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FfTSKxOhrxM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.devanthro.com/">Devanthro</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="7oc55almc4u"><em>Tilt-rotor aerial robots enable omnidirectional maneuvering through thrust vectoring, but introduce significant control challenges due to the strong coupling between joint and rotor dynamics. This work investigates reinforcement learning for omnidirectional aerial motion control on overactuated tiltable quadrotors that prioritizes robustness and agility.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4141e9b6c30d50877b06af77023c3bab" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7oc55aLMC4U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://zwt006.github.io/posts/BeetleOmni/">Dragon Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="3fzhnaoajae"><em>At the [Carnegie Mellon University] Robotic Innovation Center’s 75,000-gallon water tank, members of the TartanAUV student group worked to further develop their autonomous underwater vehicle (AUV) called Osprey. The team, which takes part in the annual <a data-linked-post="2650255401" href="https://spectrum.ieee.org/build-your-own-undersea-robot" target="_blank">RoboSub</a> competition sponsored by the U.S. Office of Naval Research, is comprised primarily of undergraduate engineering and robotics students.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d138043667221480117b9978bd790ef3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3fzhNAoAjaE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cmu.edu/news/stories/archives/2026/february/cmus-robotics-innovation-center-propels-research-from-deep-sea-to-space">Carnegie Mellon University</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="1one4l_pghw">Sure seems like the only person who would want a robot dog is a person who does not in fact want a dog.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c384213c6605abee52a4b285faed7d20" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1ONE4l_pgHw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Compact size, industrial capability. Maximum torque of 90N·m, over 4 hours of no-load runtime, IP54 rainproof design. With a 15-kg payload, range exceeds 13 km. Open secondary development, empowering industry applications.</em></blockquote><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="flmtgad-snu">If your robot video includes tasty baked goods it <strong><em>will</em></strong> be included in Video Friday.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bc37f01ece7f72aae9e972b63ed5f39d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fLMTgAD-SNU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://qbrobotics.com/">QB Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hdktpcuzwli"><em>Astorino is a 6-axis educational robot created for practical and affordable teaching of robotics in schools and beyond. It has been created with 3D printing, so it allows for experimentation and the possible addition of parts. With its design and programming, it replicates the actions of industrial robots giving students the necessary skills for future work.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9175161ed153e8deaaa1697322641517" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HDKtpcUzwLI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://astorino.com.pl/en/">Astorino by Kawasaki</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zdqvhaoagcu">We need more <a data-linked-post="2668536834" href="https://spectrum.ieee.org/autonomous-vehicles-great-at-straights" target="_blank">autonomous driving datasets</a> that accurately reflect how sucky driving can be a lot of the time.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="32bf318792e72e34dbc1d1ef25b3d572" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zDQVhAOagcU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://asrl.utias.utoronto.ca/">ASRL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="g-g0wl_tqw4">This Carnegie Mellon University Robotics Institute Seminar is by CMU’s own Victoria Webster-Wood, on “Robots as Models for Biology and Biology and Materials for Robots.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5ad1f7c8448c962c5dc5ae76dffc81e9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/G-G0wL_TqW4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>In the last century, it was common to envision robots as shining metal structures with rigid and halting motion. This imagery is in contrast to the fluid and organic motion of living organisms that inhabit our natural world. The adaptability, complex control, and advanced learning capabilities observed in animals are not yet fully understood, and therefore have not been fully captured by current robotic systems. Furthermore, many of the mechanical properties and control capabilities seen in animals have yet to be achieved in robotic platforms. In this talk, I will share an interdisciplinary research vision for robots as models for neuroscience and biology as materials for robots.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/robots-as-models-for-biology-and-biology-and-materials-for-robots/">CMU RI</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 27 Feb 2026 18:00:55 +0000</pubDate><guid>https://spectrum.ieee.org/quadruped-farming-robots</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Swarm-robotics</category><category>Quadruped-robots</category><category>Farm-robots</category><category>Bipedal-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/quadruped-robots-with-crates-on-their-backs-carry-produce-on-a-path-amidst-lush-leafy-green-crops.png?id=65095903&amp;width=980"/></item><item><title>Perseverance Smashes Autonomous Driving Record on Mars</title><link>https://spectrum.ieee.org/perseverance-mars-rover-autonomous-driving</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-self-portrait-captured-by-nasa-s-perseverance-rover-while-traversing-mars-rocky-surface.jpg?id=65007226&width=1245&height=700&coordinates=0%2C156%2C0%2C157"/><br/><br/><p><em>This article is part of our exclusive <a href="https://spectrum.ieee.org/collections/journal-watch/" target="_self">IEEE Journal Watch series</a> in partnership with <a href="https://spectrum.ieee.org/tag/ieee-xplore" target="_self">IEEE Xplore</a>.</em></p><p>In past missions to Mars, like with the <a href="https://spectrum.ieee.org/nasa-mars-curiosity-rover-autonomous-driving-mode" target="_blank"><em>Curiosity</em></a> and <em>Opportunity</em> rovers, the robots relied mostly on human instructions from millions of miles away in order to safely navigate the Martian landscape. The <em>Perseverance</em> rover, on the other hand, has zipped across the alien, boulder-ridden land almost completely autonomously, smashing previous records for autonomous driving on Mars. </p><p>Whereas the <em>Curiosity</em> rover completed about 6.2 percent of its travels autonomously,<strong> </strong><em>Perseverance</em> had completed about 90 percent of its travels autonomously, as of its 1,312th Martian day since landing (28 October 2024). <em>Perseverance</em> was able to accomplish such a feat<span>—</span>using remarkably little computing power<span>—</span>thanks to its specially designed autonomous driving algorithm, Enhanced Autonomous Navigation, or ENav. </p><p>The full details on ENav’s inner workings and how well it has performed on Mars are described in a <a href="https://ieeexplore.ieee.org/document/11265757" target="_blank">study</a> published in <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=10495159" target="_blank"><em>IEEE Transactions on Field Robotics</em></a> in November 2025. </p><p>There are some advantages, but some serious challenges when it comes to autonomous navigation on Mars. On the plus side, almost nothing on the planet moves. Rocks and gravel slopes—while formidable obstacles—remain stationary, offering rovers consistency and predictability in their calculations and pathfinding. On the other hand, Mars is in large part uncharted terrain. </p><p>“This enormous uncertainty is the major challenge,” says <a href="https://www-robotics.jpl.nasa.gov/who-we-are/people/masahiro_ono/" target="_blank">Masahiro “Hiro” Ono</a>, supervisor of the Robotic Surface Mobility Group at NASA’s Jet Propulsion Laboratory, who helped develop ENav.</p><h2>Creating a Highly Autonomous Rover </h2><p>While some images from the space-borne Mars Reconnaissance Orbiter exist, these are usually not high enough resolution for ground-based navigation by a rover. In December, NASA engineers performed the first <a href="https://spectrum.ieee.org/perseverance-rover-nasa-anthropic-ai" target="_self">test of a navigation technique</a> that uses a model based on Anthropic’s AI to analyze MRO images and generate waypoints—the coordinates used to guide the rover’s path—for more complete automation. </p><p class="ieee-inbody-related">RELATED: <a href="https://spectrum.ieee.org/perseverance-rover-nasa-anthropic-ai" target="_blank">NASA Let AI Drive the Perseverance Rover</a></p><p><span>But in the majority of today’s navigation, <em>Perseverance</em> must rely on images the rover itself takes, analyze these to assess thousands of different paths, and choose the right route that won’t end in its own demise. The kicker? It must do so with the equivalent computing capacity of an </span><a href="https://en.wikipedia.org/wiki/IMac_G3" target="_blank">iMac G3</a><span>, an Apple computer sold in the late 1990s.</span></p><p><span></span>The rover’s processor must undergo <a href="https://spectrum.ieee.org/europa-clipper" target="_blank">radiation hardening</a>, a process that makes them resilient to the extreme levels of solar radiation and cosmic rays experienced on Mars. Although other radiation-hardened CPUs with more computing power were available at the time of <em>Perseverance</em><span>‘s development, the one used is proven to be reliable in the harsh conditions of outer space. By reusing hardware from previous missions—the same CPU was used in <em>Curiosity</em>—NASA can reduce costs while minimizing risk.</span></p><p>Given its limited computing resources, the ENav algorithm was strategically designed to do the heaviest computing only when driving on challenging terrains. It works by analyzing images of its surroundings and assessing about 1,700 possible paths forward, typically within 6 meters from the rover’s current position. Assessing factors such as travel time and terrain roughness, it ranks possible paths. Finally, it runs a computationally heavy collision-checking algorithm, called ACE (approximate clearance estimation) on only on a handful of top-ranked potential paths. </p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="The driving path of NASA's Perseverance rover across Mars' surface, spanning 18.65 miles." class="rm-shortcode" data-rm-shortcode-id="e6b5e0be83268493f2e1836c913c6174" data-rm-shortcode-name="rebelmouse-image" id="e6dbd" loading="lazy" src="https://spectrum.ieee.org/media-library/the-driving-path-of-nasa-s-perseverance-rover-across-mars-surface-spanning-18-65-miles.jpg?id=65007264&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">As of October 2024, Perseverance has driven more than 30 kilometers (18.65 miles) and collected 24 samples of rock and regolith. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Source:  JPL-Caltech/ASU/MSSS/NASA</small></p><h2>Exploring the Red Planet with ENav</h2><p><em>Perseverance</em> landed on Mars on 18 February 2021. In their study, Ono and his colleagues describe how the rover was initially deployed with strong human navigation oversight during its first 64 Martian days on the Red Planet, but then went on to predominantly use ENav to travel to one of the major exploration targets: the delta formed by an ancient river that once flowed into Jezero Crater billions of years ago. Scientists believe it could be a prime spot for finding evidence of past alien life, if life ever existed on Mars.</p><p>After a brief exploration of an area southwest of its landing site, <em>Perseverance</em> jetted counterclockwise around sand dunes toward the ancient river delta at a crisp pace, averaging 201 meters per Martian day. (It’s too cold for the rover to travel at night.) Over the course of just 24 Martian days of driving, the rover traveled about 5 kilometers into the foothill of the delta. 95 percent of all driving that month was performed using the autonomous driving mode, resulting in an unprecedented amount of autonomous driving on Mars.</p><p>Past rovers, such as <em>Curiosity</em>, had to stop and “think” about their paths before moving forward. “That was the main speed bump for <em>Curiosity</em>, why it was so slow to drive autonomously,” Ono explains. </p><p>In contrast, <em>Perseverance</em> is able to think and drive at the same time. “Sometimes [<em>Perseverance</em>] has to stop and think, particularly when it cannot figure out a safe path quickly. But most of the time, particularly on easy terrains, it can keep driving without stopping,” Ono says. “That made its autonomous driving an order of magnitude faster.”</p><p><em>Opportunity</em> held the previous record for autonomous driving on Mars, traveling 109 meters in a single Martian day. But on 3 April 2023, <em>Perseverance</em> set a new record by driving 331.74 meters autonomously (and 347.69 meters in total) in a single Martian day. </p><p>Ono says that fine-tuning the ENav algorithm took a lot of work, but he is happy with its performance. He also emphasizes that efforts to continue advancing autonomous navigation are critical if humans want to continue exploring even deeper into space, where Earthly communication with rovers and other spacecraft will become increasingly difficult.</p><p>“The automation of the space systems is unstoppable direction that we have to go if we want to explore deeper in space,” Ono says. “This is the direction that we must go to push the boundaries and frontiers of space exploration.”</p><p><em>This article was updated on 27 February to clarify NASA’s reasoning for selecting the CPU used in the </em>Perseverance<em> rover.</em></p><p><em>This article appears in the May 2026 print issue as “</em>Perseverance<em> Masters Self-Driving on Mars.”</em></p>]]></description><pubDate>Wed, 25 Feb 2026 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/perseverance-mars-rover-autonomous-driving</guid><category>Mars</category><category>Perseverance-rover</category><category>Autonomous-robots</category><category>Journal-watch</category><dc:creator>Michelle Hampson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-self-portrait-captured-by-nasa-s-perseverance-rover-while-traversing-mars-rocky-surface.jpg?id=65007226&amp;width=980"/></item><item><title>Video Friday: Humanoid Robots Celebrate Spring</title><link>https://spectrum.ieee.org/robot-martial-arts</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/five-humanoid-robots-in-red-vests-perform-synchronized-movements-on-a-shiny-stage.png?id=64966934&width=1245&height=700&coordinates=87%2C0%2C87%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="mumlv814ajo">So humanoid robots are nearing peak human performance. I would point out, though, that this is likely very far from <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">peak robot performance</a>, which has yet to be effectively exploited, because it requires more than just copying humans.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="024bb0c442a4f9a923cb9d2287d65cd3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mUmlv814aJo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tvy0henpoto"><em>“The Street Dance of China”: Turning lightness into gravity, and rhythm into impact.This is a head-on collision between metal and beats. This Chinese New Year, watch PNDbotics Adam bring the heat with a difference.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9d2fff7bb612a25b39211999f00444b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Tvy0HenPoTo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zg286rri750">You had me at robot pandas.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fedfa68ba524b95536171d927a20a6af" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zG286rRI750?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.magiclab.top/en/">MagicLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="koftfrgo4zs"><em>NASA’s <a data-linked-post="2675264402" href="https://spectrum.ieee.org/perseverance-rover-nasa-anthropic-ai" target="_blank">Perseverance rover</a> can now precisely determine its own location on Mars without waiting for human help from Earth. This is possible thanks to a new technology called Mars global localization. This technology rapidly compares panoramic images from the rover’s navigation cameras with onboard orbital terrain maps. It’s done with an algorithm that runs on the rover’s helicopter base station processor, which was originally used to communicate with the Ingenuity Mars helicopter. In a few minutes, the algorithm can pinpoint Perseverance’s position to within about 10 inches (25 centimeters). The technology will help the rover drive farther autonomously and keep exploring. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="401bcf4da451bde0e05bb8995d292bd6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KofTfRGO4Zs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.jpl.nasa.gov/news/nasas-perseverance-now-autonomously-pinpoints-its-location-on-mars/">NASA Jet Propulsion Laboratory</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yjmfc4p-u60">Legs? Where we’re going, we don’t need legs!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d5d2299bc6edac5ee7dff1d6e24eef8d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yJmFc4p-U60?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/aisy.202500270">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ojioedp1mke">This is a bit of a tangent from robotics, but it gets a pass because of the cute jumping spider footage.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ffa1eda2a0ffbdd35216c4d08e332cee" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OjIoEDP1mKE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://als.lbl.gov/">Berkeley Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="7vuvvqap5vq"><em>Corvus One for Cold Chain is engineered to live and operate in freezer environments permanently, down to<br/> –20 °F, while maintaining full-flight and barcode-scanning performance.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="72c26bad197919021cd10399958457a7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7vuVVQAP5VQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I am sure there is an excellent reason for putting a cold-storage facility in the Mojave Desert.</p><p>[ <a href="https://www.corvus-robotics.com/">Corvus Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="plielbq6bzc"><em>The video documents the current progress made in the picking rate of the Shiva robot when picking strawberries. It first shows the previous status, then the further development, and finally the field test.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0cf52bc202c0d15d39ffa28de952a72d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PLIELBq6bZc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotik.dfki-bremen.de/de/forschung/projekte/roland">DFKI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mh4nlgi64mm"><em>Data powers an organization’s digital transformation, and ST Engineering MRAS is leveraging Spot to get a full view of critical equipment and a facility. Working autonomously, Spot collects information about machine health—and now, thanks to an integration of the Leica BLK ARC for reality capture, detailed and accurate point-cloud data for their digital twin.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="10813bef7dc9bee7221fe4245caf64c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mH4NLGI64MM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/case-studies/spot-at-st-engineering-mras/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="fhk28r9j-dq">The title of this video is “Get out and have fun!” Is that mostly what humanoid robots are good for right now, pretty much...?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="98b64102d3af8082637898f2bb44507d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fhK28R9J-DQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://en.engineai.com.cn/">Engine AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mey8uw65yw8"><em>Astorino is a modern six-axis <a data-linked-post="2666268821" href="https://spectrum.ieee.org/3d-printed-robot-hand" target="_blank">robot based on 3D-printing</a> technology. Programmable in AS language, the robot facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2343d89a61472a8c3764df56a794d75f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MeY8Uw65YW8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://astorino.com.pl/en/">Kawasaki</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="clw5-nkw1xs">Can I get this in my living room?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="412804fd295fdadb519e7277b046f10a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/clw5-NkW1xs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.yaskawa-global.com/centenary/robot">Yaskawa</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="-g8q-nx5fce"><em>What does it mean to build a humanoid robot in seven months, and the next one in just five? This documentary takes you behind the scenes at Humanoid, a U.K.-based AI and robotics company building reliable, safe, and  helpful humanoid robots. You’ll hear directly from our engineering, hardware, product, and other teams as they share their perspectives on the journey of turning physical AI into reality.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a984d29bd7b9e36f3603259780ba8687" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-G8q-NX5FCE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thehumanoid.ai/">Humanoid</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ckr6h6vsuz8">This IROS 2025 keynote is from <a data-linked-post="2650279622" href="https://spectrum.ieee.org/darpa-tim-chung-on-subt-challenge-urban-circuit" target="_blank">Tim Chung</a>—now at Microsoft—on catalyzing the future of human, robot, and AI agent teams in the physical world.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="71d714793cd8568276d695e1a470b03e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ckr6h6vSuz8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The convergence of technologies—from foundation AI models to diverse sensors and actuators to ubiquitous connectivity—is transforming the nature of interactions in the physical and digital world. People have accelerated their collaborative connections and productivity through digital and immersive technologies, no longer limited by geography or language or access. Humans have also leveraged and interacted with AI in many different forms, with the advent of hyperscale AI models (that is, large language models) forever changing (and at an ever-astonishing pace) the nature of human–AI teams, realized in this era of the AI “copilot.” Similarly, robotics and automation technologies now afford greater opportunities to work with and/or near humans, allowing for increasingly collaborative physical robots to dramatically impact real-world activities. It is the compounding effect of enabling all three capabilities, each complementary to one another in valuable ways, and we envision the triad formed by human–robot–AI teams as revolutionizing the future of society, the economy, and technology.</em></blockquote><p>[ <a href="https://www.iros25.org/KeynoteSessions.html">IROS 2025</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="3dgvyq66yhm">This GRASP SFI talk is by Chris Paxton at Agility Robotics: “How Close Are We to Generalist Humanoid Robots?”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d4595fc1a5a1e20ee73a48779ac78576" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3DgvYq66YhM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>With billions of dollars of funding pouring into robotics, general-purpose humanoid robots seem closer than ever. And certainly it feels like the pace of robotics is faster than ever, with multiple companies beginning large-scale deployments of humanoid robots. In this talk, I’ll go over the challenges still facing scaling robot learning, looking at insights from a year of discussions with researchers all over the world.</em></blockquote><p>[ <a href="https://www.grasp.upenn.edu/events/spring-2026-grasp-sfi-chris-paxton/">University of Pennsylvania GRASP Laboratory</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ry8itipzbfe">This week’s Carnegie Mellon University Robotics Institute Seminar is from Jitendra Malik at University of California, Berkeley: “Robot Learning, With Inspiration From Child Development.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8f92598b30d5a78a3c0f4c212b77230a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ry8itipzBFE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>For intelligent robots to become ubiquitous, we need to “solve” locomotion, navigation, and manipulation at sufficient reliability in widely varying environments. In locomotion, we now have demonstrations of humanoid walking in a variety of challenging environments. In navigation, we pursued the task of “Go to Any Thing”: A robot, on entering  a newly rented Airbnb, should be able to find objects such as TV sets or potted plants. RL in simulation and sim-to-real have been workhorse technologies for us, assisted by a few technical innovations. I will sketch promising directions for future work.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/robot-learning-with-inspiration-from-child-development/">Carnegie Mellon University Robotics Institute</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 20 Feb 2026 18:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/robot-martial-arts</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Agility-robotics</category><category>Perseverance-rover</category><category>Insect-robots</category><category>Industrial-robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/five-humanoid-robots-in-red-vests-perform-synchronized-movements-on-a-shiny-stage.png?id=64966934&amp;width=980"/></item><item><title>Tech Is Taking Over Olympic Curling</title><link>https://spectrum.ieee.org/olympics-curling-robot-ai</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/curling-players-sweeping-a-red-stone-on-ice-motion-blur-emphasizes-speed-and-action.jpg?id=64953312&width=1245&height=700&coordinates=0%2C62%2C0%2C63"/><br/><br/><p>At this year’s <a href="https://spectrum.ieee.org/winter-olympics-2026-tech" target="_blank">Winter Olympics in Italy</a>, the controversy began with a fingertip.</p><p>A disputed double-touch—whether a curler had <a href="https://www.nytimes.com/athletic/7045743/2026/02/13/curling-canada-sweden-marc-kennedy-cheating/" target="_blank">brushed a moving stone twice</a>—sparked protests, profanity-laced exchanges, and heated debate about sportsmanship. In a game that prides itself on mutual trust and the idea of competition as a shared test of skill, even the suggestion of impropriety can ripple far beyond a single end.</p><p>But if a double-touch can shake the sport, what happens when the controversy isn’t about a fingertip but an algorithm?</p><p>That’s the question shadowing the rise of analytics driven by machine learning and a new breed of AI-powered robots that can throw stones, read the ice, and calculate strategy with machine precision.</p><p class="ieee-inbody-related">RELATED: <a href="https://spectrum.ieee.org/winter-olympics-2026-tech" target="_blank">Milan-Cortina Winter Olympics Debut Next-Generation Sport Smarts</a></p><p>Some of these robots, such a “<a href="https://www.smithsonianmag.com/smart-news/curly-curling-robot-can-beat-pros-their-own-game-180975951/" target="_blank">Curly</a>,” have already toppled elite human opponents in head-to-head competitions. Others, engineered either to replicate the biomechanics of human shot delivery or to fire stones consistently with repeatable speed and rotation, are transforming the sport by dissecting technique and strategy with a level of rigor no coach with a stopwatch could match.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="7fb2199a88bd9f3a0f97671101939e8c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uj3ur1uW-7Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">Seen here in action, the two-part robot system named Curly made its debut in 2018 ahead of that year’s Paralympic Winter Games in Pyeongchang.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/@TUBerlinTV" rel="noopener noreferrer" target="_blank">TUBerlinTV/YouTube</a></small></p><p>“The amount of innovation I’m seeing is just tremendous,” says <a href="https://glennpaulley.ca" rel="noopener noreferrer" target="_blank">Glenn Paulley</a>, a retired computer scientist who now runs Throwing Rocks Consulting Services, where he coaches curlers and advises teams on analytics.</p><p><a href="https://www.tabletmag.com/sections/sports/articles/israeli-curling" target="_blank">Fueled by investments from governments</a> and sporting bodies around the world, the pursuit of a competitive edge has escalated into a data-driven push for marginal gains ahead of each Olympic cycle. <del></del>“They’re trying like crazy to elevate their national team programs,” Paulley says, “and they’re doing it in every way possible.” <span>By the time medals are handed out in Cortina d’Ampezzo this weekend, the imprint of this full-throttle tech offensive could be etched into every sheet of ice.</span></p><p>Yet, as algorithms begin suggesting shots, the contours of fair play blur. Regulators and coaches alike are grappling with where to draw the line. And as top curlers lean more into AI and robotic systems, some fear the loss of something fundamental: the quiet, hard-earned feel for ice that separates veterans from novices.</p><p>“It’s a big debate!” says <a href="https://en.wikipedia.org/wiki/Emily_Zacharias" target="_blank">Emily Zacharias</a>, a former elite curler from Manitoba who captured gold representing Canada at the 2020 World Junior Curling Championships.</p><p>Three decades ago, Garry Kasparov sat across from IBM’s Deep Blue and discovered that even the most cerebral of games could be <a href="https://spectrum.ieee.org/how-ibms-deep-blue-beat-world-champion-chess-player-garry-kasparov" target="_blank">unsettled by silicon</a>. Curling, long called “<a href="https://chessonice.ca/" target="_blank">chess on ice</a>,” may now be entering its own version of that reckoning.</p><h2>Can New Tech Comply With the “Spirit of Curling”?</h2><p>Curling has been at this kind of crossroads before. A decade back, the sweeping-fabric controversy known as “<a href="https://www.cbc.ca/listen/cbc-podcasts/1427-broomgate-a-curling-scandal" target="_blank"><span>Broomgate</span></a>” triggered accusations of <a href="https://spectrum.ieee.org/motor-doping-cycling" target="_blank">technological doping</a>, a dispute that tore at the heart of the sport’s ethos of trust and bonhomie.</p><p>The World Curling Federation responded by clamping down on brush materials, <span>but<strong> </strong></span><span>AI now poses a broader challenge. It is not just a better broom but a decision engine, capable of shifting authority from a player’s<strong></strong> judgment in the “<a href="https://www.curlingbasics.com/en/images/z_house_03.png" target="_blank">house</a>” to a model running in the cloud.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A hexapod robot stands on a demonstration curling course " class="rm-shortcode" data-rm-shortcode-id="58f9a3f0bebfe5952e925b87a6f208f8" data-rm-shortcode-name="rebelmouse-image" id="44f61" loading="lazy" src="https://spectrum.ieee.org/media-library/a-hexapod-robot-stands-on-a-demonstration-curling-course.jpg?id=64957715&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The six-legged “hexapod” curling robot is displayed at the World Robot Conference 2022 in Beijing, where that year’s Olympic Games were also held.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Anna Ratkoglo/Sputnik/AP</small></p><p>It’s a prospect that unsettles some athletes and ethicists, who worry about what gets lost as optimization tightens its grip on a sport long governed by the so-called <a href="https://www.olympics.com/en/video/the-spirit-of-curling" target="_blank">Spirit of Curling</a>, an unwritten code of integrity, fairness, and respect.</p><p>“We’re at a point now where just about everything that we used to hold up as uniquely human is now being eroded by technology—and we feel a loss,” says <a href="https://www.craiedl.ca/team/jason-millar" target="_blank">Jason Millar</a>, who runs the Canadian Robotics and AI Ethical Design Lab at the University of Ottawa.</p><p>“The AI doesn’t care,” he adds. “There’s no ‘spirit’ there.”</p><h2>Building Rock-Solid Curling Robots</h2><p>The Curly robot first made waves in 2018 when, ahead of that year’s Paralympic Winter Games in Pyeongchang, engineers at Korea University, in Seoul, <a href="https://www.youtube.com/watch?v=uj3ur1uW-7Q" target="_blank">unveiled the AI-powered device</a>—or, rather, two coordinated devices, a pair of “skip” and “thrower” units, designed to read the ice and deliver stones.</p><p>Driven by a physics-based simulator and an adaptive deep-reinforcement-learning framework, the robot didn’t simply replay preprogrammed shots. It learned from its own misses, updated its aim based on the distance gaps between intended and actual stone positions, and factored in the cumulative wear of pebbled ice as a match unfolded.</p><p>That capacity was put to the test in a series of mini-games against top-ranked Korean athletes. As reported in the journal <em></em><em><a href="https://www.science.org/doi/10.1126/scirobotics.abb9764" target="_blank">Science Robotics</a>,</em> Curly started slow, dropping the opening match as it calibrated to the live ice. But it then went on to win the next three contests, demonstrating what its creators called “human-level performance” under real-world conditions.</p><p>The next Winter Olympics—the <a href="https://spectrum.ieee.org/carbon-neutral-winter-olympics" target="_blank">Beijing 2022 Games</a>—then brought a more agile machine: <a href="https://doi.org/10.1016/j.eng.2023.10.018" target="_blank">a “hexapod” curling robot</a><a href="https://doi.org/10.1016/j.eng.2023.10.018"></a> built to walk, align, and throw like a human curler.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="4313dd37d8c0249258f89615b9114690" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IXQ7MjwdZ3A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">With six legs, the hexapod robot can act more like a human curler when launching the stone, putting a new spin on curling-robot tech.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=IXQ7MjwdZ3A" target="_blank">FlyingDumplings/YouTube</a></small></p><p>With its six-legged gait for stable traction and flexibility on the ice, the robot could pivot at the “hack,” the rubber foothold curlers use to launch their delivery. From there, the hexapod set its angle, kicked off, and glided on a skateboard-like undercarriage before releasing the stone, imparting competition-level spin.</p><p>Equipped with lidar and cameras, the robot scanned the sheet to map stone positions and fed those data into software that <a href="https://doi.org/10.1007/s11465-025-0835-5" target="_blank">calculated collision paths</a> and solved for the precise release parameters needed to execute a chosen strategy.</p><h2>Curling Bots Leave Broom for Improvement<br/></h2><p>For all the technical prowess of Curly and the hexapod, one stubborn constraint remains: No robot can sweep—at least not yet.</p><p>There are no <a href="https://spectrum.ieee.org/irobot-roomba-history" target="_self">Roomba</a>-like machines flanking the stone, frantically brushing to extend its travel or hold its line. Once released, the robot’s shot is fate, untouched by the vigorous, broom-flailing choreography that so often determines whether a stone bites the button or drifts wide.</p><p>“These robots are leaving out a huge chunk of potential that humans are bringing to the game,” says <a href="https://umanitoba.ca/kinesiology-recreation-management/faculty-staff/steven-passmore-phd" target="_blank">Steven Passmore</a>, a human-movement scientist at the University of Manitoba in Winnipeg who, together with Zacharias, coauthored a <a href="https://doi.org/10.3389/fspor.2024.1291241" target="_blank">comprehensive review of the scientific literature</a> on curling.</p><p>At the time of their data cutoff, in 2021, they found nearly two dozen published studies about robotics, AI, and emerging tech in the sport.<strong> </strong>But as Zacharias points out<del></del>, the most sophisticated tools shaping elite play often never appear in academic journals, developed behind closed doors and closely guarded as competitive secrets.</p><p>For her part, Zacharias—who <a href="https://stats.curling.io/players/zacharias-emily" target="_blank">competed at four Canadian women’s curling championships</a> between 2021 and 2024—says she never once practiced against a robot. But she has trained with a rock launcher, a mechanized delivery system that fires stones at precisely calibrated speeds and rotations, over and over.</p><p>By standardizing the throw, the device allows athletes to isolate how different sweeping techniques, brush-head fabrics, or ice temperatures alter a stone’s path, explains Paulley. “It means you can run repeated experiments in order to test the impact of different variables,” he says. “And in curling, there are <em>a lot</em> of variables.”</p><h2>Cutting-Edge Tech Helps Athletes Train</h2><p>In Japan, all these technologies and more are being explored in a government-backed initiative called <a href="https://xfuture.info/en/curling/" target="_blank">Curling of the Future</a>.</p><p>The program brings together university engineers, sporting agencies, and elite athletes to prototype delivery robots and sweep-assist machines, along with AI strategy engines, instrumented “smart stones,” and rock-launcher systems for controlled training. </p><p>“The core objective is elite performance: improving decision-making and the quality of training so that Japan can strengthen its competitiveness in international competition,” says <a href="https://www.fun.ac.jp/en/faculty/takegawa-yoshinari" target="_blank"><span>Yoshinari Takegawa</span></a>, an information scientist at the Future University Hakodate who is co-leading the project.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A set of photos shows a blonde woman in a wheelchair wearing a VR headset, and images of a VR curling environment." class="rm-shortcode" data-rm-shortcode-id="641c890960e19b2fb95aa6b42a14a6a0" data-rm-shortcode-name="rebelmouse-image" id="7c584" loading="lazy" src="https://spectrum.ieee.org/media-library/a-set-of-photos-shows-a-blonde-woman-in-a-wheelchair-wearing-a-vr-headset-and-images-of-a-vr-curling-environment.jpg?id=64953348&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Dylan Rusnak, a kinesiology student at Red Deer Polytechnic, contributed to the project by developing a VR system for curling. Rusnak wears a Meta Quest headset [left] while demonstrating the system, which shows athletes immersive views of the rink [right]. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Red Deer Polytechnic</small></p><p>The technology push isn’t confined to Olympic play either. At the Paralympics next month, the Canadian national wheelchair curling squad will be coming primed with training sessions inside a full virtual replica of the Cortina Curling Olympic Stadium, courtesy of a <a href="https://spectrum.ieee.org/tag/virtual-reality" target="_blank">VR system</a> developed by mechanical engineer <a href="https://rdpolytech.ca/about-us/faculty/jennifer-dornstauder" target="_blank">Jennifer Dornstauder</a> and her students at Red Deer Polytechnic in Alberta. </p><p><del></del>The setup drops athletes into an immersive curling rink via a <a href="https://spectrum.ieee.org/tag/meta" target="_self">Meta</a> Quest headset, where they can look down and see virtual renderings of their legs, wheelchair, throwing stick, stones, and the ice surface beneath them.<del></del></p><p>According to <a href="https://paralympic.ca/news/coach-spotlight-mick-lizmore-reconnects-para-sport-wheelchair-curling-coach/" target="_blank">Mick Lizmore</a>, head coach of Canada’s National Wheelchair Curling Program, his team has used the VR to help visualize the venue where they will be competing and for group tactical training<strong></strong>, even when they can’t meet together in person. Beyond sharpening elite preparation, Dornstauder says, the same tool should help expand access to wheelchair curling for <a href="https://spectrum.ieee.org/tag/disability" target="_blank">people with disabilities</a> who face mobility challenges or limited ice availability.</p><p>“VR is just this amazing tool that is almost designed for getting around these barriers,” she says.</p><h2>Will Tech Change Curling?</h2><p>Many of the technologies entering curling are, in many ways, benign—tools for analysis, accessibility, and incremental refinement rather than wholesale disruption. A rock launcher standardizes practice. A VR headset extends rehearsal beyond the rink. A strategy engine offers probabilities, not ultimatums.<del></del><span><br/></span></p><p><span>Taken together, however, they reveal how thoroughly digital systems are seeping into every layer of the sport.</span></p><p>AI-powered sparring machines tuned to mimic a rival team’s tendencies, and thus capable of playing out fully simulated preparatory matches, remain a fantasy. National curling programs operate on tight budgets, limiting how far and how fast innovation can go. And even well-funded federations must balance software and robotics against coaching, travel, and ice time. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A device shaped long a stretched letter H sits on a curling rink next to equipment." class="rm-shortcode" data-rm-shortcode-id="d1ad93daab6f9bd8fa1655e3c6b33c50" data-rm-shortcode-name="rebelmouse-image" id="2de6f" loading="lazy" src="https://spectrum.ieee.org/media-library/a-device-shaped-long-a-stretched-letter-h-sits-on-a-curling-rink-next-to-equipment.png?id=64953356&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Rock launchers provide a consistent throw to help athletes practice sweeping.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Sean Maw/University of Saskatchewan</small></p><p>Yet as money continues to flow into high-performance curling, those possibilities draw closer<em><em>.</em></em></p><p>“It’s probably just a matter of time,” says <a href="https://engineering.usask.ca/people/sopd/maw,sean.php" target="_blank">Sean Maw</a>, a sports engineer at the University of Saskatchewan who has <a href="https://harvest.usask.ca/items/7b2c095a-5f96-416e-86af-8ca6f073dc12" target="_blank">built rock launchers</a> and studies the complexities of curling<em><em></em></em>. </p><p>For now, the stones still leave human hands—hands capable of brilliance, instinct, and the occasional double-touch—and the final call still rests with the skip in the house. But the algorithms are edging closer to the button.</p>]]></description><pubDate>Wed, 18 Feb 2026 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/olympics-curling-robot-ai</guid><category>Robotics</category><category>Artificial-intelligence</category><category>Virtual-reality</category><category>Sports</category><category>Canada</category><category>Olympic-games</category><dc:creator>Elie Dolgin</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/curling-players-sweeping-a-red-stone-on-ice-motion-blur-emphasizes-speed-and-action.jpg?id=64953312&amp;width=980"/></item></channel></rss>