<?xml version="1.0" encoding="utf-8" standalone="no"?><rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:media="http://search.yahoo.com/mrss/" version="2.0"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/type/video.rss" rel="self"/><language>en-us</language><lastBuildDate>Tue, 12 Nov 2024 01:11:48 -0000</lastBuildDate><itunes:explicit>no</itunes:explicit><itunes:subtitle>IEEE Spectrum</itunes:subtitle><itunes:category text="Technology"><itunes:category text="Tech News"/></itunes:category><item><title>Unitree Demos New $16k Robot</title><link>https://spectrum.ieee.org/unitree-g1</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/B2pmDShvGOY?rel=0" width="100%"></iframe><br/><p>At ICRA 2024, <em>Spectrum</em> editor <a href="https://spectrum.ieee.org/u/evan-ackerman" target="_blank">Evan Ackerman</a> sat down with <a href="https://www.unitree.com/" target="_blank">Unitree</a> founder and CEO Xingxing Wang and Tony Yang, VP of Business Development, to talk about the company’s newest humanoid, the <a href="https://www.unitree.com/g1/" target="_blank">G1 model</a>. </p><p>Smaller, more flexible, and elegant, the G1 robot is designed for general use in service and industry, and is one of the cheapest—if not <em>the</em> cheapest—of a new wave of advanced <a href="https://spectrum.ieee.org/ai-robots" target="_blank">AI humanoid robots</a>.</p>]]></description><pubDate>Fri, 30 Aug 2024 17:01:06 +0000</pubDate><guid>https://spectrum.ieee.org/unitree-g1</guid><category>Humanoid-robots</category><category>Unitree</category><category>Robotics</category><dc:creator>IEEE Spectrum</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/ieee-spectrum.jpg?id=53562309&amp;width=980"/></item><item><title>DIY: Classic 555 Timer Kit</title><link>https://spectrum.ieee.org/discrete-555-timer</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/ei7OKotML38?rel=0" width="100%"></iframe><br/><p>Follow along as we build and test one of our favorite kits of all time, the Discrete 555 Timer! Build a huge version of one of the most iconic and surprisingly versatile integrated circuits of all time from transistor and resistors. </p><p>The 555 chip has been used at one time or another by nearly every E.E. alive, and you can use it to detect pulses, make lights blink, debounce inputs, trigger alarms, and even make music (terrible music, but music nonetheless!). We first <a href="https://spectrum.ieee.org/build-your-own-giant-555-timer-chip" target="_blank">wrote up the kit</a> in our <a href="https://spectrum.ieee.org/type/hands-on/" target="_blank">Hands On column</a> in <em>Spectrum</em>, and this is second version, which features some improvements over the original.</p><p><strong>Chapters:</strong></p><p><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=0s" target="_blank">00:00</a> Introduction and hello! <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=39s" target="_blank">00:39</a> The 555 integrated circuit <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=65s" target="_blank">01:05</a> Inside the 555 <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=144s" target="_blank">02:24</a> Oscillator mode applications <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=186s" target="_blank">03:06</a> Origin of the 555 <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=208s" target="_blank">03:28</a> Unboxing the Discrete 555 timer kit <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=268s" target="_blank">04:28</a> Comparing the 555 kit to the 555 chip <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=321s" target="_blank">05:21</a> Soldering <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=354s" target="_blank">05:54</a> Kit creator Eric Schlaepfer and <a href="https://spectrum.ieee.org/open-circuits" target="_blank">his book “Open Circuits”</a> <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=420s" target="_blank">07:00</a> Problems with my jig! <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=453s" target="_blank">07:33</a> The voltage divider <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=606s" target="_blank">10:06</a> Inside Tea <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=785s" target="_blank">13:05</a> Transistors <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=824s" target="_blank">13:44</a> Cleaning up the leads <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=884s" target="_blank">14:44</a> Silkscreen details <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=1243s" target="_blank">20:43</a> Radio Shack soldering iron magic goo <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=1374s" target="_blank">22:54</a> Getting the decorative legs on <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=1476s" target="_blank">24:36</a> 555 chip test circuit <br/><a href="https://www.youtube.com/watch?v=ei7OKotML38&t=1515s" target="_blank">25:15</a> Wiring in the 555 kit</p>]]></description><pubDate>Tue, 23 Jul 2024 14:01:00 +0000</pubDate><guid>https://spectrum.ieee.org/discrete-555-timer</guid><category>555</category><category>Hands-on</category><category>Integrated-circuits</category><category>Timer</category><category>Electronics-kits</category><category>Chip-design</category><dc:creator>Stephen Cass</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/ieee-spectrum.jpg?id=53562079&amp;width=980"/></item><item><title>360 Video: Zoom Over Zanzibar With Tanzania’s Drone Startups</title><link>https://spectrum.ieee.org/360-video-zoom-over-zanzibar-with-tanzanias-drone-startups</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/hPChaC0wyd0?rel=0" width="100%"></iframe><br/><p>With 360-degree video, <em>IEEE Spectrum</em> puts you aboard drones that are flying high above the Tanzanian landscape: You’ll ride along as drones soar above farms, towns, and the blue expanse of Lake Victoria. You’ll also meet the local entrepreneurs who are creating a new industry, finding applications for their drones in land surveying and delivery. And you’ll get a close-up view from a bamboo grove as a drone pilot named Bornlove builds a flying machine from bamboo and other materials.</p> <p>You can follow the action in a 360-degree video in three ways: 1) Watch on your computer, using your mouse to click and drag on the video; 2) watch on your phone, moving the phone around to change your view; or 3) watch on a VR headset for the full immersive experience.</p> <p><em><strong>If you’re watching on an iPhone:</strong> <a href="https://www.youtube.com/watch?v=hPChaC0wyd0">Go directly to the YouTube page</a> for the proper viewing experience.</em></p> <p>For more stories of how drones are changing the game in Africa, check out <em>IEEE Spectrum</em>’s “<a href="https://spectrum.ieee.org/special-reports/east-africas-big-bet-on-drones/">Tech Expedition: East Africa’s Big Bet on Drones</a>.”</p>]]></description><pubDate>Thu, 09 May 2019 14:30:00 +0000</pubDate><guid>https://spectrum.ieee.org/360-video-zoom-over-zanzibar-with-tanzanias-drone-startups</guid><category>Type-video</category><category>Drones</category><category>Tanzania</category><category>East-africa-drones</category><category>Gadgets</category><category>Africa</category><category>Mapping</category><category>Delivery-drones</category><category>360-video</category><dc:creator>Eliza Strickland</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=27053036&amp;width=980"/></item><item><title>360 Video: Go on a Mission With Zipline’s Delivery Drones</title><link>https://spectrum.ieee.org/360-video-go-on-a-mission-with-ziplines-delivery-drones</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/T9ZCi29mAiI?rel=0" width="100%"></iframe><br/><p>With 360 video, <em>IEEE Spectrum</em> takes you behind the scenes with one of the world’s first drone-delivery companies. <a href="https://flyzipline.com/">Zipline</a>, based in California, is using drones to deliver blood to hospitals throughout Rwanda. At an operations center in Muhanga, you’ll watch as Zipline technicians assemble the modular drones, fill their cargo holds, and launch them via catapult. You’ll see a package float down from the sky above a rural hospital, and you’ll get a closeup look at Zipline’s ingenious method for capturing returning drones.</p>
<p>You can follow the action in a 360-degree video in three ways: 1) Watch on your computer, using your mouse to click and drag on the video; 2) watch on your phone, moving the phone around to change your view; or 3) watch on a VR headset for the full immersive experience.</p>
<p><em><strong>If you’re watching on an iPhone:</strong> <a href="https://www.youtube.com/watch?v=T9ZCi29mAiI">Go directly to the YouTube page</a> for the proper viewing experience.</em></p>
<p>For more about Zipline’s technology and operations, check out the feature article “<a href="/robotics/drones/in-the-air-with-ziplines-medical-delivery-drones">In the Air With Zipline’s Medical Delivery Drones.</a>”</p>]]></description><pubDate>Mon, 06 May 2019 18:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/360-video-go-on-a-mission-with-ziplines-delivery-drones</guid><category>Type-video</category><category>Drones</category><category>East-africa-drones</category><category>Delivery-drones</category><category>360-video</category><category>Rwanda</category><category>Zipline</category><dc:creator>Eliza Strickland</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=27053035&amp;width=980"/></item><item><title>A Techie’s Tour of New York City</title><link>https://spectrum.ieee.org/a-techies-tour-of-nyc</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/BhBlorTfFcE?rel=0" width="100%"></iframe><br/><p>
	Do your travel plans include New York City? Are you a techie? If the answer to those questions is yes, let <em>IEEE Spectrum</em> be your guide! We've put together a list of some of our favorite places to visit, including important locations in the history of electrotechnology (New York was <a href="/geek-life/reviews/review-nycs-computing-history-on-display">once the center</a> of the electrical and electronic world) and places where fun and interesting things are happening today. See where <a href="https://en.wikipedia.org/wiki/Nikola_Tesla">Nikola Tesla</a> lived, check out cutting-edge artists working with technology, or take the kids to see an Atlas and Titan rocket.
</p><p>
	All the locations are accessible via the <a href="https://www.mta.info">subway</a>, and many are free to visit. If you do visit, take a selfie and post a link in the comments below.
</p>]]></description><pubDate>Wed, 17 Oct 2018 15:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/a-techies-tour-of-nyc</guid><category>Type-video</category><category>New-york-city</category><category>Rockets</category><category>Hackerspace</category><category>Videos</category><category>Video</category><category>Nikola-tesla</category><category>Tourism</category><category>Nyc-tourist-video</category><category>Events</category><category>Nyc-tourist-guide</category><category>Tech-tour-nyc</category><category>History</category><category>Technology</category><category>Nyc-tech-tour</category><category>Diy</category><dc:creator>Stephen Cass</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-techie-s-tour-of-nyc.jpg?id=27387689&amp;width=980"/></item><item><title>Don Eyles: Space Hacker</title><link>https://spectrum.ieee.org/don-eyles-space-hacker</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/SZpoKeIsrXY?rel=0" width="100%"></iframe><br/><p>In the early hours of 5 February 1971, <a href="https://www.doneyles.com/">Don Eyles</a> had a big problem: <a href="https://www.lpi.usra.edu/lunar/missions/apollo/apollo_14/">Apollo 14</a> astronauts Alan Shepard and Edgar Mitchell were orbiting the moon, preparing to land, but it looked like they were going to have to come home without putting so much as a single footprint on the surface. The only way to save the mission was for Eyles to hack his own software.</p><hr/> <p class="shortcode-media shortcode-media-rebelmouse-image rm-float-right rm-container-resized rm-resized-container-50">
<img alt="A book cover for 'Sunburst and Luminary' with a photograph depicting a spindly spacecraft above the lunar surface." class="rm-shortcode rm-resized-image" data-rm-shortcode-id="48c5a519b04aad1cace927d3305629f5" data-rm-shortcode-name="rebelmouse-image" id="881e9" loading="lazy" src="https://spectrum.ieee.org/media-library/a-book-cover-for-sunburst-and-luminary-with-a-photograph-depicting-a-spindly-spacecraft-above-the-lunar-surface.jpg?id=25586136&width=980" style="float: right"/>
<small class="image-media media-caption" placeholder="Add Photo Caption...">Don Eyles's recent memoir details how the mission software for the lunar module was developed.</small>
</p> <p>Shepard and Mitchell were onboard their lunar module, the <em>Antares</em>. The <em>Antares</em> flight computer was registering occasional presses of an Abort button in the cabin, even though the astronauts hadn't touched it. A loose ball of solder was floating around in zero gravity inside the switch and shorting it out. The button was intended for extreme emergencies. But once the descent to the lunar surface had begun, the rogue bit of solder could activate the switch, ordering the <em>Antares</em> computer to try to rocket the lunar module back into orbit. Eyles had written the mission software running in the <em>Antares</em> computer, and his challenge at that moment was this: Find a way to lock out the emergency switch behavior that he had carefully programmed into the computer.</p> <p>Eventually, Eyles was able to come up with a few lines of instructions that the astronauts were to punch into their computer, bypassing the code that paid attention to the switch. Apollo 14 landed on the moon later that day. His fix was elegant and creative, and it's not hard to see why Eyles finds no discontinuity between engineering and art. In fact, in later life, he himself went on to become a photographer and sculptor.</p> <p><em>IEEE Spectrum</em> was able to speak to Eyles at the <a href="https://vcfed.org/wp/festivals/vintage-computer-festival-east/">Vintage Computer Festival East</a> in May. He shared interesting anecdotes from his career, including how he saved the Apollo 14 mission. He was there to give a talk and promote his recently released book, <a href="https://www.sunburstandluminary.com/SLhome.html"> <em>Sunburst and Luminary: An Apollo Memoir</em></a>, which provides a wealth of inside detail about how the Apollo software was developed at the <a href="https://www.draper.com/"> Charles Stark Draper Laboratory</a>, then a part of MIT.</p>]]></description><pubDate>Tue, 10 Jul 2018 15:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/don-eyles-space-hacker</guid><category>Type-video</category><category>Editorial</category><category>Nasa</category><category>Moon</category><category>Apollo</category><category>Spaceflight</category><category>Video</category><category>Lunar-landing</category><category>Space-exploration</category><category>Space-flight</category><category>Software</category><category>Space</category><category>Software-engineering</category><dc:creator>Stephen Cass</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=27387688&amp;width=980"/></item><item><title>We Grew Algae and Asked Spectrum Editors to Taste It</title><link>https://spectrum.ieee.org/we-grew-algae-and-asked-spectrum-editors-to-taste-it</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=https://content.jwplatform.com/players/IWFQ048a-5Zdv3OJ1.js" width="100%"></iframe><br/><p><span>When was the last time you sipped algae? Chances are, you’ve never done that. But while working on </span><a href="https://spectrum.ieee.org/special-reports/blueprints-for-a-miracle/"><span>a special report</span></a><span> about potential climate-saving technologies, </span><em>IEEE Spectrum</em><span> decided to try to grow </span><em>Spirulina</em><span>, which proponents have pitched as a </span><a href="/energy/environment/the-green-promise-of-vertical-farms"><span>sustainable food</span></a><span>, in a five-gallon plastic bucket in a back room of our New York City office over the course of six weeks. </span></p>
<p><span>The point was to not only explore algae’s ability to reduce carbon dioxide emissions and serve as a stupendous source of vegetable protein—two topics addressed in </span><a href="/energy/environment/new-tech-could-turn-algae-into-the-climates-slimy-savior"><span>“New Tech Could Turn Algae Into the Climate’s Slimy Savior”</span></a><span>—but to also figure out whether anyone would actually eat the stuff. Though the majority of algae grown by humans is cultivated in large farms to be dried or added to processed foods, a company called </span><a href="https://www.livespira.com/"><span>Spira</span></a><span> sells a home-growing kit (which we bought) to encourage people to harvest and eat fresh algae every day.    </span></p>
<p><span>We originally thought this humble experiment would take between 10 days and 2 weeks, but growing algae wasn’t as straightforward as we’d hoped. After replacing an LED light, pouring in liberal amounts of baking soda, carefully adding iron droplets, and starting over when we accidentally killed it—we finally harvested our first batch. Then, we asked </span><em>Spectrum</em><span> editors if they’d be willing to drink it, for the planet’s sake.</span></p>]]></description><pubDate>Sat, 30 Jun 2018 15:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/we-grew-algae-and-asked-spectrum-editors-to-taste-it</guid><category>Type-video</category><category>Green-energy</category><category>Blueprints</category><category>Food-technology</category><category>Climate-change</category><category>Sustainability</category><category>Algae</category><category>Agriculture</category><dc:creator>Amy Nordrum</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo-ieee-spectrum.jpg?id=25586096&amp;width=980"/></item><item><title>Nanoscale 3D Printing Technique Uses Micro-Pyramids to Build Better Biochips</title><link>https://spectrum.ieee.org/nanoscale-3d-printing-technique-uses-photochemistry-and-micropyramids-to-build-better-biochips</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=https://content.jwplatform.com/players/SEcp8HlN-rDStukqM.js" width="100%"></iframe><br/><p><span>Making biochips, a key technology in studying disease, just got a little easier. This new </span><a href="https://www.cell.com/chem/fulltext/S2451-9294(18)30042-1"><span>nanoprinting process</span></a><span> uses gold-plated pyramids, an LED light, and photochemical reactions to print more organic material on the surface of one single biochip than ever before.</span></p>
<p></p>
<p><span>The technique uses an array of polymer pyramids that are covered in gold and mounted onto an atomic force microscope. These arrays, which are one square centimeter in size, contain thousands of tiny pyramids with holes that allow light through, and make sure that the light goes only to specific places on the surface of a chip below, immobilizing delicate organic reagants on the chip’s surface without damaging them.</span></p>
<p></p>
<p><span>Processes like this, known as </span><a href="/nanoclast/semiconductors/nanotechnology/reconfigurable-nanopatterning-technique-promises-new-generation-of-metamaterials"><span>tip-based lithography</span></a><span>, are widely considered to be the best way to 3D print organic material with nanoscale feature resolution. But in the past, they were limited by the fact that they could only print one kind of molecule at a time.</span></p>
<p></p>
<p><span>Now researchers at Hunter College and the Advanced Science Research Center (<a href="https://www.asrc.cuny.edu/">ASRC</a>) at The Graduate Center of the City University of New York think they have solved that problem.</span></p>
<p></p>
<p><span>They’re using microfluidics, the manipulation of fluids on a molecular level, to expose each biochip to the desired combination of chemicals. Then, they use photochemistry to shine light through the apertures in the pyramids. As the light reacts with the molecules, it adheres them to the chip.</span></p>
<p></p>
<p><span>With typical tip-based lithography systems, the light can overpower the chip, destroying some molecules. But the CUNY research team uses </span><a href="/nanoclast/semiconductors/nanotechnology/desktop-nanofabrication-becomes-much-cheaper"><span>beam-pen lithography</span></a><span>, where the light is confined and channeled through small apexes. This allows the team to control the light and protect the organic materials that they have already printed on the biochip.</span></p>
<p><span><a href="https://nanoscience.asrc.cuny.edu/people/dr-adam-braunschweig/">Adam Braunschweig</a>, the lead researcher and an associate professor with the ASRC’s Nanoscience Initiative and Hunter College’s Department of Chemistry, says this method of 3D printing biochips will help scientists understand cells and biological pathways. That’s because this technology should make it easier and more efficient to study disease development and solve other biological puzzles, such as detecting </span><a href="/the-human-os/biomedical/imaging/ai-makes-anthrax-bioterror-detection-easier"><span>bioterrorism agents</span></a><span>.</span></p>]]></description><pubDate>Sat, 21 Apr 2018 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/nanoscale-3d-printing-technique-uses-photochemistry-and-micropyramids-to-build-better-biochips</guid><category>Nanotechnology</category><category>Type-video</category><category>Biotechnology</category><category>3d-printing</category><category>Medical-devices</category><dc:creator>Christina Dabney</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/this-nanoprinting-process-allows-researchers-to-3d-print-more-on-a-biochip-than-ever-before-making-it-easier-to-study-biomedica.jpg?id=25585589&amp;width=980"/></item><item><title>Constructing a Better Bike Light</title><link>https://spectrum.ieee.org/constructing-a-better-bike-light</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=https://content.jwplatform.com/players/eAwffzTD-5Zdv3OJ1.js" width="100%"></iframe><br/><p>Bicycling at night can be dangerous, particularly if you don't put much effort into making yourself visible to drivers. Alas, many people don't. This video describes the construction of an Arduino-controlled rear light meant to make a cyclist more visible by throwing a sequence of red spots on the ground adjacent to the bike. It's not a perfect insurance policy by any means, but it's better than what many people are doing—riding about on bikes at night with little or no light to advertise their presence.</p>
<p>Read more: <a href="/geek-life/hands-on/build-an-attentiongrabbing-bicycle-light">Build an Attention-Grabbing Bicycle Light</a></p>]]></description><pubDate>Wed, 18 Apr 2018 20:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/constructing-a-better-bike-light</guid><category>Type-video</category><category>Leds</category><category>Safety</category><category>Bicycling</category><category>Lighting</category><category>Arduino</category><category>Lights</category><category>Diy</category><category>Hands-on</category><dc:creator>David Schneider</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/projecting-light-onto-the-ground-makes-a-bicyclist-easy-to-spot-but-it-isnt-easy-to-build.jpg?id=25585565&amp;width=980"/></item><item><title>The Most Interesting Thing About Stephen Colbert's Monologues Is The Wall Behind Him</title><link>https://spectrum.ieee.org/the-most-interesting-thing-about-stephen-colberts-monologues-is-the-wall-behind-him</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/3NsCGxJVPrs?rel=0" width="100%"></iframe><br/><p>On Broadway, a few blocks north of Times Square in New York City, visitors flock to the <a href="https://en.wikipedia.org/wiki/Ed_Sullivan_Theater">Ed Sullivan Theater</a>. The theater is currently home to television’s <em><a href="https://www.cbs.com/shows/the-late-show-with-stephen-colbert/">The Late Show with Stephen Colbert</a></em>, and nearly every week night, Colbert takes to the famous stage to tape a new comedy monologue in front of a live audience. But right around the corner from the illuminated marquee of the Ed Sullivan Theatre, another building draws a steady, if small, crowd of devotees. This building is Substation 13, and it’s been a vital cog in the running of New York’s <a href="https://www.mta.info/">MTA subway</a> system since 1904.</p>
<p>The star attraction of Substation 13 is an enormous rotary convertor weighing 45 tons with a spinning amature 3 meters in diameter, dubbed Rotary #1. Trains in the NYC subway drive their electric motors by tapping a third rail that is energized with 600 volts of <a href="https://engineering.mit.edu/engage/ask-an-engineer/whats-the-difference-between-ac-and-dc/">direct current</a>. But electricity generated by the power company is transmitted over the grid as alternating current, so the subway must convert this AC power to DC, and do so at wattage levels powerful enough to speed trains full of people beneath the streets. Today, this job is done in Substation 13, and other MTA substations, by nondescript grey cabinets full of solid-state rectifiers. But for decades, it was the job of converters like Rotary #1.</p>
<p>These converters essentially pair an AC motor with a DC generator on the same shaft. AC power at 25 hertz is fed into the enormous windings, the convertor spins at 250 revolutions per minute, and up to 1,500 kilowatts of DC power emerges from the other side. The converter and its connection to the third rail of the subway are controlled using a set of panels, each over 2 meters tall, that are studded with the kind of dials and knife switches that most people associate with the laboratories of old-school mad scientists. Convertors would be spun up and connected to subway lines as required to handle shifting power needs over the course of the day.</p>
<p>Rotary #1 was in operation until 1999, when the local power company stopped supplying 25-hertz AC power. The engineer who took the convertor off-line for the last time was Robert Lobenstein. Lobenstein later was a protagonist in the restoration of Rotary #1, and he now gives tours of Substation 13 (tickets can be obtained via the MTA’s museum website).</p>]]></description><pubDate>Fri, 13 Apr 2018 18:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/the-most-interesting-thing-about-stephen-colberts-monologues-is-the-wall-behind-him</guid><category>Type-video</category><category>Subways</category><category>Rotary-converter</category><category>Mta</category><category>Mass-transit-system</category><category>Mass-transit</category><category>History</category><dc:creator>Stephen Cass</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/nyc-s-45-ton-rotary-convertor.jpg?id=34797234&amp;width=980"/></item><item><title>Better Living Through Virtual Reality</title><link>https://spectrum.ieee.org/better-living-through-virtual-reality</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=https://content.jwplatform.com/players/NNkVjZDF-5Zdv3OJ1.js" width="100%"></iframe><br/><p></p>
<p>Residents settle into swivel chairs in a beauty shop on the campus of Ebenezer, a senior living community in the heart of the Minneapolis, Minn. They strap on <a href="https://www.samsung.com/global/galaxy/gear-vr/">Samsung Gear VR</a> headsets. With a couple of clicks on the side of their goggles, they are transported to Stonehenge. Or find themselves sitting on the deck of a paddle boat cruising down the Mississippi River. Or standing on stage with violinist and vocalist <a href="https://violinscratches.com/">Gaelynn Lea</a> as she sings a haunting ballad.</p>
<p></p>
<p>Some rock back and forth in their chairs, narrating aloud the scene in which they find themselves. Some squeal in delight. Others fall totally silent, craning their necks this way and that, immersed in a new experience.</p>
<p></p>
<p>They’re all part of a virtual reality for wellness pilot overseen by Joel Prevost, the administrator for <a href="https://www.ebenezercares.org/">Ebenezer’s</a> Minneapolis campus, and Chris Mangold, Ebenezer’s lifelong learning coordinator.</p>
<p></p>
<p>“We focused on a group of around 30 seniors who would be part of the original study,” Prevost explained. These residents experienced virtual reality twice a week for one month, in sessions that lasted about 10 minutes each. Once a week, participants were asked about their feelings of wellness. About 90 percent of the pilot participants reported increased feelings of relaxation and well-being that lasted long after their VR session was over.</p>
<p></p>
<p>The startup behind the trial, Minneapolis-based Visual, provided the mobile phone-based VR rig, which comes loaded with their proprietary software  that allows the residents to click through a menu of scenarios. That menu features content created by Visual in the last year with partners like <a href="https://www.mpr.org/">Minnesota Public Radio</a>, Minnesota Opera, USA Dance, and the comedy venue Brave New Workshop.</p>
<p></p>
<p>It’s not exactly <a href="https://readyplayeronemovie.com/"><em>Ready Player One</em>’s</a> OASIS virtual world. The VR experience itself is limited to 360-degree videos that let the user look around a scene. But it’s enough to give the residents a taste of what VR can do.</p>
<p></p>
<p>“You feel like you’re part of [the scene],” says Ebenezer resident David Zimmer. “And it’s real easy to forget that you’re sitting in a chair looking through some goggles.”</p>
<p></p>
<p>The spillover effects from the VR experience surprised Visual’s president Esra Kucukciftci and founder/CEO Chuck Olsen. “People were more likely to socialize with friends and family—we presume because they had something to talk about that’s not, like, who broke their hip. It’s like, ‘Oh my gosh, did you go to the Louvre?’” says Olsen.</p>
<p></p>
<p>Speaking to a reporter, several enthusiastic participants expressed a desire for Visual to make the 360-degree experience a true VR environment, where they could interact with what they see and potentially with avatars of other residents or family and friends outside of Ebenezer.</p>
<p></p>
<p>Visual is exploring the development of a true VR experience, which Olsen says will take much more powerful VR computers and headsets. For now, the company is focused on making their current product more social.</p>
<p></p>
<p>“Step one will be, everyone is seeing the same content at the same time and can talk to each other,” says Olsen. “We want to do avatars, but that adds a lot of processing overhead that would be taxing to the devices, so that’s the next step.”</p>
<p></p>
<p>As for the residents of Ebenezer, they will continue to have access to Visual’s wellness VR package and equipment that can take them out of their community in the frozen North and transport them to places they might otherwise only dream of seeing in reality.</p>
<p></p>]]></description><pubDate>Fri, 30 Mar 2018 19:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/better-living-through-virtual-reality</guid><category>Type-video</category><category>Wellness-apps</category><category>Elder-care</category><category>Virtual-reality</category><category>Vr</category><category>Gadgets</category><category>Audio</category><category>360-video</category><category>Gaming</category><category>Wellness</category><category>Medical-devices</category><dc:creator>Harry Goldstein</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo.jpg?id=25585447&amp;width=980"/></item><item><title>Origami-Folded Hydrogel Paper Instantly Generates 110 Volts of Electricity</title><link>https://spectrum.ieee.org/origamifolded-hydrogel-paper-instantly-generates-110-volts-of-electricity</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=https://content.jwplatform.com/players/PaNzDe3y-5Zdv3OJ1.js" width="100%"></iframe><br/><span> </span>
<p><span><a href="/video/robotics/robotics-hardware/octopusinspired-camouflage-for-soft-robotics">Animal-inspired technology</a> has gone electric. These brightly colored, 3D-printed gels have the potential to create up to 110 volts of electricity in an instant, similar to the electric eel.</span></p>
<p><span>Rows of small hydrogel dots are packed with positively and negatively charged ions that combine together to mimic an electric eel’s cellular structure. Printing and stacking these hydrogels produces the highest amount of voltage, while a connection to a larger contact area produces the highest current</span><span>. Scientists are hoping that this system could potentially lead to a device that generates power from inside of the human body. </span></p>
<p><span>“The electric eel is able to create very, very large amounts of power. And we thought that this was remarkable,” said Anirvan Guha, one of the researchers on <a href="https://www.nature.com/articles/nature24670">the project</a>, designed at the University of Fribourg in Switzerland.  “So we started to think about whether or not we could create a system that could generate electricity in the same way.”</span></p>
<p><span>An eel’s unique ability comes from a specialized organ housing thousands of cells called electrocytes. The chemical make up of these cells allows for a positive or negative charge. The surrounding membranes control the charge by allowing ions to pass through, inciting an electric reaction, or by blocking the ions and returning the organ to a neutral, dormant state. </span></p>
<p><span>When an eel is threatened or stalking prey, a neural impulse is sent to the membranes in the electrocytes, and positive ions flood into the cells. In a second, the electric voltage in each cell can go from zero millivolts to 150 millivolts, producing a total of up to 600 volts.</span></p>
<p><span>This new power generator works in a similar way. </span></p>
<p><span>It uses four different types of hydrogels to mimic the eel’s electrical system. One with a high salt concentration, one with a low salt concentration, and two charged membranes—one negative and one positive.</span></p>
<p><span>The first attempt at putting this system together involved using a fluidic autosampler that pushed the gels into sequence in tubes. The more gels in a sequence, the higher the voltage. But the researchers couldn’t build an array long enough to produce the desired voltage.</span></p>
<p><span>So the researchers moved on to <a href="/tech-talk/computing/hardware/mechanical-metamaterials-and-other-3d-printing-tech-from-chi-2017">3D printing</a>.  They printed a sequence of about 2,500 gels on two plastic sheets the size of regular printer paper. When they connected two gel papers, they were able to produce 110 volts of charge within seconds.</span></p>
<p><span>This was a huge jump in electricity from the previous method, but the current was still too low for most practical applications.</span></p>
<p><span>At the suggestion of a colleague, they tried connecting the gels through a Miura-ori fold, a type of origami fold that allows the gels to stack on a folded sheet. The gels connect simultaneously with a large contact area, more closely resembling the geometry of the eel’s cells.  This method increased the current and prevented energy waste by decreasing the time it took for the gels to connect.</span></p>
<p><span>Guha says he and his team would love to find a way to make the hydrogels thinner, which would allow for an even higher current. They imagine that one </span><span>day this system, or one like it, could be used to power internal biological devices such as pacemakers. </span></p>
<p><span>“Because the power source is ionic gradients,” Guha says, “our hope is that you could implant one of these devices and the power could be maintained from the ionic gradients within the human body.” </span></p>
<span> </span>]]></description><pubDate>Sat, 17 Mar 2018 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/origamifolded-hydrogel-paper-instantly-generates-110-volts-of-electricity</guid><category>Type-video</category><category>Switzerland</category><category>Power-grid</category><category>Electronics</category><category>Hydrogel</category><category>3d-printing</category><category>Animals</category><category>Biomimicry</category><category>Electric-eel</category><category>Medical-devices</category><dc:creator>Christina Dabney</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/researchers-combine-3d-printed-hydrogels-and-origami-to-create-an-electric-eel-inspired-power-source.jpg?id=25585348&amp;width=980"/></item><item><title>Extended Director’s Cut: Ted Nelson on What Modern Programmers Can Learn From the Past</title><link>https://spectrum.ieee.org/extended-directors-cut-ted-nelson-on-what-modern-programmers-can-learn-from-the-past</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=https://content.jwplatform.com/players/wklF65si-5Zdv3OJ1.js" width="100%"></iframe><br/><p><em>Editors Note: Due to popular demand, we're releasing this extended version of our interview with Ted Nelson, in which he talks about the work of Douglas Englebart, and in more detail about the origins of Xanadu, and how he views programs as art.</em></p>
<div></div>
<p>Ted Nelson is one of the original prophets of the information age. In the 1960s he invented the word <em>hypertext</em>, and created project <a href="https://xanadu.com/">Xanadu</a>, which prefigured many of the elements of the World Wide Web.</p>
<p></p>
<p>Nelson was part of personal computing at a time when it saw itself as an outgrowth of the countercultural movement that flourished in the 1960s. This computing was done either via a terminal to minicomputers, or on microprocessors with transistor counts measuring only in thousands. Back in the summer of 2016, Nelson was a keynote speaker at <a href="https://vcfed.org/wp/festivals/vintage-computer-festival-east/">Vintage Computing Festival East</a> in New Jersey and <em>IEEE Spectrum</em> had the chance to interview him off-stage.</p>
<p></p>
<p>We thought this was a good time to dust off that interview. We’re entering a period when the possibilities and dangers of computing are looming large in our minds, thanks to the explosion of machine learning, debates over the governance of the Internet, the impacts of automation, and unexpected weaknesses revealed by the <a href="/tech-talk/semiconductors/processors/how-the-intel-processor-meltdown-vulnerability-was-thwarted">Spectre and Meltdown</a> hardware bugs. Nelson talks about how he and his fellow pioneers thought the future would be a world of citizen programmers, how the Web omits much of the architecture underlying <a href="https://www.xanadu.net/">Xanadu</a>, and his advice for breaking through the current limits to new conceptual ground.</p>
<p>Producer: Celia Gorman<br/> Videographer: Kristen Clark</p>]]></description><pubDate>Wed, 31 Jan 2018 21:30:00 +0000</pubDate><guid>https://spectrum.ieee.org/extended-directors-cut-ted-nelson-on-what-modern-programmers-can-learn-from-the-past</guid><category>Type-video</category><category>Profiles</category><category>Ted-nelson</category><category>Xanadu</category><category>Personal-computing</category><category>Hypertext</category><dc:creator>Stephen Cass</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/ted-nelson.jpg?id=25585009&amp;width=980"/></item><item><title>Ted Nelson on What Modern Programmers Can Learn From the Past</title><link>https://spectrum.ieee.org/ted-nelson-on-what-modern-programmers-can-learn-from-the-past</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=https://content.jwplatform.com/players/lin3eKDo-5Zdv3OJ1.js" width="100%"></iframe><br/><p><span>Ted Nelson is one of the original prophets of the information age. In the 1960s he invented the word </span><em>hypertext</em><span>, and created project <a href="https://xanadu.com/">Xanadu</a>, which prefigured many of the elements of the World Wide Web.</span></p>
<p></p>
<p><span>Nelson was part of personal computing at a time when it saw itself as an outgrowth of the countercultural movement that flourished in the 1960s. This computing was done either via a terminal to minicomputers, or on microprocessors with transistor counts measuring only in thousands. Back in the summer of 2016, Nelson was a keynote speaker at </span><a href="https://vcfed.org/wp/festivals/vintage-computer-festival-east/"><span>Vintage Computing Festival East</span></a><span> in New Jersey and </span><em>IEEE Spectrum</em><span> had the chance to interview him off-stage.</span></p>
<p></p>
<p><span>We thought this was a good time to dust off that interview. We’re entering a period when the possibilities and dangers of computing are looming large in our minds, thanks to the explosion of machine learning, debates over the governance of the Internet, the impacts of automation, and unexpected weaknesses revealed by the </span><a href="/tech-talk/semiconductors/processors/how-the-intel-processor-meltdown-vulnerability-was-thwarted"><span>Spectre and Meltdown</span></a><span> hardware bugs. Nelson talks about how he and his fellow pioneers thought the future would be a world of citizen programmers, how the Web omits much of the architecture underlying <a href="https://www.xanadu.net/">Xanadu</a>, and his advice for breaking through the current limits to new conceptual ground.</span></p>
<p><span>Producer: Celia Gorman</span><br/> <span>Videographer: Kristen Clark</span></p>]]></description><pubDate>Sat, 20 Jan 2018 14:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/ted-nelson-on-what-modern-programmers-can-learn-from-the-past</guid><category>Type-video</category><category>Profiles</category><category>Ted-nelson</category><category>Xanadu</category><category>Personal-computing</category><category>Hypertext</category><dc:creator>Stephen Cass</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/ted-nelson-on-what-modern-programmers-can-learn-from-the-past.jpg?id=25584949&amp;width=980"/></item><item><title>Casting a $20 Million Mirror for the World’s Largest Telescope</title><link>https://spectrum.ieee.org/casting-a-20-million-mirror-for-the-worlds-largest-telescope</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/FTe6KKyIWqs?rel=0" width="100%"></iframe><br/><p>Building a mirror for any giant telescope is no simple feat. The sheer size of the glass, the nanometer precision of its curves, its carefully calculated optics, and the adaptive software required to run it make this a task of herculean proportions. But the recent castings of the 15-metric ton, off-axis mirrors for the Giant Magellan Telescope (GMT) forced engineers to push the design and manufacturing process beyond all previous limits.</p>
<p>Building the GMT is not a task of years, but of decades. The <a href="https://www.gmto.org/">Giant Magellan Telescope Organization</a> (GMTO) and a team at the <a href="https://mirrorlab.arizona.edu/">University of Arizona’s Richard F. Caris Mirror Laboratory</a> cast the first of seven mirrors back in 2005; they expect to complete construction of the telescope in 2025. Once complete, it’s expected to be the largest telescope in the world. The seven 8.4-meter-wide mirrors will combine to serve as a 24.5-meter mirror telescope with 10 times the resolution of the Hubble Space Telescope. This will allow astronomers to gaze back in time to, they hope, the materialization of galaxies.</p>
<div>
<span> </span>
</div>
<p>Each mirror costs US $20 million dollars and takes more than two years to build. Every stage of the manufacturing process calls for careful thought and meticulous planning. To begin, more than 17,000 kilograms of special glass are ordered and inspected for flaws. Next, a crew must build a 15-metric ton ceramic structure to serve as a mold for the glass, which they carefully place one chunk at a time. The glass is slowly melted and continuously spun in a furnace to create a parabolic shape, then cooled by fractions of degrees over the course of three months. And that’s only the beginning.</p>
<p>Once cooled, massive machinery lifts the mirror and tilts it to a vertical position. Engineers purge the ceramic mold from the mirror, wait for it to dry, and then rotate it again. They grind and refine the back of the mirror with exacting precision. Then they reposition the mirror in order to shape and polish the front face to within 20 nanometers of perfection—a process that takes about 18 months. Along the way, it undergoes four optical tests, some of which were engineered specifically for this project.<br/>  <br/> Any mammoth mirror requires much of the same engineering, but six of the seven GMT mirrors have an off-axis, parabolic shape. Producing an off-axis mirror at this scale is a new achievement for the Caris Mirror Laboratory and for the field in general.<br/>  <br/> Once four of the mirrors are complete, they must be transported to the Chilean Andes, where the giant telescope will be constructed on the peak of a mountain range. Even the transport to Chile will be a challenge—so much so that the teams have yet to decide exactly how they’ll pull it off. Still, GMTO says it is on course for the four-mirror installation and “First Light” in 2023, when the telescope will be turned to the night skies for the first time.</p>
<p>And then we’ll all get a chance to peer into the maternity ward of the cosmos and see galaxies being born.</p>]]></description><pubDate>Sat, 06 Jan 2018 21:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/casting-a-20-million-mirror-for-the-worlds-largest-telescope</guid><category>Type-video</category><category>Gmt</category><category>Mirror</category><category>Telescope</category><category>Astrophysics</category><category>University-of-arizona</category><dc:creator>Celia Gorman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/casting-a-20-million-mirror.jpg?id=34862457&amp;width=980"/></item><item><title>Building Alaska’s Internet</title><link>https://spectrum.ieee.org/building-alaskas-internet</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=https://content.jwplatform.com/players/bp7GzJwa-5Zdv3OJ1.js" width="100%"></iframe><br/><p><span>One of the most ambitious telecommunications projects in the rural United States was completed this year, after US $300 million of investment and six years of construction. <a href="https://terra.gci.com/">TERRA</a>, General Communication Inc.’s new hybrid fiber-microwave network, uses a combination of repeater data links and fiber optics to form <a href="/telecom/wireless/109-microwave-towers-bring-the-internet-to-remote-alaska-villages">a giant, 5,000-⁠kilometer ring</a> around southwest Alaska. If the network had been built in the contiguous United States, it would stretch from Washington, D.C. to Seattle. But it will only serve about as many customers as live in Twin Falls, a small city in Idaho.</span></p>
<p><span>TERRA relies primarily on 109 microwave towers, a classic technology that can still be deployed faster and more cheaply than fiber-optic cables in most of rural Alaska. But the project nevertheless took GCI a very long time and a lot of money to complete. It shows what a challenge it can be to build and maintain a communications network in one of the most remote areas in the world—even when you are using the most tried-and-true technology available.</span></p>
<p><span>Because there is no road system throughout much of <a href="/tech-talk/telecom/wireless/scientists-in-alaska-attempt-to-produce-fake-aurora-with-giant-antenna-array">Alaska</a>, each of those microwave towers had to be flown or barged in. Some were installed in villages, and 23 were hoisted to mountaintop sites by a small fleet of helicopters. These towers, all 109 of them, poke up every 15 to 65 km in the Alaskan bush. </span></p>
<p><span>Ultimately, one of TERRA’s greatest strengths comes from its shape. Circles or rings create extra resiliency within a network: If one site does go down, all the traffic can be immediately routed in the opposite direction, taking the long way around the ring but ultimately getting where it needs to go.</span></p>
<p><span>Read more: </span><a href="/telecom/wireless/109-microwave-towers-bring-the-internet-to-remote-alaska-villages">109 Microwave Towers Bring the Internet to Remote Alaska Villages</a></p>]]></description><pubDate>Sat, 16 Dec 2017 14:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/building-alaskas-internet</guid><category>Type-video</category><category>Internet</category><category>Alaska</category><category>Broadband</category><category>Telecommunications</category><category>Last-mile-projects</category><category>Microwaves-networks</category><category>3g</category><category>Wireless</category><dc:creator>Amy Nordrum</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-photo-of-a-crew-of-workers-unloading-a-large-box-of-supplies-from-cables-attached-to-a-helicopter-above-surrounding-by-mounta.jpg?id=25584552&amp;width=980"/></item><item><title>Build a Cordless Soldering Iron</title><link>https://spectrum.ieee.org/build-a-cordless-soldering-iron</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/s4rryock-5Zdv3OJ1.js" width="100%"></iframe><br/><p>Good soldering requires good tools. In particular, a soldering iron that uses feedback to keep the tip at a set temperature works much better than one that doesn’t. That functionality is easy to find in a bench soldering station, but try to find a cordless iron with it, and you’ll be out of luck. This DIY solution solves that problem by marrying the business end of a Weller “Magnastat” soldering iron with a Maglite flashlight.</p>
<p><strong>Detailed Instructions:</strong> <a href="/geek-life/hands-on/want-a-temperaturecontrolled-cordless-soldering-iron-heres-how-to-make-one"><em>How to Make a Temperature-Controlled Cordless Soldering Iron</em></a></p>]]></description><pubDate>Sat, 04 Nov 2017 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/build-a-cordless-soldering-iron</guid><category>Type-video</category><category>Cordless-soldering</category><category>Soldering-iron</category><category>Hands-on</category><dc:creator>David Schneider</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo-david-schneider.jpg?id=25584290&amp;width=980"/></item><item><title>Testing DIY Digital Video for FPV Flying</title><link>https://spectrum.ieee.org/testing-diy-digital-video-for-fpv-flying</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/NFHZziWZ-5Zdv3OJ1.js" width="100%"></iframe><br/><p><span><a href="https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/">Wifibroadcast</a>  </span><span>is a collection of open-source software that uses ordinary Wi-⁠Fi dongles and Raspberry Pi hardware at each end to provide telemetry, recording capability, and high-resolution video for first-person-view, radio-controlled flight. Only commodity hardware is needed, so this approach is relatively inexpensive.</span></p>
<p><em>IEEE Spectrum</em><span> Senior Editor David Schneider found that the video quality was quite good when the radio link used to control his model airplane was on a different band (72 megahertz) from the one used to transmit video (2.4 gigahertz). But the Raspberry Pi computer aboard the plane created enough electromagnetic interference to compromise the control link. A more modern spread-spectrum control link, operating at 2.4 GHz, could cope with that interference, but the control signals then degraded the video quality.</span></p>
<p><span>Schneider explains the experiments he did and why they convinced him that this DIY system for digital FPV video isn't yet ready for prime time.</span></p>
<p><span>Read More: </span><a href="/geek-life/hands-on/wifi-hd-video-drones-still-wobbly">Wi-Fi + HD Video + Drones = Still Wobbly</a></p>]]></description><pubDate>Sat, 21 Oct 2017 15:17:00 +0000</pubDate><guid>https://spectrum.ieee.org/testing-diy-digital-video-for-fpv-flying</guid><category>Type-video</category><category>Drones</category><category>Fpv</category><category>Wi-fi</category><category>Hands-on</category><category>Raspberry-pi</category><dc:creator>David Schneider</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/wifibroadcast-promises-digital-video-for-fpv-flyers-using-raspberry-pi-computers-and-ordinary-wifi-dongles-watch-ieee-spectrum.jpg?id=25583909&amp;width=980"/></item><item><title>Octopus-Inspired Camouflage for Soft Robotics</title><link>https://spectrum.ieee.org/octopusinspired-camouflage-for-soft-robotics</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/U2kkzJMN-5Zdv3OJ1.js" width="100%"></iframe><br/><p>There are many reasons to admire the octopus, including its ability to instantly pop up tiny protrusions of various shapes from its skin to match the texture of its background. This technique, combined with other camouflage tricks such as changing its color, allow an octopus to blend into to almost anything—even boats.</p>
<p>Those protrusions, called dermal papillae, were the bio-inspiration behind a new elastic material that can morph into various shapes, and could provide a shape-shifting surface for <a href="/automaton/robotics/robotics-hardware/robot-octopus-takes-to-the-sea">soft robots</a>.</p>
<p>Researchers from Cornell University in New York and the <a href="https://www.mbl.edu/">Marine Biological Laboratory</a> in Massachusetts decided to build a material based on muscle groups that control papillae along the surface of an octopus tentacle. The material consists of a fiber mesh that simulates an octopus’s erector muscles, which contract to squeeze a protrusion into shape. They embedded that mesh in concentric rings within a rubber skin, which mimics an octopus’s connective tissue.</p>
<p>Using a compressed air cylinder, they inflated the rubber skin much like one might blow up a balloon. The fiber mesh held parts of the rubber in place while others expanded out. The team found that, with the right number and spacing of rings, they could form the skin into shapes that resembled a rock and an aloe plant.</p>
<p>Robert Shepherd, a co-author and assistant professor at Cornell University, says the material could be reformed hundreds of thousands of times without degrading. He and his collaborators recently described <a href="https://science.sciencemag.org/cgi/doi/10.1126/science.aao5345">their research</a>, funded by the U.S. Army and Air Force, in the journal <em>Science</em>.</p>
<p>Shepherd thinks their morphable skin could be applied to furniture, or used to create immersive virtual reality experiences in which participants can feel their surroundings. He also says it could someday be worn by robots—maybe even <a href="/robotics/robotics-hardware/robot-octopus-points-the-way-to-soft-robotics-with-eight-wiggly-arms">robots that look and move like an octopus</a>.</p>
<p>The model that his group created is limited by the elasticity of the rubber itself, which can only stretch so far, and the fact that they can’t change the arrangement of the fiber rings once they’re enmeshed in rubber. Each model can therefore only adopt one shape, rather than continuously morph like an octopus’s papillae. But Shephard has ideas for how to make that possible, and to enable the material to change color as well.</p>]]></description><pubDate>Wed, 18 Oct 2017 19:30:00 +0000</pubDate><guid>https://spectrum.ieee.org/octopusinspired-camouflage-for-soft-robotics</guid><category>Robot-hardware</category><category>Type-video</category><category>Bioinspired-robots</category><category>Soft-robots</category><category>Materials</category><category>Robotics</category><category>Camouflage</category><dc:creator>Amy Nordrum</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-new-elastic-skin-morphs-to-produce-different-textures.jpg?id=25584137&amp;width=980"/></item><item><title>This Mind-Bending Actuator Could Make Robots Safer and More Energy Efficient</title><link>https://spectrum.ieee.org/sri-inception-drive-infinitely-variable-transmission</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/jWOK3JIy-5Zdv3OJ1.js" width="100%"></iframe><br/><p>Last year, SRI’s <a href="https://www.linkedin.com/in/alexander-kernbaum-9997a628/">Alexander Kernbaum</a> introduced us to <a href="/automaton/robotics/robotics-hardware/sri-demonstrates-abacus-rotary-transmission">Abacus Drive, a new kind of rotary transmission based on pure rolling motion</a> that promises to be much cheaper and much more energy efficient than harmonic gears, which are the current (and quite expensive) standard. Now Kernbaum is back with another ingenious transmission design: an ultra-compact, infinitely variable transmission based on a novel nested-pulley configuration. This new kind of actuator could make robots—and all kinds of other things—safer, more affordable, and vastly more efficient.</p>
<p>In an <a href="https://en.wikipedia.org/wiki/Continuously_variable_transmission#Infinitely_variable_transmission_.28IVT.29">infinitely variable transmission</a>, which is a specific kind of <a href="https://en.wikipedia.org/wiki/Continuously_variable_transmission">continuously variable transmission</a>, the transmission ratio includes a zero point that can be approached from either a positive side or a negative side. In other words, a constant input, like an electric motor turning the same direction at the same speed, can be converted to an output that’s turning faster, turning slower, turning in the opposite direction, or not turning at all (in this “geared neutral” mode, you’d need infinite input revolutions to cause one output revolution, hence the name “infinitely variable transmission”).</p>
<p>If you can’t quite understand how it works from the video (and even Kernbaum admits that it’s difficult to visualize), read the explanation here: <a href="/automaton/robotics/robotics-hardware/inception-drive-a-compact-infinitely-variable-transmission-for-robotics">Inception Drive: A Compact, Infinitely Variable Transmission for Robotics</a></p>]]></description><pubDate>Sat, 14 Oct 2017 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/sri-inception-drive-infinitely-variable-transmission</guid><category>Type-video</category><category>Inception-drive</category><category>Alexander-kernbaum</category><category>Cvt</category><category>Abacus-drive</category><category>Robot-hardware</category><category>Sri</category><category>Actuators</category><category>Motors</category><category>Robotics-hardware</category><category>Gears</category><category>Harmonic-drive</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/wrap-your-head-around-sri-s-new-actuator-based-on-an-ultra-compact-infinitely-variable-transmission.jpg?id=25584131&amp;width=980"/></item><item><title>Radio Frequency Hunters: Saving Your Cell Signal</title><link>https://spectrum.ieee.org/radio-frequency-hunters-saving-your-cell-signal</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/JkxpRhWU-5Zdv3OJ1.js" width="100%"></iframe><br/><p>Every day, <a href="https://www.linkedin.com/in/kevinargentieri/">Kevin Argentieri</a> tracks down devices that are causing radio-frequency interference, and then tries to persuade their owners to shut them off. This is a delicate process that involves knocking on strangers’ doors, searching their homes for the offending device, and politely reminding the owner that they could be fined up to $16,000 a day for keeping the gadget switched on.</p>
<p>Argentieri works for<a href="https://www.p3-group.com/en/"> P3</a>, a company that performs radio-frequency interference hunting for nationwide carriers including Verizon, AT&T, T-Mobile, and Sprint. American carriers pay<a href="https://arstechnica.com/information-technology/2017/04/t-mobile-dominates-spectrum-auction-will-boost-lte-network-across-us/"> big bucks</a> to the U.S. Federal Communications Commission for exclusive licenses to swaths of the radio-frequency spectrum.</p>
<p>Once a license is sold, the FCC prohibits anyone else from transmitting on that particular band. But rogue devices are out there, and it’s Argentieri’s job to find them. He says baby monitors, cordless phones, and cell phone repeaters are three of the most common violators.</p>
<p>Interference from any of these sources causes dropped calls for wireless customers in the area and worsens the overall performance of the network. Argentieri uses a portable spectrum analyzer to home in on the source. Most of the time, it takes him less than a day to track it down.</p>
<p>On a recent hunt in Brooklyn, Argentieri traced interference from a client’s rooftop base station to a construction site across the street, where he believes the crew nicked a cable TV antenna. The interference popped up in the 700-megahertz band, which is popular with both cable TV companies and wireless carriers.</p>
<p>Cable TV companies are permitted to use 700 MHz so long as their TV signals remain contained within the cables running to customers’ televisions. But if one of those cables is sliced in two by construction, or chewed up by squirrels (it happens, says Argentieri), the cable starts spewing TV signals into the air.</p>
<p>These “leaked” TV signals interfere with the cell signals of other customers in the area who also rely on 700 MHz. Cell signals can even bounce right back into the severed cable, distorting TV programs. His next step will be to inform the cable TV provider in the area of the suspected leak, and ask them to repair it.</p>
<p>The proliferation of personal electronics means the potential for radio-frequency interference is ever greater. But because it’s so difficult and expensive for carriers to buy new spectrum, they will continue to vigorously defend the licenses they hold—and keep Argentieri busy out on the streets.</p>]]></description><pubDate>Sat, 02 Sep 2017 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/radio-frequency-hunters-saving-your-cell-signal</guid><category>Wireless</category><category>Internet</category><category>Type-video</category><category>Bandwidth</category><category>Telecom</category><category>Interference</category><category>Signal</category><dc:creator>Alyssa Pagano</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo-alyssa-pagano.jpg?id=25583881&amp;width=980"/></item><item><title>5G Bytes: Small Cells Explained</title><link>https://spectrum.ieee.org/5g-bytes-small-cells-explained</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/CmLV4dch-5Zdv3OJ1.js" width="100%"></iframe><br/><p></p>
<p>Today’s mobile users want faster data speeds and more reliable service. The next generation of wireless networks—5G—promises to deliver that, and much more. Right now, though, <a href="/tech-talk/telecom/wireless/5-myths-about-5g">5G</a> is still in the planning stages, and companies and industry groups are working together to figure out exactly what it will be. But they all agree on one thing: As the number of mobile users and their demand for data rises, 5G will have to handle far more traffic at much higher speeds than do the base stations that make up today’s cellular networks.</p>
<div class="ieee-sidebar-small">
<h3>Watch: <a href="/video/telecom/wireless/everything-you-need-to-know-about-5g">Everything You Need to Know About 5G</a></h3>
<p></p>
<a href="/video/telecom/wireless/everything-you-need-to-know-about-5g"> <p></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-container-resized rm-resized-container-50">
<img alt="Everything You need to know about 5g" class="rm-shortcode" data-rm-shortcode-id="9fb05c064d727e09c752bbc2c5e81f90" data-rm-shortcode-name="rebelmouse-image" id="e3540" loading="lazy" src="https://spectrum.ieee.org/media-library/everything-you-need-to-know-about-5g.jpg?id=25582804&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption..."></small>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit..."></small>
</p> </a>
</div>
<p></p>
<p>To achieve this, wireless engineers are designing a suite of brand-new technologies. Together, these technologies will deliver data with less than a millisecond of delay (compared to<a href="https://opensignal.com/reports/2017/02/usa/state-of-the-mobile-network"> about 70 ms</a> on today’s 4G networks), and raise peak download speeds to <a href="https://www.itu.int/en/membership/Documents/missions/GVA-mission-briefing-5G-28Sept2016.pdf">20 gigabits per second</a> (compared with <a href="https://www.itu.int/en/membership/Documents/missions/GVA-mission-briefing-5G-28Sept2016.pdf">1 Gb/s on 4G</a>).</p>
<p></p>
<p>At the moment, it’s not yet clear which technologies will do the most for 5G in the long run, but a few early favorites have emerged. The front-runners include <a href="/video/telecom/wireless/5g-bytes-millimeter-waves-explained">millimeter waves</a>, small cells, <a href="/video/telecom/wireless/5g-bytes-massive-mimo-explained">massive MIMO</a>, <a href="/video/telecom/wireless/5g-bytes-full-duplex-explained">full duplex</a>, and <a href="/video/telecom/wireless/5g-bytes-beamforming-explained">beamforming</a>.</p>
<p></p>
<p><strong>Small Cells</strong></p>
<p><a href="/telecom/wireless/a-surge-in-small-cell-sites">Small cells</a> are portable miniature base stations that require minimal power to operate and can be placed every 250 meters or so throughout cities. To prevent signals from being dropped, carriers could blanket a city with thousands of these stations. Together, they would form a dense network that acts like a relay team, handing off signals like a baton and routing data to users at any location.</p>
<p>While traditional cell networks have also come to rely on an increasing number of base stations, achieving 5G performance will require an even greater infrastructure. Luckily, antennas on small cells can be much smaller than traditional antennas if they are transmitting tiny millimeter waves. This size difference makes it even easier to stick cells unobtrusively on light poles and atop buildings.</p>
<p>What’s more, this radically different network structure should provide more targeted and efficient use of spectrum. Having more stations means the frequencies that one station uses to connect with devices in its small broadcast area can be reused by another station in a different area to serve another customer. There is a problem, though: The sheer number of small cells required to build a <a href="/video/telecom/wireless/everything-you-need-to-know-about-5g">5G network</a> may make it impractical to set up in rural areas.</p>]]></description><pubDate>Sat, 19 Aug 2017 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/5g-bytes-small-cells-explained</guid><category>Type-video</category><category>Small-cells</category><category>Networks</category><category>5g</category><category>Wireless</category><dc:creator>Kristen Clark</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/illustration.jpg?id=25583810&amp;width=980"/></item><item><title>Repairing Organs With the Touch of a Nanochip</title><link>https://spectrum.ieee.org/repairing-organs-with-the-touch-of-a-nanochip</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/q8JX6DyN-5Zdv3OJ1.js" width="100%"></iframe><br/><p><span><span>Researchers at <a href="https://wexnermedical.osu.edu/">Ohio State University</a> developed a way to change cells inside the body from one type to another—with just one touch from a nanochip. This new technology, called “<a href="https://www.nature.com/nnano/journal/vaop/ncurrent/full/nnano.2017.134.html">tissue nanotransfection</a>,” could be used to repair and regenerate body tissues, including organs, in a way that is non-invasive and painless.</span></span></p>
<p><span><span>“When these things come out for the first time, it’s basically crossing the chasm from impossible to possible,” says Chandan Sen, co-leader of the study. “We have established feasibility.”</span></span></p>
<p><span><span>Previously, experiments with this sort of cell type conversion were done outside the body, in petri dishes. Even if cells from the test subject were removed from the body, converted in the lab, and then reinserted into the body, those new cells often incited an immune response and were rejected. Sen’s method is unique because the conversion takes place entirely inside the body, preventing an immune response. But working in the body can be complicated.</span></span></p>
<p><span><span>“The moment you go <em>in vivo</em> the complexity is significantly elevated, and now you have to deal with a lot of parameters that are beyond your control,” says Sen.</span></span></p>
<p><span><span>To keep it simple, Sen’s trials focused on skin cells in mice and pigs. The first step is to place the nanochip on the affected area. When the chip touches the skin, it sends a snippet of synthetic DNA into the surface cells using an electric current, which lasts less than one tenth of a second. The genetic code of the synthetic DNA differs depending on the desired outcome. For example, if the researchers want to convert the skin cells to nerve cells, they use a different code than they would use to convert the skin cells to bone cells.</span></span></p>
<p><span><span>One of Sen’s experiments successfully healed a mouse’s injured leg. Because scans showed a lack of blood flow in the mouse’s leg, researchers inserted DNA that would change skin cells to the endothelial cells that form blood vessels. Within a week of treatment, the mouse had blood flowing through its leg again. Not only did the the process work to reprogram local cells at the site of the injury, but the entire leg was affected. This is because the mouse’s body also did some work to repair itself.</span></span></p>
<p><span><span>“There are steps that the body itself mounts for its own rescue,” says Sen. “So what is happening here is we’re enabling the body’s rescue system by tipping it off in a positive way.”</span></span></p>
<p><span><span>In the experiment with the mouse’s leg, the mouse’s immune system took cues from the injected DNA and generated its own new cells to help it heal faster. It’s hard to know, however, if the body will always respond favorably. And that’s why more research has to be done.  </span></span></p>
<p><span><span><a href="https://www.biochemistry.vcu.edu/Faculty/Diegelmann.html">Robert Diegelmann</a>, an expert in tissue injury and tissue repair, sees Sen’s work as the key to unlocking a lot more information about how we can reprogram cells inside the body.</span></span></p>
<p><span><span>“A lot more research has to be done to specifically target cells other than fibroblast [skin] cells, and to reprogram certain genes, but what he’s got here is really exciting,” says Diegelmann.</span></span></p>
<p><span><span>Sen and his team want to continue experimenting with the technology and they plan to run clinical trials on humans next year. Working on animals is less risky, but it it’s also less realistic, because it doesn’t take into account the human body’s own response.</span></span></p>
<p><span><span>“Of course, there will be problems but over time we will fix those problems,” says Sen. </span></span></p>
<div></div>]]></description><pubDate>Sat, 12 Aug 2017 14:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/repairing-organs-with-the-touch-of-a-nanochip</guid><category>Medical-devices</category><category>Medical-diagnostics</category><category>Type-video</category><category>Tissue-engineering</category><category>Gene-therapy-vectors</category><dc:creator>Alyssa Pagano</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo-wexner-medical-center-at-ohio-state-university.jpg?id=25583760&amp;width=980"/></item><item><title>Kuri Robot Brings Autonomous Video to a Home Near You</title><link>https://spectrum.ieee.org/kuri-robot-brings-autonomous-video-to-a-home-near-you</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/4JIA0lOM-5Zdv3OJ1.js" width="100%"></iframe><br/><p></p>
<p><span><span>Most <a href="/robotics/home-robots">home robots</a> are designed primarily for convenience and function. Not Kuri. Silicon Valley startup <a href="https://www.mayfieldrobotics.com/">Mayfield Robotics</a> </span></span><span>designed <a href="https://robots.ieee.org/robots/kuri/?utm_source=spectrum">Kuri specifically to be an adorable home companion</a>. <span>And that means it needed to have one quality you won’t find in most robotic vacuums and other home bots: cuteness. </span></span></p>
<p><span><span><a href="/automaton/robotics/home-robots/mayfield-robotics-announces-kuri-a-700-mobile-home-robot">Mayfield introduced Kuri earlier this year</a> at the Consumer Electronics Show in Las Vegas. Since then, the Mayfield team has made several updates to the robot. The most significant one </span></span><span>is the home video feature called “Kuri Vision,” which allows Kuri to take video autonomously.</span></p>
<p><span><span>To do that, Kuri has two high definition 1080p cameras, one behind each eye. These cameras take videos intermittently throughout the day, capturing candid moments. </span></span><span><span>You can then review those clips through the app, which runs on iOS and Android, and choose which ones you like best. Then Kuri’s machine learning and image processing kicks in: Based on which images you favorite or delete, Kuri learns to take videos that you’ll like. </span></span></p>
<p><span>For those who </span><span>find this robot stalking feature a bit creepy, the company says you can always review the photos and videos and delete what you don’t want. And you can also <span>program the robot to avoid filming in specific rooms. This is useful if, say, you don’t want Kuri filming in the bathroom.</span></span></p>
<p><span><span>Kuri knows the layout of your house because when you first bring the robot home, it uses a laser sensor array to create a map. It then uses that map for reference, so it keeps track of its location. This map makes it easy to tell Kuri where to go. You can say “Kuri, go to the kitchen,” and it will know exactly how to get there.</span></span></p>
<p></p>
<p><span><span>But don’t expect Kuri to respond in English—or in any other human language. Mayfiled says Kuri “speaks “robot,” which makes it extra cute. Kuri chirps, beeps, and bloops. This helps with a big challenge in human-robot interaction. When robots try to communicate in human language, they often make mistakes that frustrate the user. Even when it’s not annoying, sometimes it’s just uncanny and creepy. Mayfield reduced the chance of that confusion by giving Kuri a language of its own. However, if you need to use Kuri to deliver a message to someone across the house, you can use the app to talk through Kuri’s speakers: “Hey! That’s my ice cream!” </span></span></p>
<p><span><span>Other new features include new </span></span><span>track wheels that collapse up into the robot’s body to absorb shock and help it go over obstacles like door threshold<span>, and an ergonomic handle to make it easier to transport. </span></span></p>
<p>The first production wave is sold out and will ship to customers in December. The next shipment, which is available for pre-order now <span>for US $799</span>, will go out in the spring 2018.</p>
<p></p>
<p></p>]]></description><pubDate>Tue, 01 Aug 2017 17:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/kuri-robot-brings-autonomous-video-to-a-home-near-you</guid><category>Type-video</category><category>Cameras</category><category>Home-robot</category><category>Consumer-robots</category><category>Robot</category><category>Mayfield-robotics</category><category>Kuri</category><category>Humanoid-robots</category><dc:creator>Alyssa Pagano</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/robot-with-round-white-head-and-a-coned-shaped-body-sits-on-the-floor-in-front-of-wooden-kitchen-cabinets.jpg?id=25583660&amp;width=980"/></item><item><title>Robot Mechanic Could Prevent Satellites From Becoming Space Junk</title><link>https://spectrum.ieee.org/robot-mechanic-could-prevent-satellites-from-becoming-space-junk</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/M9PrIFRJ-5Zdv3OJ1.js" width="100%"></iframe><br/><p>Let’s say you are the program manager of a very large, complex system. Perhaps it’s an aircraft, or a building, or a communications network. Your system is valued at over US $500 million. Could you imagine being told that you won’t ever be able to maintain it? That once it’s operational, it will never be inspected, repaired, or upgraded with new hardware?</p>
<p>Welcome to the world of satellite building. After a satellite is launched, it is on a one-way journey to disrepair and obsolescence, and there is little anyone can do to alter that path. Faults (which are called anomalies in the space business) can only be diagnosed remotely, using data and inferential reasoning. Software fixes and upgrades may be possible, but the nuts and bolts remain untouched. The upshot: Even if a satellite is operating well, it could lose its state-of-the-art status just a few years into a typical 15-year lifetime.</p>
<p>If governments and private companies could actively repair and revitalize their satellites in geosynchronous orbit—and move them to new orbits as needed—they could extend the lifespans of  their investments and substantially defer the cost of building and launching replacements.</p>
<p>To that end, the U.S. Defense Advanced Research Projects Agency (DARPA) has sponsored a project to develop a robotic servicing spacecraft that can work on satellites that were never designed to be repaired—which is pretty much all of the ones in orbit today. The ­public-private partnership, called the <span><a href="https://www.google.com/url?q=https://www.darpa.mil/program/robotic-servicing-of-geosynchronous-satellites&sa=D&ust=1501016122233000&usg=AFQjCNGb2Oq5u9nttkUtvl62r7Fqr3MjUA">Robotic Servicing of Geosynchronous Satellites</a></span><span> (RSGS) program, builds on a decade of work by DARPA and the U.S. Naval Research Laboratory, as well as the efforts of university researchers and space agencies around the world.</span></p>
<p>When RSGS launches in the early 2020s, its robot arm could move GEO satellites to new orbits, fix stuck solar panels, and perform other important repairs. Independently, NASA plans to launch, around the same time, a robotic mission called <a href="https://www.google.com/url?q=https://ssco.gsfc.nasa.gov/restore-L.html&sa=D&ust=1501016122233000&usg=AFQjCNGUPS5NIDcc5Hfd6jYXWArTYsVgiQ">Restore-L</a>; its aim is to refuel and relocate a government-owned satellite now in low Earth orbit.</p>
<p>If successful, these two missions will push the limits of automation and robotic operation in space. They could be the first steps toward space construction projects such as vast solar arrays that can beam energy back to Earth, robots that could mine asteroids and deflect those that pose a danger to Earth, and many other applications that would revolutionize the way we operate beyond the bounds of Earth’s atmosphere.</p>
<p>Read more: <a href="https://www.google.com/url?q=/aerospace/satellites/inside-darpas-mission-to-send-a-repair-robot-to-geosynchronous-orbit&sa=D&ust=1501016122234000&usg=%0D%0AAFQjCNG0lyDd3goc5Jp8OH4yZc5CFM4OvQ">Inside DARPA’s Mission to Send a Repair Robot to Geosynchronous Orbit</a></p>]]></description><pubDate>Sat, 29 Jul 2017 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/robot-mechanic-could-prevent-satellites-from-becoming-space-junk</guid><category>Type-video</category><category>Space-robot</category><category>Rsgs</category><category>Robotic-exploration</category><category>Geosynchronous-orbit</category><category>Satellites</category><category>Robot</category><category>Darpa</category><category>Satellite</category><dc:creator>Alyssa Pagano</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo-darpa.jpg?id=25583639&amp;width=980"/></item><item><title>Robots Learn to Speak Body Language</title><link>https://spectrum.ieee.org/robots-learn-to-speak-body-language</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/FnuhbbLGEjw?rel=0" width="100%"></iframe><br/><p>If your friend says she feels relaxed, but you see that her fists are clenched, you may doubt her sincerity. Robots, on the other hand, might take her word for it. Body language says a lot, but even with advances in computer vision and facial recognition technology, robots struggle to notice subtle body movement and can miss important social cues as a result.</p><p>Researchers at Carnegie Mellon University developed a <a href="https://www.google.com/url?q=https://www.cmu.edu/news/stories/archives/2017/july/computer-%0D%0Areads-body-language.html&sa=D&ust=1500663018548000&usg=AFQjCNFH-U9aDJV0ajnvS-%0D%0AtWZrSyw3Pv8g">body-tracking system</a> that might help solve this problem. Called <a href="https://www.google.com/url?q=https://github.com/CMU-Perceptual-Computing-Lab/openpose&%0D%0Asa=D&ust=1500663018548000&usg=AFQjCNFJuCn3wetC_oL4IEETM5mLt2IReA">OpenPose</a>, the system can track body movement, including hands and face, in real time. It uses computer vision and machine learning to process video frames, and can even keep track of multiple people simultaneously. This capability could ease human-robot interactions and pave the way for more interactive virtual and augmented reality as well as intuitive user interfaces.
</p><p>
	One notable feature of the OpenPose system is that it can track not only a person’s head, torso, and limbs but also individual fingers. To do that, the researchers used CMU’s <a href="https://www.google.com/url?q=/tech-talk/computing/software/%0D%0Acamerafilled-dome-recreates-full-3d-motion-scenes&sa=D&ust=1500663018549000&usg=%0D%0AAFQjCNEu2-_SLzHKprcn2Hkb4E3JNMWsLg">Panoptic Studio</a>, a dome lined with 500 cameras, where they captured body poses at a variety of angles and then used those images to build a data set.
</p><p>
	They then passed those images through what is called a keypoint detector to identify and label specific body parts. The software also learns to associate the body parts with individuals, so it knows, for example, that a particular person’s hand will always be close to his or her elbow. This makes it possible to track multiple people at once.
</p><p>
	The images from the dome were captured in 2D. But the researchers took the detected keypoints and triangulated them in 3D to help their body-tracking algorithms to understand how each pose appears from different perspectives. With all of this data processed, the system can determine how the whole hand looks when it’s in a particular position, even if some fingers are obscured.
</p><p>
	Now that the system has this data set to draw from, it can run with only <a href="https://www.google.com/url?q=https://www.youtube.com/watch?v%3DLrCO8QcXfAY&sa=D&ust%0D%0A=1500663018550000&usg=AFQjCNG5l-rpuc0IY-vymsfZx632FgvHeg">one camera and a laptop</a>. It no longer requires the camera-lined dome to determine body poses, making the technology mobile and accessible. The researchers have already released <a href="https://www.google.com/url?q=https://github.com/CMU-Perceptual-Computing-Lab/openpose&%0D%0Asa=D&ust=1500663018550000&usg=AFQjCNFh61A9ItjlT-CTaK_0zpboTHhhRw">their code</a> to the public to encourage experimentation.
</p><p>
	They say this technology could be applied to all sorts of interactions between humans and machines. It could play a huge role in VR experiences, allowing finer detection of the user’s physical movement without any additional hardware, like stick-on sensors or <a href="https://www.google.com/url?q=/view-from-the-valley/at-work/start-%0D%0Aups/for-precise-hand-tracking-in-virtual-reality-start-with-a-magnetic-field&sa=D&ust=%0D%0A1500663018551000&usg=AFQjCNE3EjO9IQ_HXT2syfZuLhXvgHiF7Q">gloves</a>.
</p><p>
	It could also facilitate more natural interactions with a home robot. You could tell your robot to “pick that up,” and it could immediately understand what you’re pointing at. By perceiving and interpreting your physical gestures, the robot may even learn to read emotions by tracking body language. So when you’re silently crying with your face in your hands because a robot has taken your job, it might offer you a tissue.
</p>]]></description><pubDate>Sat, 22 Jul 2017 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/robots-learn-to-speak-body-language</guid><category>Type-video</category><category>Robot-software</category><category>Cmu</category><category>Panoptic-studio</category><category>Computer-vision</category><category>Software</category><category>Machine-learning</category><dc:creator>Alyssa Pagano</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=29308831&amp;width=980"/></item><item><title>5G Bytes: Beamforming Explained</title><link>https://spectrum.ieee.org/5g-bytes-beamforming-explained</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/mNGnwKuO-5Zdv3OJ1.js" width="100%"></iframe><br/><p>Today’s mobile users want faster data speeds and more reliable service. The next generation of wireless networks—5G—promises to deliver that, and much more. Right now, though, <a href="/tech-talk/telecom/wireless/5-myths-about-5g">5G</a> is still in the planning stages, and companies and industry groups are working together to figure out exactly what it will be. But they all agree on one matter: As the number of mobile users and their demand for data rises, 5G must handle far more traffic at much higher speeds than the base stations that make up today’s cellular networks. Beamforming is one of the burgeoning technologies that will help get us there.</p>
<p><a href="https://en.wikipedia.org/wiki/Beamforming">Beamforming</a> is a traffic-signaling system for cellular base stations that identifies the most efficient data-delivery route to a particular user, and it reduces interference for nearby users in the process. Depending on the situation and the technology, there are several ways to implement it in 5G networks.</p>
<div class="ieee-sidebar-small">
<a href="/video/telecom/wireless/everything-you-need-to-know-about-5g"> <h3>Watch: Everything You Need to Know About 5G</h3> <p class="shortcode-media shortcode-media-rebelmouse-image rm-container-resized rm-resized-container-50">
<img alt="Everything You need to know about 5g" class="rm-shortcode" data-rm-shortcode-id="9fb05c064d727e09c752bbc2c5e81f90" data-rm-shortcode-name="rebelmouse-image" id="e3540" loading="lazy" src="https://spectrum.ieee.org/media-library/everything-you-need-to-know-about-5g.jpg?id=25582804&width=980"/>
<small class="image-media media-caption" placeholder="Add Photo Caption..."></small>
<small class="image-media media-photo-credit" placeholder="Add Photo Credit..."></small>
</p> </a>
</div>
<p>Beamforming can help <a href="/video/telecom/wireless/5g-bytes-massive-mimo-explained">massive MIMO arrays</a>, which are base stations arrayed with dozens or hundreds of individual antennas, to make more efficient use of the spectrum around them. The primary challenge for massive MIMO is to reduce interference while transmitting more information from many more antennas at once. At massive MIMO base stations, signal-processing algorithms plot the best transmission route through the air to each user. Then they can send individual data packets in many different directions, bouncing them off buildings and other objects in a precisely coordinated pattern. By choreographing the packets’ movements and arrival time, beamforming allows many users and antennas on a massive MIMO array to exchange much more information at once.</p>
<p>For <a href="/video/telecom/wireless/5g-bytes-millimeter-waves-explained">millimeter waves</a>, which are high-frequency waves expected to play a key role in 5G networks, beamforming is primarily used to address a different set of problems: Cellular signals are easily blocked by objects and tend to weaken over long distances. In this case, beamforming can help by focusing a signal in a concentrated beam that points only in the direction of a user, rather than broadcasting in many directions at once. This approach can strengthen the signal’s chances of arriving intact and reduce interference for everyone else.</p>
<p>With beamforming and other 5G technologies, engineers hope to build the wireless network that future smartphone users, VR gamers, and autonomous cars will rely on every day. Already, researchers and companies have set high expectations for 5G by promising ultralow latency and record-breaking data speeds for consumers. If they can solve the remaining challenges, and figure out how to make all these systems work together, ultrafast 5G service could reach consumers in the next five years.</p>]]></description><pubDate>Sat, 15 Jul 2017 14:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/5g-bytes-beamforming-explained</guid><category>Type-video</category><category>Beamforming</category><category>Networks</category><category>5g</category><category>Wireless</category><dc:creator>Kristen Clark</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/beamforming-illustration.jpg?id=25583525&amp;width=980"/></item><item><title>Underwater Robots Learn a New Language, JANUS</title><link>https://spectrum.ieee.org/underwater-robots-learn-a-new-language-janus</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://www.youtube.com/embed/Q-kQWr_heqg?rel=0" width="100%"></iframe><br/><p>For decades, global standards defining Wi-Fi and cellular networks have allowed people to exchange data over the air. But those technologies are worthless below the waves, and until now there have been no such standards for underwater communications.</p>
<p><span>“We live in a time of wild west communications underwater,” says João Alves, a principal scientist for NATO.</span></p>
<p>Now, Alves and other NATO researchers have <span>established the first international standard</span> for underwater communications. Named <a href="https://ieeexplore.ieee.org/document/7017134/"><span>JANUS</span></a>, after the<a href="https://www.britannica.com/topic/Janus-Roman-god"><span> Roman god of gateways</span></a>, it creates a common protocol for an acoustic signal with which underwater systems can connect.</p>
<p>Acoustics has long been a popular medium for underwater communications. Generally, optical signals could deliver high data rates underwater at distances up to 100 meters, while sound waves covered much greater distances at lower data rates.</p>
<p>The main role of JANUS is to bring today’s acoustic systems into sync with one another. It does this in part by defining a common frequency—<a href="https://www.youtube.com/watch?v=rSW5eDoZgAw"><span>11.5 kilohertz</span></a>—over which all systems can announce their presence. Once two systems make contact through JANUS, they may decide to switch to a different frequency or protocol that could deliver higher data rates or travel further.</p>
<p>In this way, Alves compares JANUS to the English language—two visitors to a foreign country may speak English to one another before realizing they are both native Spanish speakers, and switch to their native tongue.</p>
<div class="ieee-pullquote">
  “We live in a time of wild west communications underwater.”
</div>
<p>The JANUS standard was developed by Alves’ team at NATO’s <a href="https://www.cmre.nato.int/"><span>Centre for Maritime Research and Experimentation</span></a> in La Spezia, Italy and sponsored by NATO’s <a href="https://www.act.nato.int/"><span>Allied Command Transformation.</span></a> It is the first underwater communications standard to be defined by an international body.<span>  </span></p>
<p>To create JANUS, Alves’ team relied on the Littoral Ocean Observatory Network, a collection of acoustic tripods that NATO researchers have placed on the seafloor in the harbour of La Spezia, Italy. In another series of tests, researchers aboard the research vessel <em>Alliance</em>, a NATO ship operated by the Italian Navy, measured the performance of JANUS signals along the surface of the ocean.</p>
<p>Once deployed, aquatic systems could use JANUS to send data directly to each other, or to “gateway buoys” bobbing on the water’s surface. The buoys could then use radio waves to relay that data to nearby control centers.</p>
<p>Based on their work, Alves says submarines could also use JANUS to issue calls for help to ships and rescue crews. “Using an open scheme like JANUS to issue distress calls would increase incredibly the chances of those being picked up,” he says.</p>]]></description><pubDate>Sat, 08 Jul 2017 15:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/underwater-robots-learn-a-new-language-janus</guid><category>Type-video</category><category>Internet</category><category>Marine</category><category>Underwater-vehicles</category><category>Underwater-standard</category><category>Telecom</category><category>Underwater-communications</category><category>Acoustics</category><category>Robots</category><category>Signal</category><category>Nato</category><category>Standards</category><dc:creator>Alyssa Pagano</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/image.jpg?id=33332027&amp;width=980"/></item><item><title>Anki Makes Programming Easy With Drag and Drop Coding</title><link>https://spectrum.ieee.org/anki-makes-programming-easy-with-drag-and-drop-coding</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/K8uQx5Jh-5Zdv3OJ1.js" width="100%"></iframe><br/><p>When <a href="https://anki.com/en-us/cozmo">Anki introduced Cozmo</a> almost exactly one year ago, <a href="/automaton/robotics/home-robots/anki-cozmo-robotic-toy">we started off with a bit of skepticism</a>, and a feeling that Anki was going slightly overboard with the kinds of promises that it was making for this cute and capable little robot. What was more exciting to us was when Anki followed up a few weeks later with <a href="/automaton/robotics/robotics-software/anki-sdk-cozmo-robot">Cozmo’s software development kit</a>, or SDK, allowing access to a variety of very sophisticated features through relatively simple lines of code.</p>
<p></p>
<p>This week, <a href="https://anki.com/en-us/cozmo/code-lab">Anki announced Code Lab</a>, which takes that SDK and adds a graphical drag-and-drop interface that makes it incredibly simple to get Cozmo to do complex tasks involving vision, manipulation, and decision making, even if you have zero programming experience (like me). It’s fun, it’s easy, it’s affordable, and last week, I tried it out for myself, with a little help from Anki co-founder and president <a href="https://anki.com/en-us/company">Hanns Tappeiner</a>.</p>
<p></p>
<p>As an absolute amateur, even an easy SDK is over my head. But Anki’s new Code Lab is designed to be used anyone—including 5-year-old children. Cozmo is charming when it learns your name and face. Code Lab draws you in with coding challenges that are more games than lessons. And now even I can program a facial recognition app.</p>
<p></p>
<p>Read More: <a href="/automaton/robotics/diy/anki-code-lab-brings-sophisticated-graphical-programming-to-cozmo-robot">Anki's Code Lab Brings Sophisticated Graphical Programming to Cozmo Robot</a></p>]]></description><pubDate>Sat, 01 Jul 2017 13:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/anki-makes-programming-easy-with-drag-and-drop-coding</guid><category>Type-video</category><category>Educational-robots</category><category>Scratch</category><category>Sdk</category><category>Python</category><category>Education</category><category>Robot-toys</category><category>Personal-robots</category><category>Cozmo</category><category>Software</category><category>Anki</category><category>Robots-for-kids</category><dc:creator>Celia Gorman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/with-an-easy-to-use-interface-based-on-scratch-you-can-now-command-cozmo-to-do-complex-tasks-without-any-programming-experience.jpg?id=25583462&amp;width=980"/></item><item><title>Watch This Robot Navigate Like a Rat</title><link>https://spectrum.ieee.org/watch-this-robot-navigate-like-a-rat</link><description><![CDATA[
<iframe frameborder="0" height="100%" scrolling="no" src="https://spectrum.ieee.org/res/scraper/embed/?jwplayer_video_url=//content.jwplatform.com/players/x8R35UQM-5Zdv3OJ1.js" width="100%"></iframe><br/><div></div>
<div>
<div></div>
</div>
<div>
</div>
<p>Rats are nimble navigators, able to find their way around, under, and over obstacles, and through the tightest spaces. Roboticists have long dreamed of giving their creations similar navigation skills. To be useful in the real world, robots must be able to find their way around on their own. Some are already learning to do that in <a href="/automaton/robotics/home-robots/irobot-brings-visual-mapping-and-navigation-to-the-roomba-980">homes,</a> <a href="/automaton/robotics/industrial-robots/cobalt-robotics-introduces-mobile-security-robot">offices</a>, <a href="/automaton/robotics/industrial-robots/clearpath-otto-can-autonomously-haul-a-ton-of-stuff">warehouses</a>, <a href="/automaton/robotics/robotics-hardware/indoor-robots-for-commercial-spaces">hospitals</a>, and <a href="/video/robotics/industrial-robots/saviokes-robot-butler-brings-you-room-service">hotels</a>—and in the case of <a href="/transportation/self-driving">self-driving cars</a>, entire cities. Despite that progress, robots still struggle to perform the tasks for which they’re designed even under mildly challenging conditions.</p>
<p></p>
<p>At the <a href="https://www.qut.edu.au/">Queensland University of Technology</a>, in Brisbane, Australia, <a href="https://wiki.qut.edu.au/display/cyphy/Michael+Milford">Michael Milford</a> and his collaborators have spent the past 14 years honing a robot navigation system modeled on the brains of rats. This biologically-inspired approach, they hope, could help robots navigate dynamic environments without requiring advanced, costly sensors and computationally-intensive algorithms.</p>
<p></p>
<p>Read More: <a href="/robotics/robotics-software/why-ratbrained-robots-are-so-good-at-navigating-unfamiliar-terrain"><em>Why Rat-Brained Robots Are So Good at Navigating Unfamiliar Terrain</em></a></p>]]></description><pubDate>Wed, 21 Jun 2017 20:00:00 +0000</pubDate><guid>https://spectrum.ieee.org/watch-this-robot-navigate-like-a-rat</guid><category>Type-video</category><category>Robot-software</category><category>Industrial-robots</category><category>Ratslam</category><category>Brain-inspired-robots</category><category>Biomimicry</category><category>Robot-navigation</category><category>Autonomous-systems</category><category>Machine-learning</category><dc:creator>Jean Kumagai</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/specialized-neurons-in-a-rats-brain-may-be-the-key-to-autonomous-robot-navigation.jpg?id=25583288&amp;width=980"/></item></channel></rss>