<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" media="screen" href="/~d/styles/rss2full.xsl"?><?xml-stylesheet type="text/css" media="screen" href="http://feeds.feedburner.com/~d/styles/itemcontent.css"?><rss xmlns:media="http://search.yahoo.com/mrss/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:feedburner="http://rssnamespace.org/feedburner/ext/1.0" version="2.0">
<channel>
<title>IEEE Spectrum Recent Content full text</title>
<link>http://spectrum.ieee.org</link>
<description>IEEE Spectrum Recent Content headlines</description>
<pubDate>Wed, 12 Jul 2017 14:00:00 GMT</pubDate>

<atom10:link xmlns:atom10="http://www.w3.org/2005/Atom" rel="self" type="application/rss+xml" href="http://feeds.feedburner.com/IeeeSpectrumFullText" /><feedburner:info uri="ieeespectrumfulltext" /><atom10:link xmlns:atom10="http://www.w3.org/2005/Atom" rel="hub" href="http://pubsubhubbub.appspot.com/" /><item>
<title>AI Creates Fake Obama</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/vkFZA_hA2FU/ai-creates-fake-obama</link>
<description>Videos of Barack Obama made from existing audio, video of him</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Videos of Barack Obama made from existing audio, video of him<figure>
<img src="http://spectrum.ieee.org/image/MjkyNTQ4Nw.jpeg"/>
<figcaption>Photo: University of Washington.</figcaption>
</figure>
<div>
<style type="text/css">&lt;!--
		@page { margin: 0.79in }
		P { margin-bottom: 0.08in }
		A:link { so-language: zxx }
	--&gt;
</style>
<p>Artificial intelligence software could generate highly realistic fake videos of former president Barack Obama using existing audio and video clips of him,<a shape="rect" href="http://grail.cs.washington.edu/projects/AudioToObama/siggraph17_obama.pdf"> a new study</a> [PDF] finds.</p>
<p/>
<p>Such work could one day help generate digital models of a person for virtual reality or augmented reality applications, researchers say.</p>
<p/>
<p>Computer scientists at the University of Washington previously revealed they could generate <a shape="rect" href="http://spectrum.ieee.org/tech-talk/consumer-electronics/gaming/celebrity-digital-dopplegangers">digital doppelgängers</a> of anyone by analyzing images of them collected from the Internet, from celebrities such as Tom Hanks and Arnold Schwarzenegger to public figures such as George W. Bush and Barack Obama. Such work suggested it could one day be relatively easy to create such models of anybody, when there are untold numbers of digital photos of everyone on the Internet.</p>
<p/>
<p>The researchers chose Obama for their latest work because there were hours of high-definition video of him available online in the public domain. The research team had a neural net analyze millions of frames of video to determine how elements of Obama's face moved as he talked, such as his lips and teeth and wrinkles around his mouth and chin.</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/MVBe6_o4cMI" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>
<span>In an artificial neural network, components known as artificial neurons are fed data, and work together to solve a problem such as identifying faces or recognizing speech. The neural net can then alter the pattern of connections among those neurons to change the way they interact, and the network tries solving the problem again. Over time, the neural net learns which patterns are best at computing solutions, an AI strategy that mimics the human brain.</span>
</p>
<p/>
<p>
<span>In the new study, the neural net learned what mouth shapes were linked to various sounds. The researchers took audio clips and dubbed them over the original sound files of a video. They next took mouth shapes that matched the new audio clips and grafted and blended them onto the video. Essentially, the researchers synthesized videos where Obama lip-synched words he said up to decades beforehand.</span>
</p>
<p/>
<p>
<span>The researchers note that similar previous research involved filming people saying sentences over and over again to map what mouth shapes were linked to various sounds, which is expensive, tedious and time-consuming. In contrast, this new work can learn from millions of hours of video that already exist on the Internet or elsewhere. </span>
</p>
<p/>
<p/>
<p>
<span>One potential application for this new technology is improving videoconferencing, says study co-author <a shape="rect" href="http://homes.cs.washington.edu/~kemelmi/">Ira Kemelmacher-Shlizerman </a>at the University of Washington. Although teleconferencing video feeds may stutter, freeze or suffer from low-resolution, the audio feeds often work, so in the future, videoconferencing may simply transmit audio from people and use this software to reconstruct what they might have looked like while they talked. This work could also help people talk with digital copies of a person in virtual reality or augmented reality applications, Kemelmacher-Shlizerman says.</span>
</p>
<p/>
<p>
<span>The researchers note their videos are currently not always perfect. For example, when Obama tilted his face away from the camera in a target video, imperfect 3-D modeling of his face could cause parts of his mouth to get superimposed outside the face and onto the background. </span>
</p>
<p/>
<p>
<span>In addition, the research team notes their work did not model emotions, and so Obama's facial expressions in the output videos could appear too serious for casual speeches or too happy for serious speeches. However, they suggest that it would be interesting to see if their neural network could learn to predict emotional states from audio to produce corresponding visuals.</span>
</p>
<p>The researchers were careful to not generate videos where they put words in Obama's mouth that he did not at some other time utter himself. However, such fake videos are “likely possible soon,” says study lead author <a shape="rect" style="background-color: rgb(255, 255, 255);" href="http://homes.cs.washington.edu/~supasorn/">Supasorn Suwajanakorn</a>, a computer scientist at the University of Washington. </p>
<p>However, this new research also suggests ways to detect fake videos in the future. For instance, the video manipulation the researchers practiced can blur mouths and teeth. “This may be not noticeable by human eyes, but a program that compares the blurriness of the mouth region to the rest of the video can easily be developed and will work quite reliably,” Suwajanakorn says.</p>
<p/>
<p>
<span>The researchers speculated that the link between mouth shapes and utterances may be to some extent universal for people. This suggests that a neural network trained on Obama and other public figures could be adapted to work for many different people.</span>
</p>
<p/>
<p>
<span>The research was funded by Samsung, Google, Facebook Intel and the University of Washington. The scientists will detail </span>
<span lang="zxx">
<a shape="rect" href="http://grail.cs.washington.edu/projects/AudioToObama/siggraph17_obama.pdf">
<span>their findings</span>
</a>
</span>
<span> [PDF] on Aug. 2 at the <a shape="rect" href="http://www.siggraph.org/">SIGGRAPH </a>conference in Los Angeles.</span>
</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/vkFZA_hA2FU" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Wed, 12 Jul 2017 14:00:00 GMT</pubDate>
<dc:creator>Charles Q. Choi</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/tech-talk/robotics/artificial-intelligence/ai-creates-fake-obama</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyNTUxMw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyNTUxMQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/tech-talk/robotics/artificial-intelligence/ai-creates-fake-obama</feedburner:origLink></item>
<item>
<title>Iran's Newest Robot Is an Adorable Dancing Humanoid</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/L7JcniELBSU/iran-surena-mini</link>
<description>University of Tehran roboticists have built a dancing, karate-chopping little humanoid called Surena Mini</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>University of Tehran roboticists have built a dancing, karate-chopping little humanoid called Surena Mini<figure>
<img src="http://spectrum.ieee.org/image/MjkyNTQ0OA.jpeg"/>
<figcaption>​Photo: University of Tehran</figcaption>
<figcaption>Iranian researchers have recently unveiled a new robot called Surena Mini.</figcaption>
</figure>
<div>
<p>Over the last several years, a team of roboticists at the University of Tehran has been working on <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/humanoids/iran-humanoid-robot-surena-iii">increasingly large and complex life-size humanoids</a>. For their latest project, however, the Iranian researchers decided to build something smaller—and cuter.</p>
<p>Surena Mini is a knee-high robot with a sleek 3D-printed body, articulated limbs, and a round head with two camera-eyes. Twenty small servomotors power its arms, legs, and neck, allowing the little robot to walk, gesture, and dance:</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/MhqqzMWMt0o" width="620" frameborder="0" height="349"/>
</p>
<p>“The main purpose of this robot is to provide researchers and students with a reliable robotic platform for educational and research applications,” Aghil Yousefi-Koma, a professor of mechanical engineering at the University of Tehran, told <em>IEEE Spectrum.</em>
</p>
<p>
<span>He added that his group also has plans to offer the robot “for helping autistic and deaf children.”</span>
</p>
<div/>
<p>
<span>A team of 15 researchers at University of Tehran’s Center for Advanced Systems and Technologies worked for over a year to design and build Surena Mini, which is 50 centimeters tall and weighs 3.4 kilograms. </span>
</p>
<p>
<span>Packed inside the robot are a compact computer with an Intel Core CPU, cameras and infrared sensors, speakers, and an IMU, or </span>
<span>inertial measurement unit.</span> Its hands aren’t designed for grasping objects, but Surena Mini can push on small things—or karate-chop them:</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/h69aiaV0NCU" width="620" frameborder="0" height="349"/>
</p>
<p>A little over a year ago, <span>the same</span> group unveiled <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/humanoids/iran-humanoid-robot-surena-iii">Surena III, an advanced adult-size humanoid</a> designed for researching bipedal locomotion, human-robot interaction, and other challenges in robotics.</p>
<p>Surena III, <span>equipped with cameras, 3D sensor, and a computer running </span>
<a shape="rect" style="font-family: Georgia, serif; font-size: 18px;" href="http://spectrum.ieee.org/automaton/robotics/robotics-software/celebrating-9-years-of-ros">ROS, or Robot Operating System</a>
<span>,</span> was able to pick up bottles, imitate a person’s gestures, and stand on one foot.</p>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkyNTQ0NA.jpeg" alt="Surena III, a life-size humanoid robot developed by Iranian researchers at the University of Tehran"/>
<figcaption class="hi-cap">
  Photos: University of Tehran
 </figcaption>
<figcaption>
  Iranian researchers unveiled Surena III in 2015. The robot, almost 2 meters tall and weighing 98 kilograms, can kick a ball, go up a ramp, and walk down a set of steps.
 </figcaption>
</figure>
<p>The Iranian roboticists plan to continue working on Surena III, but they also want to explore the possibility of creating marketable products based on their research, <span>Professor Yousefi-Koma explained, </span>and one of the ideas they had was building a “kid-size version of Surena.”</p>
<p>Surena Mini’s overall size and design appear similar to that of other small humanoids like <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/humanoids/new-next-gen-nao-is-now-the-new-nao">Nao</a>, developed by <a shape="rect" href="http://spectrum.ieee.org/robotics/home-robots/how-aldebaran-robotics-built-its-friendly-humanoid-robot-pepper">French robotics company Aldebaran (now SoftBank Robotics)</a>, and Robotis OP2, <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/humanoids/darwin-op-humanoid-robot-demo">created by U.S. and South Korean roboticists</a>.</p>
<p>But the Iranian robot has yet to show that it has some of the same skills already demonstrated by those other humanoids. <a shape="rect" href="http://spectrum.ieee.org/tag/Nao">Nao</a> and <a shape="rect" href="http://spectrum.ieee.org/tag/darwin-op">Robotis OP2</a> have been used in research labs, <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/humanoids/aldebaran-robotics-nao-robot-autism-solution-for-kids">schools</a>, and <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/artificial-intelligence/robot-companions-to-befriend-sick-kids-at-european-hospital">hospitals</a> for nearly a decade. <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/humanoids/virginia-techs-romela-rocks-robocup-2011">Both are</a>
<a shape="rect" href="https://youtu.be/HV0WNO1c_4I?t=1m8s">also used</a> in the <a shape="rect" href="http://www.robocup.org/">RoboCup</a> robot soccer competition.</p>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkyNTQ0NQ.jpeg" alt="A team of Iranian researchers from the University of Tehran designed and built the humanoid robot Surena Mini"/>
<figcaption class="hi-cap">
  Photo: University of Tehran
 </figcaption>
<figcaption>
  Researchers from the University of Tehran’s Center for Advanced Systems and Technologies, led by Professor Aghil Yousefi-Koma (standing between the robots with red and green feet/hands), worked for over a year to design and build Surena Mini.
 </figcaption>
</figure>
<p>Despite their size, these little robots are pricey. <a shape="rect" href="http://www.robotlab.com/store/nao-standard-edition">Nao</a> and <a shape="rect" href="http://www.robotis.us/robotis-OP2/">Robotis OP2</a> each sell for nearly US $10,000. Professor Yousefi-Koma said Surena Mini will be available for 260,000,000 Iranian rials, or $8,000, but he hopes the cost to come down if the robot can be produced in large batches.</p>
<p>One of the biggest challenges of the project, he explained, has been implementing features like face detection and voice recognition, which would let the robot perform with a greater level of autonomy. His team has developed such capabilities for their large robots, but replicating them using Surena Mini’s limited hardware is a trickier task.</p>
<p>To program the robot, advanced users can modify the source code to create different behaviors. But the researchers wanted to make Surena Mini accessible to less experience users as well. So they created a programming environment with a graphical interface “designed to be attractive and user-friendly,” Professor Yousefi-Koma said.</p>
<p>“It gives users full access to all available features of the robot,” he added, “so even beginners can make the robot walk and move its arms and head.”</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/L7JcniELBSU" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Wed, 12 Jul 2017 13:12:00 GMT</pubDate>
<dc:creator>Erico Guizzo</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/automaton/robotics/humanoids/iran-surena-mini</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyNTQ2Mw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyNTQ2MQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/automaton/robotics/humanoids/iran-surena-mini</feedburner:origLink></item>
<item>
<title>The Audi A8: the World's First Production Car to Achieve Level 3 Autonomy</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/2AtEtqk_s4Q/the-audi-a8-the-worlds-first-production-car-to-achieve-level-3-autonomy</link>
<description>It's also the first to sport lidar</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>It's also the first to sport lidar<figure>
<img src="http://spectrum.ieee.org/image/MjkyNTIwNg.octet-stream"/>
</figure>
<div>
<p>The 2018 Audi A8, just unveiled in Barcelona, counts as the world’s first production car to offer Level 3 autonomy.</p>
<p>Level 3 means the driver needn’t supervise things at all, so long as the car stays within guidelines. Here that involves driving no faster than 60 kilometers per hour (37 mph), which is why Audi calls the feature AI Traffic Jam Pilot.</p>
<p>Go ahead, Audi’s saying, read your newspaper or just zone out while traffic creeps along. Take a look at the company’s promotional video. Beginning around the 3:55 mark you’ll see an indulgent father who ends up horsing around with his kid in the back seat. When the car up ahead stops, the A8’s AI hits the brakes in time to avoid rear-ending it.</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/U1U8w-I_tAY" width="620" frameborder="0" height="349"/>
</p>
<p>Other cars now on the market may physically allow the driver to zone out, too, but not without squawking a lot. “Put your hand on the wheel within five seconds, Buster, or I’m pulling over to the side of the road,” they’ll say, in so many words.</p>
<p>Mercedes-Benz sounds an alarm if you take your hand off the wheel. Tesla makes you <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/self-driving/tesla-robocar-to-driver-accept-the-liability-buster">hit the turn signal</a> to show you want to change lanes. The new Cadillac CT6 goes so far as to <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/self-driving/cadillac-adds-level-2-highway-autonomy-with-super-cruise">monitor your eye movements</a> with infrared cameras to make sure you’re paying attention to the road. That’s an emphatic statement of the car’s status: Level 2.</p>
<p>To be sure, the A8 also monitors the driver, even while the traffic jam persists, and continues to do so as the speed edges up over the limit. If the driver falls asleep, it’ll wake him up; if it can’t get his attention, it will stop the car.</p>
<p>There’s no one feature that seems to be behind the company’s decision to go up to Level 3, but there are certainly a lot of new technologies. There are computers from Nvidia and other firms, an image processor from Mobileye, and a really huge array of sensors: 12 ultrasound sensors, five cameras, five radars, one infrared camera for night vision. Most notable of all, there’s lidar—the first ever offered on a production car. The unit, a forward-looking one, <a shape="rect" href="http://www.autonews.com/article/20170123/OEM06/301239847/cheaper-lidar-on-the-way">comes from Valeo</a>.</p>
<p>Besides allowing for Level 3 autonomy, the panoply of devices also makes for a smoother ride. For instance, when the sensors see a pothole coming up they prime the active suspension so that it can handle the challenge more easily. Oh, and the car is also a mild hybrid, with a plug-in version in the works.</p>
<p>Electric-drive is suddenly catching on—just last week <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/advanced-cars/volvo-says-goodbye-to-gasoline">Volvo announced</a> that all its 2019 cars would have at least some degree of electric drive in them. Nowhere has the change in attitude been more extraordinary than in the family of VW, Audi’s parent company, which had based its emissions policy on what it called clean diesel. VW had to give it up last year, when <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/advanced-cars/how-professors-caught-vw-cheating">it was shown to have cheated</a> on diesel emissions tests.</p>
<p>If you want to buy the new A8, you’ll have to check whether your jurisdiction will accept it as a Level 3 car. Audi said in a statement that it will follow “<a shape="rect" href="https://www.audi-mediacenter.com/en/press-releases/the-new-audi-a8-future-of-the-luxury-class-9124">a step-by-step approach</a>” to introducing the traffic jam pilot. It plans to sell the base model in Europe this fall for €90,600, or about US $103,000, and to enter the United States market shortly afterwards. A model having a longer wheelbase will cost a few percent more.</p>
<p>What’s next? A Level 4 car, naturally. <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/self-driving/nvidia-ceo-announces">Back in January</a>, Audi and Nvidia said they’d have one on the roads by 2020.</p>
<p/>
<p/>
<p/>
<p/>
<p/>
<p/>
<p/>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/2AtEtqk_s4Q" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Tue, 11 Jul 2017 17:00:00 GMT</pubDate>
<dc:creator>Philip E. Ross</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/cars-that-think/transportation/self-driving/the-audi-a8-the-worlds-first-production-car-to-achieve-level-3-autonomy</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyNTIxOQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyNTIxNw.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/cars-that-think/transportation/self-driving/the-audi-a8-the-worlds-first-production-car-to-achieve-level-3-autonomy</feedburner:origLink></item>
<item>
<title>NATO Unveils JANUS, First Standardized Acoustic Protocol for Undersea Systems</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/QOjvNFomc6A/nato-develops-first-standardized-acoustic-signal-for-underwater-communications</link>
<description>The new acoustic communications protocol is a step towards an Internet of Underwater Things</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>The new acoustic communications protocol is a step towards an Internet of Underwater Things<figure>
<img src="http://spectrum.ieee.org/image/MjkyNTI3OA.jpeg"/>
<figcaption>Photo: NATO</figcaption>
</figure>
<div>
<p>Aquatic robots are busier than ever. They have seabeds to mine, cable pathways to plough, and marine data to gather. But they and their aquatic brethren—including submarines and scuba divers—still struggle to communicate.</p>
<p>For decades, global standards defining Wi-Fi and cellular networks have allowed people to exchange data over the air. But those technologies are worthless below the waves, and no such standards have existed for underwater communications.</p>
<aside class="inlay pullquote lt med">
 “Using an open scheme like JANUS to issue distress calls would increase incredibly the chances of those being picked up.”
</aside>
<p>Aquatic systems have instead used a mishmash of acoustic and optical signals to send and receive messages. However, manufacturers sell acoustic modems that operate at many different frequencies, which means those systems often can’t speak to each other.</p>
<p>“We live in a time of wild west communications underwater,” says <a shape="rect" href="https://www.researchgate.net/profile/Joao_Alves19">
<span>João Alves</span>
</a>, a principal scientist for NATO.</p>
<p>Now, Alves and other NATO researchers have <a shape="rect" href="http://www.nato.int/cps/bu/natohq/news_143247.htm">
<span>established the first international standard</span>
</a> for underwater communications. Named <a shape="rect" href="http://ieeexplore.ieee.org/document/7017134/">
<span>JANUS</span>
</a>, after the<a shape="rect" href="https://www.britannica.com/topic/Janus-Roman-god">
<span> Roman god of gateways</span>
</a>, it creates a common protocol for an acoustic signal with which underwater systems can connect.</p>
<p>Acoustics has long been a popular medium for underwater communications. Generally, optical signals can deliver high data rates underwater at distances up to 100 meters, while sound waves cover much greater distances at lower data rates.</p>
<p>The main role of JANUS is to bring today’s acoustic systems into sync with one another. It does this in part by defining a common frequency—<a shape="rect" href="https://www.youtube.com/watch?v=rSW5eDoZgAw">
<span>11.5 kilohertz</span>
</a>—over which all systems can announce their presence. Once two systems make contact through JANUS, they may decide to switch to a different frequency or protocol that could deliver higher data rates or travel further.</p>
<p>In this way, Alves compares JANUS to the English language—two visitors to a foreign country may speak English to one another before realizing they are both native Spanish speakers, and switch to their native tongue.</p>
<p class="jwcode">
<script src="//content.jwplatform.com/players/nsFmpt3a-7pFgM9ap.js" class="jwembed"/>
</p>
<p>
<a shape="rect" href="http://reti.dsi.uniroma1.it/eng/petrioli/chiara-petrioli.html">
<span>Chiara Petrioli</span>
</a>, a specialist in underwater sensors and embedded systems at La Sapienza, the University of Rome, says JANUS could be the first step toward an “Internet of Underwater Things"—a submerged digital network of sensors and vessels.<span>  </span>
</p>
<p>In addition to designating a frequency, JANUS also provides a modulation encoding scheme to describe how data should be encoded onto a sound wave, and describes the particular waveform that should be used (known as FH-BFSK). It also spells out which redundancies should be added to the data stream to minimize transmission errors.</p>
<p>In order to use JANUS, a system would first emit three optional tones to indicate that it intends to broadcast a JANUS data packet hitched to a sound wave. Then, the system would pause for about 400 milliseconds to allow other devices in its vicinity to “wake up.” Next, the system would broadcast a fixed series of tones to ensure both systems were properly synchronized to the JANUS protocol. Finally, the system would send the JANUS packet, consisting of 56 bits followed by a redundancy check, which tests for transmission errors.</p>
<aside class="inlay pullquote rt med">
 “We live in a time of wild west communications underwater.”
</aside>
<p>The JANUS standard was developed by Alves’ team at NATO’s <a shape="rect" href="http://www.cmre.nato.int/">
<span>Centre for Maritime Research and Experimentation</span>
</a> in La Spezia, Italy and sponsored by NATO’s <a shape="rect" href="http://www.act.nato.int/">
<span>Allied Command Transformation.</span>
</a> It is the first underwater communications standard to be defined by an international body.<span>  </span>
</p>
<p>
<a shape="rect" href="http://millitsa.coe.neu.edu/">
<span>Milica Stojanovic</span>
</a>, an expert in oceanic engineering at Northeastern University, expects other standards will soon follow. She says the 11.5 kHz frequency used by JANUS is great for transmitting data between 1 and 10 kilometers, but a lower frequency, perhaps 1 kHz, would be better for sending data over longer distances of 10 to 100 km.<span>   </span>
</p>
<p>Even with JANUS and other standards, any future underwater Internet will probably be cursed by far lower data rates than modern Wi-Fi or cellular networks. Sound travels at much lower frequencies, and on much longer waves, than the signals used for consumer electronics. Though sound waves travel faster in water than on land, they still travel more slowly through water than radio waves through air.</p>
<p>To develop JANUS, Alves’ team relied on the Littoral Ocean Observatory Network, a collection of tripods that NATO researchers have placed on the seafloor in the harbour of La Spezia, Italy. Each tripod emits acoustic signals to other tripods, which send performance reports to researchers through undersea cables. Those reports helped the team understand how fluctuations in water temperature, and other environmental changes, will affect JANUS signals.</p>
<p>The tripods also allowed researchers to build a JANUS receiver, advanced versions of which could minimize decoding errors and account for the Doppler effect. The Doppler effect describes shifts in sound waves caused by motion, such as the whirl of an ambulance siren as it drives by.</p>
<p>In another series of tests, researchers aboard the research vessel <em>Alliance</em>, a NATO ship operated by the Italian Navy, measured the performance of JANUS signals along the surface of the ocean.</p>
<figure role="img" class="lt med">
<img src="http://spectrum.ieee.org/image/MjkyNTI3Nw.jpeg" alt="Three ships and a satellite point to an ocean buoy. Yellow concentric half-circles emanate from below the buoy toward a submarine."/>
<figcaption class="hi-cap">
  Illustration: NATO
 </figcaption>
<figcaption>
  Buoys would act as transeivers between RF signals and JANUS’ sonic messages.
 </figcaption>
</figure>
<p>Once deployed, aquatic systems could use JANUS to send data directly to each other, or to “gateway buoys” bobbing on the water’s surface. The buoys could then use radio waves to relay that data to nearby control centers.</p>
<p>In one demonstration, Alves’ group helped the Portuguese Navy set up a buoy that converted data about the positions and speeds of nearby ships to JANUS. The buoy rebroadcast this information to Portuguese submarines lurking below.</p>
<p>Based on their work, Alves says submarines could also use JANUS to issue calls for help to ships and rescue crews. “Using an open scheme like JANUS to issue distress calls would increase incredibly the chances of those being picked up,” he says.</p>
<p>Now that JANUS is available, manufacturers of aquatic systems must decide whether or not to adopt it. Alves is confident they will, and Petrioli, who contributed feedback to the development of JANUS, agrees that adoption is essential to the industry’s future.</p>
<p>But Stojanovic is not so sure. “If there starts to develop a serious market, then everybody will have to play to the same tune,” she says. “If not, and everybody finds their own niche market with their own protocols, then they will do that.”</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/QOjvNFomc6A" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Tue, 11 Jul 2017 16:30:00 GMT</pubDate>
<dc:creator>Amy Nordrum</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/tech-talk/telecom/wireless/nato-develops-first-standardized-acoustic-signal-for-underwater-communications</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyNTMwMA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyNTI5OA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/tech-talk/telecom/wireless/nato-develops-first-standardized-acoustic-signal-for-underwater-communications</feedburner:origLink></item>
<item>
<title>How a One-Man Team from California Won NASA's Space Robotics Challenge</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/aNSRu8QYPho/coordinated-robotics-winner-nasa-space-robotics-challenge</link>
<description>By mastering a Mars robot simulator, an engineer and stay-at-home dad took home the $125,000 top prize</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>By mastering a Mars robot simulator, an engineer and stay-at-home dad took home the $125,000 top prize<figure>
<img src="http://spectrum.ieee.org/image/MjkyNDg5MQ.jpeg"/>
<figcaption>Image: NASA SRC</figcaption>
<figcaption>In NASA's Space Robotics Challenge, participants had to command a virtual Valkyrie robot to perform a series of repair tasks in a simulated Mars base hit by a dust storm.</figcaption>
</figure>
<div>
<p>NASA’s Space Robotics Challenge (SRC) took place last month, <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/space-robots/nasa-space-robotics-challenge-tasks-prizes-how-to-participate">full of virtual Valkyries wandering around a virtual Mars base trying to fix virtual stuff</a>. Anyone was allowed to participate, and since the virtual nature of the competition means there was no need for big expensive <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/humanoids/darpa-robotics-challenge-robots-falling">robots that mostly didn’t fall over</a>, anyone actually could (and did) participate. Of the 93 teams initially signed up to compete, NASA selected 20 finalist teams based on their performance completing some tasks in the <a shape="rect" href="http://gazebosim.org/">Gazebo 3D robot simulator</a>, and each of those finalists had to program a <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/military-robots/nasa-jsc-unveils-valkyrie-drc-robot">Valkyrie humanoid</a> to complete a repair mission on a simulated Mars base.</p>
<p>The winner of the SRC was team Coordinated Robotics, which also was the only team to manage a perfect run with 100 percent task completion, taking home the <span>US $125,000 top prize plus a</span> <span>$50,000 “perfect run” bonus</span>. “Team” may be a little bit of a misnomer, though, since Coordinated Robotics consists entirely of one dude: Kevin Knoedler. We spoke with Kevin about his epic win, and also checked in with Nate Koenig from <a shape="rect" href="https://www.openrobotics.org/">Open Robotics</a>, which leads the development of Gazebo and helped organize the SRC, to get more info on the competition, along with footage of all the best outtakes.</p>
<p>The SRC was very similar to the <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/humanoids/darpa-vrc-challenge-results-heres-who-gets-an-atlas-humanoid">VRC (the qualifier for the DARPA Robotics Challenge)</a>, in that all of the teams competed by running their code in a Gazebo virtual environment. “The tasks themselves were somewhat inspired by <a shape="rect" href="http://spectrum.ieee.org/at-work/tech-careers/the-martian-andy-weir-explains-what-he-got-right-and-wrong">
<em>The Martian</em>
</a>,” Open Robotics CTO <a shape="rect" href="https://www.osrfoundation.org/team/nate-koenig/">Nate Koenig</a> told us. “Valkyrie is on Mars, preparing the way for human settlement, and a dust storm comes.” Post dust storm, Val has to align a communications dish, repair a solar array, and locate and fix a leak in the habitat. Here are some highlights from the competition:</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/vOssEL1xqNs?rel=0" width="620" frameborder="0" height="349"/>
</p>
<p>“The competition overall went pretty smoothly,” says Koenig. “A unique aspect of the SRC, as opposed to the VRC, is that we were emphasizing sequential completion of tasks. You get more points for completing more tasks in order without having Valkyrie fall or require a reset, so the more reliable you are in terms of walking and manipulating, the better you’ll do.” </p>
<p>As with the <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/humanoids/darpa-robotics-challenge-amazing-moments-lessons-learned-whats-next">DRC</a>, the time limits on the tasks were set such that teams were heavily encouraged to use as much autonomy as possible. And it sounds like most of them did; only a few timed out. Making things even more challenging were severe restrictions on bandwidth coupled with latency designed to emulate (to some extent) what it would be like trying to teleoperate a robot somewhere out in space, as Koenig explains:</p>
<blockquote>
<p>
<em>“Network latency and bandwidth limitations were more severe than the VRC. We wanted to simulate something closer to what you might experience with a round trip delay to Mars, but that would have been too extreme, so we toned it down to a maximum of 20 seconds delay. Some of the tasks had bandwidth limits of 380 bits/second, and if you look at those numbers, that essentially kills TCP. </em>
</p>
<p>
<em>People had to get creative, and we did see some unique things: one person ran an IRC server and client to pass information, and some other people used just straight text-based console messages, getting no visualized data, which was pretty awesome: It was like reading The Matrix. One team [Team Xion] ran completely autonomously: They just deployed their code and hit go, and they were able to complete a lot of the tasks, which was impressive. </em>
</p>
</blockquote>
<p>Koenig said he and his colleagues weren’t expecting any of the teams to complete all of the tasks in sequence. “But Kevin proved us wrong,” he added. “And he was the only team that was able to perform that feat.”</p>
<p>Kevin is, of course, Kevin Knoedler, who is the entirety of Team Coordinated Robotics. As Nate pointed out, Kevin managed to complete all of the Space Robotics Challenge flawlessly in a row, which is pretty amazing. We spoke with Kevin over email to learn more about how he pulled it off.</p>
<p>
<strong>
<em>IEEE Spectrum:</em> What’s your background, and what made you decide to enter the SRC by yourself?</strong>
</p>
<p>
<strong>Kevin Knoedler:</strong>
<em>After graduating from MIT I worked as an engineer and engineering manager at Teradyne. I left in 2007 to be a stay-at-home dad. Both during my time at Teradyne and in my current role as a stay-at-home dad, I have continued to be involved in various contests—Robot Wars, Battlebots, the three DARPA autonomous vehicle grand challenges, and the DRC. The SRC looked challenging and fun, so I signed up to compete in it.</em>
</p>
<p>
<em>I was busy coaching two soccer teams when the qualification round started (fall 2016), and I knew I would be busy coaching track and <a shape="rect" href="https://en.wikipedia.org/wiki/Odyssey_of_the_Mind">Odyssey of the Mind</a> when the finals started (early 2017). It is usually key to contribute and coordinate with teams early in the project cycle. Since I would be busy with other things during those key times, I decided to do it alone to avoid frustration for myself and any team I worked with. Working with teams is generally a better choice as more people have more creative ideas. I have worked with teams on all of the previous contests.</em>
</p>
<aside class="inlay pullquote xlrg">
 During my time at Teradyne and in my current role as a stay-at-home dad, I have continued to be involved in various contests . . . It is usually key to contribute and coordinate with teams early in the project cycle. Since I would be busy with other things, I decided to do it alone.
</aside>
<p>
<strong>How much autonomy did your strategy rely on?</strong>
</p>
<p>
<em>I approached the design for the contest assuming I would always have the maximum time delay, so the robot needed to do shorter tasks on its own. Even without the design work, the up to 20-second delay was not a major problem given that the allowed time was in the hours. My perception code was not as reliable and accurate as I would like, so I focused on the robot doing the planning and execution. It was mostly supervised autonomy with human perception help.</em>
</p>
<p>
<strong>You sent us a video of one of your runs [below]. Can you take us through it?</strong>
</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/waOs9L84pVU" width="620" frameborder="0" height="349"/>
</p>
<p>
<em>The video is a short third-person view of the robot completing the three tasks. The first is turning handles to align the antenna. The second task shows the robot removing a solar panel from the trailer, placing it on a table, and plugging in a cable. The final task is climbing the stairs, opening the habitat door, using a tool to locate the leak, and then another tool to fix the leak. One of the fun parts for me was when the robot would find the leak. There was a lot of area to be covered, some of which was partially obstructed, which made it exciting to actually find the leak each run.</em>
</p>
<p>
<em>The leak was found by the robot doing sweeps up and down and using torso rotation to minimize the amount of walking necessary. As the robot looked for the leak it kept track of the search area as either un-searched, clear, or leaky. That information was displayed to the operator via an interactive marker in <a shape="rect" href="http://wiki.ros.org/rviz">Rviz</a> [a 3D visualition tool for ROS] to make it easy to see what had been searched, and when the leak was found, easy to visualize.</em>
</p>
<p>
<strong>What was the trickiest part for you?</strong>
</p>
<p>
<em>I would say the most challenging part was the manipulation and use of tools. Getting a good grasp on the tool and then having the robot use the tools as an extension of the robot were hard to do consistently. I created a scenario in Gazebo where the robot started right at the tools with nothing else around. That allowed testing of picking up the tools from various starting positions and putting them down over and over. </em>
</p>
<p>
<em>An interesting story from the contest: Sometimes real hardware gets stuck and has to be pushed to get it moving again, and the simulated [Valkyrie robot] in Gazebo also had this behavior. [Open Robotics called that an “interesting emergent behavior” that wasn’t programmed in deliberately.] It was possible for the robot’s thumb to get stuck and no longer respond to commands. That happened to me during the contest on my third run. But, much like in real life, I was able to push the thumb against the table to get it unstuck and moving again to be able to complete the tasks. </em>
</p>
<p>
<strong>What kinds of things are easier in simulation than they are in real life?</strong>
</p>
<aside class="inlay pullquote rt med-lrg">
 The main thing that makes simulation easier is the hardware reliability—the simulation hardware doesn’t break like real hardware frequently does. You can also try riskier experiments.
</aside>
<p>
<em>Everything is easier in simulation. It is not dramatically easier, but you can solve 90 percent of the problems in simulation. … The main thing that makes simulation easier is the hardware reliability—the simulation hardware doesn’t break like real hardware frequently does. You can also try riskier experiments. A falling humanoid robot in Gazebo does not cost $100,000 to repair and cause a multi-week delay. The other big advantage to simulation is that one person can run one or multiple tests simultaneously. With a real robot it generally takes multiple people to run a single test.</em>
</p>
<p>
<strong>If NASA put a real Valkyrie inside of a physical mock-up of a Mars base and asked you to complete the same set of tasks, how do you think you’d do?</strong>
</p>
<p>
<em>The robot should be able to complete the tasks after some initial testing to identify and fix differences between simulation and hardware. I had a layered approach where I could fall back to lower level control if the primary method did not succeed. There always seem to be enough differences between simulation and real hardware that some adaptations are needed for success. But, given some testing and adaptations, I do think it would be a success!</em>
</p>
<p>
<strong>After participating in the DRC and now the SRC, how do you feel about the potential for humanoid robots to be realistically useful in disaster areas or planetary exploration?</strong>
</p>
<p>
<em>After the DRC and SRC we are getting closer to be able to use humanoid robots in disaster areas on earth and for planetary exploration. The main challenges I see on earth are making the hardware robust, handling falls, and being able to do manipulation in difficult situations (crawling, obstructed or constricted working environments, situations requiring an arm for support, etc.). In space there are the same challenges plus the distances require giving the robot more perception and autonomy.</em>
</p>
<hr/>
<p>Kevin made sure to remind us to thank <a shape="rect" href="https://www.nasa.gov/">NASA</a>, <a shape="rect" href="https://spacecenter.org/">Space Center Houston</a>, <a shape="rect" href="http://www.ninesigma.com/">Nine Sigma</a>, <a shape="rect" href="https://www.osrfoundation.org/">Open Robotics</a>, and <a shape="rect" href="https://www.ihmc.us/">IHMC</a> on his behalf, which we’re more than happy to do, because we’re also constantly wanting to thank them for what they’ve all done for robotics.</p>
<p>Oh, and before we forget: outtakes!</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/JmvQbkNiL4s?rel=0" width="620" frameborder="0" height="349"/>
</p>
<p>[ <a shape="rect" href="https://ninesights.ninesigma.com/web/space-robotics-challenge">SRC</a> ] via [ <a shape="rect" href="http://gazebosim.org/blog/src_results">Gazebo</a> ]</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/aNSRu8QYPho" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Tue, 11 Jul 2017 15:30:00 GMT</pubDate>
<dc:creator>Evan Ackerman</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/automaton/robotics/robotics-software/coordinated-robotics-winner-nasa-space-robotics-challenge</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyNTEyMA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyNTExOA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/automaton/robotics/robotics-software/coordinated-robotics-winner-nasa-space-robotics-challenge</feedburner:origLink></item>
<item>
<title>Nuclear to Coal to Hydrogen: Sheldon Station Blazes a Trail</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/RyLVcnhc3YI/nuclear-to-coal-to-hydrogen-sheldon-station-blazes-a-trail</link>
<description>In a corner of Nebraska, a power plant continues a 60-year history of innovation as it aims to burn hydrogen for electric power generation</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>In a corner of Nebraska, a power plant continues a 60-year history of innovation as it aims to burn hydrogen for electric power generation<figure>
<img src="http://spectrum.ieee.org/image/MjkyNDkxNg.jpeg"/>
<figcaption>Photo: NPPD</figcaption>
</figure>
<div>
<p>You’d have to be pretty lost on a road trip through the southeastern part of the Cornhusker state to run across Nebraska Public Power District’s (<a shape="rect" href="http://www.nppd.com/innovation/">NPPD’s</a>) Sheldon power plant.</p>
<p>And that would be too bad, because Sheldon may deserve at least a mention in the annals of industrial history. If so, then it’s on track to add to that citation.</p>
<p>The plant was first built between 1958 and 1963 as an experimental nuclear power plant for the U.S. Atomic Energy Commission. That equipment is long gone. In its place is a two-unit coal-fired power plant that for a time tried (without much luck) to use old tires as fuel. It burned Eastern U.S. coal and today takes delivery of trainloads of low-sulfur coal from Wyoming’s Powder River Basin.</p>
<aside class="inlay pullquote lt med">
 Sheldon Station is on track to become the largest hydrogen-fueled power plant in the U.S.
</aside>
<p>That’s set to change as NPPD engineers advance plans to convert Sheldon’s 125 megawatt (MW) Unit 2 from coal to hydrogen. Doing so would make Sheldon the largest hydrogen-fueled electric power station in the United States.</p>
<p>“Contracts are in place to move the project forward,” says John Swanson, generation strategy manager for NPPD. His job is to “look under rocks” for new opportunities. It was NPPD’s work on carbon storage and sequestration methods that led to introductions being made about a possible source of hydrogen for the power plant. </p>
<p>In 2015, NPPD agreed to work with California-based <a shape="rect" href="http://monolithmaterials.com/olive-creek/">Monolith</a>, which broke ground last September on the first of a planned two-stage $50 million plant to produce carbon black from natural gas. (A sort of high-surface-area soot, carbon black is formed by the incomplete combustion of petroleum products.) The carbon black would find commercial uses in everything from plastics to car tires. Meanwhile, the hydrogen left behind as a byproduct  would travel a short distance by pipeline to the Sheldon station. There it would fire a new dual-fuel boiler and generate up to 125 MW of electricity for NPPD’s distribution customers.</p>
<p>The first stage of Monolith’s carbon black plant is expected to be mechanically complete in 2018. When it’s up and running, the plant will buy electricity from the Norris Public Power District, local reseller of NPPD’s electric power.</p>
<p>And, by 2020 work could be underway to convert Sheldon to burn hydrogen. Under current plans, by 2022 much of the plant’s power output could help supply electricity to multiple production lines at the Monolith plant, which could employ up to 600 people.</p>
<p>Between now and then, however, a lot of engineering work needs to be done to prep Unit 2 for conversion.</p>
<p>The thinking now, says John Meacham, senior staff engineer at NPPD, is to tear out the coal-fired boiler and use the existing structural framework to hang a new boiler. But because hydrogen’s heat characteristics are different than coal, a larger boiler will be needed to maintain the power plant’s 125 MW generating capacity. To fit, the new larger boiler will extend down into the plant’s basement area, which currently houses coal ash collection equipment. That equipment becomes obsolete after conversion, Meacham says.</p>
<p>The boiler’s burner configuration will be changed from a cyclone pattern to a wall-fired arrangement with a windbox. The new burners will be segmented with one set dedicated to hydrogen and a separate set for natural gas.</p>
<p>NPPD is planning on dual-fuel capability in case the hydrogen supply from Monolith is interrupted for any reason, including business failure.</p>
<p>Meacham says that some structural steel will need to be modified to handle rapid heat change if the hydrogen-fueled fireball goes out. Not only does a hydrogen fireball have a hotter flame front than coal, but, he says, it also goes away faster in an outage. That creates a thermal transient that needs to be planned for in the boiler’s support structure.</p>
<p>Unsurprisingly, Sheldon station’s environmental profile will improve with the hydrogen conversion. NPPD says that about 1 million metric tons of carbon dioxide emissions will be eliminated each year with the end of coal combustion at Unit 2. And the utility’s portfolio of carbon-free generation sources will grow to nearly 50 percent.</p>
<p>Engineers plan to add emission control equipment at Sheldon to deal with nitrogen oxide (NO<sub>x</sub>) produced when the hydrogen fuel mixes with ambient air during combustion. The plant also may produce a plume of water vapor, the result of hydrogen combustion.</p>
<p>“With the appropriate materials there will be no issue with the vapor,” Meacham says.</p>
<p>For now, Meacham’s biggest worry is how pure the hydrogen stream will be. The gas will be around 95 percent hydrogen “with some cats and dogs” in the stream as well, he says. That means the gas isn’t pure enough for use in fuel cells or food production, but it is good enough for a power plant boiler. The gas stream quality  is likely to vary as Monolith’s production produces different forms of carbon black.</p>
<p>“We’re awaiting a final spec on the fuel from Monolith,” Meacham says.  The quality is unlikely to affect boiler operation, but likely will impact Sheldon’s NO<sub>x</sub> emission profile.</p>
<p>Both Swanson and Meacham declined to reveal conversion cost estimates. But Meacham says that NPPD looked at several different technologies before concluding that the conversion offered the lowest-cost option.</p>
<p>One technology looked at was a gas-fired combustion turbine. The U.S. Department of Energy estimates that the cost to develop a greenfield power plant with a combustion turbine may range from $680-$1100  per kilowatt. That’s on the order of $8.5 to $13.75 million for a brand-new 125 MW plant. But Sheldon will be a conversion with a lot of equipment already in place, including the expensive turbine and generator. <span>The project to date has received green lights in part because its conversion will not increase the cost of electricity to NPPD’s wholesale customers.</span>
</p>
<p>“It’s a cost share with Monolith” and does not include any government funding, Sheldon says, adding “it’s mostly them and a little of us.”</p>
<p>For now, it’s also mostly Monolith on the construction side of things. Aside from Meacham and Swanson, NPPD has not yet fully staffed a project development team.</p>
<p> But in discussing the planned conversion, Swanson betrays some of the entrepreneurial drive that led NPPD on its journey from nuclear to coal to hydrogen at the out-of-the-way Sheldon station.</p>
<p>“We always have one eye on the future,” he says.</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/RyLVcnhc3YI" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Tue, 11 Jul 2017 13:00:00 GMT</pubDate>
<dc:creator>David Wagman</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/energywise/energy/fossil-fuels/nuclear-to-coal-to-hydrogen-sheldon-station-blazes-a-trail</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMzE1MQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMzE0OQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/energywise/energy/fossil-fuels/nuclear-to-coal-to-hydrogen-sheldon-station-blazes-a-trail</feedburner:origLink></item>
<item>
<title>2D Material Could Make Pseudocapacitors Charge in Milliseconds</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/Hn6KKbrqqOM/twodimensional-material-could-make-pseudocapacitors-charge-in-milliseconds</link>
<description>MXene supercapacitors could charge as fast rival supercapacitor technology but with as much as 10 times the storage capacity</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>MXene supercapacitors could charge as fast rival supercapacitor technology but with as much as 10 times the storage capacity<figure>
<img src="http://spectrum.ieee.org/image/MjkyNDQ1MQ.octet-stream"/>
<figcaption>Illustration: Nature Energy</figcaption>
</figure>
<div>
<p>Over two decades ago, researchers at Drexel University identified a new kind of material they dubbed <a shape="rect" href="http://max.materials.drexel.edu/">Max phase</a> (the M is for transition metal, the A for "A group" metal, and the X for carbon and/or nitrogen). At the time, the scientists believed the material could serve as a kind of primordial goo from which all things came—it contained all the elements but needed to be organized by scientists.</p>
<p>Since 2011, Drexel researchers have been working with an iteration of this Max phase goo called “MXene.” It was essentially a two-dimensional (2D) material that derived its name from the process of etching and exfoliating atomically thin layers of aluminum from MAX phases. In 2013, the Drexel researchers first discovered that one of the dozens of MXene materials they were working with <a shape="rect" href="http://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/previously-unheralded-2d-material-exhibits-high-charge-storage-capacity">could serve as a material for energy storage</a>.</p>
<p>Now in research described in the journal <a shape="rect" href="http://nature.com/articles/doi:10.1038/nenergy.2017.105">
<em>Nature Energy</em>
</a>, the Drexel scientists in collaboration with researchers at the Université Paul Sabatier in Toulouse, France, have used an earlier reported MXene material made from titanium and molybdenum carbide and applied a number of improvements to it to reach new charging rate horizons for pseudocapacitive materials​ ​with millisecond charging​ times​.</p>
<p>Unlike in the <a shape="rect" href="http://spectrum.ieee.org/semiconductors/materials/how-a-microscopic-supercapacitor-will-supercharge-mobile-electronics">other varieties of </a>
<a shape="rect" href="http://spectrum.ieee.org/searchContent?q=supercapacitors">supercapacitor</a> such as electric double layer capacitors, in <a shape="rect" href="http://spectrum.ieee.org/searchContent?q=pseudocapacitors&amp;type=&amp;sortby=newest">pseudocapacitors</a> charge is stored via a chemical mechanism, like in batteries. Pseudocapacitors sit somewhere in between batteries and EDLCs, fully charging in anywhere from tens of seconds to several minutes. Their main advantage is that they can store as much charge as some batteries, while operating much faster.</p>
<p>The Drexel researchers have shrunk the charging times for pseudocapacitors from the tens of seconds down to milliseconds.</p>
<p>“The key to the faster charging times was the electrode architecture, which allowed easy access of ions on the surface of MXene sheets,” said Maria Lukatskaya, a PhD student at Drexel and co-author of the research, in an e-mail interview with <em>IEEE Spectrum</em>. “Also, in this work we demonstrate an extended operational voltage window which leads to a higher energy density.”</p>
<p>The researchers haven’t provided the energy density values as of yet since because such metrics are characteristics of a device, not a material.  The researchers concede that the energy density of a supercapacitor—even pseudocapacitor using a battery-like charge storage—is going to be lower than that of a Li-ion battery. </p>
<p>However, if you want to have energy storage devices that allow today’s numerous electronic gadgets to be charged quickly—in seconds to minutes, instead of hours—pseudocapacitors could offer this opportunity.</p>
<p>“The key value of this work is in the demonstration that pseudocapacitive charge storage can be achieved at the same high rates as in double-layer capacitors using physical charge storage (but with 5-10 times more charge stored per unit of weight or volume), as long as the electrode material offers sufficient electronic and ionic conductivity,” Yury Gogotsi, whose lab led the research, said in an e-mail interview with <em>IEEE Spectrum</em>. “We believe this is a game-changing development in the energy storage field.”</p>
<p>In order for this work to become a real-world energy storage device, Gogotsi believes that they will initially need to scale-up material synthesis. At present, they are producing up to 100 grams per batch in their labs. After that, they will need to design energy storage devices employing MXenes and other highly conductive pseudocapacitive materials.</p>
<p>Gogotsi added: “We are developing matching anodes to MXene cathodes that will further expand the voltage window—this means doubling the energy density.”</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/Hn6KKbrqqOM" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Mon, 10 Jul 2017 20:00:00 GMT</pubDate>
<dc:creator>Dexter Johnson</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/nanoclast/semiconductors/materials/twodimensional-material-could-make-pseudocapacitors-charge-in-milliseconds</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyNDQ2Ng.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyNDQ2NA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/nanoclast/semiconductors/materials/twodimensional-material-could-make-pseudocapacitors-charge-in-milliseconds</feedburner:origLink></item>
<item>
<title>How Analytics can be used to Drive More Effective Business Decisions around Additive Manufacturing</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/tYSI_JT6ooM/how-analytics-can-be-used-to-drive-more-effective-business-decisions-around-additive-manufacturing</link>
<description>Learn about how to best take advantage of additive manufacturing and provide insight on the direction your business needs to go, not just to survive but to thrive in today’s world.</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Learn about how to best take advantage of additive manufacturing and provide insight on the direction your business needs to go, not just to survive but to thrive in today’s world.<div>
<p>The digitalization journey continues. From our foundation presentation “How Digitalization is transforming the Electronics Industry”, we continue along the connected journey to adopting a digitalization strategy. Manufacturers are increasingly looking to disruptive technologies like Additive Manufacturing to enable on demand production, produce components never before possible and reduce order to cash cycle time. Even though 3D Printers are more accessible than ever, moving from prototyping to meaningful volume production still requires a significant investment in capital, process change and retooling of personnel. Going beyond the basic technology of 3D printing, this session draws from recent collaboration between leading manufactures who are seeking to fully understand the “why” to additive manufacturing and how data analytics can be used to drive more effective business decisions.</p>
<p/>
<p>In this presentation we’ll uncover details and best practices that can shed light on how to best take advantage of additive manufacturing and provide insight on the direction your business needs to go, not just to survive but to thrive in today’s world:</p>
<p/>
<ul>
<li>If 3D Printing can solve a problem, is it the most profitable solution?</li>
<li>How to uncover essential attributes and rules for cost, quantity, materials, size, assemblies etc…details that lack meaningful value unless they can be evaluated together to drive effective decisions</li>
<li>Uncovering and collecting the details that are typically hidden in a mess   disparate and out-of-context IT systems, information that can fundamentally alter your decision making with regard to additive manufacturing</li>
<li>Understanding and interpreting this information and decision points, and using it to determine if a part can or should be printed</li>
</ul>
<p/>
<p>We believe a new data analytics approach to unify, contextualize and present simplified business insights for proper component/part selection has the potential to dramatically improve the return from 3D printing. All by focusing valuable capital resources to produce the right parts at the right time. Please join us while we take a deeper dive into some of the new and fascinating areas and technological advances that are transforming the electronics industry.</p>
<div>
<strong>PRESENTERS:</strong>
</div>
<div>
<div data-widget="SImage" class="imgWrapper lt sm">
<img src="data:image/jpeg;base64,/9j/4AAQSkZJRgABAQEAYABgAAD/4QDkRXhpZgAATU0AKgAAAAgABAEPAAIAAAAIAAAAPgEQAAIAAAARAAAARgEyAAIAAAAUAAAAWIdpAAQAAAABAAAAbAAAAABzYW1zdW5nAFNBTVNVTkctU0dILUkzMzcAADIwMTY6MTI6MjUgMTc6NDE6MDkAAAaCmgAFAAAAAQAAALqCnQAFAAAAAQAAAMKIJwADAAAAAQJYAACSBAAKAAAAAQAAAMqSCgAFAAAAAQAAANKkAwADAAAAAQAAAAAAAAAAAAAAAQAAAA8AAAD1AAAAZAAAAAAAAAAKAAAAuQAAAGQAAP/gABBKRklGAAEBAAABAAEAAP/bAEMAAgEBAgEBAgICAgICAgIDBQMDAwMDBgQEAwUHBgcHBwYHBwgJCwkICAoIBwcKDQoKCwwMDAwHCQ4PDQwOCwwMDP/bAEMBAgICAwMDBgMDBgwIBwgMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDP/AABEIAGQAOQMBIgACEQEDEQH/xAAfAAABBQEBAQEBAQAAAAAAAAAAAQIDBAUGBwgJCgv/xAC1EAACAQMDAgQDBQUEBAAAAX0BAgMABBEFEiExQQYTUWEHInEUMoGRoQgjQrHBFVLR8CQzYnKCCQoWFxgZGiUmJygpKjQ1Njc4OTpDREVGR0hJSlNUVVZXWFlaY2RlZmdoaWpzdHV2d3h5eoOEhYaHiImKkpOUlZaXmJmaoqOkpaanqKmqsrO0tba3uLm6wsPExcbHyMnK0tPU1dbX2Nna4eLj5OXm5+jp6vHy8/T19vf4+fr/xAAfAQADAQEBAQEBAQEBAAAAAAAAAQIDBAUGBwgJCgv/xAC1EQACAQIEBAMEBwUEBAABAncAAQIDEQQFITEGEkFRB2FxEyIygQgUQpGhscEJIzNS8BVictEKFiQ04SXxFxgZGiYnKCkqNTY3ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqCg4SFhoeIiYqSk5SVlpeYmZqio6Slpqeoqaqys7S1tre4ubrCw8TFxsfIycrS09TV1tfY2dri4+Tl5ufo6ery8/T19vf4+fr/2gAMAwEAAhEDEQA/APwBQU9VzSKKkRM/QfrXdTiZsdDH5h6ZAGcYq5ZrbLaySSRvJNISqc4VD+eeBz3HTNSeH7ixt7ljfrcPGVIAiAPPryRUerT28mqzNZrItpvPkrJw23sTycE9cZP1OK25egNaElvfrZ2anlixJwKl0XVftl4sUiqN3Qr7etZnOBz0qbS1b+0ofL2BiSBuOFzg1MqasTyq1ztn0TNpuxkMPQcmsj+yl/ya7U6BJDpO231Gzn/d5C3ERt3wOOBlxnOB1A6Vx39i+Iv+fWD/AMCIv8a8213ubLY5fS7BtUv4bdWVHuHEas5wu49Mn3NWtc8P3nhjUGs763e2nQA7W7g9CD6H1qbVLFbCJI2Xbk5V/wDCpdXutU8caiLy5Z7m5lRYxgY3BFC8D8MnHck16VO/yM20lqV/DHhjUPGviOz0nSbOa+1LUJRBb28Qy0rnoBn+Z4Fe/a//AMEr/i/4f+y79L024W4HzPDdblgPBwxKgcZ7Zr60/wCCDf7EUOofEqbxz4ksma4toDb6ZbNHkoWXc07cHHUKuSM4Ppx+res/Av7QrIsMajHy8nAxzzgdv514eYZ5KjW9nSSdt/U9TB4GnOHNWvqfzbfFP9jT4hfCKFJtS0Ke6t2bYZbIGZVb0IAzzjqBj3zxXlsqNFIyOrKynDKwwQfQiv6Pvit8ELeJG+1WMM3BRgw7Hjg9iP8ACvzv/b+/YO0vWvDt5daPp8NrrFqXlt5lUK0hJLbXYfeDEkc9CcgdjWDztVGo1V8zStlSS5qT+TPztg+IFwnhy40+4Vp2YoI5N2CFUg7W456Csn/hJLv/AJ6fpUFzC0ErJIrRuhKOrDDKR1BHtUOw160qa6HjnVeL9Pk0/wAOafHccXETFCM9V/hP5YFfZH7FH7MGk674S0vUPsA1C5vrOEsSxUP5sW4rz0wz4OOTtGCMjHzZ8GtP0v4o/F/4Zafq0cd1aS6tBp2pWzMR9qjSRMIdvzYkT5cg5yTg55r9oPGX7KHh/wDZ48B6FrXgvTbPRbbUNUSDUNHMai3t5ZFEoaFSSYEBWZii4GXJ2gEivLzDFOnFU1uz0sLRjKfM9v8AgHsP7APwNh8I39/JDbLDbwuyAmQhZSTjOAMY2gKCT/DwP4j9n2/hq3a1G9Y9uMkmvIf2SvCUWk+B9Lt/3a/uxLLIqkBnbknHPP1zXTftWfF3QPhd8LNQubjwv478ceWgRotCmjtoow4I+9JIm4nhcKGbLjGACV+PqXnU82ehW+KxR+OfwmtLzR5pIWhYspJG3J/n+VfB/wAf/h62peGNWd4lElqWUkDBBwD/ACI+lfR3w+g1rW/ipqHh/TfF/iPQdN02KF30nVLiDW7NBKm/yUlwJI3VSMjzGwQ2R/e+Sf8Ago1+2ZofwG+OOofC7zo7fVtQ019Wa/ms5buC3iEcmS0cZDttWCVjtPQDrkCurDU5ufLDV7m8ZKmrTZ+Nv7TuhR+HPjx4otY12r9rE5HoZEWQ/qxrg67X9obxHH4s+MfiDUIdYh8QRTSqq6jDafZI7oLGqbliySqjbgAknAyTk1xVfeU4vkjfsj5qs05truzovhT42/4Vt8UvDXiPY8g8P6ra6lsT7ziGZZNoz67cc+tf0NeLtWtfHf7Pnhm7OraVqnhTWtbs9Q0+5gkYSmKeOeXLHkGIt5ZRhtYA4I4Br+cWPhK/YL/ghRpmoeNv2D9as7q4uLq20/xtIbOO4fzo7SAQ2rsI0YlVXzJJXIAxuZzgknPj55RXs41uzt953ZfUtPkfqfqp8MfFNvHoMMen7tkaiB1bho2X5WX8CMV12qeKbzwto032PUGtZ54+UKAo2Rkgg/8A1v1rxX9nTV9mvSaPO0kaJtdpWG5VUgKSMfe28Zx645wK7v8AaOVLLSN0eqQw6fpwK/ayGUTAn5SFPJJAyFPI79DXyXIup7SjHnUX1OVsdQ02Oe81SfULP+2rp22ecyo0xHVI19s9vUdq+Cv+CgfhyG5+L51K/s7eSTVNP8syvEGOAQGUEjjJ5wODgelex/Fn9urQPgN4TRV8L2WuWtupu3ur2cxgBSR5u7K7cuu3PIzuxja2Pjn9vf8A4KieFfjJ4Ut7vTfDtzpF5p9iLxBK6lLgzAoqJtzyGI3ZxtHOOuPTwWGqOacUaVKkKb9/Y/L7xnAtr4s1aNWMix3k6hic7gHYZrLqW4ne5eSSRi8khLux/iJ5JqDb719ttoz497kkZ/Sv1+/4NzfitY3fwd8YeC1mjj1DTdUXWI4twDSxSxpG599rxLn/AK6LX4/xHmvYP2Hv2h9c/Zo/aL0PxBod59lkZzazhhujmifgo691PH06jBANcWYYf6xh3Bb7/cdGHqctRH9CPhz4qWvhT4uajo8un6tY29q9gqalPbvFYXBuGcLHHIVCSFdpDKrcEpu2kqWz/wBoTxvb/GL4j6Z4f1a3ns/DukE3t+ZH+W4n2looMZDHjc7EArtTHzBq8n+GP7cfh74yeH5NP1K2Wx1KSAloJlEkF2uPmCE8MPVSPwxXn3hTxVpc3jGa5tYprzwz5pYLON1xausiYSCRm3rENjEp1y2AwXCj4yFFp+8rNH0VNP4i5/wVA8E+CPEf7POpaPfalqllDqkSRKq6gztEVdXQgy7sqJAh2Mw44BGa/FT4t/2l4b8V3fh641ybxBY6MyQW877tpRExHgMSVCq23bkgAYBwBX6Z/t2ftS/COW6vmt/H1jqVxJFMhtbKR7oxzbVTygighACr53gDLY9DX5b+LtbXxL4mvr6OHyY7qUskfGVXoM44zgc4r6XKMPJQu7o8vM60JWUdWZjSB0470lNljAPy8YqP5q9SVRrSR5Q9TzVvTr+TTryG5hbbNbyLIh9GU5FUlORUiNV05aWFsfdvwW/av8L+JvCVrJLqdnpOoQkCS2vLlYpI3A/hZiNy+hH4gHivTvD/AO1B4d0eHd/aVuY4/wB6FikVlc9Tgg4yfcjrX5krLlCuF+YgnKjPHv179q2fEniez1rR9EtrPQdN0ebSbVoLi6tXmabVJC5bzpfMdgGwQuECrweOgHDPLabejZ6cM0mo2aMq6vJNQupbiZt01w7SO3qzHJP61EzbRQWCimO+efyr03LQ8wa5ptBNR1yyldjSuOQ807OKKKVMch4+ZaXH1ooroiSDHJLdzzT7C6W1uC8lvDcrtYbJd23p1+Ug8fWiipmBWlbc3154FNoorlnuaI//2Q==" alt=""/>
</div>
<strong>Jeff Spencer – </strong>
<strong>Portfolio Development Executive, Siemens</strong>
<div>
<p>Jeff Spencer is a Portfolio Development Executive at Siemens with over 22 years of industry experience in Big Data Analytics, 3D Design and Product Lifecycle Management.</p>
</div>
</div>
<div>
<div>
<div class="imgWrapper lt sm"/>
</div>
</div>
<div/>
<div/>
<p/>
<p/>
<p/>
<p>
<strong>Attendees of this IEEE Spectrum webinar have the opportunity to earn PDHs or Continuing Education Certificates!</strong>  To request your certificate you will need to get a code. Once you have registered and viewed the webinar send a request to techinsider@ieee.org for a webinar code. To request your certificate complete the form here: <a shape="rect" href="https://fs25.formsite.com/ieeevcep/form112/index.html">https://fs25.formsite.com/ieeevcep/form112/index.html</a>
</p>
<div>
<em>
<strong>Attendance is free. To access the event please register.</strong>
</em>
</div>
<div>
<em>
<strong>NOTE: By registering for this webinar you understand and agree that IEEE Spectrum will share your contact information with the sponsors of this webinar and that both IEEE Spectrum and the sponsors may send email communications to you in the future.​</strong>
</em>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/tYSI_JT6ooM" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Mon, 10 Jul 2017 19:30:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/webinar/how-analytics-can-be-used-to-drive-more-effective-business-decisions-around-additive-manufacturing</guid>
<feedburner:origLink>http://spectrum.ieee.org/webinar/how-analytics-can-be-used-to-drive-more-effective-business-decisions-around-additive-manufacturing</feedburner:origLink></item>
<item>
<title>Roomba Inventor Joe Jones: Why I Think Home Robots Will Become Invisible</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/HEIT-Fa03Lk/why-i-think-home-robots-will-become-invisible</link>
<description>Joe Jones, the inventor of the Roomba, argues that home robots will follow computers into the shadows</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Joe Jones, the inventor of the Roomba, argues that home robots will follow computers into the shadows<figure>
<img src="http://spectrum.ieee.org/image/MjkyNDU2NA.jpeg"/>
<figcaption>Photo-illustration: IEEE Spectrum; Roomba image: iRobot</figcaption>
<figcaption>In this guest post, Joe Jones, the inventor of the Roomba, argues that home robots will follow computers into the shadows.</figcaption>
</figure>
<div>
<p>How many computers do you own?</p>
<div/>
<p>If you picked a number close to three (say, laptop, tablet, and smartphone) you’re way off. The answer is probably dozens. There are computers in your car, in your appliances, in your thermostat, and maybe even in your light bulbs. Every year the number goes up.</p>
<p>Today, visible computers are just the slimmest tip of the iceberg. Most computers are hidden away, quietly performing their jobs without you even being aware of the work they do for you. That’s as it should be. You have no interest in the computers themselves, you just want certain tasks done.</p>
<div style="page-break-after: always"/>
<p>
<a shape="rect" href="http://spectrum.ieee.org/tag/Social+Robots">Cute, social robots</a> currently get a lot of press, but are these engaging devices early emissaries of our robotic future? Are we entering an era where no one would dream of living without a <a shape="rect" href="http://spectrum.ieee.org/robotics/home-robots/so-where-are-my-robot-servants">cheerful electromechanical companion</a>? In my view, companion robots offer novelty over utility, but once the novelty wears off, it’s only utility that people will pay for.</p>
<aside class="inlay pullquote rt med-lrg">
 Rather than being front and center, home robots, I believe, will follow computers into the shadows. Why? Because people don’t want robots.
</aside>
<p>Rather than being front and center, home robots, I believe, will follow computers into the shadows. Why? Because <em>people don’t want robots</em>. (I say this despite 30-plus years as a practicing roboticist.) Consumers want a spotless floor; not a machine buzzing around underfoot. Every morning, you want to find your dresser filled with clean clothes; you have no need to socialize with a laundry-bot no matter how exuberant it may be. People want the things a robot can do for them; the robot itself may just get in the way.</p>
<p>Acknowledging that consumers don’t love robots the way we do might help roboticists build better products. The robot, I think, should not be an end in itself but instead should be the simplest, most cost effective way to deliver what our customers truly want. Furthermore, if a proposed robot is not the simplest, most cost effective solution to a problem consumers want solved, <em>then we shouldn’t build that robot</em>.</p>
<p>In the fairytale of the <a shape="rect" href="https://en.wikipedia.org/wiki/The_Elves_and_the_Shoemaker">shoemaker and the elves</a>, the shoemaker awakens each morning to find that his work is done. Discovering how the work was accomplished requires effort on the part of the shoemaker. This, I think, is good inspiration for robot developers.</p>
<p>Home robotics hasn’t achieved that happy ideal yet. We can program <a shape="rect" href="https://store.irobot.com/default/robot-vacuum-roomba/">Roomba</a> to emerge and work when no one is home, but it’s still necessary to empty the dirt compartment and clean the brushes. My newest robot, <a shape="rect" href="https://www.kickstarter.com/projects/rorymackean/tertill-the-solar-powered-weeding-robot-for-home-g?ref=9bb7cp">Tertill, which is available on Kickstarter</a>, is another step in the direction of invisibility—delivering a weed-free garden with <em>almost</em> no attention from the gardener.</p>
<p>I look forward to the day when the logistics of home life will simply run smoothly and no one need trouble themselves with the details. Unless they want to.</p>
<p>
<em>
<a shape="rect" href="https://www.linkedin.com/in/joe-jones-293b2b4/">Joe Jones</a> is co-founder and CTO of <a shape="rect" href="http://www.franklinrobotics.com/">Franklin Robotics</a>, which is developing a solar-powered garden-weeding robot named <a shape="rect" href="http://www.kickstarter.com/projects/rorymackean/tertill-the-solar-powered-weeding-robot-for-home-g?ref=9bb7cp">Tertill</a>. Previously, he was co-founder and CTO of <a shape="rect" href="https://www.public.harvestai.com/">Harvest Automation</a> and a senior roboticist at <a shape="rect" href="http://www.irobot.com/">iRobot</a>, where he was the co-inventor of the Roomba vacuuming robot. Follow him on Twitter: <a shape="rect" href="https://twitter.com/JoeRobotJones">@JoeRobotJones</a>
</em>
</p>
<h4>The views expressed in this guest post are solely those of the author and do not represent positions of <em>
<span>IEEE Spectrum</span>
</em> or the IEEE. This article was originally <a shape="rect" href="https://www.linkedin.com/pulse/home-robots-cobblers-elves-joe-jones">published</a> on LinkedIn.</h4>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/HEIT-Fa03Lk" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Mon, 10 Jul 2017 17:50:00 GMT</pubDate>
<dc:creator>Joe Jones</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/automaton/robotics/home-robots/why-i-think-home-robots-will-become-invisible</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyNDU4OA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyNDU4Ng.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/automaton/robotics/home-robots/why-i-think-home-robots-will-become-invisible</feedburner:origLink></item>
<item>
<title>Danish Electric Bike-Sharing Dodges Failure</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/Z_pnwLQI9uc/danish-electric-bikesharing-dodges-failure</link>
<description>Copenhagen's public electric bicycles are great to ride, but costly enough to nearly sink the system</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Copenhagen's public electric bicycles are great to ride, but costly enough to nearly sink the system<figure>
<img src="http://spectrum.ieee.org/image/MjkyMzk2OA.jpeg"/>
<figcaption>Photo: Lucas Laursen</figcaption>
</figure>
<div>
<p>Copenhagen's <a shape="rect" href="https://bycyklen.dk/">public electric bikes</a> are kind of a pain to get started: they are heavy and their coaster brake prevents riders from kicking the pedal around to a convenient starting place. The business side of the operation has also had a rough start, marked by <a shape="rect" href="http://www.cycling-embassy.dk/2015/11/02/the-bycyklen-bikes-have-arrived/">delivery delays</a>, bankruptcy, and restructuring. Once you do manage to push the bikes to a start, however, their 250-W electric motors kick in and they are a breeze to power around Copenhagen's well-marked and protected bike lanes.</p>
<p>It may not have been electricity, but something has also boosted the Copenhagen bike-sharing program: <a shape="rect" href="http://bycyklen.dk/en/news/user-statistics-june/">U</a>
<a shape="rect" href="https://bycyklen.dk/en/news/user-statistics-june/">sage grew</a> from just 169,000 rides in 2015 to 933,000 last year and the program, <span>called Bycyklen,</span> is on track for similar usage this year. That might be just enough to keep Bycyklen from falling over.</p>
<p>Private cycling is still much bigger, at 408 million journeys in 2016, according to a <a shape="rect" href="http://www.cycling-embassy.dk/wp-content/uploads/2017/07/19129-Cykelregnskab-2017_A4-bred_ENG_enkeltsider_final.pdf">report</a> [PDF] this month by Denmark's so-called “Cycling Embassy”. But that's not the point: these bikes were supposed to encourage people to reduce car use. But the Cycling Embassy report does not offer statistics on how many people made bike journeys instead of car journeys, or what Bycklen’s contribution to that might have been. City planners around the world want to get people out of <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/advanced-cars/how-pedestrians-can-protect-themselves-from-diesel-exhaust">polluting</a>, <a shape="rect" href="http://www.copenhagenize.com/2017/05/arrogance-of-space-copenhagen-hans.html">space-hogging</a> cars and into public transport and electric bicycles are a tantalizing means.</p>
<figure role="img" class="lt med">
<img src="http://spectrum.ieee.org/image/MjkyNDUwOQ.jpeg" alt="A row of white electric bicycles in front of a stately brick building in Copenhagen."/>
<figcaption class="hi-cap">
  Photo: Lucas Laursen
 </figcaption>
</figure>
<p/>
<p>When IEEE Spectrum last visited Copenhagen (see <a shape="rect" href="http://spectrum.ieee.org/transportation/alternative-transportation/copenhagen-pioneers-smart-electricbike-sharing">"Copenhagen Pioneers Smart Electric-Bike Sharing"<em>,</em>
</a> <span>IEEE Spectrum, </span>17 December 2013), the electric-bike-sharing program was in its pilot phase, with just a few dozen users. Bicyklen took two years to reach maturity, with all 1860 bikes delivered and 87 of a planned 105 docking stations installed around Copenhagen by late 2015. The bicycles are ambitious: they all contain a tablet computer on the handlebar that allows users to log in and unlock bicycles, manage their accounts, and navigate to a destination or find nearby docking stations.</p>
<p/>
<p>But they are expensive: a <a shape="rect" href="http://www.cycling-embassy.dk/2013/11/26/a-city-bike-costs-17000-kroner/">2013 story</a> reported that they cost around 17,000 DKK (about US$ 2833 at the time), though they were leased for about a third of that per year. The Danish national train service and Copenhagen's municipal governments paid around US$ 1.23 million a year to the operators. Users <a shape="rect" href="https://bycyklen.dk/en/pricing/">pay</a> 30 DKK (US $4.60) per hour or get discounts for paying in advance or via subscription.</p>
<p/>
<p>Despite those sources of revenue, the company that imported the bikes, Gobike, went under in March 2017 and the Bycyklen operator, Bikeshare, underwent a restructuring that included the pro-public transport <a shape="rect" href="http://www.cycling-embassy.dk/presentation-of-the-members/by-og-pendercykel-fonden/">City and Commuter Bike Foundation</a> stepping in to <a shape="rect" href="http://politiken.dk/indland/kobenhavn/art5906076/De-hvide-bycykler-bliver-på-de-københavnske-cykelstier">rescue the system</a> in April.</p>
<p/>
<p>Such failures are common. Madrid, which launched its service in 2014 (<a shape="rect" href="http://spectrum.ieee.org/tech-talk/transportation/alternative-transportation/madrid-begins-electric-bike-sharing">"Madrid Begins Electric Bike Sharing"</a>
<em>
<a shape="rect" href="http://spectrum.ieee.org/tech-talk/transportation/alternative-transportation/madrid-begins-electric-bike-sharing">,</a>
</em> <em>IEEE Spectrum</em>, 29 July 2014) has had to take over operations from the private company, Bonopark, that held the €25 million, ten-year contract. Even Bixi, the bike provider to New York City, London, and Montreal, had <a shape="rect" href="https://gigaom.com/2014/01/20/bike-sharing-faces-bankruptcy-cities-withhold-money-as-bixi-struggles/">financial troubles</a>. Last month, Wukong Bikes in China also <a shape="rect" href="http://www.bbc.com/news/business-40351409">declared bankruptcy</a>.</p>
<p/>
<p>Despite such discouragements, public docking stations in Copenhagen, Madrid, and other cities now have competition from smaller private conventional bike-sharing networks launched in 2015 by a company called <a shape="rect" href="https://www.donkey.bike/about/">Donkey Republic</a>. Users can unlock bikes via their mobile phones using a Bluetooth signal. (The name comes from the simplicity of using a donkey for transport.) But if they are going to make it in the “<a shape="rect" href="https://www.donkey.bike/about/">vandal-burns-bicycle</a>” world out there, this and every other bike-sharing system will have to be more robust and affordable than the first-generation attempts have proven.</p>
<p/>
<p>Here's hoping that the electric tablet-enabled bikes and the simple black Donkey bikes are both up to the challenge. Bicycles may offer city dwellers <a shape="rect" href="https://www.scientificamerican.com/article/health-benefits-of-bike-sharing-depend-on-age-gender/">health benefits</a> over cars, and despite their slow start, even clunky electric citybikes offer a better ride.</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/Z_pnwLQI9uc" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Mon, 10 Jul 2017 15:59:00 GMT</pubDate>
<dc:creator>Lucas Laursen</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/cars-that-think/transportation/alternative-transportation/danish-electric-bikesharing-dodges-failure</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMzk4MQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMzk3OQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/cars-that-think/transportation/alternative-transportation/danish-electric-bikesharing-dodges-failure</feedburner:origLink></item>
<item>
<title>DARPA Wants Brain Implants That Record From 1 Million Neurons</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/yl_uvSljm74/darpa-wants-brain-implants-that-record-from-1-million-neurons</link>
<description>In DARPA's new brain implant program, teams race to build functional prosthetics within four years</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>In DARPA's new brain implant program, teams race to build functional prosthetics within four years<figure>
<img src="http://spectrum.ieee.org/image/MjkyNDE1NQ.jpeg"/>
<figcaption>Image: Paradromics</figcaption>
<figcaption>One system will use implanted chips connected to bundles of microwires to interface with 1 million neurons. </figcaption>
</figure>
<div>
<p>DARPA is known for issuing big challenges. Still, the mission statement for its new <a shape="rect" href="http://www.darpa.mil/program/neural-engineering-system-design">Neural Engineering Systems Design</a> program is a doozy: Make neural implants that can record high-fidelity signals from 1 million neurons.</p>
<p>Today’s best brain implants, like the experimental system that a paralyzed man <a shape="rect" href="http://spectrum.ieee.org/biomedical/bionics/a-better-way-for-brains-to-control-robotic-arms">used to control a robotic arm</a>, record from just a few hundred neurons. Recording from 1 million neurons would provide a much richer signal that could be used to better control external devices such as wheelchairs, robots, and computer cursors.</p>
<p>What’s more, the DARPA program calls for the tech to be bidirectional; the implants must be able to not only record signals, but also to transmit computer-generated signals to the neurons. That feature would allow for neural prosthetics that provide blind people with visual information or deaf people with auditory info. </p>
<p>Today the agency <a shape="rect" href="http://www.darpa.mil/news-events/2017-07-10">announced the six research groups</a> that have been awarded grants under the NESD program. In a press release, DARPA says that even the 1-million-neuron goal is just a starting point. “A million neurons represents a miniscule percentage of the 86 billion neurons in the human brain. Its deeper complexities are going to remain a mystery for some time to come,” says Phillip Alvelda, who <a shape="rect" href="http://www.darpa.mil/news-events/2015-01-19">launched the program</a> in January. “But if we’re successful in delivering rich sensory signals directly to the brain, NESD will lay a broad foundation for new neurological therapies.”</p>
<div class="imgWrapper rt med">
<figure>
<img src="http://spectrum.ieee.org/image/MjkyNDE3Ng.jpeg" alt="A close-up image shows a bundle of microwires "/>
<figcaption class="hi-cap">
   Image: Paradromics
  </figcaption>
<figcaption>
   Paradromics will use bundles of microwire electrodes to interface with neurons.
  </figcaption>
</figure>
</div>
<p>One of the teams taking on the challenge is the Silicon Valley startup <a shape="rect" href="https://paradromics.com/">Paradromics</a>. Company CEO Matt Angle says his company is developing a device called the Neural Input-Output Bus (NIOB) that will use bundles of microwire electrodes to interface with neurons. With four bundles containing a total of 200,000 microwires, he says, the NIOB could record from or stimulate 1 million neurons. </p>
<p>“Microwire electrodes have been used since the 1950s, but traditionally they’re un-scaleable,” Angle tells <em>IEEE Spectrum</em> in an interview. With existing systems “you need to wire up one microwire to one amplifier—so if you want to use 100,000 microwires, that’s a lot of soldering work for a grad student,” he says. </p>
<p>Paradromics gets around this problem by polishing the end of a microwire bundle to make it very flat, and then bonding the whole bundle to a chip containing an array of CMOS amplifiers. “We make sure the probability of a single wire coming down and touching the pad on the CMOS is very, very high,” says Angle, “but if you have a few spots that don’t get wires, that doesn’t matter much.”</p>
<div class="imgWrapper xlrg">
<figure>
<img src="http://spectrum.ieee.org/image/MjkyNDE1NA.jpeg" alt="A microscopic image shows the polished end of a bundle of microwire electrodes. "/>
<figcaption class="hi-cap">
   Image: Mina Hanna and Abdul Obaid, Stanford University
  </figcaption>
<figcaption>
   Each microwire in the bundle has a diameter of less than 20 micrometers.
  </figcaption>
</figure>
</div>
<p>As always, DARPA emphasizes the practical application of technology. By the end of the four-year NESD program, the teams are expected to have working prototypes that can be used in therapies for sensory restoration.</p>
<p>Paradromics’ goal is a speech prosthetic. The NIOB device’s microwires will record signals from the <a shape="rect" href="https://en.wikipedia.org/wiki/Superior_temporal_gyrus">superior temporal gyrus</a>, a brain area involved in audio processing that decodes speech at the level of sound units called <a shape="rect" href="https://en.wikipedia.org/wiki/Phoneme">phonemes</a> (other areas of the brain deal with higher-level semantics).</p>
<p>The company drew inspiration from neuroscientist <a shape="rect" href="http://knightlab.berkeley.edu/">Robert Knight</a> at University of California Berkeley, who has shown that when people read aloud or read silently to themselves the neural signal in the superior temporal gyrus <a shape="rect" href="http://knightlab.berkeley.edu/publications/detail/513/">can be used to reconstruct the words</a>. This finding suggests that a user could just imagine speaking a phrase, and a neural implant could record the signal and send the information to a speech synthesizer. </p>
<p>While Paradromics has chosen this speech prosthetic as its DARPA-funded goal, its hardware could be used for any number of neural applications. The differences would come from changing the location of the implant and from the software that decodes the signal.</p>
<p>The challenges ahead of Paradromics are significant. Angle imagines a series of implanted chips, each bonded to 50,000 microwires, that send their data to one central transmitter that sits on the surface of the skull, beneath the skin of the scalp. To deal efficiently with all that data, the implanted system will have to do some processing: “You need to make some decisions inside the body about what you want to send out,” Angle says, “because you can’t have it digitizing and transmitting 50 GB per second.” The central transmitter must then wirelessly send data to a receiver patch worn on the scalp, and must also wirelessly receive power from it. </p>
<p>The other five teams that won NESD grants are research groups investigating vision, speech, and the sense of touch. The group from Brown University, led by neural engineer <a shape="rect" href="http://nurmikko.engin.brown.edu/?q=node/9">Arto Nurmikko</a>, is working on a speech prosthetic using tens of thousands of independent “neurograins,” each about the size of a grain of table salt. Those grains will <a shape="rect" href="https://news.brown.edu/articles/2017/10/neurograins">interface with individual neurons</a>, and send their data to one electronics patch that will either be worn on the scalp or implanted under the skin. </p>
<div class="imgWrapper ct lrg">
<figure>
<img src="http://spectrum.ieee.org/image/MjkyNDIxNg.jpeg" alt="An illustration shows an electronics patch on the skull of a man, and tiny electrodes labeled &quot;neurograin network&quot; implanted in the brain."/>
<figcaption class="hi-cap">
   Image: Brown University
  </figcaption>
<figcaption>
   The team from Brown University is developing a system using a network of independent "neurograins."
  </figcaption>
</figure>
</div>
<p>In an email, Nurmikko writes that his team is working on such challenges as how to implant the neurograins, how to ensure that they’re hermetically sealed and safe, and how to handle the vast amount of data that they’ll generate. And the biggest challenge of all may be networking 10,000 or 100,000 neurograins together to make one coherent telecommunications system that provides meaningful data.</p>
<p>“Even with a hundred thousand such grains, we would still not reach every neuron—and that’s not the point,” Nurmikko writes. “You want to listen to a sufficiently large number of neurons to understand how, say, the auditory cortex computes ‘the Star Spangled Banner’ for us to have a clear perception of both the music and the words.”</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/yl_uvSljm74" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Mon, 10 Jul 2017 14:00:00 GMT</pubDate>
<dc:creator>Eliza Strickland</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/the-human-os/biomedical/devices/darpa-wants-brain-implants-that-record-from-1-million-neurons</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyNDE3NA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyNDE3Mg.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/the-human-os/biomedical/devices/darpa-wants-brain-implants-that-record-from-1-million-neurons</feedburner:origLink></item>
<item>
<title>Automotive FMCW Radar System Design Using 3D Framework for Scenario Modeling</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/9UAZQ6djKYk/automotive-fmcw-radar-system-design-using-3d-framework-for-scenario-modeling</link>
<description>Preventing human casualties caused by car collisions is a high priority. Engineers are using the SystemVue Scenario Framework Solution for automotive frequency modulated continuous waveform (FMCW) radar system simulation to increase design fidelity and save cost during design and test.</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Preventing human casualties caused by car collisions is a high priority. Engineers are using the SystemVue Scenario Framework Solution for automotive frequency modulated continuous waveform (FMCW) radar system simulation to increase design fidelity and save cost during design and test.<div>
<p>
<span>Preventing human casualties caused by car collisions is a high priority. Today, radar systems, such as adaptive cruise control, stop-and-go, blind spot detection, lane change assist, and rear crash warning, are being developed and integrated into automobiles to reduce possible collisions. Engineers are using the SystemVue Scenario Framework Solution for automotive </span>frequency modulated continuous waveform<span> (FMCW) radar system simulation to increase design fidelity and save cost during design and test.</span>
</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/9UAZQ6djKYk" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Mon, 10 Jul 2017 13:30:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/whitepaper/automotive-fmcw-radar-system-design-using-3d-framework-for-scenario-modeling</guid>
<feedburner:origLink>http://spectrum.ieee.org/whitepaper/automotive-fmcw-radar-system-design-using-3d-framework-for-scenario-modeling</feedburner:origLink></item>
<item>
<title>In FutureLearn's MOOCs, Conversation Powers Learning at Massive Scale</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/w503f8BKWpk/conversation-powers-personalized-learning-in-futurelearn-mooc</link>
<description>Personalized learning has to get social. Students learn better through conversation</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Personalized learning has to get social. Students learn better through conversation<figure>
<img src="http://spectrum.ieee.org/image/MjkyNDAzNg.jpeg"/>
<figcaption>Illustration: iStockphoto</figcaption>
</figure>
<div>
<p>“Personalized learning” is one of the hottest trends in education these days. The idea is to create software that tracks the progress of each student and then adapts the content, pace of instruction, and assessment to the individual’s performance. These systems succeed by providing immediate feedback that addresses the student’s misunderstandings and offers additional instruction and materials.</p>
<p>The Bill &amp; Melinda Gates Foundation has reportedly spent more than US $300 million on personalized learning R&amp;D, while the Chan Zuckerberg Initiative—the investment and philanthropic company created by Facebook CEO Mark Zuckerberg and his wife, Priscilla Chan—has also signalled its commitment to personalized learning (which <a shape="rect" href="https://www.facebook.com/zuck/posts/10102342632826831">Zuckerberg announced on Facebook</a>, of course). Just last month, the two groups teamed up for the first time to jointly fund <a shape="rect" href="http://www.newprofit.org/new-profit-launches-personalized-learning-initiative/">a $12 million program</a> to promote personalized classroom instruction.</p>
<p>But personalized learning is hard to do. It requires breaking down a topic into its component parts in order to create different pathways through the material. It can be done, with difficulty, for well-structured and well-established topics, such as algebra and computer programming. But it really can’t be done for subjects that don’t form neat chunks, such as economics or psychology, nor for still-evolving areas, such as cybersecurity.</p>
<p>What’s more, this latest wave of personalized learning may have the unintended consequence of isolating students because it ignores the biggest advance in education of the past 50 years: learning through cooperation and conversation. It’s ironic that the inventor of the world’s leading social media platform is promoting education that’s the opposite of social.</p>
<p>Interestingly, one early proponent of personalized learning had a far more expansive view. In the 1960s, <a shape="rect" href="http://www.independent.co.uk/news/people/obituary-professor-gordon-pask-1304435.html">Gordon Pask</a>, a deeply eccentric British scientist who pioneered the application of cybernetics to entertainment, architecture, and education, co-invented the <a shape="rect" href="http://hackeducation.com/2015/03/28/pask">first commercial adaptive teaching machine</a>, which trained typists in keyboard skills and adjusted the training to their personal characteristics. A decade later, Pask extended personalized learning into a grand unified theory of learning as conversation.</p>
<p>For the layperson and even for a lot of experts, Pask’s <a shape="rect" href="http://www.aect.org/edtech/08.pdf">Conversation Theory</a> is impenetrable. But for those who manage to grasp it, it’s quite exciting. In essence, it explains how language-using systems, including people and artificial intelligences, can come to know things through well-structured conversation. He proposed that all human learning involves conversation. We converse with ourselves when we relate new experience to what we already know. We converse with teachers when we respond to their questions and they correct our misunderstandings. We converse with other learners to reach agreement.</p>
<p>This is more than an abstract theory of learning. It is a blueprint for designing educational technology. Pask himself developed teaching machines that conversed with students in a formalized language, represented as dynamic maps of interconnected concepts. He also introduced conversational teaching methods, such as <a shape="rect" href="http://www.open.ac.uk/blogs/innovating/?page_id=276">Teachback</a>, where the student explains to the teacher what has just been taught.</p>
<p>Pask’s theory still has relevance today. I know, because for the past four years, I’ve helped develop a new MOOC (Massive Open Online Course) platform based on his ideas. The platform is operated by <a shape="rect" href="https://www.futurelearn.com/">FutureLearn</a>, a company owned by <a shape="rect" href="http://www.open.ac.uk">The Open University</a>, the UK’s 48-year-old public distance learning and research university.</p>
<p>As Academic Lead for FutureLearn, I was determined not to copy existing MOOC platforms, which primarily focus on <a shape="rect" href="http://spectrum.ieee.org/tech-talk/at-work/education/how-the-pioneers-of-the-mooc-got-it-wrong">delivering lectures at a distance</a>. Instead, we designed FutureLearn for learning as conversation, and in such a way that learning would improve with scale, so that the more people who signed up, the better the learning experience would be.</p>
<p>Every course involves conversation as a core element. Each teaching step, whether video, text, or interactive exercise, has a flow of comments, questions, and replies from learners running alongside it. The steps make careful use of questions to prompt responses: What was the most important thing you learned from the video? Can you give an example from your own experience?</p>
<p>There are also dedicated discussions, in which learners reflect on the week’s activity, describe how they performed on assessments, or answer an open-ended question about the course. And online study groups allow learners to work together on a task and discuss their learning goals.</p>
<p>Even student assessment has a conversational component. Learners write short structured reviews of other students’ assignments, and in return they receive reviews of their assignments from their peers. Quizzes and tests are marked by computer, but the results come with pre-written responses from the educator.</p>
<p>When we began designing FutureLearn, previous research suggested that students don’t like to collaborate and converse online. Other online learning platforms that provide forums to discuss a course find these features are generally not well used. But that may be because these features are peripheral, whereas we put conversation at the heart of learning.</p>
<p>From the start, the conversations took off. In June 2015, the British Council ran the <a shape="rect" href="https://www.timeshighereducation.com/news/biggest-ever-mooc-starts-on-futurelearn/2020257.article">largest ever online MOOC</a>, on preparing for the IELTS English language proficiency exam. Some 271,000 people joined the FutureLearn course, including many based in the Middle East and Asia. Just one video on that course attracted over 60,000 comments from learners. By then, we had realized that the scale of conversation needed to be tamed by using the social media techniques of liking and following. We also encouraged course facilitators to reply to the most-liked comments so that learners who were following the facilitators would see them.</p>
<p>We had expected to deal with abusive comments on courses like “Muslims in Britain” and “Climate Change.” That hasn’t happened, and we aren’t entirely sure why. The initial testers of FutureLearn were Open University alumni, so perhaps they modelled good practice. Comments are moderated to remove the occasional abusive remark, but most of the conversation streams are so overwhelmingly positive that dissenters get constructive responses rather than triggering flame wars.</p>
<p>To be clear, students aren’t required to take part in a discussion to complete a FutureLearn course, but the learning is definitely enriched when students read the responses of other learners and join in. On average, a third of learners on a FutureLearn course contribute comments and replies.</p>
<p>FutureLearn is now a worldwide MOOC platform, with more than six million total registrations. We’re continuing to consider new conversational features, such as reflective conversations where learners write and discuss annotations on the teaching material, and experiential learning where learners share their personal insights and experiences.</p>
<p>FutureLearn has taken the path of social learning and proven that it can work at scale. Going forward, the big challenge for FutureLearn and for educational technology in general will be to find ways of combining the individual pathways and adaptive content of personalized learning with the benefits of learning through conversation and collaboration.</p>
<p/>
<p>About the Author</p>
<p>
<a shape="rect" href="http://www.open.ac.uk/people/ms8679">Mike Sharples</a> is Professor of Educational Technology at The Open University and Academic Lead at FutureLearn. He is Associate Editor in Chief of <em>IEEE Transactions on Learning Technologies</em> and a Senior Member of IEEE.</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/w503f8BKWpk" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Mon, 10 Jul 2017 12:00:00 GMT</pubDate>
<dc:creator>Mike Sharples</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/tech-talk/at-work/education/conversation-powers-personalized-learning-in-futurelearn-mooc</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMzM1Mw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMzM1MQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/tech-talk/at-work/education/conversation-powers-personalized-learning-in-futurelearn-mooc</feedburner:origLink></item>
<item>
<title>Underwater Robots Learn a New Language, JANUS</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/AX4ONU38ciI/underwater-robots-learn-a-new-language-janus</link>
<description>The new JANUS acoustic signal will connect aquatic robots and sensors into an “Internet of Underwater Things”</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>The new JANUS acoustic signal will connect aquatic robots and sensors into an “Internet  of Underwater Things”<figure>
<img src="http://spectrum.ieee.org/image/MjkyMjIwMw.jpeg"/>
<figcaption>Photo: NATO</figcaption>
</figure>
<div>
<p>For decades, global standards defining Wi-Fi and cellular networks have allowed people to exchange data over the air. But those technologies are worthless below the waves, and until now there have been no such standards for underwater communications.</p>
<p>
<span>“We live in a time of wild west communications underwater,” says João Alves, a principal scientist for NATO.</span>
</p>
<p>Now, Alves and other NATO researchers have <a shape="rect" href="http://www.nato.int/cps/bu/natohq/news_143247.htm">
<span>established the first international standard</span>
</a> for underwater communications. Named <a shape="rect" href="http://ieeexplore.ieee.org/document/7017134/">
<span>JANUS</span>
</a>, after the<a shape="rect" href="https://www.britannica.com/topic/Janus-Roman-god">
<span> Roman god of gateways</span>
</a>, it creates a common protocol for an acoustic signal with which underwater systems can connect.</p>
<p>Acoustics has long been a popular medium for underwater communications. Generally, optical signals could deliver high data rates underwater at distances up to 100 meters, while sound waves covered much greater distances at lower data rates.</p>
<p>The main role of JANUS is to bring today’s acoustic systems into sync with one another. It does this in part by defining a common frequency—<a shape="rect" href="https://www.youtube.com/watch?v=rSW5eDoZgAw">
<span>11.5 kilohertz</span>
</a>—over which all systems can announce their presence. Once two systems make contact through JANUS, they may decide to switch to a different frequency or protocol that could deliver higher data rates or travel further.</p>
<p>In this way, Alves compares JANUS to the English language—two visitors to a foreign country may speak English to one another before realizing they are both native Spanish speakers, and switch to their native tongue.</p>
<aside class="inlay pullquote rt med">
 “We live in a time of wild west communications underwater.”
</aside>
<p>The JANUS standard was developed by Alves’ team at NATO’s <a shape="rect" href="http://www.cmre.nato.int/">
<span>Centre for Maritime Research and Experimentation</span>
</a> in La Spezia, Italy and sponsored by NATO’s <a shape="rect" href="http://www.act.nato.int/">
<span>Allied Command Transformation.</span>
</a> It is the first underwater communications standard to be defined by an international body.<span>  </span>
</p>
<p>To create JANUS, Alves’ team relied on the Littoral Ocean Observatory Network, a collection of acoustic tripods that NATO researchers have placed on the seafloor in the harbour of La Spezia, Italy. In another series of tests, researchers aboard the research vessel <em>Alliance</em>, a NATO ship operated by the Italian Navy, measured the performance of JANUS signals along the surface of the ocean.</p>
<p>Once deployed, aquatic systems could use JANUS to send data directly to each other, or to “gateway buoys” bobbing on the water’s surface. The buoys could then use radio waves to relay that data to nearby control centers.</p>
<p>Based on their work, Alves says submarines could also use JANUS to issue calls for help to ships and rescue crews. “Using an open scheme like JANUS to issue distress calls would increase incredibly the chances of those being picked up,” he says.</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/AX4ONU38ciI" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Sat, 8 Jul 2017 15:00:00 GMT</pubDate>
<dc:creator>Amy Nordrum and Alyssa Pagano</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/video/telecom/standards/underwater-robots-learn-a-new-language-janus</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMjIxNw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMjIxOQ.gif" height="" width="" />
<feedburner:origLink>http://spectrum.ieee.org/video/telecom/standards/underwater-robots-learn-a-new-language-janus</feedburner:origLink></item>
<item>
<title>Building a Battery-Free Cellphone</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/Il5VKQTXnmA/building-a-batteryfree-cellphone</link>
<description>A prototype from the University of Washington leverages a backscattered radiofrequency wave to transmit analog signals</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>A prototype from the University of Washington leverages a backscattered radiofrequency wave to transmit analog signals<figure>
<img src="http://spectrum.ieee.org/image/MjkyMzExMA.jpeg"/>
<figcaption>Photo: University of Washington</figcaption>
</figure>
<div>
<p>Batteries can be a real drag. They’re expensive and must be constantly recharged. Though some battery-free sensors can passively transmit small amounts of data, most consumer electronics today still rely on bulky batteries to store power.</p>
<p/>
<p>A team from the University of Washington has built a <a shape="rect" href="http://batteryfreephone.cs.washington.edu/">battery-free cellphone</a> that can harness power from radiofrequency (RF) waves sent to it from a nearby base station. The phone not only harnesses the power it needs to operate from those waves, but can also place a voice call by modifying and reflecting the same waves back to the base station, through a technique known as backscattering.</p>
<p>The UW team has shown their device (built from off-the-shelf components) can use harvested power to place a call from a distance of 9.4 meters away from a customized base station. They also built a version outfitted with photodiodes that collect ambient light to passively power the device, allowing them to place a call from a distance of 15.2 meters.</p>
<p/>
<p>To place or receive a call, the entire device consumes just 2 to 3 microwatts of power. The group’s design supports only voice calls—there’s no data plan—but its creators say it would still prove quite useful in certain circumstances.</p>
<p>“Imagine a scenario where your phone died but you could at least have enough power to make a 9-1-1 call,” says <a shape="rect" href="http://vamsitalla.com.s3-website-us-west-1.amazonaws.com/">Vamsi Talla</a>, who built the phone while a post-doc in electrical engineering at the University of Washington. “That could be a lifesaver.”</p>
<p>Many of today’s passive sensors transmit data only occasionally–perhaps every minute or so—due to power constraints. Or, in the case of RFID tags, some passive sensors must be very close to a reader to harness enough power to transmit a message.</p>
<p>
<span>In </span>
<a shape="rect" style="font-family: Georgia, serif; font-size: 18px;" href="http://batteryfreephone.cs.washington.edu/files/batteryFreePhone.pdf">a conference paper </a>
<span>published earlier this month, </span>Talla, who now serves as chief technology officer of <a shape="rect" href="https://www.jeevawireless.com/">Jeeva Wireless</a>, and his colleagues call their design “a major leap” toward the creation of battery-free devices. Ultimately, they want to build devices that can constantly transmit or receive data and voice calls over long distances without batteries.</p>
<p>“Now we're showing the world that a battery-free device doesn't have to be a sensor, but it can be a whole system where in real-time, you can actually do something useful,” Talla says.</p>
<p/>
<p>
<a shape="rect" href="https://users.ece.cmu.edu/~raj/">Raj Rajkumar</a>, a professor in electrical engineering at Carnegie Mellon University, says the research is “another interesting step in the evolution of wireless power transmission.” He also noted that follow-up studies would need to evaluate the safety of transmitting power to mobile devices in this way.</p>
<p>For now, the UW device only works with customized base stations within close range of the user. Being near a base station may not always be possible for users who need to place an urgent call. But Talla says this could change with the anticipated rollout of 5G networks, in which providers are expected to dramatically increase the density of base stations—at least in cities.</p>
<p>He also expects to achieve greater distances at other frequencies. In their initial tests, the base station broadcast a single tone on the 915 megahertz frequency band to the device.</p>
<p/>
<p>To place a call, the battery-free phone uses an <a shape="rect" href="https://en.wikipedia.org/wiki/Electret_microphone">electret microphone</a> to generate an analog signal. An electret microphone contains a diaphragm with a fixed electrostatic charge. Within the microphone, the diaphragm forms a capacitor with a metal plate. When a person speaks, mechanical vibrations from their voice cause the diaphragm to change shape relative to the metal plate. This affects the capacitance of the device and generates a small voltage.</p>
<p/>
<p>The microphone connects to an antenna through a RF switch. The voltage from the microphone travels to the antenna, where it directly alters the amplitude of the single tone embedded in the RF wave. The altered signal is then reflected back to the base station using backscattering techniques. These methods reduce the phone’s power consumption by three or four orders of magnitude compared to a traditional radio.   </p>
<p/>
<p>The phone’s design was inspired in part by the <a shape="rect" href="http://www.cryptomuseum.com/covert/bugs/thing/index.htm">Great Seal Bug</a>, a passive surveillance device planted in the desk of the U.S. Ambassador to Moscow by Russian authorities in the late 1940s. The UW phone is also half-duplex, which means a user can either listen or talk, but can’t do both at the same time. A microcontroller manages the RF switch, connecting the microphone to the antenna when a user presses a button to talk, and connecting the earphones when the user wants to listen.</p>
<p/>
<p>To minimize power consumption, the team moved much of the processing that would typically be performed on a phone to their customized base station. Smartphones today contain components that convert analog sound to digital signals before transmission, and other components that convert the digital signals received from a base station to analog sound.</p>
<p>In the UW system, the base station performs these conversions and connects to the nationwide cellular network, forwarding calls or sending signals it receives back to the user. Talla says the group will continue to refine the technology through a licensing agreement with Jeeva Wireless.</p>
<p/>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/Il5VKQTXnmA" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 7 Jul 2017 16:00:00 GMT</pubDate>
<dc:creator>Amy Nordrum</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/building-a-batteryfree-cellphone</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMzEzNQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMzEzMw.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/building-a-batteryfree-cellphone</feedburner:origLink></item>
<item>
<title>Video Friday: DARPA's LUKE Arm, Human Support Robot, and Starting a Robotics Company</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/sg-vxVxMEDc/video-friday-darpa-luke-arm-human-support-robot-starting-robotics-company</link>
<description>Your weekly selection of awesome robot videos</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Your weekly selection of awesome robot videos<figure>
<img src="http://spectrum.ieee.org/image/MjkyMzI5OQ.jpeg"/>
<figcaption>Photo: Mobius Bionics</figcaption>
</figure>
<div>
<p>Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (<a shape="rect" href="mailto:e.guizzo@ieee.org; evan.ackerman@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a>!):</p>
<h5>
<a shape="rect" href="http://www.ee.cuhk.edu.hk/~qhmeng/icar2017/index.html">ICAR 2017</a> – July 10-12, 2017 – Hong Kong</h5>
<h5>
<a shape="rect" href="http://www.roboticsconference.org/">RSS 2017</a> – July 12-16, 2017 – Cambridge, Mass., USA</h5>
<h5>
<a shape="rect" href="http://marss-conference.org/">MARSS</a> – July 17-21, 2017 – Montreal, Canada</h5>
<h5>
<a shape="rect" href="http://soma-summerschool.dlr.de/">Summer School on Soft Manipulation</a> – July 17-21, 2017 – Lake Chiemsee, Germany</h5>
<h5>
<a shape="rect" href="http://livingmachinesconference.eu/2017/">Living Machines Conference</a> – July 25-28, 2017 – Stanford, Calif., USA</h5>
<h5>
<a shape="rect" href="http://www.robocup2017.org/eng/index.html">RoboCup 2017</a> – July 27-31, 2017 – Nagoya, Japan</h5>
<h5>
<a shape="rect" href="http://case2017.org/">IEEE CASE 2017</a> – August 20-23, 2017 – Xi’an, China</h5>
<h5>
<a shape="rect" href="http://www.ieee-arm.org/">IEEE ICARM 2017</a> – August 27-31, 2017 – Hefei, China</h5>
<p>
<a shape="rect" href="mailto:e.guizzo@ieee.org; evan.ackerman@ieee.org?subject=Robot%20video%20suggestion%20for%20Video%20Friday">Let us know</a> if you have suggestions for next week, and enjoy today’s videos.</p>
<hr/>
<p>Dean Kamen’s DEKA R&amp;D firm, with support from DARPA’s Revolutionizing Prosthetics Program, <a shape="rect" href="http://spectrum.ieee.org/biomedical/bionics/dean-kamens-luke-arm-prosthesis-readies-for-clinical-trials">designed the advanced prosthetic LUKE Arm</a> to give amputees “dexterous arm and hand movement through a simple, intuitive control system.” The <a shape="rect" href="http://spectrum.ieee.org/video/biomedical/bionics/dean-kamens-artificial-arm">LUKE Arm</a>, which <span>stands for Life Under Kinetic Evolution but is also a reference to <a shape="rect" href="https://www.youtube.com/watch?v=cik8cl_n9AE">Luke Skywalker’s bionic hand</a>, “</span>
<span>allows users to control multiple joints simultaneously and provides a variety of grips and grip forces by means of wireless signals generated by sensors worn on the feet or via other easy-to-use controllers.</span>
<span>” It </span>
<a shape="rect" href="http://spectrum.ieee.org/automaton/biomedical/bionics/dean-kamen-luke-arm-prosthesis-receives-fda-approval">received FDA approval in 2014</a>, and will now be commercialized by Mobius Bionics of Manchester, N.H. </p>
<blockquote>
<p>
<em>On Friday, June 30th, at a ceremony at the Manhattan campus of the Department of Veterans Affairs’ New York Harbor Health Care System, two veterans living with arm amputations became the first recipients of a new generation of prosthetic limb that promises them unprecedented, near-natural arm and hand motion. The modular, battery-powered arms, designed and developed by DEKA Research and Development Corporation for DARPA, represent the most significant advance in upper extremity prosthetics in more than a century.</em>
</p>
</blockquote>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/Zg-FH1Gn2Ls" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>[ <span>
<a shape="rect" href="http://www.mobiusbionics.com/">Mobius Bionics</a> ] and</span>
<em> </em>[ <a shape="rect" href="http://www.darpa.mil/program/revolutionizing-prosthetics">DARPA</a> ]</p>
<hr/>
<blockquote>
<p>
<em>On September 16, 2008, the life of U.S. Army Chief Warrant Officer Romulo "Romy" Camargo changed. During a humanitarian mission for his third deployment in Afghanistan, Romy’s team was ambushed and he was struck by a bullet. At that moment, he unwittingly began a new life of limited mobility — he had just become paralyzed from the shoulders down. With the idea of ’Mobility for All’ being a key driving principle behind Toyota, Romy’s story has provided a perfect opportunity to demonstrate just what that means. To Toyota, mobility goes beyond selling automobiles — it means helping people navigate their world and live the life they want to live, regardless of their circumstances. Toyota is dedicated to developing ways to utilize advanced technology to improve quality of life, especially among people with limited mobility.</em>
</p>
</blockquote>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/t5-uqGllcp8" width="620" frameborder="0" height="349"/>
</p>
<p/>
<blockquote>
<p>
<em>Nowhere is this principle more evident than in Toyota’s Technology for Human Support division. And with the help of Romy and his family, Toyota was able to carry out North America’s first in-home trial of the amazing Human Support Robot (HSR). While the HSR is still in an experimental research phase, and not destined for mass production anytime soon, its unique design and abilities immediately proved useful in assisting Romy with everyday activities like opening doors and bringing food from the kitchen, helping him to regain some independence and live more freely.</em>
</p>
</blockquote>
<p>[ <a shape="rect" href="https://www.toyota.com/usa/toyota-effect/romy-robot.html">Toyota</a> ]</p>
<hr/>
<p>Nick Kohut from <a shape="rect" href="http://dashrobotics.com">Dash Robotics</a> talks to Udacity about learning robotics and starting a hardware-based robotics company. It’s a good interview, and worth your time.</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/fGDn6FenCaA?rel=0" width="620" frameborder="0" height="349"/>
</p>
<p>[ <a shape="rect" href="https://sites.google.com/knowlabs.com/udacity-robotics-slack/interviews?authuser=0">Udacity</a> ] via [ <a shape="rect" href="http://robohub.org/udacity-robotics-video-interview-series/">Robohub</a> ]</p>
<hr/>
<blockquote>
<p>
<em>Spacecraft equipped with gecko-inspired dry adhesive grippers can dynamically grasp objects having a wide variety of featureless surfaces. In this paper we propose an optimization-based control strategy to exploit the dynamic robustness of such grippers for the task of grasping a free-floating, spinning object.</em>
</p>
</blockquote>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/TnV2Hbe-Qps" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>More info in the paper at the link below.</p>
<p>[ <a shape="rect" href="https://asl.stanford.edu/wp-content/papercite-data/pdf/MacPherson.Hockman.Bylard.ea.FSR17.pdf">Paper</a> ] via [ <a shape="rect" href="http://news.stanford.edu/2017/06/16/engineers-space-robot-technology-helps-self-driving-cars/">Stanford</a> ]</p>
<p/>
<p/>
<p/>
<hr/>
<p/>
<p>EMYS knows the best way to teach kids a new language: bribery!</p>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/LGPBnLRMRgc" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>[ <a shape="rect" href="http://emys.co/">EMYS</a> ]</p>
<p>
<em>Thanks Jan!</em>
</p>
<p/>
<hr/>
<p/>
<p>
<a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/diy/anki-code-lab-brings-sophisticated-graphical-programming-to-cozmo-robot">Cozmo</a> plus Apple ARKit makes for an awfully cute augmented reality fireworks show:</p>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/ZBvvPZXgNwU" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>[ <a shape="rect" href="https://anki.com/en-us/4thofjuly">Anki</a> ]</p>
<p/>
<p/>
<p/>
<hr/>
<p/>
<blockquote>
<p>
<em>In June and July 2017, the Institute of Robotics and Mechatronics conducted tests on the volcano Etna in Sicily under the Helmholtz Alliance Robex. In this video is a part of the moon analog mission to see. Within the scope of the experiment, the rover moves to the lunar lander and loosens the seismic instrument from the screw joint with the robot arm. Then, the seismic instrument is placed in a holder on the rover, and the rover thus moves to a predetermined point to place the seismometer on the ground. The seimometer then makes a measurement, is raised again, and is routed autonomously from the robot to a further point of the crossmember and set down there.</em>
</p>
</blockquote>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/-wXQf0b1bqQ" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>[ <a shape="rect" href="http://www.dlr.de/irs/en/desktopdefault.aspx/tabid-11109/">DLR</a> ]</p>
<p/>
<hr/>
<p/>
<blockquote>
<p>
<em>We are excited to show off a simulation of a Prius in Mcity using ROS Kinetic and Gazebo 8. ROS enabled the simulation to be developed faster by using existing software and libraries. The vehicle’s throttle, brake, steering, and transmission are controlled by publishing to a ROS topic. All sensor data is published using ROS, and can be visualized with RViz.</em>
</p>
</blockquote>
<p>
<iframe scrolling="auto" mozallowfullscreen="" allowfullscreen="allowfullscreen" src="https://player.vimeo.com/video/223812225?title=0&amp;byline=0" width="620" webkitallowfullscreen="" frameborder="0" height="349"/>
</p>
<blockquote>
<p>
<em>We leveraged Gazebo’s capabilities to incorporate existing models and sensors. The world contains a new model of Mcity and a freeway interchange. There are also models from the gazebo models repository including dumpsters, traffic cones, and a gas station. On the vehicle itself there is a 16 beam lidar on the roof, 8 ultrasonic sensors, 4 cameras, and 2 planar lidar.</em>
</p>
</blockquote>
<p>[ <a shape="rect" href="https://www.osrfoundation.org/simulated-car-demo/">OSRF</a> ]</p>
<p/>
<hr/>
<p/>
<p>Robot maker Hinamitetu outdoes himself again with a vault robot:</p>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/1MQ3eYevMso" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>At this rate, the Olympics are going to need to invent some extra gymnastics events just to give Hinamitetu something to do.</p>
<p>[ <a shape="rect" href="https://www.youtube.com/channel/UCYQDHzSrOA6sFVk7gPjnRWg">Hinamitetu</a> ]</p>
<hr/>
<blockquote>
<p>
<em>The U.S. Army Research Laboratory is experimenting with a hybrid unmanned aerial vehicle that transforms in flight and gives Soldiers an advantage on the battlefield of the future. Weighing in at just over half a pound, this UAV tilts its rotors to go from hovering like a helicopter to speeding along like a sleek airplane. The design has many efficiencies, but also provides many challenges to its creator, Dr. Steve Nogar, a postdoctoral researcher with the lab’s Vehicle Technology Directorate. With this hybrid UAV, transforming from hovering to horizontal flight offers speed, agility and mission flexibility.</em>
</p>
</blockquote>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/-71PMQKwCb0" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>[ <a shape="rect" href="https://www.army.mil/article/190401/experimental_drone_transforms_in_flight">ARL</a> ]</p>
<p/>
<p/>
<hr/>
<p/>
<blockquote>
<p>
<em>ARIAC is a simulation-based competition designed to promote agility in industrial robot systems by utilizing the latest advances in artificial intelligence and robot planning. The goal is to enable industrial robots on the shop floors to be more productive, more autonomous, and more responsive to the needs of shop floor workers. The virtual nature of the competition enabled participation of teams affiliated with companies and research institutions from across three continents. While autonomously completing pick-and-place kit assembly tasks, teams were presented with various agility challenges developed based on input from industry representatives. These challenges include failing suction grippers, notification of faulty parts, and reception of high-priority orders that would prompt teams to decide whether or not to reuse existing in-progress kits.</em>
</p>
</blockquote>
<p>
<iframe scrolling="auto" mozallowfullscreen="" allowfullscreen="allowfullscreen" src="https://player.vimeo.com/video/224134238?title=0&amp;byline=0" width="620" webkitallowfullscreen="" frameborder="0" height="349"/>
</p>
<blockquote>
<p>
<em>Teams had control over their system’s suite of sensors positioned throughout the workcell, made up of laser scanners, intelligent vision sensors, quality control sensors and interruptible photoelectric break-beams. Each team participating in the finals chose a unique sensor configuration with varying associated costs and impact on the team’s strategy.</em>
</p>
</blockquote>
<p>[ <a shape="rect" href="https://www.osrfoundation.org/ariac-finals-results-announced/">OSRF</a> ]</p>
<p/>
<p/>
<p/>
<hr/>
<p/>
<blockquote>
<p>
<em>NASA’s Dryden Flight Research Center has a heritage of developmental and operational experience with unmanned aircraft systems. Spanning from 1969 to the present, this 4-minute 48-second fast-paced visual survey produced in 2013 captures nearly a half-century of innovation in environmental and aeronautical research, showing the scope, scale, and variety of unmanned and remotely piloted vehicle projects flown at the center.</em>
</p>
</blockquote>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/_c2xYT7-RFM" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>We got closer looks and some of these aircraft (and others not in the video) when <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/drones/unmanned-aerial-systems-at-nasa-dryden">we visited NASA Dryden a few years ago</a>, but here are more videos that NASA just posted of some of their old-school flying robots:</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/NMWnCLfrBpg" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p/>
<blockquote>
<p>
<em>This 57-second movie clip taken August 12, 2005 shows tests of NASA’s Autonomous Soaring Project with comments by Project Engineer Michael Allen. A series of research flights at NASA’s Dryden (now Armstrong) Flight Research Center in the summer of 2005 validated the premise that using thermal lift could significantly extend the range and endurance of small unmanned air vehicles (UAVs) without a corresponding increase in fuel requirements.</em>
</p>
</blockquote>
<p/>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/yR5fL_QR8ao" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p/>
<blockquote>
<p>
<em>This 1-minute, 53-second video taken on October 1, 2011 shows the NASA Dryden (now Armstrong) Flight Research Center’s Dryden Remotely Operated Integrated Drone (DROID) sub-scale test bed aircraft is moving up to the flight test big leagues! The center’s Automatic Collision Avoidance Technology team conducted test flights of new software architecture on the radio-controlled large model aircraft to demonstrate that even the simplest flight systems may benefit from Automatic Ground Collision Avoidance Software (GCAS).</em>
</p>
</blockquote>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/J3VtlF8JbHY" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/EGtF-Tcq2M8" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p/>
<blockquote>
<p>
<em>This video from June 7, 2003 shows the Helios Prototype solar-powered aircraft taking off on a checkout flight over Kauai, Hawaii.</em>
</p>
</blockquote>
<p>[ <a shape="rect" href="https://www.nasa.gov/centers/armstrong/news/FactSheets/index.html">NASA Dryden</a> ]</p>
<p/>
<hr/>
<p/>
<blockquote>
<p>
<em>On June 15th, WeRobotics co-founder Patrick Meier gave a talk at the 2017 National Geographic Explorer’s Festival. Patrick spoke about the activities ongoing at each of the Flying Labs along with their future plans.</em>
</p>
</blockquote>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/hoQByKRQOS0" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>[ <a shape="rect" href="http://werobotics.org/">WeRobotics</a> ]</p>
<p/>
<p/>
<p/>
<hr/>
<p/>
<blockquote>
<p>
<em>On July 4, 1997, NASA’s Mars Pathfinder lander and Sojourner rover successfully landed on the Red Planet utilizing a revolutionary airbag landing system. This special 20th anniversary show chronicles the stories and the people behind the groundbreaking mission that jump-started 20 years of continuous presence at Mars. Guests include: Former NASA Administrator Dan Goldin, former JPL Directors Ed Stone and Charles Elachi, JPL Director Michael Watkins and Pathfinder mission team members Jennifer Trosper and Brian Muirhead. Recorded June 27, 2017 at NASA Jet Propulsion Laboratory; aired on NASA TV on July 4, 2017.</em>
</p>
</blockquote>
<p/>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/XvVOgS9jWn0" width="620" frameborder="0" height="349"/>
</p>
<p/>
<p>[ <a shape="rect" href="https://www.jpl.nasa.gov/missions/mars-pathfinder-sojourner-rover/">JPL</a> ]</p>
<p/>
<hr/>
<p/>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/sg-vxVxMEDc" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 7 Jul 2017 15:35:00 GMT</pubDate>
<dc:creator>Evan Ackerman and Erico Guizzo</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-darpa-luke-arm-human-support-robot-starting-robotics-company</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMzMwOQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMzMwNw.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-darpa-luke-arm-human-support-robot-starting-robotics-company</feedburner:origLink></item>
<item>
<title>This Circuit Board Will Self-Destruct in 5, 4, 3…</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/KkKGXksM0u4/this-circuit-board-will-selfdestruct-in-5-4-3</link>
<description>Nanowire circuit boards that dissolve in cold water have some truly sci-fi applications</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Nanowire circuit boards that dissolve in cold water have some truly sci-fi applications<figure>
<img src="http://spectrum.ieee.org/image/MjkyMzAyNw.gif"/>
<figcaption>Gif: Vanderbilt University/IEEE Spectrum</figcaption>
</figure>
<div>
<p>Under the cover of night, enemy agents capture an elite solider unit. The agents hold down the commander and cut through the skin of his upper arm, pulling out a slim, transparent circuit board containing the unit’s military directives. But as soon as the agents remove the device, it dissolves before their eyes.</p>
<p/>
<p>Sounds sci-fi, right? Yet such technology is one step closer to reality this month, thanks to a proof-of-concept study published in the journal <em>
<a shape="rect" href="http://pubs.acs.org/doi/abs/10.1021/acsami.7b04748">ACS Applied Materials and Resources</a>
</em>. A pair of engineers at Vanderbilt University has constructed simple circuit boards, including conductive traces and capacitors, that work above room temperature but rapidly disintegrate when cooled below 32°C (89<span>°F).</span>
</p>
<p/>
<p>There are numerous types of transient electronics in development, but many are designed to self-destruct when energy, such as <a shape="rect" href="http://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/selfdestructing-gadgets-made-not-so-mission-impossible">heat or light</a>, is applied. Others must be <a shape="rect" href="http://spectrum.ieee.org/energywise/green-tech/fuel-cells/this-battery-will-selfdestruct-in-30-minutes">submerged in water</a>. The novelty of this new technology is that simple neglect leads to destruction: When warm, the technology works; if not, it comes apart.</p>
<p/>
<p>“It’s a little bit backwards of what people tend to think,” says senior author <a shape="rect" href="https://my.vanderbilt.edu/bellanlab/lmb/">Leon Bellan</a>, who develops micro/nanofabrication techniques for smart materials at Vanderbilt. “You have to provide heat to prevent it from dissolving.”</p>
<p/>
<p>The system involves a series of silver nanowires held together by a polymer that is hydrophobic at room temperature or warmer, but hydrophilic at lower temps. Bellan, along with graduate student Xin Zhang, placed a simple circuit board made of these materials in a warm water bath, where it was able to turn on an LED light.</p>
<p/>
<p>But when the engineers cooled the water bath, the board dissolved. The electronic components stopped working within seconds as the silver nanowires lost contact with each other, says Bellan. The whole thing disintegrated within a matter of minutes. </p>
<p/>
<p>Bellan previously used the temperature-sensitive polymer as the raw material in a cotton candy machine. The machine spun a network of fine threads that dissolved once embedded in a hydrogel, leaving behind capillary-like <a shape="rect" href="https://news.vanderbilt.edu/2016/02/08/cotton-candy-machines-may-hold-key-for-making-artificial-organs/">microfluidic networks</a>.</p>
<p/>
<p>The self-destruct system does have a number of sci-fi applications, admits Bellan, such as our example of the solider with a device embedded in his arm. “If body heat were lost, because the solider was killed or because it was removed from the solider, it would immediately dissolve,” says Bellan. </p>
<p/>
<p>There are also more mundane, but more near-term, applications of the tech. Metal nanoparticles have been <a shape="rect" href="https://www.ncbi.nlm.nih.gov/pubmed/25877089">widely studied</a> in biomedicine and appear to be largely non-toxic and even anti-microbial. So it could be feasible to implant an RFID tag made with this technology into a hospital patient, or even livestock, as a way to track movements. Then, if a patient is released from a hospital or an animal sold, the simple application of a cold pack to the site of implantation would dissolve the tag.</p>
<p/>
<p>Bellan’s group is now working to see if the circuit boards behave the same in tissue. (They’ve begun by embedding them in steaks. Yum.) They’re also building RFID tags into the material for wireless detection, and are working to construct devices that include transistors using semiconductors, says Bellan. “Once you start integrating more complex electronic components, then you start talking about circuits that do very interesting things.”</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/KkKGXksM0u4" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 7 Jul 2017 15:00:00 GMT</pubDate>
<dc:creator>Megan Scudellari</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/the-human-os/biomedical/devices/this-circuit-board-will-selfdestruct-in-5-4-3</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMzA1OQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMzA2MQ.gif" height="" width="" />
<feedburner:origLink>http://spectrum.ieee.org/the-human-os/biomedical/devices/this-circuit-board-will-selfdestruct-in-5-4-3</feedburner:origLink></item>
<item>
<title>Tool Reveals Mechanism Behind High-Temperature Superconductivity</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/Kbh-XYFctMo/tool-reveals-mechanism-behind-hightemperature-superconductivity</link>
<description>The atomic vibrations in a material and its electrons are much closely bound than previously thought</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>The atomic vibrations in a material and its electrons are much closely bound than previously thought<figure>
<img src="http://spectrum.ieee.org/image/MjkyMjk0MA.gif"/>
<figcaption>Greg Stewart/SLAC National Accelerator Laboratory</figcaption>
<figcaption>An animation shows how an infrared laser beam (orange) triggers atomic vibrations in a thin layer of iron selenide, which are then recorded by ultrafast X-ray laser pulses to create an ultrafast movie.</figcaption>
</figure>
<div>
<p>Scientists at the Department of Energy’s <a shape="rect" href="http://spectrum.ieee.org/searchContent?q=SLAC">SLAC National Accelerator Laboratory</a> and Stanford University have <a shape="rect" href="https://www.eurekalert.org/emb_releases/2017-07/dnal-sgf070317.php">combined two microscopy techniques to peer into the interactions that occur between electrons and the atomic vibrations of a material</a>. They found that the coupling between electrons and atomic vibrations is ten times stronger than anyone had previously believed.</p>
<p/>
<p>This new insight could lead to <a shape="rect" href="http://spectrum.ieee.org/searchContent?q=superconductivity+&amp;type=&amp;sortby=relevance">superconductivity</a> at much higher temperatures than previously thought possible, leading to a large ripple effect on applications including improved energy transmission in cables and faster electronics and communication.</p>
<p/>
<p>In research described in the journal <a shape="rect" href="http://science.sciencemag.org/cgi/doi/10.1126/science.aak9946">
<em>Science</em>
</a>, the scientists combined an X-ray free-electron laser together with a technique called <a shape="rect" href="http://spectrum.ieee.org/searchContent?q=ARPES&amp;type=&amp;sortby=relevance">angle-resolved photoemission spectroscopy (ARPES) </a>to image the atomic vibrations of a material and to see how those vibrations affect the electrons in the same material.</p>
<p/>
<p>
<a shape="rect" href="https://portal.slac.stanford.edu/sites/lcls_public/Pages/Default.aspx">SLAC’s Linac Coherent Light Source (LCLS)</a> X-ray free-electron laser provided the measurements of the atomic vibrations known as <a shape="rect" href="http://spectrum.ieee.org/searchContent?q=phonons&amp;type=&amp;sortby=relevance">phonons </a>while the ARPES was able to measure the momentum and energy of electrons in iron selenide.</p>
<p/>
<p>Iron selenide is a material that has garnered increased interest of late in the world of superconductivity. A team of researchers in China observed five years ago that when you place an atomically thin layer of it over an alloy of strontium, titanium and oxygen (STO), the temperature for achieving superconductivity rose from 8 degrees to 60 degrees Celsius above absolute zero. That is still pretty cold, but in the world of superconductivity it represents a huge difference.</p>
<p/>
<p>While room-temperature superconductivity remains a distant prospect, this kind of research seems to put it squarely within the realm of possibility.</p>
<p/>
<p>“Higher temperature superconductivity itself with other good properties (such as critical field and current) would already be very impactful,” explained Zhi-Xun Shen, a professor at SLAC and Stanford and investigator with the Stanford Institute for Materials and Energy Sciences (SIMES) who led the study, in an e-mail interview with <em>IEEE Spectrum</em>. “We used to have a dogma that is impossible.  Now we know the old dogma (30-40 [degrees] Kelvin being the upper limit) is not correct, room-temperature superconductivity is extremely hard, but there is no known reason to believe it is impossible.”</p>
<p/>
<p/>
<p>Once these initial observations of iron selenide were reported, Shen started to investigate this material combination with the ARPES tools available at the SLAC labs. In a 2014 paper published in <em>Nature</em>, Shen and his colleagues <a shape="rect" href="https://www6.slac.stanford.edu/news/2014-11-12-study-slac-explains-atomic-action-high-temperature-superconductors.aspx">sorted out what was causing the effect</a>. It turns out that the atomic vibrations in the STO travel up into the iron selenide and give electrons the additional energy they need to pair up and carry electricity with zero loss at higher temperatures than they would on their own.</p>
<p/>
<p>The implications of this suggested that if one were to play around with the substrate material, it might be possible to raise the temperature for superconductivity even higher. But Shen wanted to see if this coupling between the atomic vibrations and the electrons in iron selenide would occur without any substrate, forming the basis of this most recent research.</p>
<p/>
<p>By using a slighter thicker version of the iron selenide that was atomically uniform in its structure, the scientists triggered 5-trillion-times-a-second atomic vibrations in the material by hitting it with infrared laser light. With the X-ray free-electron laser they could see and measure these vibrations and then, with the ARPES, image how the electrons behaved.</p>
<p/>
<p>At this point, the scientists are not prepared to say that there is a direct connection between this strong coupling between phonons and electrons is what causes the higher-temperature superconductivity, but the combined microscopy techniques should help lead to an answer.</p>
<p/>
<p>There are several possibilities for the higher-temperature superconductivity, according to Shen. For example, electron-electron and electron-phonon interaction can both contribute, or electron-electron acts through electron-phonon. </p>
<p/>
<p>“What this experiment has shown is that the fact previous simple theory of electron-phonon interaction cannot explain the superconductivity in this compound does not mean that we need to throw out the phonon,” said Shen  “It could be that all players are active, and we need to look at the problem in a more holistic way.”</p>
<p/>
<p>While iron selenide has led to this new technique and the resulting observations, this research has broader implications.</p>
<p>Shen added: “There are multiple players in action, atoms, electrons, etc. We used to think of them in isolation and in most simple terms. This work shows that their interplay can make the whole far more powerful than the individuals in isolation, dramatically. This is likely a route to a broader range of materials with interesting and extreme properties.”</p>
<p/>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/Kbh-XYFctMo" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Thu, 6 Jul 2017 20:00:00 GMT</pubDate>
<dc:creator>Dexter Johnson</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/nanoclast/semiconductors/materials/tool-reveals-mechanism-behind-hightemperature-superconductivity</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMjkzNw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMjkzOQ.gif" height="" width="" />
<feedburner:origLink>http://spectrum.ieee.org/nanoclast/semiconductors/materials/tool-reveals-mechanism-behind-hightemperature-superconductivity</feedburner:origLink></item>
<item>
<title>Roomba Inventor Joe Jones on His New Weed-Killing Robot, and What's So Hard About Consumer Robotics</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/yZR0Dxw9pig/roomba-inventor-joe-jones-on-weed-killing-robot</link>
<description>The inventor of the Roomba tells us about his new solar-powered, weed-destroying robot</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>The inventor of the Roomba tells us about his new solar-powered, weed-destroying robot<figure>
<img src="http://spectrum.ieee.org/image/MjkyMjgxNg.jpeg"/>
<figcaption>Photo: Franklin Robotics</figcaption>
</figure>
<div>
<style type="text/css">&lt;!--
.topicList {
	list-style-type: none;
}
#TopPage {
	font-family: Arial, Helvetica, sans-serif;
	font-size: 12px;
}
--&gt;
</style>
<p>iRobot’s <a shape="rect" href="http://spectrum.ieee.org/tag/roomba">Roomba</a> robotic vacuum is, arguably, the most successful robot ever made. Some 15 million of them are cleaning floors all over the planet, and they’re doing so reliably and affordably and autonomously enough that people keep on buying them, which is something no other consumer robot has ever been able to replicate.</p>
<p>Providing the vision for the small team that designed the Roomba was Joe Jones. What started out as his <a shape="rect" href="http://www.techrepublic.com/article/joe-jones-roomba-inventor-roboticist-vindicated-pioneer/">personal side project</a> at MIT’s Artificial Intelligence Lab in 1988 became a commercial product at iRobot in 2002, and while iRobot is still doing its best to make the Roomba better than ever, Jones left to found his own <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/industrial-robots/harvest-automation-beta-testing-robot-farmers">agricultural robotics company, Harvest Automation, in 2006</a>.</p>
<p>Now Jones has started his second robotics company, <a shape="rect" href="http://www.franklinrobotics.com/">Franklin Robotics</a>, which is funding its latest <a shape="rect" href="https://www.kickstarter.com/projects/rorymackean/tertill-the-solar-powered-weeding-robot-for-home-g">project through Kickstarter</a>: Tertill is a solar-powered, weed-destroying, fully autonomous and completely self-contained robot designed for your garden. Put it out there, forget about it (mostly), and it will brutally exterminate any weeds that it can find, as long as they’re short. </p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/VwTWhMbnq9g" width="620" frameborder="0" height="349"/>
</p>
<p>The genius thing about Tertill is that it’s self-sufficient. It has one button, you push that button, and then forget about the robot while it weeds your garden every day, forever. You can talk to it via Bluetooth and get updates and statistics and whatnot, but that’s optional. With a waterproof design and batteries charged by the sun, you really can just leave the robot alone and be confident that your garden will be weed free.</p>
<p>While it’s true that Tertill’s method of weeding (decapitation) is not as effective at pulling weeds out completely, it doesn’t make any difference: It’s not like you’re the one doing the weeding, and the robot is perfectly happy to inefficiently weed over and over every single day, until the sun explodes.</p>
<figure role="img" class="rt med-lrg">
<img src="http://spectrum.ieee.org/image/MjkyMjY2Nw.jpeg" alt="Robotics engineer Joe Jones, who helped create the Roomba, is a co-founder of Franklin Robotics, which is launching a weed-killing robot named Tertill. "/>
<figcaption class="hi-cap">
  Image: Franklin Robotics
 </figcaption>
<figcaption>
  Robotics engineer Joe Jones, who helped create the Roomba at iRobot, is a co-founder of Franklin Robotics, a Boston-area startup that has developed a weed-killing autonomous robot named Tertill. 
 </figcaption>
</figure>
<p>As with all robots, Tertill comes with some caveats. First, you’ll need a border of some sort around your garden (at least 5 centimeters, or about 2 inches, above the ground surface) to make sure that the robot doesn’t escape. Depending on your particular garden, this could be either be more or less annoying than installing an edge wire. Tertill differentiates weeds from not-weeds the same way that it differentiates garden from not-garden: It won’t run over anything over 5 cm in height, and will instead navigate away.</p>
<p>During a typical day (Boston day, we’re guessing), Tertill will spend between 60 and 90 minutes hard at work, and will then spend the rest of its time sunbathing to recharge its batteries, much like myself. If you have a garden of unusual size, you can populate it with multiple Tertills, which will coordinate their schedules (although not their coverage area) so that they’re not running at the same time. This maximizes the amount of time that there’s a moving robot in your garden, which can help scare away wildlife and attract passers-by.</p>
<p>
<a shape="rect" href="https://www.kickstarter.com/projects/rorymackean/tertill-the-solar-powered-weeding-robot-for-home-g">Tertill is on Kickstarter</a> until July 12, and you can pledge US $250 for one that’s scheduled to be delivered in May of 2018. We’re obligated to remind you that this is a crowdfunded project and you’re backing an idea, and not paying for a product that already exists. Jones and his colleagues, though, have a lot of experience in building affordable, reliable robots, and we’re looking forward to seeing how well Tertill performs in practice.</p>
<p>
<a shape="rect" name="TopPageAnchor" id="TopPageAnchor"/>For more on Tertill, Roomba, and why consumer robotics is so hard, we spoke with Jones via email.</p>
<p>
<strong>Joe Jones on . . .</strong>
</p>
<ol class="topicList">
<li>
<a shape="rect" href="#qaTopicOne">Lessons From Roomba’s Early Days</a>
</li>
<li>
<a shape="rect" href="#qaTopicTwo">From a Round to Rectangular to Round Again Robot</a>
</li>
<li>
<a shape="rect" href="#qaTopicThree">Robot Personality? No. Navigation Strategy? Yes.</a>
</li>
<li>
<a shape="rect" href="#qaTopicFour">Chasing Chipmunks and Other Future Features</a>
</li>
<li>
<a shape="rect" href="#qaTopicFive">“Robots Are Deceptively Hard”</a>
</li>
</ol>
<ol class="listicle">
<li>
<h3 class="listicle-item-hed">
<a shape="rect" name="qaTopicOne"/>Lessons From Roomba’s Early Days</h3>
<p>
<strong>
<em>IEEE Spectrum:</em> What did you learn from the process of bringing Roomba to market, and how did that influence your approach to Tertill?</strong>
</p>
<p>
<strong>Joe Jones: </strong>I remember working in a back room at iRobot long ago when they were located above the Twin City Plaza in Somerville [outside Boston]. Roomba needed a very low-power vacuum, and I was trying to develop one that that used no more than 3 watts. I built the test mechanism from cardboard and packing tape and used the guts of an old heat gun as my vacuum source. I suddenly thought, “What do I need iRobot for? I could do any of this at home using only my own resources.”</p>
<p>But I came to appreciate that there’s actually a lot more to bringing a product to market than just the invention phase. Depending on the product, invention may be possible on a shoestring but then there’s testing, compliance, inventory, manufacturing, distribution, customer service, and repair. All the necessary but boring stuff in which I had no interest.</p>
<p>So mostly my view of product development matured as a result of Roomba. We have managed to do the early development of Tertill at a pretty low cost but we will need more resources for the next phase. Our <a shape="rect" href="https://www.kickstarter.com/projects/rorymackean/tertill-the-solar-powered-weeding-robot-for-home-g">Kickstarter campaign</a> is part of that.</p>
<p>
<strong>
<em>Spectrum:</em> What made you decide that now was the right time to introduce a consumer weeding robot?</strong>
</p>
<p>
<strong>Jones:</strong> Opportunity. There’s a need for the application and it can be accomplished using existing technology. Had I thought of it and been in a position to work on it, Tertill would have been possible years earlier. <a shape="rect" name="qaTopicTwo" id="qaTopicTwo"/>
</p>
<h4>
<a shape="rect" href="#TopPageAnchor" id="TopPage">BACK TO TOP↑</a>
</h4>
</li>
<li>
<h3 class="listicle-item-hed">From a Round to Rectangular to Round Again Robot</h3>
<p>
<strong>
<em>Spectrum:</em> It looks like you went through lots of design iterations before you ended up with those extreme camber wheels. Can you talk about the process that you went through before you settled on the final design for Tertill?</strong>
</p>
<p>
<strong>Jones:</strong> The first autonomous version of Tertill used two-wheel drive and was nearly round. But when we tried it in the garden it had problems with slopes and ruts. I did some calculations that demonstrated the limited slope-climbing ability of any two-wheel versus four-wheel drive robot. It was clear both practically and theoretically that two-wheel drive wouldn’t be versatile enough to handle realistic gardens. So we had to move to four-wheel drive.</p>
<p>Robots that have a coverage task (like robot vacuums or weeding robots) must come into contact with all obstacles in their workspace. (If you don’t touch the obstacles then you leave an unprocessed ring around each one.) We made Roomba round with the two wheels on a diameter because we knew that no matter how complex the geometry of nearby obstacles, the robot could always spin in place to find an escape path. That strategy simplifies motion planning (ergo the almost round shape of gen 1) but complicates the mechanics—it’s easier to fit components into a rectangular robot. </p>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkyMjcxOA.jpeg" alt="Franklin Robotics' Tertill weed-killing robot prototype 0 to prototype 3"/>
<figcaption class="hi-cap">
    Photos: Franklin Robotics
   </figcaption>
<figcaption>
    From left: First prototype and generations 1, 2, and 3 of the robot, which went from two-wheel to four-wheel drive to improve its ability to drive over slopes and ruts. 
   </figcaption>
</figure>
<p>Now that we had to have 4WD, a rectangular shape became much more attractive. This brought us to generation two—a long way from round but narrow enough to thread its way between closely planted rows. Our hope was that the garden environment was forgiving enough that we could get away with a non-round robot. </p>
<p>Traction was great, the robot could go up and down slopes with no problem but overall maneuverability was poor. Trying to turn corners Tertill swept out a wide area and sometime swung sideways into plants. If we had complete information about the local environment we could plan clever paths to avoid such problems. Unfortunately, complete information requires fancy sensors. Fancy sensors cost real money and result in robots that customers are unwilling to pay for. If we couldn’t afford the sensor we had to change the robot.</p>
<p>In generation three we tried a compromise keeping the 4WD but making the robot a little closer to round. Better but no cigar. The robot could still get hung up in a corner that wouldn’t have given Roomba the slightest problem. Eventually, Tertill could usually extricate itself from such spaces but the many attempts required for final success was problematic. Every failed motion can cause the wheels to dig in farther, changing the garden surface and making escape more difficult. </p>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkyMjcxNg.jpeg" alt="Franklin Robotics' Tertill weed-killing robot prototype 4 to final version"/>
<figcaption class="hi-cap">
    Photos: Franklin Robotics
   </figcaption>
<figcaption>
    From left: Generations 4, 5, and 6, and the final version (available on Kickstarter) of the Tertill robot, featuring a round shape and camber wheels.
   </figcaption>
</figure>
<p>So the robot had to be round. That brought us to generation four. Packing four wheels into a round robot is tough. The wheels need to be as large as possible to get over rocks and holes and you want as large a wheelbase as possible in both dimensions to minimize the likelihood of the robot tipping over. Generation four worked better than all previous robots but the wheelbase was narrow and we began to worry about high-centering.</p>
<p>When something the robot has driven over (e.g. rock, dirt clod) touches the underside of the robot, that object supports some of the robot’s weight. The wheels thus lose traction and the robot may become stuck. Thinking about this and the wheelbase issue led us to the extreme camber wheels. This new configuration moves the contact point of the wheels out to the edge of the shell and means that more of the underside area of the robot is movable. That is, there’s less area that can participate in a high centering event. And we get the added bonus that we have more room for the whacker—we can cut a wider swath.</p>
<p>We think the high camber idea is interesting enough that we made it part of our patent (pending). <a shape="rect" name="qaTopicThree" id="qaTopicThree"/>
</p>
<h4>
<a shape="rect" href="#TopPageAnchor" id="TopPage">BACK TO TOP↑</a>
</h4>
</li>
<li>
<h3 class="listicle-item-hed">Robot Personality? No. Navigation Strategy? Yes.</h3>
<p>
<strong>
<em>Spectrum:</em> Does Tertill have a personality? Do you find that users develop emotional connections with Tertill like people commonly do with Roombas?</strong>
</p>
<aside class="inlay pullquote rt med">
   The most common response upon seeing Tertill for the first time is probably, “That’s adorable!” So I expect many Tertills will be named and become part of the family.
  </aside>
<p>
<span>
<strong>Jones:</strong> </span>Robot personality is in the eye of the beholder, in my view. Personality is not part of Tertill’s specs. And those of us who know it best (hardnosed technologists all) have not developed emotional connections. But the little robot is very engaging. The most common response upon seeing Tertill for the first time is probably, “That’s adorable!” So I expect many Tertills will be named and become part of the family.</p>
<p>
<strong>
<em>Spectrum:</em> iRobot always made a point of telling us that Roombas are making careful, data-driven and sensor-based decisions about where to go, as opposed to just bouncing randomly around a room (which is how it appears at times). What kind of navigation strategy does Tertill use?</strong>
</p>
<p>
<strong>Jones:</strong> I agree that from a high level it looks like both Roomba and Tertill just bounce around randomly. But both contain some hidden subtlety that makes a huge difference in performance. The effects have to do with coverage and escaping hazards. </p>
<p>A robot that does nothing but bounce randomly has a hard time fully covering a cluttered space. The trick is that some fraction of the time, rather than bounce, the robot should follow an obstacle or a wall or a row or crops. This allows it to escape tight spaces and fully cover a cluttered area. (If you’re really interested, the method is described in my book, “<a shape="rect" href="https://www.amazon.com/Robot-Programming-Practical-Behavior-Based-Robotics/dp/0071427783">Robot Programming: a Practical Guide to Behavior-based Robotics.</a>”)</p>
<p>The second issue is more complex; it arises any time the robot must respond to a second hazard while still dealing with a first—say encountering a wall while avoiding a cliff or turning away from a plant and encountering a rut at the same time. The earliest version of Roomba (which didn’t deal with simultaneous hazards) had maybe half a dozen behaviors. But by the time we had worked out the bugs of all the special cases, Roomba had over 50 different behaviors. We on the Roomba team used to feel quite smug when we encountered a knockoff robot where the developers had clearly stopped working after writing just the first six behaviors. Such robots got stuck frequently and had a hard time getting away from even the most benign hazards. <a shape="rect" name="qaTopicFour" id="qaTopicFour"/>
</p>
<h4>
<a shape="rect" href="#TopPageAnchor" id="TopPage">BACK TO TOP↑</a>
</h4>
</li>
<li>
<h3 class="listicle-item-hed">Chasing Chipmunks and Other Future Features</h3>
<p>
<strong>
<em>Spectrum:</em> Are there features or capabilities that you originally wanted Tertill to have, but had to cut due to cost, reliability, or practicality?</strong>
</p>
<aside class="inlay pullquote rt med">
   Tertill can move suddenly at random intervals, and nearby pests are likely to be frightened away when this happens. Later models may include motion sensors enabling Tertill to give chase.
  </aside>
<p>
<strong>Jones: </strong>Yes, we had hoped to include deliberate pest scaring and extensive monitoring of the local environment in the first iteration. Unfortunately, our initial approaches to those features proved to be too complex, too costly, and required too much development time. We did, however, find lower impact ways to achieve part of our goals. Although the robot doesn’t detect the presence of pests (like rabbits and chipmunks) we can schedule Tertill to move suddenly at random intervals. Any nearby pests are likely to be frightened away when this happens. Later models may include motion sensors enabling Tertill to give chase.</p>
<p>We would like to add fixed sensors in the garden to monitor things like soil moisture, conductivity, and so on. A future robot will collect data from these sensors and compile a report alerting the gardener to any developing problems that may affect plant health or yield. The initial version of the robot will collect some of the information we’d like to provide. This includes insolation, local temperature, how often it finds weeds to chop.</p>
<p>
<strong>
<em>Spectrum:</em> My Roomba works best when I keep the floor mostly free of clutter and electrical cords. Is there a way to optimize a garden for Tertill?</strong>
</p>
<p>
<strong>Jones:</strong> Slopes that exceed 40 percent grade, big rocks, big ruts or holes, and persistent puddles all make it more difficult for Tertill to operate. Gardeners will make Tertill more effective if they minimize or eliminate those features. </p>
<p>We choose Tertill’s diameter after surveying the sort of plants that are most often grown in home gardens. My research suggests that most plant rows should be more than 8 inches apart and within a row most plants grow best if spaced at a minimum of 8 inches. Respecting the recommended minimum plant spacing (or even giving them a bit of extra room) is good for both Tertill and your plants.<a shape="rect" name="qaTopicFive" id="qaTopicFive"/>
</p>
<h4>
<a shape="rect" href="#TopPageAnchor" id="TopPage">BACK TO TOP↑</a>
</h4>
</li>
<li>
<h3 class="listicle-item-hed">“Robots Are Deceptively Hard”</h3>
<p>
<strong>
<em>Spectrum:</em> Why is it so difficult to build reliable, affordable consumer robots?</strong>
</p>
<p>
<strong>Jones:</strong> Robots are deceptively hard. My co-founder at Harvest, Clara Vu, likes to say, “A robot application that a technology grad student thinks they can easily do in a couple of weeks has an outside chance of actually being practical. Anything harder is impossible.”</p>
<p>I think the issue that misleads developers is that robots have different strengths and weaknesses compared to people. What that means it that any application you want to roboticize must be re-imagined from the ground up. With Tertill we didn’t just take the base of an RC car, strap on a weed whacker, and call it done. That would work fine if you or I were at the controls. We would rarely drive the car into any debris or terrain from which it couldn’t escape. But the robot doesn’t have our sensory and cognitive advantages, it will simply drive itself into any rut or mass of foliage and only realize that there is a problem when it stops moving. Thus we spent a year designing and redesigning the chassis and mobility system to make it possible of the robot blunder into difficult situations and still manage to get out.</p>
<p>
<strong>
<em>Spectrum:</em> What are you most excited about in robotics right now?</strong>
</p>
<p>
<strong>Jones:</strong> I’m always interested in low-cost robots. Robots that just about anyone can afford have the greatest potential to create new industries and change the world. Currently I’m thinking that cameras connected to deep learning networks is the technology with the most promise to enable a new wave of low-cost robotic applications. It’s possible that I only think this way because I don’t yet know a lot about deep learning and am blissfully ignorant of its shortcomings. But I’m trying to learn more.</p>
<h4>
<a shape="rect" href="#TopPageAnchor" id="TopPage">BACK TO TOP↑</a>
</h4>
</li>
</ol>
<p>
<em>Updated 4:46 p.m. ET</em>
</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/yZR0Dxw9pig" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Thu, 6 Jul 2017 19:57:00 GMT</pubDate>
<dc:creator>Evan Ackerman</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/automaton/robotics/home-robots/roomba-inventor-joe-jones-on-weed-killing-robot</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMjgzNA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMjg5OA.gif" height="" width="" />
<feedburner:origLink>http://spectrum.ieee.org/automaton/robotics/home-robots/roomba-inventor-joe-jones-on-weed-killing-robot</feedburner:origLink></item>
<item>
<title>Low Current / Ultra-High Resistance Measurement Fundamentals</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/p6vy7i_GZrY/low-current-ultra-high-resistance-measurement-fundamentals</link>
<description>Attend this webcast to learn the measurement techniques, tricks and tools necessary to measure low currents and high resistances with high measurement confidence and repeatability.</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Attend this webcast to learn the measurement techniques, tricks and tools necessary to measure low currents and high resistances with high measurement confidence and repeatability.<div>
<p>Performing current vs. voltage characterization on devices and materials at very low current levels presents a unique set of measurement challenges. Normal measurement issues such as noise, transient signals and cabling and fixturing parasitics are much harder to solve when dealing with currents in the femtoamp range. In addition, many cutting-edge materials have extremely high resistances that conventional DMMs and source/measurement units (SMUs) cannot measure.  In this seminar Keysight will explain the measurement techniques, tricks and tools necessary to measure currents down to 0.01 femtoamps and resistances up to 10 Peta Ohms with both high measurement confidence and repeatability.</p>
<p>
<span name="summary" class="summary"> </span>
<strong>PRESENTER:</strong>
</p>
<div class="imgWrapper lt sm">
<img src="http://event.on24.com/event/14/63/38/1/rt/alan.jpg" width="148" alt="" height="206"/>
</div>
<div>
<p>
<strong>Alan Wadsworth</strong>, Marketing Brand Manager, Keysight</p>
<div/>
<p>Alan Wadsworth is currently the Marketing Brand Manager for Keysight’s semiconductor and power products.  He has over 30 years of experience in the semiconductor industry in both design and test, and is the author of Keysight’s Parametric Measurement Handbook, which contains comprehensive information on semiconductor parametric test and measurement techniques.<br clear="none"/>  <br clear="none"/> Alan joined Hewlett Packard in 1991 and worked for five years as the SRAM engineer in HP’s Memory Technology Center.  Previously, he worked as an integrated circuit designer at Signetics/Philips where he designed circuits in both bipolar and BiCMOS technologies.  He holds a bachelors and masters degrees in electrical engineering from the Massachusetts Institute of Technology and an MBA from Santa Clara University.</p>
<p/>
<p/>
<div>
<p>
<strong>Attendees of this IEEE Spectrum webinar have the opportunity to earn PDHs or Continuing Education Certificates! </strong> To request your certificate you will need to get a code. Once you have registered and viewed the webinar send a request to <a shape="rect" href="mailto:webinarteam@ieeeglobalspec.com">webinarteam@ieeeglobalspec.com</a> for a webinar code. To request your certificate complete the form here: <a shape="rect" href="https://fs25.formsite.com/ieeevcep/form112/index.html">https://fs25.formsite.com/ieeevcep/form112/index.html</a>
</p>
</div>
</div>
<p>
<em>
<strong>Attendance is free. To access the event please register.<br clear="none"/> NOTE: By registering for this webinar you understand and agree that IEEE Spectrum will share your contact information with the sponsors of this webinar and that both IEEE Spectrum and the sponsors may send email communications to you in the future.​</strong>
</em>
</p>
<p>Please contact  <a shape="rect" href="mailto:webinarteam@ieeeglobalspec.com">gs-webinarteam@ieeeglobalspec.com</a> if you have questions</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/p6vy7i_GZrY" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Thu, 6 Jul 2017 19:30:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/webinar/low-current-ultra-high-resistance-measurement-fundamentals</guid>
<feedburner:origLink>http://spectrum.ieee.org/webinar/low-current-ultra-high-resistance-measurement-fundamentals</feedburner:origLink></item>
<item>
<title>What Happens When Carpooling Laws Suddenly Change? Chaos!</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/RW-liCdQc04/what-happens-when-carpooling-laws-suddenly-change-chaos</link>
<description>A huge jump in traffic congestion in Jakarta shows how valuable carpooling had been—before the government ended it</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>A huge jump in traffic congestion in Jakarta shows how valuable carpooling had been—before the government ended it<figure>
<img src="http://spectrum.ieee.org/image/MjkyMjUwNA.octet-stream"/>
<figcaption>Photo: iStockphoto</figcaption>
</figure>
<div>
<p>A natural experiment is what social scientists settle for when people, as is their tendency, fail to volunteer as guinea pigs. And one heck of a natural experiment happened recently in Jakarta, where the government ended what had been one of the world’s most stringent carpooling laws.</p>
<p>Under the law, instituted in 1992, private cars had to carry at least three people to use special fast-track routes during rush hour. When the government suddenly ended the “3-in-1” law, in March 2016, rush-hour trip times jumped by 47 percent in the morning and by 87 percent in the evening. And congestion rose not only on the roads that had been restricted to carpoolers but along other, seemingly unrelated routes as well. </p>
<p>In May the government reinstituted the law. It had learned, the hard way, that carpooling can work.</p>
<p>Well duh, you may be thinking: More people per car, more throughput, end of story. But in fact many experts have wondered about high-occupancy lanes for carpoolers. How can we be sure that the extra people in each carpool aren’t offset by a lack of carpools—that the special lanes aren’t underused?</p>
<p>Traffic planning is a lot more complicated that it seems. You might lose not only by allocating existing highway lanes to carpoolers but even by adding <em>totally new</em> lanes for them or for everyone. And you can sometimes <em>improve</em> traffic in an entire system by closing down existing lanes. It’s all according to <a shape="rect" href="https://en.wikipedia.org/wiki/Braess%27s_paradox">Braess’s Paradox</a>, named after the mathematician who discovered it.</p>
<p>This means that the only way to judge a road-network reform is by running an experiment. And the Jakarta government did just that when it instituted an incentive to carpool and then years later, Lucy-like, yanked it away.</p>
<p>Jakarta is the poster child for traffic congestion, holding some 30 million people, making second in size only to Tokyo. The central area where the carpooling law applied includes a 12-lane highway. Oh, and the city has no subway. So for commuters, it’s the roads or nothing.</p>
<p>When the city government up and announced it would eliminate the 3-in-1 rule, effective one week later, three economists at the Harvard Kennedy school sprang into action. Rema Hanna, Gabriel Kreindler, and Benjamin A. Olken used Google Maps to collect data on traffic in the high-occupancy roads and also in some other roads, and they continued to collect data right through the transition.</p>
<p>“The lifting of Jakarta’s three-in-1 policy not only had effects on traffic on former [high-occupancy roads] but also had spillovers to alternative roads and time periods,” the <a shape="rect" href="http://science.sciencemag.org/cgi/doi/10.1126/science.aan2747">economists write </a>in today’s issue of <em>Science </em>magazine. “The results therefore suggest that quantity restrictions on severely congested roads can have beneficial spillover effects on traffic throughout the city, whether by potentially eliminating hypercongestion or by getting cars off the road.” </p>
<p>The Jakarta law gives such powerful incentives to carpool in part because the default position in that city isn’t necessarily one driver per car, but two, because many Indonesians employ a driver. That’s why it mandates three people per pool. Other cities, notably London, cut congestion by charging drivers fees for using the roads at certain times of day. Others ration access by license plates, allowing odd-number plates on some days and even numbers on the other days.</p>
<p>A completely different incentive to carpooling has just been suggested by <a shape="rect" href="http://www.eng.uwaterloo.ca/~bghaddar/">Bissan Ghaddar</a>, an expert in operations research at the University of Waterloo, and her colleagues at IBM Research Ireland and two universities in Italy. They analyzed Twitter feeds of potential carpoolers and worked out each person’s social network, personality traits, and geo-tagged locations. They then used a matchmaking model to simulate how such socially compatible carpoolers would behave in two major cities. Total car use would drop by 57 per cent in Rome and by 40 percent in San Francisco, they <a shape="rect" href="http://www.sciencedirect.com/science/article/pii/S0968090X17300657">wrote</a> in recently in <em>Transportation Research Part C. </em>
</p>
<p>“As a carpooler myself,” Ghaddar <a shape="rect" href="https://www.eurekalert.org/pub_releases/2017-07/uow-eca063017.php">says</a>, “I can't overestimate the importance of compatibility."</p>
<p>It’s just a model, of course. Now we need an experiment.</p>
<p/>
<p/>
<p/>
<p/>
<p/>
<p/>
<p/>
<p/>
<p/>
<p/>
<p/>
<p/>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/RW-liCdQc04" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Thu, 6 Jul 2017 17:57:00 GMT</pubDate>
<dc:creator>Philip E. Ross</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/cars-that-think/transportation/efficiency/what-happens-when-carpooling-laws-suddenly-change-chaos</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMjUxOA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMjUxNg.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/cars-that-think/transportation/efficiency/what-happens-when-carpooling-laws-suddenly-change-chaos</feedburner:origLink></item>
<item>
<title>3D Electronic Nose Demostrates Advantages of Carbon Nanotubes</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/n0bumJlrJL8/carbon-nanotube-computing-stacks-up</link>
<description>A 3D stack of silicon logic, resistive RAM, nanotube circuits, and sensors uses new architecture and devices to save energy</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>A 3D stack of silicon logic, resistive RAM, nanotube circuits, and sensors uses new architecture and devices to save energy<figure>
<img src="http://spectrum.ieee.org/image/MjkyMjI5Nw.jpeg"/>
<figcaption>llustration: Max Shulaker/Nature</figcaption>
</figure>
<div>
<p>You’d think computers spend most of their time and energy doing, well, computation. But that’s not the case: about 90 percent of a computer’s execution time and electrical energy is spent transferring data between the processor and the memory banks, says <a shape="rect" href="http://www.stanford.edu/~subh/">Subhasish Mitra</a>, a computer scientist at Stanford University. Even if Moore’s law continued on indefinitely, computers would still be limited by this memory bottleneck.</p>
<p>This week in the journal <a shape="rect" href="http://www.nature.com/nature/journal/v547/n7661/full/nature22994.html?WT.feed_name=subjects_materials-science">
<em>Nature</em>
</a>, Mitra and collaborators describe a new computer architecture they say addresses this problem—and that Mitra believes will improve both the energy efficiency and speed of computers by a factor of 1000.</p>
<p>The new 3D architecture is based on novel devices including 2 million carbon nanotube transistors and over 1 million resistive RAM cells, all built on top of a layer of silicon using existing fabrication methods and connected by densely packed metal wiring between the layers. As a demonstration, the team built an electronic nose that can sense and identify several common vapors including lemon juice, rubbing alcohol, vodka, wine, and beer.</p>
<p>These novel nanodevices are interesting in themselves, says Mitra, but computing’s problems will not be solved by switching out existing devices with new ones that are slightly better. He says the important thing about the combination of technologies in their prototype is that it enabled them to develop a new, more efficient architecture that would not be possible to make in traditional CMOS.</p>
<p>Stacking circuits is a way to bring memory and processing closer together, but even 3D chips have a significant memory bottleneck. They are typically limited by the number and quality of connections between levels. It’s not possible to build conventional metal interconnects on top of one layer and then add another level of memory or processing because of the temperatures required—in excess of 1000 degrees Celsius to make silicon devices. Typically the layers are made separately, then bonded together and connected with relatively large, sparsely distributed connectors called through-silicon vias tens of micrometers apart, says <a shape="rect" href="https://www.eecs.mit.edu/people/faculty/max-shulaker">Max Shulaker</a>, a computer scientist at MIT. Shulaker worked with Mitra and Stanford electrical engineer <a shape="rect" href="https://profiles.stanford.edu/philip-wong">H.-S. Philip Wong</a> on the 3D nanosystem.</p>
<p>Carbon nanotube transistors and resistive RAM can both be fabricated at about 200 degrees Celsius. So they can be built on top of each other and connected with metal wiring, without researchers having to worry about vaporizing the metal. The interconnects in their prototype are more than a thousand times denser than through-silicon vias in conventional 3D chips.</p>
<p>The Stanford and MIT device has four layers. It’s built on a silicon wafer, and the first stratum is made up of silicon logic. This is topped with a layer of interconnects, then an array of carbon nanotube logic. Another layer of interconnects links the nanotubes up to a layer of resistive RAM. A final layer of interconnects is topped with another array of carbon nanotube logic and nanotube sensors that can pick up ambient gases. The order of the layers “reflects the data flow streaming vertically down through the chip,” says Shulaker.</p>
<p>
<a shape="rect" href="http://www.bakirlab.gatech.edu/people.shtml">Muhannad Bakir</a>, an electrical engineer who leads the Integrated 3D Systems Group at Georgia Tech and who was not involved with the work, is impressed by the multitiered prototype. “Being able to fabricate and demonstrate a four tier monolithic 3D integrated circuit comprised of millions of heterogeneous device technologies, let alone in a shared user facility, is a tremendous milestone and accomplishment,” he says. The use of metal interconnects has significant performance and energy-use advantages over through-silicon vias, he says, though he expects these technologies will complement one another. And he’s curious to see what the Stanford and MIT group builds next.</p>
<p>This team has been working on carbon nanotube computing for several years, and Shulaker believes it is now ready for commercial applications. Shulaker says something like chemical sensing is a good first application for carbon nanotube systems because even if one sensor doesn’t work, redundant ones will ensure that the whole system won’t fail. He sees niche sensing applications of this sort as a good “on-ramp” to getting carbon nanotube computing commercialized. “The only reason we have 10 nanometer silicon today is that we’ve gone through decades of process learning,” he says. Niche applications will help carbon nanotube computing gain a foothold. The group has partnered with <a shape="rect" href="http://www.analog.com/en/index.html">Analog Devices</a> to work on products.</p>
<p>Mitra is thinking big. The memory bottleneck is a huge problem, and as applications like machine learning become more widespread, computing’s data intensity is only growing. He thinks this sort of architectural redesign is a promising route forward. “From supercomputers to cellphones, everyone has to use this technology,” says Mitra.</p>
<p>Shulaker started his lab at MIT last summer. But he started in the Stanford lab his freshman year of college, and even slept there during fabrication runs. During his grad school days, says Shulaker, he says no one really believed a demonstration of carbon nanotube computing on this scale would be possible. Today, says Shulaker, “it would be difficult to bet against these technologies.”</p>
<p>
<em>This post was updated on 10 July to include comment from Muhannad Bakir.</em>
</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/n0bumJlrJL8" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Thu, 6 Jul 2017 15:00:00 GMT</pubDate>
<dc:creator>Katherine Bourzac</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/carbon-nanotube-computing-stacks-up</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMjMwOA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMjMwNg.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/carbon-nanotube-computing-stacks-up</feedburner:origLink></item>
<item>
<title>How Bots Win Friends and Influence People</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/_LxgC2oHv94/how-bots-win-friends-and-influence-people</link>
<description>Social and computer scientists parse online bot discourse</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Social and computer scientists parse online bot discourse<figure>
<img src="http://spectrum.ieee.org/image/MjkyMTk4Ng.octet-stream"/>
<figcaption>Illustration: iStockphoto</figcaption>
</figure>
<div>
<p>Every now and then sociologist <a shape="rect" href="http://comprop.oii.ox.ac.uk/the-team/research/philip-n-howard/">Phil Howard</a> writes messages to social media accounts accusing them of being bots. It’s like a Turing test of the state of online political propaganda. “Once in a while a human will come out and say, ‘I’m not a bot,’ and then we have a conversation,” he said at the <a shape="rect" href="http://www.ecsj2017.com/">European Conference for Science Journalists </a>in Copenhagen on June 29.</p>
<p>In his academic writing, Howard calls bots “highly automated accounts.” By default, the accounts publish messages on Twitter, Facebook, or other social media sites at rates even a teenager couldn’t match. Human puppet-masters manage them, just like the Wizard of Oz, but with a wide variety of commercial aims and political repercussions. Howard and colleagues at the <a shape="rect" href="https://www.oii.ox.ac.uk/">Oxford Internet Institute </a>in England published a working paper [<a shape="rect" href="http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/06/Casestudies-ExecutiveSummary.pdf">PDF</a>] last month examining the influence of these social media bots on politics in nine countries.</p>
<p>“Our goal is to produce large amounts of evidence, gathered systematically, so that we can make some safe, if not conservative, generalizations about where public life is going,” Howard says. The working paper, available ahead of peer-review in draft form, reports on countries with a mixture of different types of governments: Brazil, Canada, China, Germany, Poland, Russia, Taiwan, Ukraine, and the United States.</p>
<p>“My biggest surprise (maybe disappointment) is how it’s seemingly taken the 2016 U.S. election outcome to elevate the conversation and concerns related to this issue... because it’s not new,” says John F. Gray, co-founder of <a shape="rect" href="http://mentionmapp.com/">Mentionmapp</a>, a social media analytics company in Vancouver, Canada. For years, bot companies have flooded protest movements’ hashtags with pro-government spam from Mexico [<a shape="rect" href="https://arxiv.org/pdf/1609.08239.pdf">PDF</a>] to Russia [<a shape="rect" href="http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/06/Comprop-Russia.pdf">PDF</a>]. More sophisticated bots <a shape="rect" href="https://lucaslaursen.com/fake-facebook-pages-spin-web-of-deceit/">replicate real-life human networks</a> and post or <a shape="rect" href="https://datasociety.net/output/media-manipulation-and-disinfo-online/">promote “fake news” and conspiracy theories seeking to sway voters</a>. Indiana University researchers are building a taxonomy of social-network bots to simplify research <a shape="rect" href="http://spectrum.ieee.org/tech-talk/computing/networks/taxonomy-goes-digital-getting-a-handle-on-social-bots">(see “Taxonomy Goes Digital: Getting a Handle on Social Bots”,</a>
<em>IEEE Spectrum</em>, 9 June 2017).</p>
<p>Howard and colleagues have taken a social science approach: They found informants willing to provide access to programmers behind the botnets and have spent time with those programmers, getting to know their business models and motivations. One of their discoveries, Howard says, is that bot networks are, “not really bought and sold: they’re rented.” That’s because the older a profile is and the more varied its activity, the easier it is to evade detection by social networks’ security teams.</p>
<p>Private companies, <span>not just governments and political parties,</span> are major botnet users, Howard adds. The big business of renting botnets to influence public conversations may encourage firms to create ever-more realistic bots. The computation for spreading propaganda via bots, Howard says, isn’t that complicated. Instead, Gray says the sophistication of botnet design, their coordination, and how they manipulate social media has been “discouragingly impressive.”</p>
<p>Both Howard and Gray say they are pessimistic about the ability of regulations to keep up with the fast-changing social bot-verse. Howard and his team are instead trying to examine each country's situation and in the working paper they call for social media firms to revise their designs to promote democracy.</p>
<p>Gray calls it a literacy problem. Humans must get better at evaluating the source of a message to help them decide how much to believe the message itself, he says.</p>
<p>
<em>Note: Journalist Lucas Laursen attended the 2017 European Conference for Science Journalists, where Howard spoke, with a travel grant but had no obligation to cover conference events.</em>
</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/_LxgC2oHv94" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Thu, 6 Jul 2017 13:30:00 GMT</pubDate>
<dc:creator>Lucas Laursen</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/tech-talk/telecom/security/how-bots-win-friends-and-influence-people</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMTk5OA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMTk5Ng.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/tech-talk/telecom/security/how-bots-win-friends-and-influence-people</feedburner:origLink></item>
<item>
<title>EU Developing Robot Badgers for Underground Excavation</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/iPgYkYuPfdI/eu-developing-robot-badgers-for-underground-excavation</link>
<description>Using robots to dig holes for infrastructure installation could put an end to road work</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Using robots to dig holes for infrastructure installation could put an end to road work<figure>
<img src="http://spectrum.ieee.org/image/MjkyMjA1NQ.jpeg"/>
<figcaption>Image: BADGER Project</figcaption>
</figure>
<div>
<p>
<span id="docs-internal-guid-f81c6bac-d15a-644f-6318-02ba2aec9bf3">These days, you can find capable and confident robots <a shape="rect" href="http://spectrum.ieee.org/transportation/self-driving">driving</a>, <a shape="rect" href="http://spectrum.ieee.org/robotics/drones">flying</a>, <a shape="rect" href="http://spectrum.ieee.org/tag/robot+boats">swimming</a>, <a shape="rect" href="http://spectrum.ieee.org/tag/underwater+robot">diving</a>, <a shape="rect" href="http://spectrum.ieee.org/robotics/space-robots">out in space</a>, and even <a shape="rect" href="https://www.geek.com/news/nasas-ice-drilling-europa-robot-gets-tested-in-alaska-1600265/">boring through ice</a>. What we haven’t seen a lot of are robots designed to dig underground. There have been a few <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/diy/self-burying-robot-could-be-hiding-in-your-backyard-right-now">self-burying robots that use digging to their advantage</a>, but they’re not designed for underground locomotion. This is slightly strange, to be honest, considering how many animals make their living by digging tunnels, and also considering how often we humans need to do useful things underground.</span>
</p>
<p/>
<p>
<span id="docs-internal-guid-f81c6bac-d15a-644f-6318-02ba2aec9bf3">The European Union is sponsoring a project to make underground robotics happen through the development of a “robotic system that will be able to drill, maneuver, localize, map and navigate in the underground space, and which will be equipped with tools for constructing horizontal and vertical networks of stable bores and pipelines.” Called BADGER (roBot for Autonomous nDerGround trenchless opERations, mapping and navigation), it is both an interesting and innovative idea and also the most ridiculous acronym I have ever seen.</span>
</p>
<p>
<span id="docs-internal-guid-f81c6bac-d15a-644f-6318-02ba2aec9bf3">Construction companies already employ a wide variety of digging machinery to help excavate underground and install pipes and cables. But this equipment has limitations: Typically, it is unable to detect and navigate around rocks, pipes, roots, and other obstacles on its own, so it relies on humans to determine exactly where to dig. Furthermore, these machines are designed to travel in mostly straight lines and can’t build a winding, intricate network of tunnels. They’re also expensive to operate. As a result, construction companies often end up opening trenches, installing the pipes into the hole, and then covering it all up again.</span>
</p>
<p>
<span>B</span>ADGER, in contrast, will be able to autonomously burrow under the ground to create channels for the pipes, navigating around existing infrastructure while it does so. It’s even able to sort of 3D print walls for the conduit it creates as it goes:</p>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkyMjA1NA.jpeg" alt="BADGER underground drilling robot"/>
<figcaption class="hi-cap">
  Image: BADGER Project
 </figcaption>
</figure>
<p/>
<p/>
<p/>
<p>
<span id="docs-internal-guid-f81c6bac-d15a-644f-6318-02ba2aec9bf3">The drilling mechanism will be a combination of rotary and impact drilling tech, along with “a novel . . . ultrasonic drill-tool” designed to “foster pulverization of the rock.” All that pulverized rock will be sucked up and flushed out the back of the robot to keep the tunnel clear. The robot will propel itself through bioinspired peristaltic motion, kind of like a worm. Or your intestines. </span>
<span>The entire thing is modular, and the drive modules, joint modules, and tool modules can all be swapped out depending on what you’re trying to accomplish. Ground penetrating radar antenna arrays, electronic navigation sensors, and lasers help keep the robot on course via underground SLAM while avoiding underground obstacles like rocks, other pipes, and mole people.</span>
</p>
<p/>
<p>
<span id="docs-internal-guid-f81c6bac-d15a-644f-6318-02ba2aec9bf3">The BADGER project currently involves seven institutions from five European countries. It is coordinated by </span>
<a shape="rect" href="http://roboticslab.uc3m.es/roboticslab/user/9">Professor Carlos Balaguer</a>, Santiago Martínez de la Casa, and Carme de Andrés Sanchis from the RoboticsLab at the University Carlos III of Madrid. The project<span> only started this January (and it’s been funded for </span>€3.7 million over the next three years)<span>, so it’s understandable that they don’t yet have any prototype hardware that we’re able to show you. Eventually the idea is to end up with a “robotic system, following a highly modular approach and architecture, while at the same time ensuring reliable and dependable operation in real-life underground environments in synergy with existing robust market technologies for trenchless applications.” Sounds good! Just make sure and look out for those mole people, okay?</span>
</p>
<p/>
<p>
<span id="docs-internal-guid-f81c6bac-d15a-644f-6318-02ba2aec9bf3">[ <a shape="rect" href="http://badger-robotics.eu/badger/">BADGER Project</a> ]</span>
</p>
<p/>
<div/>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/iPgYkYuPfdI" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Wed, 5 Jul 2017 20:20:00 GMT</pubDate>
<dc:creator>Evan Ackerman</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/automaton/robotics/industrial-robots/eu-developing-robot-badgers-for-underground-excavation</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMjA3MA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMjA2OA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/automaton/robotics/industrial-robots/eu-developing-robot-badgers-for-underground-excavation</feedburner:origLink></item>
<item>
<title>Volvo Says Goodbye To Gasoline</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/l4onMs2h4-U/volvo-says-goodbye-to-gasoline</link>
<description>It's the first major carmaker to switch to electric drive, starting in model year 2019</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>It's the first major carmaker to switch to electric drive, starting in model year 2019<figure>
<img src="http://spectrum.ieee.org/image/MjkyMTkyNw.octet-stream"/>
<figcaption>Illustration: Volvo</figcaption>
</figure>
<div>
<p>Volvo says all of its 2019 models will have electric drive, making it the first big auto company to switch from traditional internal-combustion engines.</p>
<p>“This announcement marks the end of the solely combustion engine-powered car,” chief executive Håkan Samuelsson <a shape="rect" href="https://www.media.volvocars.com/us/en-us/media/pressreleases/210058/volvo-cars-to-go-all-electric">said in a statement.</a> “Volvo Cars has stated that it plans to have sold a total of one million electrified cars by 2025. When we said it we meant it. This is how we are going to do it.”</p>
<figure role="img" class="rt med-sm">
<img src="http://spectrum.ieee.org/image/MjkyMTkyNg.octet-stream" alt="photo of Håkan Samuelsson, President &amp; CEO, Volvo Car Group"/>
<figcaption class="hi-cap">
  Photo: Volvo
 </figcaption>
</figure>
<p>Samuelsson’s strategy, in a nutshell, is to begin with a few pure electric models and a larger number of hybrid-electric ones. True, every vehicle will run, at least in part, on electrons, but some will also include either the vestiges of an internal combustion engine—in a so-called plug-in hybrid, where the engine is basically just a range extender—or a beefier one, in a mild hybrid.</p>
<p>But by 2019 five new models will run purely on electricity, from either batteries or some other form of stored electric power, such as fuel cells. Two of those all-electrics will be high-performance models to be sold by Polestar, a Volvo subsidiary built in response to Tesla’s challenge. BMW has something like it in the form of BMW i, a brand rather than an independent subsidiary.</p>
<p>Volvo is based in Sweden, but it’s owned by Geely, a Chinese multinational. And China has done more than any other big car country to subsidize electric drive, with some 200 entities now chasing the technology. That’s too many, the government now appears to think, given that it <a shape="rect" href="https://www.ft.com/content/891d8264-5016-11e7-bfb8-997009366969?mhq5j=e1">cut subsidies by 20 percent earlier this year</a> and is taking further steps to weed out the startups that haven’t delivered much on their promises.</p>
<p>It could be that the Chinese want to concentrate electric drive in the hands of a few big players that would then benefit, as much as possible, from economies of scale. Battery prices are falling, and though the total ownership cost of an electric model is still high, it’s coming within reach of the internal combustion–powered competition.</p>
<p>Today the plug-in hybrid version of the <a shape="rect" href="http://spectrum.ieee.org/transportation/advanced-cars/2016s-top-ten-tech-cars-volvo-xc90">Volvo XC90</a> SUV has a suggested retail price of <a shape="rect" href="http://www.motortrend.com/cars/volvo/xc90-plug-in/2017/2017-volvo-xc90-t8-plug-in-hybrid-first-test-review/">US $67,800</a>, or about $18,000 more than the <a shape="rect" href="http://www.motortrend.com/cars/volvo/xc90/2016/2016-volvo-xc90-awd-inscription-review-arrival/">gasoline-powered version</a>. And that doesn’t include a tax credit worth as much as $4,600. Also, the watt-hours it drinks cost less per trip than the equivalent energy in gasoline.</p>
<p>The real cost benefits will come when the engine finally makes its final bow. Then cars will come with just one power train rather than two, and it will be by far the simpler one, and thus <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/self-driving/why-robocars-will-run-on-electricity">cheaper to maintain</a>. That, and the magic of mass production, will bring the cost of owning an electric car down to that of a conventional car, if not lower, even without the subsidy. According to researchers at the Swiss bank UBS, cost parity could come as early as <a shape="rect" href="https://www.ft.com/content/6e475f18-3c85-11e7-ac89-b01cc67cfeec?mhq5j=e1">next year</a> in Europe, and by around 2025 in the United States.</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/l4onMs2h4-U" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Wed, 5 Jul 2017 18:00:00 GMT</pubDate>
<dc:creator>Philip E. Ross</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/cars-that-think/transportation/advanced-cars/volvo-says-goodbye-to-gasoline</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMTk0Mw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMTk0MQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/cars-that-think/transportation/advanced-cars/volvo-says-goodbye-to-gasoline</feedburner:origLink></item>
<item>
<title>Nondestructive Microscopy Technique Offers a Path Toward In-Silicon Quantum Computers</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/XvUcU6uIBfs/nondestructive-microscopy-technique-offers-a-path-toward-insilicon-quantum-computers</link>
<description>For first time, scientists can peer deep into silicon to image atoms far from the surface without damaging the sample</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>For first time, scientists can peer deep into silicon to image atoms far from the surface without damaging the sample<figure>
<img src="http://spectrum.ieee.org/image/MjkyMDcwOQ.jpeg"/>
<figcaption>Illustration: Science Advances</figcaption>
</figure>
<div>
<p>An international team of researchers has developed a nondestructive imaging technique that can peer deep inside of silicon to locate and characterize various structures. While this should be a boon for testing and measuring conventional silicon chips currently used in information processing, it could have its greatest impact by enabling the next generation of devices for quantum information processing.</p>
<p>In research described in the journal <a shape="rect" href="http://advances.sciencemag.org/content/3/6/e1602586">
<em>Science Advances</em>
</a>, researchers from the University of Linz in Austria, University College London, ETH Zurich, and École Polytechnique Fédérale de Lausanne in Switzerland have adapted the well-established microscopy technique known as Scanning Microwave Microscopy (SMM) to identify dopants deep inside the silicon without causing any damage to the material. (Dopants are atoms that are added to a semiconductor to change its electrical and optical properties.)</p>
<p>SMM is used for a wide range of applications from characterizing biological cells to new materials, such as graphene, or standard semiconductor samples. It’s done by combing an Atomic Force Microscope (AFM)—which has a nanoscopic probe that’s scanned over the sample of interest—with a Vector Network Analyzer (VNA) that sends out a microwave signal from the AFM probe. The signal is reflected within the volume of the sample and measured by the VNA, which gives information about the three-dimensional structure and electrical properties of the sample.</p>
<p>The researchers used this SMM technique to image the electrical properties of a patterned layer of phosphorus atoms under a silicon surface. By using this technique, the researchers were able to image 1,900 to 4,200 densely packed atoms buried four to 15 nanometers below the surface.</p>
<p>Of course, there other microscopy techniques like Secondary Ion Mass Spectrometry (SIMS) that can image these dopants. However, the main advantage of SMM is that it doesn’t modify or damage the sample.</p>
<p>“We see potentially a global impact with our technique for standard silicon chips, which are becoming so sophisticated and complex that taking snapshots of their smallest working parts is extremely difficult and time-consuming, and currently involves destroying the chip,” explained Georg Gramse, a post-doc at the University of Linz who led the research, in an e-mail interview with <em>IEEE Spectrum</em>.</p>
<p>Gramse also notes that non-destructive imaging technologies are also becoming important for governments who are interested in knowing what is inside the foreign-made electronics they are using.</p>
<p>While SMM’s non-destructive scanning is sure to be of assistance to those making silicon chips for classical information processing, Gramse believes it will likely have an enormous impact on the fabrication of phosphorus-in-silicon quantum computers.</p>
<p>
<a shape="rect" href="http://spectrum.ieee.org/searchContent?q=quantum+computers">Quantum computers</a> operate quite differently from classical computers that switch transistors either on or off to represent data as ones and zeroes. Instead, quantum computers use quantum bits that, because of the laws of quantum mechanics, can be in a state of superposition where they simultaneously act as both 1 and 0.</p>
<p>Four years ago, an initial step was taken towards making it possible to fabricate such quantum computers by using the same silicon used in today’s computers. The trick was to <a shape="rect" href="http://spectrum.ieee.org/computing/hardware/a-big-step-toward-a-silicon-quantum-computer">implant a phosphorus atom in the silicon</a>. This approach manages to use the nuclear spin of the phosphorus atom that is embedded in silicon as a quantum bit, or qubit.</p>
<p>This latest research offers an important advance towards realizing these phosphorous-in-silicon devices since the SMM can be integrated into the same scanning probe instrument used to pattern the silicon devices. This would dramatically increase the speed of producing three-dimensional patterned structures. This is due to the fact that the same scanning probe instrument used to pattern the device also allows in situ and iterative control during the entire lithography molecular beam epitaxy (MBE) process for atomic-scale doping.</p>
<p>Gramse adds: “Currently we are studying the physical behavior of phosphorus layers on powered devices, which is the next step on the way towards in-silicon quantum computers.”</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/XvUcU6uIBfs" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Wed, 5 Jul 2017 17:00:00 GMT</pubDate>
<dc:creator>Dexter Johnson</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/nondestructive-microscopy-technique-offers-a-path-toward-insilicon-quantum-computers</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMDcyNg.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMDcyNA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/nondestructive-microscopy-technique-offers-a-path-toward-insilicon-quantum-computers</feedburner:origLink></item>
<item>
<title>Under the Hood of Luminar's Long-Reach Lidar</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/fFvtMMmhA_M/under-the-hood-of-luminars-long-reach-lidar</link>
<description>Shifting to a longer wavelength that's safer for the eye lets Luminar raise its lidar power enough to stretch its range beyond 200 meters. Other innovations could cut system costs.</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Shifting to a longer wavelength that's safer for the eye lets Luminar raise its lidar power enough to stretch its range beyond 200 meters. Other innovations could cut system costs.<figure>
<img src="http://spectrum.ieee.org/image/MjkyMTgxOA.gif"/>
<figcaption>Gif: Luminar Technologies/IEEE Spectrum</figcaption>
</figure>
<div>
<p>Current automotive lidars scan their surroundings by firing pulses from semiconductor diode lasers emitting at 905 nanometers in the near infrared and recording reflected light to build up a point cloud mapping the car’s surroundings. But laser-safety rules in the U.S. and other countries restrict the power in the laser pulse, limiting the lidar’s range to 30 to 40 meters, too short a distance for a car to stop safely at highway speeds. Makers of autonomous cars need to spot low-reflectivity objects at least 200 meters away to give the car enough time to identify hazards and stop, so they turned to other technologies. At least one lidar maker, however, kept tinkering. </p>
<p>This spring, the U.S. startup Luminar Technologies announced a lidar able to achieve that 200-meter range by using a 1550-nm laser and <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/sensors/22yearold-lidar-whiz-claims-breakthrough">showed its system at <em>Spectrum</em>’s offices</a>. <span>Last week </span>
<a shape="rect" style="font-family: Georgia, serif; font-size: 18px;" href="http://optics.org/news/8/6/40">Luminar mounted its system on a standard Mercedes-Benz and showed it outside the Laser World of Photonics trade show in Munich</a>.</p>
<figure role="img" class="rt med">
<img src="http://spectrum.ieee.org/image/MjkyMTgzNg.jpeg" alt="photo of Luminar's lidar"/>
<figcaption class="hi-cap">
  Photo: Randi Klett
 </figcaption>
<figcaption>
  Luminar’s lidar contains one laser, one sensor, and moving optics.
 </figcaption>
</figure>
<p>The key to that success is looser safety rules that let Luminar’s lasers fire pulses 40 times more powerful at 1550 nm than 905-nm lasers are allowed to fire in a conventional car. That <a shape="rect" href="https://www.luminartech.com/technology/index.html">extra power extends the long-wavelength lidar’s range by a factor of 10 and its resolution a factor of 50.</a> That extra margin makes a big difference at highway speeds—at 75 miles per hour, a car travels 34 meters in a second.</p>
<p>The difference in laser safety rules comes from a crucial difference in hazards posed by the two wavelengths. The part of the body most vulnerable to damage from a laser beam is the retina at the back of the eye. Just staring at the sun can damage the fragile retina, which is why our eyes instinctively turn away if the sun comes into our field of view. Because laser light is collimated into a narrow beam, it’s particularly hazardous: The eye focuses the parallel rays to a tiny point on the retina, which is why safety rules limit laser-pointer beams to five milliwatts.</p>
<p>The retina does not respond to the 905-nm infrared light used in current car lidars, so we can’t see it. But the eye transmits 905 nm to the retina, so it’s subject to the same restrictions as visible light. In fact, it’s even more hazardous because the eye cannot automatically turn away from a bright source that the retina can’t sense.</p>
<figure role="img" class="rt lrg">
<img src="http://spectrum.ieee.org/image/MjkyMTgzNw.jpeg" alt="eye diagram showing lidar visibility to retina"/>
<figcaption class="hi-cap">
  Illustration: Jeff Hecht/WPClipart
 </figcaption>
<figcaption>
  The eye’s interior is transparent to 905-nm light [blue arrow]; lidar at that wavelength can reach the vulnerable retina. However, the eye is opaque to 1550-nm light [red arrows] so that wavelength cannot reach the retina, allowing the use of higher power lidar without endangering the eye.
 </figcaption>
</figure>
<p>The interior of the eye—the lens, the cornea, and the watery fluid inside the eyeball—becomes less transparent at longer wavelengths, mainly because of water absorption. Essentially no light at wavelengths beyond 1400 nm reaches the retina, so laser safety standards allow higher output at longer wavelengths. Long wavelengths can damage the cornea, but only at intensities orders of magnitude higher than the threshold for retinal damage.</p>
<p>Police lidar guns use 905-nm lasers because they don’t need long range, the lasers are cheap, and the guns can use inexpensive silicon detectors. But developers of non-automotive lidars consider 1550 nm a “sweet spot,” says lidar pioneer Dennis Killinger, distinguished university professor emeritus at the University of Southern Florida. Those lidars are used in open air, so using an eye-safe wavelength allows long-range measurement without worrying about eye damage to bystanders. <a shape="rect" href="http://seminex.com/1550nm-lasers.aspx">Lasers emitting at 1550 nm are widely used in telecommunications and other fields, and available from companies like SemiNex in Massachusetts</a>. Silicon sensors don’t respond to 1550 nm, but room-temperature indium-gallium arsenide sensors do, and they also are standard communication products.</p>
<p>Lasers and sensors for 1550 nm cost more than those for 905 nm, but Killinger says what contributes the most to the cost of today’s US $10,000-range automotive lidars are complex opto-mechanical scanners that spin up to dozens of lasers and sensors around many times a second. He says the best hope for slashing lidar prices to affordable levels is to replace today’s costly opto-mechanical scanners with optical versions of phased-array antennas. Used in radars and in cell phones, they contain slotted elements arranged side by side, with electronic signals directing their focus without moving the elements.</p>
<p>“Several groups are working on small, slotted, phased-array optical arrays with flat solid-state panels,” says Killinger. Last year <a shape="rect" href="http://spectrum.ieee.org/tech-talk/semiconductors/optoelectronics/mit-lidar-on-a-chip">MIT’s Photonic Systems Group and DARPA described solid-state lidars fabricated on a 300-millimeter chip that they said could be mass produced for $10 each</a>.  <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/sensors/velodyne-announces-a-solidstate-lidar">Three companies announced forthcoming solid-state lidars earlier this year</a>, with projected prices as low as $100. </p>
<figure role="img" class="rt med-lrg">
<img src="http://spectrum.ieee.org/image/MjkyMTgzOA.jpeg" alt="Luminar's lidar mounted in a Mercedes-Benz"/>
<figcaption class="hi-cap">
  Photo: Luminar Technologies
 </figcaption>
<figcaption>
  Luminar’s lidar looks forward in a test car.
 </figcaption>
</figure>
<p>Luminar is pinning its hopes on a different approach—mounting lidars pointing in different directions on the car. Each contains only a single laser and receiver, plus customized mirrors that move inside the sealed box. They don’t expect a $100 lidar anytime soon, but <a shape="rect" href="http://spectrum.ieee.org/cars-that-think/transportation/sensors/22yearold-lidar-whiz-claims-breakthrough">Luminar founder and CEO Austin Russell told <em>Spectrum</em> earlier this year that for autonomous cars, “Cost is not the most important issue; performance is.”</a>
</p>
<p/>
<p>
<em>This post was corrected on 6 July to give the proper full name of Luminar Technologies.</em>
</p>
<p/>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/fFvtMMmhA_M" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Wed, 5 Jul 2017 15:00:00 GMT</pubDate>
<dc:creator>Jeff Hecht</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/cars-that-think/transportation/self-driving/under-the-hood-of-luminars-long-reach-lidar</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMTgzMw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMTgzNQ.gif" height="" width="" />
<feedburner:origLink>http://spectrum.ieee.org/cars-that-think/transportation/self-driving/under-the-hood-of-luminars-long-reach-lidar</feedburner:origLink></item>
<item>
<title>Autonomous Vehicles vs. Kangaroos: the Long Furry Tail of Unlikely Events</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/s3XDBVuFY5A/autonomous-cars-vs-kangaroos-the-long-furry-tail-of-unlikely-events</link>
<description>Self-driving cars in Australia are preparing to handle kangaroos, but what about autonomous cars everywhere else?</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Self-driving cars in Australia are preparing to handle kangaroos, but what about autonomous cars everywhere else?<figure>
<img src="http://spectrum.ieee.org/image/MjkyMTU0MQ.jpeg"/>
<figcaption>Photo: Theo Allofs/Minden Pictures/Getty Images</figcaption>
</figure>
<div>
<p>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">One of the biggest challenges for autonomous cars is being able to deal with potentially dangerous events that don't happen very often. This is why most autonomous car companies are spending an enormous amount of time and effort driving around and collecting sensor data: they’re hoping to expose their algorithms to as many unusual things as possible, to give them more experience with making decisions even when weird stuff happens. “Weird stuff that's hard to deal with” can mean almost anything, but it tends to mean different things in different places. In San Francisco, it probably means bicycles. In Sweden, it means moose. In Australia, it means kangaroos, and according to Volvo, <a shape="rect" href="http://jalopnik.com/volvos-driverless-cars-cant-figure-out-kangaroos-1796418109">kangaroos are proving to be very weird stuff indeed</a>.</span>
</p>
<p>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">Most autonomous cars do have animal detection systems, but they’re primarily designed for detecting medium to large quadrupeds, like cats, dogs, and deer. Kangaroos not only don’t look like quadrupeds, they don’t act like quadrupeds either, and all of that charming hopping about really does a number on an autonomous system trying to figure out where they are and what they’re doing, <a shape="rect" href="http://www.abc.net.au/news/2017-06-24/driverless-cars-in-australia-face-challenge-of-roo-problem/8574816">as ABC News Australia reported last week</a>:</span>
</p>
<blockquote>
<p>
<em>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">It turns out the unusual way that kangaroos move completely throws off the car's animal detection system. “We've noticed with the kangaroo being in mid-flight ... when it’s in the air it actually looks like it’s further away, then it lands and it looks closer,” Volvo Australia’s technical manager David Pickett said. Because the cars use the ground as a reference point, they become confused by a hopping kangaroo, unable to determine how far away it is.</span>
</em>
</p>
</blockquote>
<p>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">Furthermore, kangaroo identification is made more complex because of how different kangaroos look depending on what they're up to. To an autonomous system, a standing kangaroo looks different than a grazing kangaroo, which looks different still than a hopping kangaroo. What’s more, any given kangaroo can rapidly (and sometime erratically) switch between these three states. The upshot is that you need a system that's able to reliably identify kangaroos despite this variability, which is a very complicated thing to put together. Apparently, Volvo sent a research team to Tidbinbilla Nature Reserve in Canberra 18 months ago, specifically to study kangaroos. They’re still working on solving this “quite interesting” problem.</span>
</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/xZOfu7Jf0jw?rel=0" width="620" frameborder="0" height="349"/>
</p>
<p>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">The kangaroo problem illustrates several potential issues with autonomous cars. First, there's localization—not the “where am I” localization, but rather the “being prepared for a new place” localization. Practically, this involves tweaking software to be able to recognize and handle the unique characteristics of a unique environment, and may involve differently oriented traffic lights, new types of street signs, foreign languages, completely new rules of the road, or (in the case of Australia) marsupials with cyclically variable altitudes. </span>
</p>
<p>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">Secondly, kangaroos are, everywhere but in Australia, a good example of autonomous cars' long tail problem (heh), which suggests that as the number of possible driving situations you may encounter approaches infinity, the probability of encountering those situations approaches zero. In other words, lots of weird stuff could potentially happen, and there's no way to predict all of it. If you're driving around in the United States, encountering a kangaroo would be very weird, but it's by no means impossible: here's one that was running around eastern Oklahoma in December of 2013:</span>
</p>
<p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="//www.youtube.com/embed/Obs7NTWx9wQ?rel=0" width="620" frameborder="0" height="349"/>
</p>
<p>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">As it turns out, that particular kangaroo was not </span>
<a shape="rect" href="http://kfor.com/2012/11/26/missing-pet-kangaroo-lucy-sparkles/">Lucy Sparkles</a>, a kangaroo that had gone missing in Oklahoma the year before. It was also not a kangaroo that belonged to a nearby exotic animal farm (all of their kangaroos were accounted for). Eventually, the kangaroo was traced to (and subsequently recaptured by) <a shape="rect" href="http://www.news9.com/story/24299464/oklahoma-hunter-catches-kangaroo-on-camera">another guy in the same town who also had a bunch of kangaroos for some reason</a>. So yeah, a kangaroo encounter is probably more likely than you think.</p>
<p>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">The question raised by this long and sometimes furry tail of unlikely events (including unconstrained extra-Australian macropods) is to what extent autonomous car companies should reasonably be expected to anticipate and devote resources to preparing for things that almost never happen. It seems obvious that an autonomous car (in North America) should know what to do when it senses a deer on the road. You're less likely to encounter a moose: should your car know what to do then? What about a bear? I’m sure kangaroos are farther down the list, but it’s a list that's infinitely long, and there’s no clear point at which a potential situation goes from “worth preparing for” to "not worth the effort.” More to the point, if your autonomous car runs into an occasionally airborne pouched hopper, who is at fault?</span>
</p>
<p>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">We don't mean to suggest that an autonomous car that has not been specifically programmed to avoid a kangaroo is just going to plow straight into the first kangaroo that it sees, since “Don't hit things” is useful, practical advice that tends to be pretty high up on an autonomous car's generalized priority list. However, there are many different ways of not hitting a thing depending on what that thing is, and sometimes that advice can even cause problems if the car senses something that it doesn't understand. A common example might be a plastic bag blowing around in the middle of a highway. Since we humans understand what a plastic bag is and what the consequences would be if we were to collide with it (probably  mild) as opposed to what the consequences would be if we tried not to hit it (possibly serious), we can make the correct decision about what to do. And thanks to having lots of experience driving, and also with generally living in the world, we can rapidly assess unfamiliar objects and make informed decisions about them. Robots of all kinds are notoriously bad at this.</span>
</p>
<p>
<span id="docs-internal-guid-de32d100-f565-5b80-82cb-6193f538a179">Volvo isn't particularly worried about kangaroos specifically; they're a challenge, but an identifiable and easily definable (and therefore solvable) one. There are other Australian things that autonomous cars need to be prepared for as well, including unsealed reads, unmarked roads, and road trains. But again, the real problem here is that Volvo, and every other autonomous car company, has no idea what might try to cross the road. Whether it's a kangaroo, a human on a pogo stick, or something far stranger, autonomous cars will (somehow) need to be ready for it.</span>
</p>
<div/>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/s3XDBVuFY5A" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Wed, 5 Jul 2017 13:30:00 GMT</pubDate>
<dc:creator>Evan Ackerman</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/cars-that-think/transportation/self-driving/autonomous-cars-vs-kangaroos-the-long-furry-tail-of-unlikely-events</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMTU1NA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMTU1Mg.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/cars-that-think/transportation/self-driving/autonomous-cars-vs-kangaroos-the-long-furry-tail-of-unlikely-events</feedburner:origLink></item>
<item>
<title>Anki Makes Programming Easy with Drag and Drop Coding</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/96JYZc8PoG0/anki-makes-programming-easy-with-drag-and-drop-coding</link>
<description>With an easy-to-use interface based on Scratch, you can now command Anki's Cozmo to do complex tasks without any programming experience</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>With an easy-to-use interface based on Scratch, you can now command Anki's Cozmo to do complex tasks without any programming experience<figure>
<img src="http://spectrum.ieee.org/image/MjkyMTQyOQ.jpeg"/>
<figcaption>Celia Gorman</figcaption>
</figure>
<div>
<p>When <a shape="rect" href="https://anki.com/en-us/cozmo">Anki introduced Cozmo</a> almost exactly one year ago, <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/home-robots/anki-cozmo-robotic-toy">we started off with a bit of skepticism</a>, and a feeling that Anki was going slightly overboard with the kinds of promises that it was making for this cute and capable little robot. What was more exciting to us was when Anki followed up a few weeks later with <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/robotics-software/anki-sdk-cozmo-robot">Cozmo’s software development kit</a>, or SDK, allowing access to a variety of very sophisticated features through relatively simple lines of code.</p>
<p/>
<p>This week, <a shape="rect" href="https://anki.com/en-us/cozmo/code-lab">Anki announced Code Lab</a>, which takes that SDK and adds a graphical drag-and-drop interface that makes it incredibly simple to get Cozmo to do complex tasks involving vision, manipulation, and decision making, even if you have zero programming experience (like me). It’s fun, it’s easy, it’s affordable, and last week, I tried it out for myself, with a little help from Anki co-founder and president <a shape="rect" href="https://anki.com/en-us/company">Hanns Tappeiner</a>.</p>
<p/>
<p>As an absolute amateur, even an easy SDK is over my head. But Anki’s new Code Lab is designed to be used anyone—including 5-year-old children. Cozmo is charming when it learns your name and face. Code Lab draws you in with coding challenges that are more games than lessons. And now even I can program a facial recognition app.</p>
<p/>
<p>Read More: <a shape="rect" href="http://spectrum.ieee.org/automaton/robotics/diy/anki-code-lab-brings-sophisticated-graphical-programming-to-cozmo-robot">Anki's Code Lab Brings Sophisticated Graphical Programming to Cozmo Robot</a>
</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/96JYZc8PoG0" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Sat, 1 Jul 2017 13:00:00 GMT</pubDate>
<dc:creator>Celia Gorman</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/video/robotics/diy/anki-makes-programming-easy-with-drag-and-drop-coding</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMTQ0NQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMTQ4Ng.gif" height="" width="" />
<feedburner:origLink>http://spectrum.ieee.org/video/robotics/diy/anki-makes-programming-easy-with-drag-and-drop-coding</feedburner:origLink></item>
<item>
<title>The Corporate Blockchain</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/eHE3_kNFqAM/the-corporate-blockchain</link>
<description>It looks a lot different than its decentralized predecessors. Can it last?</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>It looks a lot different than its decentralized predecessors. Can it last?<figure>
<img src="http://spectrum.ieee.org/image/MjkyMTE4MA.jpeg"/>
<figcaption>Illustration: Blockchain Technologies</figcaption>
</figure>
<div>
<p>Hundreds of financiers, Wall Street analysts, and C-suite executives gathered in New York City this week to peer into the future of finance at the CB Insights’ <a shape="rect" href="http://events.cbinsights.com/future-of-fintech">Future of Fintech</a> conference. And on Wednesday afternoon, they took a moment to ponder one of the greatest existential threats to their industry—and how they might turn it to their advantage.</p>
<p/>
<p>Attendees crammed into a standing-room-only session to hear about the role that blockchains would play in existing businesses. To many in finance, it’s a perplexing topic. After all, the Bitcoin blockchain was long ago predicted to render modern finance—and finacial firms—obsolete.</p>
<p/>
<p>Instead, many financial firms have embraced blockchain technology, and even become rather bullish about it in the process. <span>But companies have also found that preparing a blockchain to go live, and integrating it with existing systems, can be a daunting process. </span>
</p>
<p/>
<p>Up on stage, and tasked with guiding the crowd through its mixed bag of emotions, were: <a shape="rect" href="https://www.linkedin.com/in/marleygray/">Marley Gray</a>, principal program manager for Microsoft’s Azure Blockchain Engineering; <a shape="rect" href="https://consensys.net/team/">Joe Lubin</a>, founder of the blockchain consulting firm ConsenSys; and <a shape="rect" href="https://www.linkedin.com/in/rumimorales/">Rumi Morales</a>, executive director of <a shape="rect" href="http://www.cmegroup.com/cme-ventures.html">CME Ventures</a>, the investment arm of CME Group which manages the Chicago Mercantile Exchange.</p>
<p/>
<p>Gray set the tone for the discussion from his vantage point at Microsoft, which offers a platform that it calls <a shape="rect" href="https://azure.microsoft.com/en-us/solutions/blockchain/">blockchain-as-a-service</a> (BaaS) to help companies build their own blockchain-based networks and applications. As a result, Gray has seen how early experiments have fared across many industries.</p>
<p/>
<p>“One of our goals was to make it ridiculously easy to roll [blockchains] out,” he said. “Now we’re at the next phase of—now I’ve got this blockchain, what do I do with it? So we’re kind of stuck on that piece right now.”   </p>
<p/>
<p>Many banks and stock exchanges are on the cusp of moving from pilots and proof-of-concepts to actual blockchain implementations. Morales, who has overseen her firm’s investments into <a shape="rect" href="https://ripple.com/">Ripple</a> and <a shape="rect" href="http://dcg.co/">Digital Currency Group</a> (which owns the cryptocurrency news site <a shape="rect" href="http://www.coindesk.com/">CoinDesk</a> and has funded <a shape="rect" href="https://www.coinbase.com/">Coinbase</a>, a trading service), suggested the industry is facing a moment of truth.</p>
<p/>
<p>“Last year, we saw a number of companies announcing that they would be building things, or had a use case, for [the blockchain],” she said. “This is the year they need to prove that.”  </p>
<p/>
<p>There has been some progress on that front—in May, Nasdaq, Citi, and Chain revealed a <a shape="rect" href="http://spectrum.ieee.org/tech-talk/telecom/internet/citi-launches-blockchainbased-payments-service-with-nasdaq-for-private-equity">blockchain-based payments system for private equity</a> and earlier this week, IBM<a shape="rect" href="https://www.reuters.com/article/us-banks-blockchain-ibm-idUSKBN19H2M6"> announced</a> that it was building a system to manage trade finance with seven European banks that would go live by the end of the year.</p>
<p/>
<p>But there’s a significant back-office bottleneck for people looking to deploy systems. Developers have a limited set of software tools at their disposal, and there is fierce competition for their talent. Consortiums, startups, and incumbents such as IBM and Microsoft are developing dozens of different ways to build blockchain-based networks and applications, without any reference architecture or standards to lean on.</p>
<p/>
<p>This process can be frustrating, to say the least, said Morales. “For many people I know, they’ve moved on to pulling out their eyelashes because they’ve finished pulling out their hair,” she said. “It can be very painful.”</p>
<p/>
<p>Even so, Morales and her fellow panelists were not keen on the idea of establishing comprehensive standards anytime soon. “I really think we’re going to have to be very, very specific about the definition of blockchain if we’re going to talk about standards,” she said.</p>
<p/>
<p>Gray from Microsoft put it more bluntly. “It’s way too early for standards,” he said.</p>
<p/>
<p>In the end, of course, the agony of blockchain development could very well result in big pay offs. For many, the thrill of the technology is its potential to overturn so many aspects of how business is done today. Throughout the week, I heard attendees and speakers batting around dozens of possible uses for blockchains in sessions and hallway meetings.</p>
<p/>
<p>On stage, Lubin described one of his favorite projects at ConsenSys—<a shape="rect" href="https://www.ethnews.com/consensys-introduces-grid-based-solution-for-energy-inefficiency">a solar power system</a> in which batteries automatically sell or buy extra juice through a blockchain, thereby improving the efficiency of the entire grid. “It prevents the need to spin up billion-dollar petrol plants to handle peak load in hot days in the summer,” he said.  </p>
<p/>
<p>And for every discussion of a practical use that has already been identified, there were countless mentions of the technology’s unexplored possibility. “It’s like trying to predict Facebook back in 1995,” Gray said. “Who would have known?”</p>
<p/>
<p>While everyone else is dreaming about blockchain’s killer app, Gray believes the highest value of the technology will be to bridge industries and simplify all kinds of interactions across companies, individuals, public entities, and real-world events. “The true promise is ultimately getting to a place where we can have business contracts that weave together across verticals,” he said.</p>
<p/>
<p>This also means that Gray expects the current industry-wide preference for permissioned blockchains—those which are cordoned off from public access—will eventually erode. Instead, he thinks society will gradually embrace the power and functionality of decentralized, public chains, such as the one that underlies Bitcoin.</p>
<p/>
<p>First, though, public blockchains must prove that they can scale up to handle millions upon millions of transactions every day. Currently, no public blockchains could do this, said Lubin.</p>
<p/>
<p>Looking ahead, Lubin expects both public and private blockchains to evolve over a long development period that has only just begun. “Blockchains in two, five, and 10 years from now are going to look completely different,” he said.</p>
<p/>
<p>For all the work ahead, many speakers and attendees at the conference remained optimistic—and at times, positively upbeat—about the future of blockchain technology. For the finance industry, the promise of reducing costs, settling trades, and streamlining transactions is particularly intoxicating. “That gain is hopefully going to be worth the pain,” Morales said.</p>
<p>
<em>Editor’s note: This story was updated on 07/10/17 to clarify the roles of CoinDesk, the cryptocurrency news site, and Coinbase, the trading service. </em>
</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/eHE3_kNFqAM" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 19:00:00 GMT</pubDate>
<dc:creator>Amy Nordrum</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/tech-talk/computing/networks/the-corporate-blockchain</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMTIwMw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMTIwMQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/tech-talk/computing/networks/the-corporate-blockchain</feedburner:origLink></item>
<item>
<title>‘NotPetya’: Latest Ransomware is a Warning Note From the Future</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/J0uHxK7wHcw/notpetya-latest-ransomware-is-a-warning-note-from-the-future</link>
<description>This week’s Ukrainian malware attack cribbed from last month’s ‘WannaCry’ ransomware outbreak—but foreshadows worse to come</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>This week’s Ukrainian malware attack cribbed from last month’s ‘WannaCry’ ransomware outbreak—but foreshadows worse to come<figure>
<img src="http://spectrum.ieee.org/image/MjkyMTQ2OQ.jpeg"/>
<figcaption>Photo: Vladimir Trefilov/Sputnik/AP Photo</figcaption>
</figure>
<div>
<p>[<em>Correction: An earlier version of this post inaccurately implied that the NSA did not inform Microsoft about the EternalBlue exploit. They did so once the NSA’s systems had been compromised. It also stated that Windows systems that are updated with latest Windows security updates will not be susceptible to the NotPetya ransomware. In fact, even patched systems could still be exploited via other means like NotPetya’s infection route via phony “updates” to the accounting program MeDoc. We apologize for the error.</em>]</p>
<p>First it “<a shape="rect" href="https://www.cnet.com/au/news/unprecedented-cyberattack-hits-businesses-across-europe/" target="_blank">slammed</a>” the Internet and “<a shape="rect" href="https://www.wired.com/story/petya-ransomware-outbreak-eternal-blue/" target="_blank">swept</a>” Europe, then it was “<a shape="rect" href="https://arstechnica.com/security/2017/06/petya-outbreak-was-a-chaos-sowing-wiper-not-profit-seeking-ransomware/" target="_blank">something much worse</a>,” and now it’s a “<a shape="rect" href="http://fortune.com/2017/06/29/police-suggest-petya-ransomware-attack-was-a-distraction/" target="_blank">distraction</a>.” This week’s “NotPetya” malware attack on Windows systems has, depending on who you believe, either spread like a devastating cyber-pandemic or amounted to an over-hyped flash-in-the-pan. </p>
<p>In the Ukraine, which took the <a shape="rect" href="https://en.wikipedia.org/wiki/2017_cyberattacks_on_Ukraine" target="_blank">brunt of the attack</a>, NotPetya certainly disrupted government and business operations, affecting hundreds of companies and offices. <span>The Russian government has been suspected as a possible origin for NotPetya, and on Friday NATO said they strongly suspected a “state actor” or private entity with close ties to a state. Yet, amidst speculation about the outbreak’s source, another part of the NotPetya story could be important down the line too: How might it inspire future malware outbreaks? </span>
</p>
<p>“It’s very disturbing that ransomware has started to move laterally,” says Mounir Hahad, senior director and head of Cyphort Labs. “You could do a lot of damage this way.” By lateral movement, Hahad means that NotPetya is designed to spread within local networks from computer to computer, devastating organizations.</p>
<p>Hahad and colleagues have been studying samples of NotPetya in their sandboxed network and <a shape="rect" href="https://www.cyphort.com/petrwrap-ransomware-wave-wake-wannacry/" target="_blank">posted their findings</a> on Cyphort’s blog earlier this week. Hahad says that NotPetya is a kind of mashup piece of malware that takes WannaCry’s ransomware approach and combines it with a 2016 piece of ransomware called <a shape="rect" href="https://en.wikipedia.org/wiki/Petya_(malware)" target="_blank">Petya</a>. NotPetya’s creators also threw three modules into the mix (one of which was <a shape="rect" href="https://en.wikipedia.org/wiki/EternalBlue" target="_blank">hacked from the NSA</a>) that effectively create a virulent spreading mechanism for the malware.</p>
<p>It’s this last part that Hahad says could be further mutated to make more dangerous attacks still.</p>
<p>
<a shape="rect" href="http://spectrum.ieee.org/tech-talk/computing/it/wannacry-updates-microsoft-touts-digital-geneva-convention-to-thwart-future-cyberattacks" target="_blank">WannaCry</a>, he says, encrypted a user’s files in affected computers and on mounted disks attached to those computers. Then it flashed the now famous <a shape="rect" href="https://en.wikipedia.org/wiki/File:Wana_Decrypt0r_screenshot.png" target="_blank">warning screen</a> that demanded payment in Bitcoin to decrypt the files.</p>
<p>NotPetya does all this too, upon infection of a system through a hacked “update” to accounting software from a Ukranian software company. And if NotPetya were pure ransomware—designed to maximize the number of ransom payments—it might have stopped there. But NotPetya can also, depending on the level of access it has, make the further devastating attack on a system of rewriting a hard drive’s so-called <a shape="rect" href="https://en.wikipedia.org/wiki/Master_boot_record" target="_blank">master boot record</a>, which tells the computer what operating system to run and where to find it.</p>
<p>A hacked computer running this second encryption routine will display a misleading boot screen telling the user it’s trying to “repair” the hard drive’s file system. It says, “WARNING: DO NOT TURN OFF YOUR PC! IF YOU ABORT THIS PROCESS, YOU COULD DESTROY ALL OF YOUR DATA! PLEASE ENSURE THAT YOUR POWER CABLE IS PLUGGED IN!”</p>
<p>Users who, understandably, heed the dire warning are unfortunately allowing the computer time to both encrypt the disk and search for ways to infect other systems within the computer’s local area network.</p>
<p>Ultimately the process completes and puts up a <a shape="rect" href="https://en.wikipedia.org/wiki/File:PetyaA.jpg" target="_blank">text-only screen</a> that tells the user to send $300 in Bitcoin to a fixed address and then to send an email to an address where one can allegedly receive the decryption key.</p>
<p>Hahad says he’s not aware of anyone actually being able to decrypt their systems. In any event, the email address (which reportedly is the same address on every infected computer) was disabled by the ISP soon after NotPetya began spreading.</p>
<p>“The only communication with these threat actors was going to be through that one email account that got terminated pretty quickly by the ISP,” he says. “The second mistake was the fact that there’s a single Bitcoin wallet. That’s the way of tracking who’s making the payments and who isn’t. So if somebody posted a payment to that wallet, anybody else could say, ‘Hey, I’m the one who posted that payment, give me my key.’ There are multiple flaws with the payment method, which clearly indicates that those guys may not have been interested in generating revenue.”</p>
<p>The most original part of NotPetya, Hahad says, is its method of propagating itself within a local network that could infect many other computers within an organization.</p>
<p>“Previous ransomware was mostly targeting the computers they hit via phishing campaigns, and when they got really sophisticated, they started looking for mounted drives on your laptop and encrypted those as well,” he says. “This one goes well beyond that. It’s jumping the gap between your computer and other computers in your organization. So that’s a level above the typical ransomware that we’ve been seeing. So it definitely requires more attention.”</p>
<p>The gap-jumping mechanism, Hahad says, involved three known Windows exploits, including the EternalBlue hack that the NSA allegedly developed but kept from Microsoft until a hack of NSA systems threatened to compromise its secrecy.</p>
<p>Hahad says that if a system has been patched and updated with the latest Windows updates, it won’t be susceptible to the “lateral spreading” he described, via the EternalBlue exploit. And users who do not perform regular backups of their systems will simply lose their files with no recourse to recovering them, he says.</p>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/J0uHxK7wHcw" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 18:00:00 GMT</pubDate>
<dc:creator>Mark Anderson</dc:creator>
<guid isPermaLink="false">http://spectrum.ieee.org/tech-talk/computing/it/notpetya-latest-ransomware-is-a-warning-note-from-the-future</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMTQ4NA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMTQ4Mg.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/tech-talk/computing/it/notpetya-latest-ransomware-is-a-warning-note-from-the-future</feedburner:origLink></item>
<item>
<title>Shaping Smarter Cities</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/cW3hn-QPoPw/shaping-smarter-cities</link>
<description>Mouser and Grant Imahara team up with the creative minds at WIRED Brand Lab to take a look at the modern city</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Mouser and Grant Imahara team up with the creative minds at WIRED Brand Lab to take a look at the modern city<figure>
<img src="http://spectrum.ieee.org/image/MjkyMTM2OQ.png"/>
</figure>
<div>
<div class="article-detail">
   
 <p>
<iframe scrolling="auto" allowfullscreen="allowfullscreen" src="https://www.youtube.com/embed/zFG8sSZYm4o" width="560" frameborder="0" height="315"/>
<br clear="none"/>
<span>Mouser and Grant Imahara team up with the creative minds at WIRED Brand Lab to take a look at the modern city. We’re traveling the world to see and learn from the innovators and progressive companies creating a more livable future for our cities. Get ready to explore new ideas and discover what a “smart” future may hold.</span>
</p>
<p>What role does technology play in making cities of the future smarter and more efficient? In this 5-part video series, Grant Imahara travels the globe to explore where humanity is heading and what companies are driving us there.</p>
<p>In the next video, Grant Imahara will visit Porto Portugal to meet with Veniam, a local company that is transforming the city into a Wi-Fi mesh network comprised of mobile hotspots.</p>
<p>Then in Tokyo, Japan, get ready to explore a new concept in indoor farming. Grant Imahara visits Mirai inside a repurposed former semiconductor fabrication factory. You won't believe what ideas are growing inside the world's largest indoor farm.</p>
<p>The last stop on the tour will be Los Angeles, California. Here we will discover new realities in Augmented Reality (AR). Grant Imahara will visit DAQRI, a company in the City of Dreams developing AR products and technologies that enhance human capabilities in manufacturing applications.</p>
<p>Be sure to also take a look at Wired’s article <strong>Shaping Smarter Cities: Assessing Global Challenges</strong> for a look at how technology will address the challenges across the globe.</p>
<p>Learn more about Mouser’s <a shape="rect" href="http://www.mouser.com/empowering-innovation/smarter-cities/?utm_source=IEEEGlobalSpecFeaturedVideoNewsletter&amp;utm_medium=display&amp;utm_campaign=mouser&amp;utm_content=smarter-cities-600x74">Empowering Innovation Together </a>campaign. </p>
</div>
<div class="article-detail"/>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/cW3hn-QPoPw" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:04:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/at-work/innovation/shaping-smarter-cities</guid>
<media:content url="http://spectrum.ieee.org/image/MjkyMTM4Mg.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkyMTM4MA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/at-work/innovation/shaping-smarter-cities</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Signetics NE555</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/wGGizCsPgQc/chip-hall-of-fame-signetics-ne555</link>
<description>A humble timer chip that became the Swiss Army knife of countless circuits</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>A humble timer chip that became the Swiss Army knife of countless circuits<figure>
<img src="http://spectrum.ieee.org/image/Mjg5ODEwMw.jpeg"/>
<figcaption>Hans Camenzind</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<style type="text/css">@media screen and (max-width:767px){
	.carousel-inner {
		height:410px!important;
	}
}
</style>
<div class="article-detail">
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/Mjg5ODA2Mg.jpeg" alt="chip"/>
<figcaption class="hi-cap">
   Photo: Hans Camenzind
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>NE555</h3>
<p>
<strong>Manufacturer: </strong>Signetics</p>
<p>
<strong>Category: </strong>Logic</p>
<p>
<strong>Year: </strong>1971</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>It was the summer of 1970, and chip designer Hans Camenzind was working as a consultant to Signetics, a Silicon Valley semiconductor firm. The economy was tanking. He was making less than US $15,000 a year and had a wife and four children at home. He really needed to invent something good.</p>
<p>And so he did. One of the greatest chips of all time, in fact. The 555 was a simple-to-use IC that could function as a timer or an oscillator. Still popular today, the chip was a smash hit, winding up in kitchen appliances, toys, spacecraft, and a few thousand other things.</p>
<p>“And it almost didn’t get made,” recalled Camenzind for <em>IEEE Spectrum</em> a few years before he <a shape="rect" href="http://www.eetimes.com/document.asp?doc_id=1262353">died in 2012</a>.</p>
<p>The idea for the 555 came to him when he was working on a circuit called a phase-locked loop. With some modifications, the circuit could work as a simple timer: You’d trigger it and it would run for a certain period. Simple as it may sound, there was nothing like that around.</p>
<p>At first, Signetics’ engineering department rejected the idea. The company was already selling components that customers could combine to make timers. That could have been the end of it. But Camenzind insisted. He went to Art Fury, Signetics’ marketing manager. Fury liked it.</p>
<p>Camenzind spent nearly a year testing breadboard prototypes, drawing the circuit components on paper, and cutting sheets of Rubylith masking film. “It was all done by hand, no computer,” he says. His final design had 23 transistors, 16 resistors, and 2 diodes.</p>
<div id="392144247" class="carousel slide">
<div class="carousel-inner">
<div class="item active">
<img src="http://spectrum.ieee.org/image/MjkxMjI3MQ.jpeg" alt="One of the keys to the 555’s success was that designers managed to get the circuit down to requiring just eight pins, which meant a small and compact package as seen in this early version." data-original="/image/MjkxMjI3MQ.jpeg" id="392144247_0"/>
<span class="item-num">1/3</span>
<div class="carousel-caption">
<p>One of the keys to the 555’s success was that designers managed to get the circuit down to requiring just eight pins, which meant a small and compact package as seen in this early version. <em>Photo: Mark Richards/Computer History Museum</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/Mjg5ODA5OA.jpeg" alt="In the end it was the intuition of one engineer—Hans Camenzind that led to one of the most successful chips of all time." data-original="/image/Mjg5ODA5OA.jpeg" id="392144247_1"/>
<span class="item-num">2/3</span>
<div class="carousel-caption">
<p>In the end it was the intuition of one engineer—Hans Camenzind that led to one of the most successful chips of all time. <em>Photo: Hans Camenzind</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/Mjg5ODA4MQ.jpeg" alt="Fans of the 555 can build a working replica of the timer out of discrete transistors." data-original="/image/Mjg5ODA4MQ.jpeg" id="392144247_2"/>
<span class="item-num">3/3</span>
<div class="carousel-caption">
<p>Fans of the 555 can build a working replica of the timer out of discrete transistors. <em>Photo: Evil Mad Scientist Laboratories</em>
</p>
</div>
</div>
</div>
<a shape="rect" data-slide="prev" href="#392144247" class="left carousel-control">
<span class="glyphicon glyphicon-chevron-left"/>
</a>
<a shape="rect" data-slide="next" href="#392144247" class="right carousel-control">
<span class="glyphicon glyphicon-chevron-right"/>
</a>
<ol class="carousel-indicators">
<li data-target="#392144247" data-slide-to="0" class="active"/>
<li data-target="#392144247" data-slide-to="1" class=""/>
<li data-target="#392144247" data-slide-to="2" class=""/>
</ol>
<script>
                $(document).ready(function(){
                    $('#392144247').carousel({
                        pause: true,
                        interval: false
                    });
                });
</script>
</div>
<p>When the 555 hit the market in 1971, it was a sensation. In 1975 Signetics was absorbed by Philips Semiconductors, now <a shape="rect" href="http://www.nxp.com/">NXP</a>, which says that many billions have been sold. Engineers still use the 555 to create useful electronic modules as well as less useful things, like “Knight Rider”–style lights for car grilles. And for hard-core 555 fans, you can even build a drop-in “macrocircuit” replica kit, <a shape="rect" href="http://spectrum.ieee.org/geek-life/hands-on/build-your-own-giant-555-timer-chip">assembled out of discrete transistors</a>.</p>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/wGGizCsPgQc" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/devices/chip-hall-of-fame-signetics-ne555</guid>
<media:content url="http://spectrum.ieee.org/image/Mjg5ODEwOQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/Mjg5ODEwNw.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/devices/chip-hall-of-fame-signetics-ne555</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Intersil ICL8038 Waveform Generator</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/_KPpb0GOhXQ/chip-hall-of-fame-intersil-icl8038-waveform-generator</link>
<description>Intersil’s somewhat cranky chip brought complex sound generation to consumer electronics</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Intersil’s somewhat cranky chip brought complex sound generation to consumer electronics<figure>
<img src="http://spectrum.ieee.org/image/MjkxMTg0MQ.jpeg"/>
<figcaption>Illustration: Intersil</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
  </script>
<style type="text/css">.article-page #side-module {
       top: 300px!important;
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkxMTgzOQ.jpeg" alt="Intersil ICL8038 Waveform Generator"/>
<figcaption class="hi-cap">
   Image: Intersil
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>ICL8038 Waveform Generator</h3>
<p>
<strong>Manufacturer: </strong>Intersil</p>
<p>
<strong>Category: </strong>Amplifiers and Audio</p>
<p>
<strong>Year: </strong>circa 1983</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>A good basic waveform—an electrical voltage varying in time—is the raw material from which much more complex behavior can be constructed. Intersil’s <a shape="rect" href="http://www.intersil.com/en/products/other-analog/special-analog/other-miscellaneous/ICL8038.html">ICL8038</a> integrated circuit was designed to meet the need for a convenient way to obtain a precise waveform, capable of simultaneously generating sine, square, and sawtooth waveforms with only a few supporting external components.</p>
<p>Initially critics scoffed at the 8038’s limited performance and propensity for behaving erratically. And the chip was indeed a bit temperamental. But engineers soon learned how to use it reliably, and the 8038 became a major hit, eventually selling into the hundreds of millions and finding its way into countless applications—including later versions of the “<a shape="rect" href="http://spectrum.ieee.org/telecom/standards/phreaking-out-ma-bell">blue boxes” that phreakers used</a>
<span>into the 1980s </span>to beat the phone companies. Intersil discontinued the 8038 in 2002, but hobbyists still seek it today to make things like homemade function generators and modular analog synthesizers.</p>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/_KPpb0GOhXQ" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/devices/chip-hall-of-fame-intersil-icl8038-waveform-generator</guid>
<media:content url="http://spectrum.ieee.org/image/MjkxMTg0OQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkxMTg0Nw.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/devices/chip-hall-of-fame-intersil-icl8038-waveform-generator</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: STMicroelectronics STA2056 GPS Receiver</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/UUGyt3wD6Bc/chip-hall-of-fame-stmicroelectronics-sta2056-gps-receiver</link>
<description>Inexpensive and small, this GPS receiver turbocharged the market for integrated navigation in mobile devices</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Inexpensive and small, this GPS receiver turbocharged the market for integrated navigation in mobile devices<figure>
<img src="http://spectrum.ieee.org/image/MjkxMTgyNw.jpeg"/>
<figcaption>Photo: STMicroelectronics</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<style type="text/css">.article-page #side-module {
       top: 342px!important;
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkxMTgyNQ.jpeg" alt="STA2056 GPS Receiver chip"/>
<figcaption class="hi-cap">
   Photo: STMicroelectronics
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>STA2056 GPS Receiver</h3>
<p>
<strong>Manufacturer: </strong>STMicroelectronics</p>
<p>
<strong>Category: </strong>Wireless</p>
<p>
<strong>Year: </strong>2004</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>A time-honored design stunt in the world of chipmaking is the kill-two-chips-with-one-chip move. Back in 2004, <a shape="rect" href="http://www.st.com/content/st_com/en.html">STMicroelectronics</a> did it with GPS receivers. Previously, there was one chip housing a GPS radio front end, which picks up the navigation signals being sent from orbiting GPS satellites. And another chip contained a microprocessor, some memory, and a signal correlator—GPS determines the location of each receiver by comparing the signals from multiple satellites. With the <a shape="rect" href="http://www.st.com/content/ccc/resource/technical/document/data_brief/f3/4c/27/44/36/29/44/91/CD00043779.pdf/files/CD00043779.pdf/jcr:content/translations/en.CD00043779.pdf">STA2056</a> [PDF], these two chips where smashed together. Although handheld GPS systems were already on the market, the STA2056 set a new standard for size and power consumption. And at US $8, the chip was cheap, driving the cost of GPS devices down and helping open up a mass market for them. Fiat used the chip in several Alfa Romeo models, and GPS vendor Becker put it in its handsets. It also helped propel the notion of GPS as something that could be integrated into devices and not just used as a standalone product or module. Today almost every phone—and quite a few watches—has a GPS chip, typically used in concert <a shape="rect" href="http://spectrum.ieee.org/telecom/wireless/new-indoor-navigation-technologies-work-where-gps-cant">with other techniques like Wi-Fi beacon mapping</a> to allow navigation even when a satellite isn’t in view. And, of course, the two-chips-into-one trick remains a favorite of chipmakers everywhere.</p>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/UUGyt3wD6Bc" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/aerospace/satellites/chip-hall-of-fame-stmicroelectronics-sta2056-gps-receiver</guid>
<media:content url="http://spectrum.ieee.org/image/MjkxMTgzNQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkxMTgzMw.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/aerospace/satellites/chip-hall-of-fame-stmicroelectronics-sta2056-gps-receiver</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Texas Instruments Digital Micromirror Device</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/Wis2u6GG-aw/chip-hall-of-fame-texas-instruments-digital-micromirror-device</link>
<description>An Oscar-winning invention brought digital video to movie theaters</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>An Oscar-winning invention brought digital video to movie theaters<figure>
<img src="http://spectrum.ieee.org/image/MjkwOTU2Mg.jpeg"/>
<figcaption>Image: Larry Hornbeck</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<style type="text/css">@media screen and (max-width:767px){
	.carousel-inner {
		height:390px!important;
	}
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwOTU2MA.jpeg" alt="Digital Micromirror Device chip"/>
<figcaption class="hi-cap">
   Image: Larry Hornbeck
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>Digital Micromirror Device</h3>
<p>
<strong>Manufacturer: </strong>Texas Instruments</p>
<p>
<strong>Category: </strong>MEMS and Sensors</p>
<p>
<strong>Year: </strong>1987</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>On 18 June 1999, Larry Hornbeck took his wife, Laura, on a date. They went to watch <em>Star Wars: Episode 1—The Phantom Menace</em> at a theater in Burbank, Calif. Not that the graying engineer was an avid Jedi fan. The reason they were there was actually the projector. At the heart of the projector was a chip—the digital micromirror device—that Hornbeck had invented at <a shape="rect" href="http://www.ti.com/">Texas Instruments</a>. A DMD uses millions of hinged microscopic mirrors to direct light through a projection lens. The <em>Phantom Menace </em>screening was “the first digital exhibition of a major motion picture,” says Hornbeck, a TI Fellow. Today movie projectors based on this digital light-processing technology—or DLP, as TI branded it—are used in thousands of theaters. It’s also integral to rear-projection TVs, office projectors, and tiny projectors for cellphones. “To paraphrase Houdini,” Hornbeck says, “micromirrors, gentlemen. The effect is created with micromirrors.” For his efforts, Hornbeck was ultimately <a shape="rect" href="http://spectrum.ieee.org/view-from-the-valley/computing/software/the-oscar-goes-to-engineer-larry-hornbeck-and-his-digital-micromirrors">awarded an Oscar</a>—unlike <em>The Phantom Menace</em>.</p>
<div id="298793298" class="carousel slide">
<div class="carousel-inner">
<div class="item active">
<img src="http://spectrum.ieee.org/image/MjkwOTU3Mg.jpeg" alt="An early prototype digital mirror device. Descendants of this chip can be found in projectors worldwide." data-original="/image/MjkwOTU3Mg.jpeg" id="298793298_0"/>
<span class="item-num">1/3</span>
<div class="carousel-caption">
<p>An early prototype digital mirror device. Descendants of this chip can be found in projectors worldwide. <em>Photo: Larry Hornbeck</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/MjkwOTU3NQ.jpeg" alt="The first production digital mirror device." data-original="/image/MjkwOTU3NQ.jpeg" id="298793298_1"/>
<span class="item-num">2/3</span>
<div class="carousel-caption">
<p>The first production digital mirror device. <em>Photo: Texas Instruments</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/MjkwOTU3OA.jpeg" alt="Larry Hornbeck receiving his Oscar, proof that all the really cool stuff happens at the Scientific and Technical Awards." data-original="/image/MjkwOTU3OA.jpeg" id="298793298_2"/>
<span class="item-num">3/3</span>
<div class="carousel-caption">
<p>Larry Hornbeck receiving his Oscar, proof that all the really cool stuff happens at the Scientific and Technical Awards. <em>Photo: Michael Yada/A.M.P.A.S.</em>
</p>
</div>
</div>
</div>
<a shape="rect" data-slide="prev" href="#298793298" class="left carousel-control">
<span class="glyphicon glyphicon-chevron-left"/>
</a>
<a shape="rect" data-slide="next" href="#298793298" class="right carousel-control">
<span class="glyphicon glyphicon-chevron-right"/>
</a>
<ol class="carousel-indicators">
<li data-target="#298793298" data-slide-to="0" class="active"/>
<li data-target="#298793298" data-slide-to="1" class=""/>
<li data-target="#298793298" data-slide-to="2" class=""/>
</ol>
<script>
                $(document).ready(function(){
                    $('#298793298').carousel({
                        pause: true,
                        interval: false
                    });
                });
</script>
</div>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/Wis2u6GG-aw" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/optoelectronics/chip-hall-of-fame-texas-instruments-digital-micromirror-device</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwOTU3MA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwOTU2OA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/optoelectronics/chip-hall-of-fame-texas-instruments-digital-micromirror-device</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Intel 8088 Microprocessor</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/1ezjt304ys0/chip-hall-of-fame-intel-8088-microprocessor</link>
<description>The “castrated” processor that birthed the IBM PC</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>The “castrated” processor that birthed the IBM PC<figure>
<img src="http://spectrum.ieee.org/image/MjkwOTUwMw.jpeg"/>
<figcaption>Image: Intel</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<style type="text/css">@media screen and (max-width:767px){
	.carousel-inner {
		height:400px!important;
	}
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwOTUwMQ.jpeg" alt="8088 Microprocessor chip"/>
<figcaption class="hi-cap">
   Image: Intel
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<aside class="inlay statbox lt sm">
<h3>8088 Micro-processor</h3>
<p>
<strong>Manufacturer: </strong>Intel</p>
<p>
<strong>Category: </strong>Processors</p>
<p>
<strong>Year: </strong>1979</p>
</aside>
<p>Was there any one chip that propelled Intel into the Fortune 500? Intel says there was: the 8088. This was the 16-bit CPU that IBM chose for its original line of PCs, which went on to dominate the desktop computer market.</p>
<p>In an odd twist of fate, the chip that established what would become known as the <em>x</em>86 architecture didn’t have a name appended with an “86.” The 8088 was basically a slightly modified 8086, Intel’s first 16-bit CPU. Or as Intel engineer and 8086 designer <a shape="rect" href="http://www.stevemorse.org/8086/index.html">Stephen Morse</a> once put it, the 8088 was “a castrated version of the 8086.” That’s because the new chip’s main innovation wasn’t exactly a step forward in technical terms: The 8088 processed data internally in 16-bit chunks, but it used an 8-bit external data bus.</p>
<p>Intel managers kept the 8088 project under wraps until the 8086 design was mostly complete. “Management didn’t want to delay the 8086 by even a day by even telling us they had the 8088 variant in mind,” says <a shape="rect" href="https://www.linkedin.com/in/peter-stoll-327b34ab/">Peter Stoll</a>, a lead engineer for the 8086 project who did some work on the 8088.</p>
<figure role="promo" class="rt med">
<a shape="rect" rel="lightbox" href="http://spectrum.ieee.org/image/MjkwOTUxNg.jpeg" class="zoom">
<img src="http://spectrum.ieee.org/image/MjkwOTUyOA.jpeg" alt="img"/>
<span class="magnifier"> </span>
</a>
<div class="ai">
<figcaption class="hi-cap">
    Photo: Intel
   </figcaption>
<figcaption>
    With some favorable early press, Intel’s PR department was sure it had a winner.
   </figcaption>
</div>
</figure>
<p>It was only after the first functional 8086 came out that Intel shipped the 8086 artwork and documentation to a design unit in Haifa, Israel, where two engineers, Rafi Retter and Dany Star, altered the chip to an 8-bit bus.</p>
<p>The modification proved to be one of Intel’s best decisions. The 29 000-transistor 8088 CPU required fewer, less expensive support chips than the 8086 and had “full compatibility with 8-bit hardware, while also providing faster processing and a smooth transition to 16-bit processors,” as Intel’s Robert Noyce and Ted Hoff wrote in a 1981 article for <em>IEEE Micro</em> magazine.</p>
<p>The first PC to use the 8088 was IBM’s Model 5150, a monochrome machine that cost US $3,000. Now almost all the world’s PCs are built around CPUs that can claim the 8088 as an ancestor. Not bad for a castrated chip.</p>
<div id="479555721" class="carousel slide">
<div class="carousel-inner">
<div class="item active">
<img src="http://spectrum.ieee.org/image/MjkwOTUzMQ.jpeg" alt="Only 8 of the 8088’s pins were used for sending data back and forth with other chips, even though internally the processor could handle data that was 16 bits wide." data-original="/image/MjkwOTUzMQ.jpeg" id="479555721_0"/>
<span class="item-num">1/3</span>
<div class="carousel-caption">
<p>Only 8 of the 8088’s pins were used for sending data back and forth with other chips, even though internally the processor could handle data that was 16 bits wide. <em>Photo: Konstantin Lanzet/Wikipedia</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/MjkwOTUzNA.jpeg" alt="The 8088 found massive success as the heart of IBM’s seminal PC, launched in 1981." data-original="/image/MjkwOTUzNA.jpeg" id="479555721_1"/>
<span class="item-num">2/3</span>
<div class="carousel-caption">
<p>The 8088 found massive success as the heart of IBM’s seminal PC, launched in 1981. <em>Photo: Intel</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/MjkwOTUzNw.jpeg" alt="Just as the IBM PC was the “killer app” for the 8088 chip, the spreadsheet was the killer app for the IBM PC, propelling it into offices worldwide." data-original="/image/MjkwOTUzNw.jpeg" id="479555721_2"/>
<span class="item-num">3/3</span>
<div class="carousel-caption">
<p>Just as the IBM PC was the “killer app” for the 8088 chip, the spreadsheet was the killer app for the IBM PC, propelling it into offices worldwide. <em>Photo: Intel</em>
</p>
</div>
</div>
</div>
<a shape="rect" data-slide="prev" href="#479555721" class="left carousel-control">
<span class="glyphicon glyphicon-chevron-left"/>
</a>
<a shape="rect" data-slide="next" href="#479555721" class="right carousel-control">
<span class="glyphicon glyphicon-chevron-right"/>
</a>
<ol class="carousel-indicators">
<li data-target="#479555721" data-slide-to="0" class="active"/>
<li data-target="#479555721" data-slide-to="1" class=""/>
<li data-target="#479555721" data-slide-to="2" class=""/>
</ol>
<script>
                $(document).ready(function(){
                    $('#479555721').carousel({
                        pause: true,
                        interval: false
                    });
                });
</script>
</div>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/1ezjt304ys0" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-intel-8088-microprocessor</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwOTUwOA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwOTUwNg.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-intel-8088-microprocessor</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Western Digital WD1402A UART</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/dOV-30Hkv2Y/chip-hall-of-fame-western-digital-wd1402a-uart</link>
<description>Freeing processors from doing the grunt work of communications accelerated the connected world</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Freeing processors from doing the grunt work of communications accelerated the connected world<figure>
<img src="http://spectrum.ieee.org/image/MjkwOTQ0Ng.jpeg"/>
<figcaption>Illustration: Western Digital Corp.</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwOTQ0NA.jpeg" alt="WD1402A UART chip"/>
<figcaption class="hi-cap">
   Image: Western Digital Corp.
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>WD1402A UART</h3>
<p>
<strong>Manufacturer: </strong>Western Digital</p>
<p>
<strong>Category: </strong>Interfacing</p>
<p>
<strong>Year: </strong>1971</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>
<a shape="rect" href="https://gordonbell.azurewebsites.net/">Gordon Bell </a>is famous for launching the PDP series of minicomputers at Digital Equipment Corp. in the 1960s. These ushered in the era of networked and interactive computing that would come to full flower with the introduction of the personal computer in the 1970s. But while minicomputers as a distinct class now belong to the history books, Bell also invented a lesser known but no less significant piece of technology that’s still in action all over the world: The universal asynchronous receiver/transmitter, or UART.</p>
<p>UARTs are used to let two digital devices communicate with each other by sending bits one at a time over a serial interface without bothering the device’s primary processor with the details.</p>
<p>Today, more sophisticated serial setups are available, such as the ubiquitous USB standard, but for a time UARTs ruled supreme as the way to, for example, connect modems to PCs. And the simple UART still has its place, not least as the communication method of last resort with a lot of modern network equipment.</p>
<p>The UART was invented because of Bell’s own need to connect a <a shape="rect" href="https://en.wikipedia.org/wiki/Teleprinter">Teletype</a> to a PDP-1, a task that required converting parallel signals into serial signals. He cooked up a circuit that used some 50 discrete components. The idea proved popular and Western Digital, a small company making calculator chips, offered to create a single-chip version of the UART. Western Digital founder <a shape="rect" href="https://www.youtube.com/watch?v=gRAdGCliWJw">Al Phillips</a> still remembers when his vice president of engineering showed him the Rubylith sheets with the design, ready for fabrication. “I looked at it for a minute and spotted an open circuit,” Phillips says. “The VP got hysterical.” Western Digital introduced the WD1402A around 1971, and other versions soon followed.</p>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/dOV-30Hkv2Y" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/computing/hardware/chip-hall-of-fame-western-digital-wd1402a-uart</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwOTQ1NA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwOTQ1Mg.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/computing/hardware/chip-hall-of-fame-western-digital-wd1402a-uart</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Toshiba NAND Flash Memory</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/AnVqR9K9VT0/chip-hall-of-fame-toshiba-nand-flash-memory</link>
<description>Once, all bulk data storage was magnetic. Then along came flash</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Once, all bulk data storage was magnetic. Then along came flash<figure>
<img src="http://spectrum.ieee.org/image/MjkwOTM1Nw.jpeg"/>
<figcaption>Photo: Fujio Masuoka</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwOTM1NQ.jpeg" alt="Toshiba NAND Flash Memory chip"/>
<figcaption class="hi-cap">
   Image: Fujio Masuoka
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>NAND Flash Memory</h3>
<p>
<strong>Manufacturer: </strong>Toshiba</p>
<p>
<strong>Category: </strong>Memory and Storage</p>
<p>
<strong>Year: </strong>1989</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>The saga that is the invention of flash memory began when a <a shape="rect" href="http://us.toshiba.com/storage/home">Toshiba</a> factory manager named <a shape="rect" href="http://ethw.org/Fujio_Masuoka">Fujio Masuoka</a> decided he’d reinvent semiconductor memory. We’ll get to that in a minute. First, a bit of history is in order.</p>
<p>Before flash memory came along, the only way to store what passed for large amounts of data at the time was to use magnetic tapes, floppy disks, or hard disks. Many companies were trying to create solid-state alternatives, but the choices, such as EPROM (or erasable programmable read-only memory, which required ultraviolet light to erase the data) and EEPROM (the extra E stands for “electrically,” doing away with the UV) couldn’t store much data economically.</p>
<p>Enter Masuoka at Toshiba. In 1980, he recruited four engineers to a semisecret project aimed at designing a memory chip that could store lots of data and would be affordable. Their strategy was simple. “We knew the cost of the chip would keep going down as long as transistors shrank in size,” says Masuoka, now <a shape="rect" href="http://www.unisantis.com/about/professor-masuoka/">CTO of Unisantis Electronics</a>, in Tokyo.</p>
<p>Masuoka’s team came up with a variation of EEPROM that featured a memory cell consisting of a single transistor. At the time, conventional EEPROM needed two transistors per cell. It was a seemingly small difference that had a huge impact on cost.</p>
<p>In search of a catchy name, they settled on “flash” because of the chip’s ultrafast erasing capability. Now, if you’re thinking Toshiba rushed the invention into production and watched as the money poured in, you don’t know much about how huge corporations typically exploit internal innovations. As it turned out, Masuoka’s bosses at Toshiba told him to, well, erase the idea.</p>
<p>He didn’t, of course. In 1984 he presented a paper on his memory design at the IEEE International Electron Devices Meeting, in San Francisco. That prompted Intel to begin development of a type of flash memory based on NOR logic gates. In 1988, the company introduced a 256-kilobit chip that found use in vehicles, computers, and other mass-market items, creating a nice new business for Intel.</p>
<p>That finally pushed Toshiba to greenlight Masuoka’s invention. His flash chip was based on NAND technology, which offered greater storage densities but proved trickier to manufacture. Success came in 1989, when Toshiba’s first NAND flash hit the market. And just as Masuoka had predicted, prices kept falling.</p>
<p>Digital photography gave flash a big boost in the late 1990s, and Toshiba became one of the biggest players in a multibillion-dollar market. At the same time, though, Masuoka’s relationship with other executives soured, and he quit Toshiba. (He later sued for a share of the vast profits and won a cash payment.)</p>
<p>Today, NAND flash is a key piece of gadgets and <a shape="rect" href="http://pluto.jhuapl.edu/Mission/Spacecraft/Systems-and-Components.php">space probes</a> alike—and is beginning to replace even hard disks as the storage medium of choice in laptop and desktop computers.</p>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/AnVqR9K9VT0" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/memory/chip-hall-of-fame-toshiba-nand-flash-memory</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwOTM2NQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwOTM2Mw.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/memory/chip-hall-of-fame-toshiba-nand-flash-memory</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Transmeta Corp. Crusoe Processor</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/Tlok38ORxKA/chip-hall-of-fame-transmeta-corp-crusoe-processor</link>
<description>Ahead of its time, this chip heralded the mobile era when energy use, not processing power, would become the most important spec</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Ahead of its time, this chip heralded the mobile era when energy use, not processing power, would become the most important spec<figure>
<img src="http://spectrum.ieee.org/image/MjkwNzkzMA.jpeg"/>
<figcaption>Photo: Dave Ditzel</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
</div>
<div class="article-detail">
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkxMTc0OQ.jpeg" alt="Transmeta Corp. Crusoe Processor"/>
<figcaption class="hi-cap">
   Photo: Dave Ditzel
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>Crusoe Processor</h3>
<p>
<strong>Manufacturer: </strong>Transmeta Corp.</p>
<p>
<strong>Category: </strong>Processors</p>
<p>
<strong>Year: </strong>2000</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>With great power come great heat sinks. And short battery life. And crazy electricity consumption. Hence Transmeta’s goal of designing a low-power processor that’d put those hogs offered by Intel and AMD to shame. The plan: Software would translate <em>x</em>86 instructions on the fly into Crusoe’s own machine code, whose higher level of parallelism would save time and power. It was hyped as the greatest thing since sliced silicon, and for a while, it was. “Engineering wizards conjure up processor gold” was how <em>IEEE Spectrum</em>’s May 2000 cover put it. Crusoe and its successor, Efficeon, “proved that dynamic binary translation was commercially viable,” says <a shape="rect" href="https://www.linkedin.com/in/dave-ditzel-7aa72b3/">David Ditzel</a>, Transmeta’s cofounder, now at Esperanto Technologies. Unfortunately, he adds, the chips arrived several years before the market for low-power computers took off, and appeared in only a few products. In the end, while Transmeta did not deliver on its commercial promise, it did point the way toward a world in which a processor’s power use was as important as its raw power, and some of Transmeta’s technology found its way into Intel, AMD, and Nvidia chips.</p>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwNzk0MQ.jpeg" alt="Spectrum cover"/>
<figcaption class="hi-cap">
   Photo: IEEE Spectrum
  </figcaption>
<figcaption>
   We here at 
   <em>Spectrum</em> were all aboard the Transmeta bandwagon. The cover photo included one of Transmeta’s most famous hires, Linus Torvalds, the creator of Linux [third from right].
  </figcaption>
</figure>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/Tlok38ORxKA" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-transmeta-corp-crusoe-processor</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwNzkzOA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwNzkzNg.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-transmeta-corp-crusoe-processor</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Texas Instruments TMS32010 Digital Signal Processor</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/gWHtYQ_lyYA/chip-hall-of-fame-texas-instruments-tms32010-digital-signal-processor</link>
<description>This chip put digital signal processors—specialists in handling the messy world outside the computer—on the map</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>This chip put digital signal processors—specialists in handling the messy world outside the computer—on the map<figure>
<img src="http://spectrum.ieee.org/image/MjkwNzkxNQ.jpeg"/>
<figcaption>Image: Texas Instruments</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<div class="article-detail">
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwNzkxMw.jpeg" alt="TMS32010 Digital Signal Processor"/>
<figcaption class="hi-cap">
   Image: Texas Instruments
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>TMS32010 Digital Signal Processor</h3>
<p>
<strong>Manufacturer: </strong>Texas Instruments</p>
<p>
<strong>Category: </strong>Processors</p>
<p>
<strong>Year: </strong>1983</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>The state of Texas has given us many great things, including the 10-gallon hat, chicken-fried steak, Dr Pepper, and perhaps less prominently, the TMS32010 digital signal processor (DSP) chip. DSPs are generally used to handle complex analog signals after they have been converted into a raw digital stream. This stream would overwhelm a general-purpose CPU, but DSPs can use specialized algorithms and hardware to process the stream into something the overall system can cope with.</p>
<p>Created by <a shape="rect" href="http://www.ti.com/">Texas Instruments</a>, the TMS32010 wasn’t the first DSP (that’d be AT&amp;T/Western Electric’s <a shape="rect" href="http://www.computerhistory.org/collections/catalog/102740084">DSP1,</a> introduced in 1980), but it was surely the fastest. It could compute a multiply operation in 200 nanoseconds, a feat that made engineers all tingly. What’s more, it could execute instructions from both on-chip ROM and off-chip RAM, whereas competing chips had only canned DSP functions. “That made program development [for the TMS32010] flexible, just like with microcontrollers and microprocessors,” says <a shape="rect" href="http://www.witi.com/center/witimuseum/halloffame/150686/Wanda-Gass-Texas-Instruments-/">Wanda Gass</a>, a member of the DSP design team and IEEE Fellow. At US $500 apiece, the chip sold about 1,000 units the first year. Sales eventually ramped up, and the DSP became part of modems, medical devices, and military systems. Oh, and another application: <a shape="rect" href="http://www.dollinfo.com/wowjulie.htm">Worlds of Wonder’s Julie</a>, a Chucky-style creepy doll that could sing and talk (“Are we making too much noise?”). The chip was the first in a large DSP family that made—and continues to make—TI’s fortune.</p>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwNzkyNQ.jpeg" alt="doll"/>
<figcaption class="hi-cap">
   Photo: Janet M. Baker
  </figcaption>
<figcaption>
   1987 saw the introduction of the technologically advanced, but somewhat badly coiffed, interactive talking Julie doll.
  </figcaption>
</figure>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/gWHtYQ_lyYA" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-texas-instruments-tms32010-digital-signal-processor</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwNzkyMw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwNzkyMQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-texas-instruments-tms32010-digital-signal-processor</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Kodak KAF-1300 Image Sensor</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/F31Iv_0UN6o/chip-hall-of-fame-kodak-kaf1300-image-sensor</link>
<description>The chip that brought digital photography outside the lab</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>The chip that brought digital photography outside the lab<figure>
<img src="http://spectrum.ieee.org/image/MjkwNzg5NA.jpeg"/>
<figcaption>Photo: Kodak</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
</div>
<style type="text/css">@media screen and (max-width:767px){
	.carousel-inner {
		height:390px!important;
	}
}
</style>
<div class="article-detail">
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwNzg5Mg.jpeg" alt="chip"/>
<figcaption class="hi-cap">
   Photo: Kodak
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>KAF-1300 Image Sensor</h3>
<p>
<strong>Manufacturer: </strong>Kodak</p>
<p>
<strong>Category: </strong>MEMs and Sensors</p>
<p>
<strong>Year: </strong>1986</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>Image sensors are so small and cheap now, that it’s hard to buy a phone <em>without</em> a built-in camera. Which is a result that few casual observers would have predicted in 1991 at the launch of the <a shape="rect" href="http://www.nikonweb.com/dcs100/">Kodak DCS 100</a> digital camera. The DCS 100 cost as much as US $25,000 and required a 5-kilogram external data storage unit that users had to carry on a shoulder strap. Still, the camera’s electronics—housed inside a Nikon F3 body—included one impressive piece of hardware: a thumbnail-size chip that could capture images at a resolution of 1.3 megapixels, enough for sharp 5-by-7-inch prints. “At the time, 1 megapixel was a magic number,” says <a shape="rect" href="https://www.linkedin.com/in/eric-stevens-3479ab24/">Eric Stevens</a>, the chip’s lead designer. The chip—a true two-phase charge-coupled device—became the basis for future CCD sensors, helping to kick-start the digital photography revolution. What, by the way, was the very first photo made with the KAF-1300? “Uh,” says Stevens, “we just pointed the sensor at the wall of the laboratory.”</p>
<div id="871916416" class="carousel slide">
<div class="carousel-inner">
<div class="item active">
<img src="http://spectrum.ieee.org/image/MjkwNzkwNA.jpeg" alt="The DCS 100 digital camera required a separate and bulky pack to store its 1.3-megapixel images." data-original="/image/MjkwNzkwNA.jpeg" id="871916416_0"/>
<span class="item-num">1/2</span>
<div class="carousel-caption">
<p>The DCS 100 digital camera required a separate and bulky pack to store its 1.3-megapixel images. <em>Photo: Kodak</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/MjkwNzkwNw.jpeg" alt="The camera itself mounted the KAF-1300 image sensor into a modified film camera: The spindles for holding the film are still present." data-original="/image/MjkwNzkwNw.jpeg" id="871916416_1"/>
<span class="item-num">2/2</span>
<div class="carousel-caption">
<p>The camera itself mounted the KAF-1300 image sensor into a modified film camera: The spindles for holding the film are still present. <em>Photo: SSPL/Getty Images</em>
</p>
</div>
</div>
</div>
<a shape="rect" data-slide="prev" href="#871916416" class="left carousel-control">
<span class="glyphicon glyphicon-chevron-left"/>
</a>
<a shape="rect" data-slide="next" href="#871916416" class="right carousel-control">
<span class="glyphicon glyphicon-chevron-right"/>
</a>
<ol class="carousel-indicators">
<li data-target="#871916416" data-slide-to="0" class="active"/>
<li data-target="#871916416" data-slide-to="1" class=""/>
</ol>
<script>
                $(document).ready(function(){
                    $('#871916416').carousel({
                        pause: true,
                        interval: false
                    });
                });
</script>
</div>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/F31Iv_0UN6o" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/optoelectronics/chip-hall-of-fame-kodak-kaf1300-image-sensor</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwNzkwMg.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwNzkwMA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/optoelectronics/chip-hall-of-fame-kodak-kaf1300-image-sensor</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Computer Cowboys Sh-Boom Processor</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/h8tmQjs2rBY/chip-hall-of-fame-computer-cowboys-shboom-processor</link>
<description>You’ve never heard of it. But this processor’s high-speed architecture has been duplicated in every modern computer</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>You’ve never heard of it. But this processor’s high-speed architecture has been duplicated in every modern computer<figure>
<img src="http://spectrum.ieee.org/image/MjkwNjExOQ.jpeg"/>
<figcaption>Image: Chuck Moore</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<style type="text/css">.article-page #side-module {
       top: 342px!important;
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwNjExNw.jpeg" alt="chip"/>
<figcaption class="hi-cap">
   Image: Chuck Moore
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>Sh-Boom Processor</h3>
<p>
<strong>Manufacturer: </strong>Computer Cowboys</p>
<p>
<strong>Category: </strong>Processors</p>
<p>
<strong>Year: </strong>1988</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>Two chip designers walk into a bar. They are Russell H. Fish III and <a shape="rect" href="http://www.greenarraychips.com/home/about/bios.html">Chuck H. Moore</a> (the creator of the <a shape="rect" href="http://www.forth.org/">Forth computer language</a>), and the bar is called Sh-Boom. No, this is not the beginning of a joke. It’s actually part of a technology tale filled with discord and lawsuits. Lots of lawsuits. It all started in 1988 when Fish and Moore created a bizarre processor called Sh-Boom. The chip was so streamlined that it could run faster than the clock on the circuit board that drove the rest of the computer. So the two designers found a way to have the processor run its own superfast internal clock while staying synchronized with the rest of the computer. Sh-Boom was never a commercial success, and after patenting its innovative parts, Moore and Fish moved on. Fish later sold his patent rights to a Carlsbad, Calif.–based firm, <a shape="rect" href="http://ptsc.com/home.html">Patriot Scientific</a>, which remained a profitless speck of a company until its executives had a revelation: In the years since Sh-Boom’s invention, the speed of processors had far surpassed that of motherboards, and so practically every maker of computers and consumer electronics wound up using an approach just like the one Fish and Moore had patented. <em>Ka-ching!</em> Patriot fired a barrage of lawsuits against U.S. and Japanese companies. Whether these companies’ chips depend on the Sh-Boom ideas is a matter of controversy. But since 2006, Patriot and Moore have reaped over US $125 million in licensing fees from Intel, AMD, Sony, Olympus, and others. As for the name Sh-Boom, Moore, now at GreenArrays, in Cupertino, Calif., told <em>IEEE Spectrum</em>: “It supposedly derived from the name of a bar where Fish and I drank bourbon and scribbled on napkins. There’s little truth in that. But I did like the name he suggested.”</p>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/h8tmQjs2rBY" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-computer-cowboys-shboom-processor</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwNjEyNw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwNjEyNQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-computer-cowboys-shboom-processor</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Micronas Semiconductor MAS3507 MP3 Decoder</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/kkLkN2x5Jbo/chip-hall-of-fame-micronas-semiconductor-mas3507-mp3-decoder</link>
<description>This chip began the digital music revolution</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>This chip began the digital music revolution<figure>
<img src="http://spectrum.ieee.org/image/MjkwOTM0NA.jpeg"/>
<figcaption>Photo: Eirik Solheim</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwOTM0Mw.jpeg" alt="Micronas Semiconductor MAS3507 MP3 Decoder"/>
<figcaption class="hi-cap">
   Photo: 
   <a shape="rect" href="https://eirikso.com/">Eirik Solheim</a>
</figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>MAS3507 MP3 Decoder</h3>
<p>
<strong>Manufacturer: </strong>Micronas Semiconductor</p>
<p>
<strong>Category: </strong>Amplifiers and Audio</p>
<p>
<strong>Year: </strong>1997</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>Before the iPod, there was the <a shape="rect" href="https://en.wikipedia.org/wiki/Rio_PMP300">Diamond Rio PMP300</a>. Not that you’d remember. Launched in 1998, the PMP300 became an instant hit, but then the hype faded faster than <a shape="rect" href="https://www.theatlantic.com/entertainment/archive/2015/11/milli-vanilli-25-years-later-pop-music-drake-meek-mill/417522/">Milli Vanilli</a>. One thing, though, was notable about the player. It carried the MAS3507 MP3 decoder chip—a <a shape="rect" href="https://cs.stanford.edu/people/eroberts/courses/soco/projects/risc/risccisc/">RISC-based</a> digital signal processor with an instruction set optimized for audio compression and decompression. The chip, developed by Micronas (now <a shape="rect" href="https://www.micronas.com/en">TDK-Micronas</a>), let the Rio squeeze about a dozen songs into its flash memory. This is laughable by today’s standards, but at the time it was just enough to compete with portable CD players, which suffered from a tendency to skip if jostled. The Rio and its successors paved the way for the iPod, and now you can carry thousands of songs—and all of Milli Vanilli’s albums and music videos—in your pocket.</p>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwNTY4Nw.jpeg" alt="diagram"/>
<figcaption class="hi-cap">
   Image: Micronas
  </figcaption>
<figcaption>
   As this Micronas design document shows, the MAS3507 was built around doing only one thing well—decoding MPEG Audio Layer III data, a.k.a. MP3 files. Originally developed simply as the storage subsystem for holding the soundtrack of MPEG videos, the MP3 format took on a life of its own. 
  </figcaption>
</figure>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/kkLkN2x5Jbo" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/computing/embedded-systems/chip-hall-of-fame-micronas-semiconductor-mas3507-mp3-decoder</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwOTM1MQ.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwOTM0OQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/computing/embedded-systems/chip-hall-of-fame-micronas-semiconductor-mas3507-mp3-decoder</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Amati Communications Overture ADSL Chip Set</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/LQQLlZXRBWE/chip-hall-of-fame-amati-communications-overture-adsl-chip-set</link>
<description>This communications chip helped to usher in the age of broadband Internet</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>This communications chip helped to usher in the age of broadband Internet<figure>
<img src="http://spectrum.ieee.org/image/MjkwNTY1OQ.jpeg"/>
<figcaption>Photo: Peter Chow</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<style type="text/css">.article-page #side-module {
       top: 342px!important;
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwNTY1Nw.jpeg" alt="chip"/>
<figcaption class="hi-cap">
   Photo: Peter Chow
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>Overture ADSL Chip Set</h3>
<p>
<strong>Manufacturer: </strong>Amati Communications</p>
<p>
<strong>Category: </strong>Interfacing</p>
<p>
<strong>Year: </strong>1994</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>Remember when DSL came along and you chucked that pathetic 56.6-kilobit-per-second modem into the trash? Okay, a few years later you ended up chucking that DSL modem into the trash too as dedicated fiber-optic-based broadband networks rolled out. But for many consumers, DSL was the first taste of what high-speed Internet could do, not least as a distribution system for music and movies. It was a great transitional technology: As long as the subscriber wasn’t too far from an exchange, DSL turned existing regular audio telephone lines into high-speed digital connections.</p>
<p>The epicenter of this broadband revolution was Amati Communications, a startup out of Stanford University. In the 1990s, it came up with a DSL modulation approach called discrete multitone, or DMT. It’s basically a way of making one phone line look like hundreds of subchannels and improving transmission using an inverse Robin Hood strategy. “Bits are robbed from the poorest channels and given to the wealthiest channels,” says <a shape="rect" href="http://web.stanford.edu/group/cioffi/">John M. Cioffi</a>, a cofounder of Amati and now an engineering professor at Stanford. DMT beat competing approaches—including ones from giants like AT&amp;T—and became a global standard for DSL. In the mid-1990s, Amati’s DSL chip set (one analog, two digital) sold in modest quantities, but by 2000, volume had increased to millions. In the early 2000s, sales exceeded 100 million chips per year. Texas Instruments bought Amati in 1997.</p>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/LQQLlZXRBWE" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/telecom/internet/chip-hall-of-fame-amati-communications-overture-adsl-chip-set</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwNTY2Nw.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwNTY2NQ.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/telecom/internet/chip-hall-of-fame-amati-communications-overture-adsl-chip-set</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Sun Microsystems SPARC Processor</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/fPs-iGlSbP4/chip-hall-of-fame-sun-microsystems-sparc-processor</link>
<description>Using an unproven new architecture, this processor put Sun Microsystems on the map</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Using an unproven new architecture, this processor put Sun Microsystems on the map<figure>
<img src="http://spectrum.ieee.org/image/MjkwNzk0NQ.jpeg"/>
<figcaption>Photo: Mark Richards</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<style type="text/css">@media screen and (max-width:767px){
	.carousel-inner {
		height:420px!important;
	}
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkxNzE5Mg.jpeg" alt="Sun Microsystems SPARC Processor"/>
<figcaption class="hi-cap">
   Photo: Mark Richards
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>SPARC Processor</h3>
<p>
<strong>Manufacturer: </strong>Sun Microsystems</p>
<p>
<strong>Category: </strong>Processor</p>
<p>
<strong>Year: </strong>1987</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>There was a time, long ago (the early 1980s), when people wore neon-colored leg warmers and watched “Dallas,” and microprocessor architects sought to increase the complexity of CPU instructions as a way of getting more accomplished in each compute cycle. But then a group at the University of California, Berkeley, always a bastion of counterculture, called for the opposite: Simplify the instruction set, they said, and you’ll process instructions at a rate so fast you’ll more than compensate for doing less each cycle. The Berkeley group, led by David Patterson, called their approach <a shape="rect" href="https://cs.stanford.edu/people/eroberts/courses/soco/projects/risc/risccisc/">RISC</a>, for reduced-instruction-set computing.</p>
<p>As an academic study, RISC sounded great. But was it marketable? Sun Microsystems (now part of <a shape="rect" href="https://www.oracle.com/sun/index.html">Oracle</a>) bet on it. In 1984, a small team of Sun engineers set out to develop a 32-bit RISC processor called SPARC (for Scalable Processor Architecture). The idea was to use the chips in Sun’s new line of workstations. One day, <a shape="rect" href="https://www.linkedin.com/in/smcnealy/">Scott McNealy</a>, then Sun’s CEO, showed up at the SPARC development lab. “He said that SPARC would take Sun from a $500-million-a-year company to a billion-dollar-a-year company,” recalls Patterson, a consultant to the SPARC project.</p>
<p>If that weren’t pressure enough, many outside Sun had expressed doubt the company could pull it off. Worse still, Sun’s marketing team had had a terrifying realization: SPARC spelled backward was…CRAPS! Team members had to swear they would not utter that word to anyone, even inside Sun—lest the word get out to archrival MIPS Technologies, which was also exploring the RISC concept.</p>
<p>The first version of the minimalist SPARC consisted of a “20,000-gate-array processor without even integer multiply/divide instructions,” says <a shape="rect" href="https://www.linkedin.com/in/robertgarner/">Robert Garner</a>, the lead SPARC architect and now an IBM researcher. Yet, at 10 million instructions per second, it ran about three times as fast as the complex-instruction-set computer (CISC) processors of the day.</p>
<p>Sun would use SPARC to power profitable workstations and servers for years to come. The first SPARC-based product, introduced in 1987, was the Sun-4 line of workstations, which quickly dominated the market and helped propel the company’s revenues past the billion-dollar mark—just as McNealy had prophesied.</p>
<div id="350478332" class="carousel slide">
<div class="carousel-inner">
<div class="item active">
<img src="http://spectrum.ieee.org/image/MjkwNzk1NA.jpeg" alt="The SPARC team in 1988, the year after the first SPARC-based products started building Sun Microsystems into a huge Silicon Valley player." data-original="/image/MjkwNzk1NA.jpeg" id="350478332_0"/>
<span class="item-num">1/3</span>
<div class="carousel-caption">
<p>The SPARC team in 1988, the year after the first SPARC-based products started building Sun Microsystems into a huge Silicon Valley player. <em>Photo: Robert B. Garner</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/MjkwNzk1Nw.jpeg" alt="The SPARCstation 1+ was the first in a pioneering series of workstations aimed at meeting the computing needs of engineers and scientists." data-original="/image/MjkwNzk1Nw.jpeg" id="350478332_1"/>
<span class="item-num">2/3</span>
<div class="carousel-caption">
<p>The SPARCstation 1+ was the first in a pioneering series of workstations aimed at meeting the computing needs of engineers and scientists. <em>Photo: Mike Chapman/Wikipedia</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/MjkwNzk2MA.jpeg" alt="The first SPARC chipset resulted in a crowded mainboard. Note that most of the chips are still using “through hole” packages, where the chips have legs that are held by sockets, rather than the surface-mounted chips used today for high-density circuitry." data-original="/image/MjkwNzk2MA.jpeg" id="350478332_2"/>
<span class="item-num">3/3</span>
<div class="carousel-caption">
<p>The first SPARC chipset resulted in a crowded mainboard. Note that most of the chips are still using “through hole” packages, where the chips have legs that are held by sockets, rather than the surface-mounted chips used today for high-density circuitry. <em>Photo: Robert B. Garner</em>
</p>
</div>
</div>
</div>
<a shape="rect" data-slide="prev" href="#350478332" class="left carousel-control">
<span class="glyphicon glyphicon-chevron-left"/>
</a>
<a shape="rect" data-slide="next" href="#350478332" class="right carousel-control">
<span class="glyphicon glyphicon-chevron-right"/>
</a>
<ol class="carousel-indicators">
<li data-target="#350478332" data-slide-to="0" class="active"/>
<li data-target="#350478332" data-slide-to="1" class=""/>
<li data-target="#350478332" data-slide-to="2" class=""/>
</ol>
<script>
                $(document).ready(function(){
                    $('#350478332').carousel({
                        pause: true,
                        interval: false
                    });
                });
</script>
</div>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/fPs-iGlSbP4" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-sun-microsystems-sparc-processor</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwNzk1Mg.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwNzk1MA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-sun-microsystems-sparc-processor</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Microchip Technology PIC 16C84 Microcontroller</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/1K1Cuyc20T8/chip-hall-of-fame-microchip-technology-pic-16c84-microcontroller</link>
<description>Adding easily reprogrammable onboard memory to store software revolutionized microcontrollers</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Adding easily reprogrammable onboard memory to store software revolutionized microcontrollers<figure>
<img src="http://spectrum.ieee.org/image/MjkwNTYzNg.jpeg"/>
<figcaption>Image: Microchip Technology</figcaption>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<div class="article-detail">
<style type="text/css">span.BackArrowBlkBkgrd {
    color: #fff;
    background: #000;
    margin: 0px 10px 0px 0px;
    padding: 5px 8px;
}

h3.RptHdBackBarMobile {
    font-weight: 400;
    font-size: 16px;
    color: #fff;
    background: #00acee;
    padding: 5px 10px 5px 0px;
    margin: 10px 0 20px 0;
    letter-spacing: 0.05em;
    text-align: left;
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwNTYzMw.jpeg" alt="chip"/>
<figcaption class="hi-cap">
   Image: Microchip Technology
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>PIC 16C84 Micro-controller</h3>
<p>
<strong>Manufacturer: </strong>Microchip Technology</p>
<p>
<strong>Category: </strong>Processors</p>
<p>
<strong>Year: </strong>1993</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>Back in the early 1990s, the huge 8-bit microcontroller universe belonged to one company, the almighty Motorola. Then along came a small contender with a nondescript name, <a shape="rect" href="http://www.microchip.com/">Microchip Technology</a>. Microchip developed the PIC 16C84, which took an 8-bit microcontroller and added a type of memory called EEPROM, for electrically erasable programmable read-only memory. EEPROM doesn’t need UV light to be erased, as did its progenitor, EPROM. Such read-only memory is generally used to store program code or small bits of data. Eliminating the need for a UV light meant that “users could change their code on the fly,” says <a shape="rect" href="https://www.linkedin.com/in/rod-drake-54b77a14/">Rod Drake</a>, the chip’s lead designer and now a director at Microchip. Even better, the whole chip cost less than US $5, or a quarter the cost of existing alternatives at the time. The 16C84 was used in smart cards, remote controls, and wireless car keys. It was the beginning of a line of microcontrollers that became electronics superstars among Fortune 500 companies and weekend hobbyists alike. While the 16C84 has been retired, the <a shape="rect" href="http://www.microchip.com/design-centers/microcontrollers">PIC line is still in production</a> and billions have been sold, used in things like industrial controllers, unmanned aerial vehicles, digital pregnancy tests, chip-controlled fireworks, LED jewelry, and a septic-tank monitor named the Turd Alert.</p>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/MjkwNTYzNA.jpeg" alt="Diagram US5351216 "/>
<figcaption class="hi-cap">
   Image: Microchip Technology/U.S. Patent and Trademark Office
  </figcaption>
<figcaption>
   This sketch from a Microchip patent shows how PIC controllers differed from other computers. In most computers, such as your PC, programs and working data are stored in the same memory—an arrangement known as a “von Neumann architecture.” But PIC controllers keep program and working-data memory separate—an arrangement known as a “Harvard architecture.” This allowed programs to be stored in cheap read-only memory.
  </figcaption>
</figure>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/1K1Cuyc20T8" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/computing/embedded-systems/chip-hall-of-fame-microchip-technology-pic-16c84-microcontroller</guid>
<media:content url="http://spectrum.ieee.org/image/MjkwNTY0NA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/MjkwNTY0Mg.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/computing/embedded-systems/chip-hall-of-fame-microchip-technology-pic-16c84-microcontroller</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Texas Instruments TMC0281 Speech Synthesizer</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/UtgH1PVWdpA/chip-hall-of-fame-texas-instruments-tmc0281-speech-synthesizer</link>
<description>The world’s first speech synthesizer on chip—and accidental supporting star of E.T.</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>The world’s first speech synthesizer on chip—and accidental supporting star of <em>E.T.</em>
<figure>
<img src="http://spectrum.ieee.org/image/Mjg5ODI1Mw.jpeg"/>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<style type="text/css">@media screen and (max-width:767px){
	.carousel-inner {
		height:390px!important;
	}
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/Mjg5ODI1Mg.jpeg" alt="die"/>
<figcaption class="hi-cap">
   Image: Gene Frantz/Texas Instruments
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>TMC0281 Speech Synthesizer</h3>
<p>
<strong>Manufacturer: </strong>Texas Instruments</p>
<p>
<strong>Category: </strong>Amplifiers and Audio</p>
<p>
<strong>Year: </strong>1978</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>If it weren’t for the TMC0281, E.T. would’ve never been able to “phone home.” That’s because the <a shape="rect" href="http://www.ti.com/corp/docs/company/history.html#1970s">TMC0281</a>, the first single-chip speech synthesizer, was the heart (or should we say the mouth?) of <a shape="rect" href="http://www.ti.com/">Texas Instruments’</a> Speak &amp; Spell learning toy. In Steven Spielberg’s <a shape="rect" href="http://www.imdb.com/title/tt0083866/">1982 blockbuster movie</a>, the eponymous flat-headed alien hacks the toy to build an interplanetary communicator. (For the record, E.T. also uses a coat hanger, a coffee can, and a circular saw.) Today, we’re increasingly accustomed to our consumer electronics talking to us; the TMC0281 was the first step toward our world of ubiquitous synthesized speech.</p>
<p>Released in 1978, the TMC0281 produced speech using a technique called linear predictive coding; the sound emerges from a combination of buzzing, hissing, and popping. It was a surprising solution for something deemed “impossible to do in an integrated circuit,” <a shape="rect" href="https://www.ece.rice.edu/genefrantz.aspx">Gene A. Frantz</a> told <em>IEEE Spectrum</em>. Frantz, one of the four engineers who designed the toy, retired from TI in 2013. Variants of the TMC0281 were used in Atari arcade games and Chrysler’s K-cars. In 2001, TI sold its speech-synthesis chip line to Sensory, which discontinued it in late 2007. But if you ever need to place a very, very-long-distance phone call, you can find Speak &amp; Spell units in excellent condition on eBay for about US $50.</p>
<div id="519047881" class="carousel slide">
<div class="carousel-inner">
<div class="item active">
<img src="http://spectrum.ieee.org/image/Mjg5ODI2Mg.jpeg" alt="Gene Frantz, Richard Wiggins, Paul Breedlove, and Larry Branntingham were the creators of the TMC0281-powered Speak &amp; Spell toy." data-original="/image/Mjg5ODI2Mg.jpeg" id="519047881_0"/>
<span class="item-num">1/3</span>
<div class="carousel-caption">
<p>Gene Frantz, Richard Wiggins, Paul Breedlove, and Larry Branntingham were the creators of the TMC0281-powered Speak &amp; Spell toy. <em>Photo: Gene Frantz/Texas Instruments</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/Mjg5OTc0MA.jpeg" alt="The Speak &amp; Spell toy was a international success for Texas Instruments." data-original="/image/Mjg5OTc0MA.jpeg" id="519047881_1"/>
<span class="item-num">2/3</span>
<div class="carousel-caption">
<p>The Speak &amp; Spell toy was a international success for Texas Instruments. <em>Photo: The Advertising Archives/Alamy</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/Mjg5ODI2OA.jpeg" alt="A few years after its launch, the Speak &amp; Spell found itself in a notable cameo in 1982's &lt;em&gt;E.T.&lt;/em&gt;, as part of the alien’s improvised distress beacon." data-original="/image/Mjg5ODI2OA.jpeg" id="519047881_2"/>
<span class="item-num">3/3</span>
<div class="carousel-caption">
<p>A few years after its launch, the Speak &amp; Spell found itself in a notable cameo in 1982's <em>E.T.</em>, as part of the alien’s improvised distress beacon. <em>Photo: Universal Studios</em>
</p>
</div>
</div>
</div>
<a shape="rect" data-slide="prev" href="#519047881" class="left carousel-control">
<span class="glyphicon glyphicon-chevron-left"/>
</a>
<a shape="rect" data-slide="next" href="#519047881" class="right carousel-control">
<span class="glyphicon glyphicon-chevron-right"/>
</a>
<ol class="carousel-indicators">
<li data-target="#519047881" data-slide-to="0" class="active"/>
<li data-target="#519047881" data-slide-to="1" class=""/>
<li data-target="#519047881" data-slide-to="2" class=""/>
</ol>
<script>
                $(document).ready(function(){
                    $('#519047881').carousel({
                        pause: true,
                        interval: false
                    });
                });
</script>
</div>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/UtgH1PVWdpA" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/computing/embedded-systems/chip-hall-of-fame-texas-instruments-tmc0281-speech-synthesizer</guid>
<media:content url="http://spectrum.ieee.org/image/Mjg5ODI2MA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/Mjg5ODI1OA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/computing/embedded-systems/chip-hall-of-fame-texas-instruments-tmc0281-speech-synthesizer</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Fairchild Semiconductor μA741 Op-Amp</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/D4NcjbUuPXM/chip-hall-of-fame-fairchild-semiconductor-a741-opamp</link>
<description>This chip became the de facto standard for analog amplifier ICs. Still in production, it’s available everywhere there are electronics</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>This chip became the de facto standard for analog amplifier ICs. Still in production, it’s available everywhere there are electronics<figure>
<img src="http://spectrum.ieee.org/image/Mjg5ODIzNg.jpeg"/>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/Mjg5ODIzMg.jpeg" alt="Fairchild Semiconductor die"/>
<figcaption class="hi-cap">
   Image: David Fullagar
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>μA741<br clear="none"/> Op-Amp</h3>
<p>
<strong>Manufacturer: </strong>Fairchild Semiconductor</p>
<p>
<strong>Category: </strong>Amplifiers and Audio</p>
<p>
<strong>Year: </strong>1968</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>Operational amplifiers are the sliced bread of analog design. You can slap them together with almost anything and get something satisfying. Designers use them to make audio and video preamplifiers, voltage comparators, precision rectifiers, and many other subsystems essential to everyday electronics.</p>
<p>In 1963, a 26-year-old engineer named <a shape="rect" href="http://www.nytimes.com/1991/03/06/obituaries/robert-widlar-53-designer-of-computer-circuits.html">Robert Widlar</a> designed the first monolithic op-amp integrated circuit, the <a shape="rect" href="http://smithsonianchips.si.edu/augarten/p16.htm">μA702</a>, at <a shape="rect" href="https://www.fairchildsemi.com/">Fairchild Semiconductor</a>. It sold for US $300 a pop. Widlar followed up with an improved design, the μA709, cutting the cost to $70 and making the chip a huge commercial success. The story goes that the freewheeling Widlar then asked for a raise. When he didn’t get it, he quit. National Semiconductor (now part of Texas Instruments) was only too happy to scoop up a guy who was then helping establish the discipline of analog IC design. In 1967, Widlar created an even better op-amp for National, the LM101, a version of which is <a shape="rect" href="http://www.ti.com/product/LM101A-N">still in production</a>.</p>
<figure role="img" class="lt med">
<a shape="rect" rel="lightbox" href="http://spectrum.ieee.org/image/Mjg5ODIzMw.jpeg" class="zoom">
<img src="http://spectrum.ieee.org/image/Mjg5ODIzMw.jpeg" alt="img"/>
<span class="magnifier"> </span>
</a>
<div class="ai">
<figcaption class="hi-cap">
    Photo: David Fullagar
   </figcaption>
<figcaption>
    As this internal memo from Fairchild’s marketing department shows, the company quickly realized it had a hit on its hands.
   </figcaption>
</div>
</figure>
<p>While Fairchild managers fretted about the sudden Widlar-powered competition, over at Fairchild’s R&amp;D lab a recent hire, David Fullagar, scrutinized the LM101. He realized that the chip, however brilliant, had a couple of drawbacks. The biggest of these was that the IC’s input stage, the so-called front end, was overly sensitive to noise in some chips, because of semiconductor quality variations.</p>
<p>“The front end looked kind of kludgy,” he says.</p>
<p>Fullagar embarked on his own design. The solution to the front end problem turned out to be profoundly simple—“it just came to me, I don’t know, driving to Tahoe”—and consisted of a couple of extra transistors. That additional circuitry made the amplification smoother and consistent from chip to chip.</p>
<p>Fullagar took his design to the head of R&amp;D at Fairchild, a guy named <a shape="rect" href="http://ethw.org/Gordon_E._Moore">Gordon Moore</a>, who sent it to the company’s commercial division. The new chip, the <a shape="rect" href="http://www.ti.com/product/UA741">μA741</a>, would become <em>the</em> standard for op-amps. The IC—and variants created by Fairchild’s competitors—have sold in the hundreds of millions. Now, for $300—the price tag of that primordial 702 op-amp—you can get about 2,000 of today’s 741 chips.</p>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/D4NcjbUuPXM" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/devices/chip-hall-of-fame-fairchild-semiconductor-a741-opamp</guid>
<media:content url="http://spectrum.ieee.org/image/Mjg5ODI0Mg.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/Mjg5ODI0MA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/devices/chip-hall-of-fame-fairchild-semiconductor-a741-opamp</feedburner:origLink></item>
<item>
<title>Chip Hall of Fame: Acorn Computers ARM1 Processor</title>
<link>http://feedproxy.google.com/~r/IeeeSpectrumFullText/~3/nfeQR-HcmeQ/chip-hall-of-fame-acorn-computers-arm1-processor</link>
<description>Reading this on a smartphone? Then you’re using a direct descendant of this processor right now</description>
<content:encoded><![CDATA[<?xml version="1.0" encoding="UTF-8"?><html>
<body>Reading this on a smartphone? Then you’re using a direct descendant of this processor right now<figure>
<img src="http://spectrum.ieee.org/image/Mjg5ODI0NA.jpeg"/>
</figure>
<div>
<div class="article-detail"/>
<link rel="stylesheet" href="/ns/interactive/0617-ChipHallofFame/css/chof_styles.css"/>
<div class="article-detail">
<script>
$(function () {
	$('.medium-bottom-ad').css('display','none');
});
</script>
<style type="text/css">@media screen and (max-width:767px){
	.carousel-inner {
		height:400px!important;
	}
}
</style>
<figure role="img" class="xlrg">
<img src="http://spectrum.ieee.org/image/Mjg5ODIyMg.jpeg" alt="Acorn chip"/>
<figcaption class="hi-cap">
   Photo: Acorn Computers
  </figcaption>
</figure>
<div class="mobileHide">
<div class="chofIconList">
<a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">
<figure role="img" class="lt sm">
<ul>
<li class="chofIcon"/>
</ul>
</figure>
</a>
</div>
</div>
<aside class="inlay statbox lt sm">
<h3>ARM1</h3>
<p>
<strong>Manufacturer: </strong>Acorn Computers</p>
<p>
<strong>Category: </strong>Processors</p>
<p>
<strong>Year: </strong>1985</p>
</aside>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
<p>In the early 1980s, Acorn Computers was a small company with a big product. The firm, based in Cambridge, England, had sold over 1.5 million 8-bit <a shape="rect" href="http://spectrum.ieee.org/tech-talk/computing/software/the-golden-age-of-basic">BBC Micro</a> desktop computers as part of the BBC’s national <a shape="rect" href="http://www.naec.org.uk/organisations/bbc-computer-literacy-project/towards-computer-literacy-the-bbc-computer-literacy-project-1979-1983">Computer Literacy Project</a>. It was now time to design a new computer. Unsatisfied with the processors then available on the market, the Acorn engineers decided to make the leap to creating their own 32-bit microprocessor.</p>
<p>They called it the Acorn RISC Machine, or ARM. RISC, which stood for reduced-instruction-set computer, was an approach to designing processors that traded <a shape="rect" href="https://cs.stanford.edu/people/eroberts/courses/soco/projects/risc/risccisc/">more complex machine code for higher efficiency</a>. The engineers knew it wouldn’t be easy; in fact, they half expected they’d encounter an insurmountable design hurdle and have to scrap the whole project. “The team was so small that every design decision had to favor simplicity—or we’d never finish it!” says codesigner <a shape="rect" href="https://www.research.manchester.ac.uk/portal/steve.furber.html">Steve Furber</a>, now a computer engineering professor at the University of Manchester. In the end, the simplicity made all the difference. The ARM was small, low power, and easy to program. <a shape="rect" href="http://www.computerhistory.org/fellowawards/hall/sophie-wilson/">Sophie Wilson</a>, who designed the instruction set, still remembers when they first tested the chip in a computer. “We did ‘PRINT PI’ at the prompt, and it gave the right answer,” she says. “We cracked open the bottles of champagne.” In 1990, Acorn spun off its <a shape="rect" href="https://www.arm.com/">ARM division</a>, and the ARM architecture went on to become the dominant 32-bit processor for embedded application. More than 10 billion ARM cores have been used in all sorts of gadgetry, including one of Apple’s most humiliating flops, the Newton handheld, and one of its most glittering successes, the iPhone. Indeed, ARM chips are now found in more than <a shape="rect" href="https://www.arm.com/markets/mobile">95 percent</a> of the world’s smartphones.</p>
<div id="1401104277" class="carousel slide">
<div class="carousel-inner">
<div class="item active">
<img src="http://spectrum.ieee.org/image/Mjg5OTcyNw.jpeg" alt="One of the earliest sketches of the ARM1's architecture." data-original="/image/Mjg5OTcyNw.jpeg" id="1401104277_0"/>
<span class="item-num">1/2</span>
<div class="carousel-caption">
<p>One of the earliest sketches of the ARM1's architecture. <em>Photo: Steve Furber/Acorn Computers</em>
</p>
</div>
</div>
<div class="item">
<img src="http://spectrum.ieee.org/image/Mjg5OTczMA.jpeg" alt="ARM chips were originally the heart of a computer called the Archimedes, the architecture of which is shown here, before the processor design was licensed to embedded-device manufacturers." data-original="/image/Mjg5OTczMA.jpeg" id="1401104277_1"/>
<span class="item-num">2/2</span>
<div class="carousel-caption">
<p>ARM chips were originally the heart of a computer called the Archimedes, the architecture of which is shown here, before the processor design was licensed to embedded-device manufacturers. <em>Illustration: Acorn Computers/IEEE</em>
</p>
</div>
</div>
</div>
<a shape="rect" data-slide="prev" href="#1401104277" class="left carousel-control">
<span class="glyphicon glyphicon-chevron-left"/>
</a>
<a shape="rect" data-slide="next" href="#1401104277" class="right carousel-control">
<span class="glyphicon glyphicon-chevron-right"/>
</a>
<ol class="carousel-indicators">
<li data-target="#1401104277" data-slide-to="0" class="active"/>
<li data-target="#1401104277" data-slide-to="1" class=""/>
</ol>
<script>
                $(document).ready(function(){
                    $('#1401104277').carousel({
                        pause: true,
                        interval: false
                    });
                });
</script>
</div>
<div class="mobileShow">
<h3 class="RptHdBackBarMobile">
<span class="BackArrowBlkBkgrd">&lt;</span> <a shape="rect" href="http://spectrum.ieee.org/static/chip-hall-of-fame">Back to the Chip Hall of Fame</a>
</h3>
</div>
</div>
</div>
</body>
</html>
<img src="http://feeds.feedburner.com/~r/IeeeSpectrumFullText/~4/nfeQR-HcmeQ" height="1" width="1" alt=""/>]]></content:encoded>
<pubDate>Fri, 30 Jun 2017 17:00:00 GMT</pubDate>
<guid isPermaLink="false">http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-acorn-computers-arm1-processor</guid>
<media:content url="http://spectrum.ieee.org/image/Mjg5ODI1MA.jpg" height="373" width="620" />
<media:thumbnail url="http://spectrum.ieee.org/image/Mjg5ODI0OA.jpg" height="225" width="300" />
<feedburner:origLink>http://spectrum.ieee.org/semiconductors/processors/chip-hall-of-fame-acorn-computers-arm1-processor</feedburner:origLink></item>
</channel>
</rss>
