<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/topic/biomedical.rss" rel="self"></atom:link><language>en-us</language><lastBuildDate>Wed, 08 Apr 2026 21:54:20 -0000</lastBuildDate><item><title>Tiny Graphene Drums Let Doctors Identify Bacteria by Sound</title><link>https://spectrum.ieee.org/soundcell-nanodrums-identify-bacteria-sound</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/close-up-of-gloved-hands-using-a-dropper-to-deposit-antibiotics-into-a-bacteria-test-tray.jpg?id=65454830&width=1200&height=400&coordinates=0%2C417%2C0%2C417"/><br/><br/><p>Identifying bacteria by sight can be quite difficult. Why not listen to them instead?</p><p>Researchers at TU Delft in the Netherlands and the university’s spinoff company <a href="https://soundcell.nl/" rel="noopener noreferrer" target="_blank">SoundCell</a> think that bacterial infections could be diagnosed with sound. They’ve crafted a nanoscale drum kit that uses some of the world’s smallest percussion instruments to turn a bacterium’s motions into song<span>. </span></p><p>Previously, the Delft researchers <a href="https://spectrum.ieee.org/graphene-drums-alert-to-antibiotic-resistance" target="_self">showed that listening to a germ’s drumbeat</a> could quickly screen it for antibiotic resistance. Now, the same researchers have discovered that different bacteria play different sounds on the drum. They’ve shown they can identify a bacterium from its song alone.</p><p>“We can really look at the level of a single cell,” says <a href="https://www.tudelft.nl/staff/f.alijani/" target="_blank">Farbod Alijani</a>, a mechanical engineer at TU Delft and one of the authors of a new paper. “We have that sensitivity.” Alijani and colleagues <a href="https://pubs.acs.org/doi/10.1021/acssensors.5c04649" target="_blank">reported their latest findings</a> this March in <em><em>ACS Sensors.</em></em></p><h2>How to Build the World’s Tiniest Drum</h2><p>The Delft researchers call their instrument of choice <a href="https://soundcell.nl/technology/#MelodyOne" rel="noopener noreferrer" target="_blank">a “nanodrum.”</a> Its drumhead is fashioned from two graphene sheets, less than 1 nanometer thick, laid atop an 8 micrometer-wide cavity. This size fits most bacteria, which are about one to 10 micrometers in length.</p><p>Several years ago, the Delft researchers noticed that, if a living bacterium settled on the graphene sheet, it would beat a pattern on the drumhead. They were detecting the life-form’s subtle motions, such as the whirling of the propellor-like <a href="https://en.wikipedia.org/wiki/Flagellum" rel="noopener noreferrer" target="_blank">flagellum</a> the bacterium uses to move about. When the drumhead moved, it left signals on a beam of laser light reflected off the surface, allowing the researchers to record the bacterium’s motion.</p><p>The drum’s tiny size is key to pinpointing individual bacteria. The Delft researchers were not the first to capture bacteria in motion, but <a href="https://pubs.aip.org/tu/npe/article/7/1/013001/2920715/Nanomotion-of-bacteria-to-determine-metabolic" rel="noopener noreferrer" target="_blank">older methods</a> usually had to average the movements of an ensemble of many bacteria because of their microscale. By comparison, each graphene drumhead is small enough to isolate—and record—a single bacterium. </p><p>Graphene is key to this instrument’s construction. The material is both strong enough to support a bacterium’s weight and sensitive enough to bend with each subtle bounce on the drum.</p><p>Then, by converting its drumbeat to a soundtrack, it’s possible to literally hear the motions of a living bacterium. “It’s very noisy, like a wind tunnel,” says <a href="https://soundcell.nl/team/" rel="noopener noreferrer" target="_blank">Aleksandre Japaridze</a>, SoundCell’s chief technical officer, who is also an author of the paper. </p><p><span>By contrast, “if you kill it with a drug, it’s immediately very silent, and you don’t hear anything.” </span><span>In </span><a href="https://www.nature.com/articles/s41565-022-01111-6" target="_blank">previously published work</a><span>, when the researchers pumped an antibiotic onto drums played by </span><em><em>E. coli</em></em><span>, the drums fell quiet within hours. But when they did the same to </span><em><em>E. coli </em></em><span>they knew to be antibiotic-resistant, the bacteria played on, seemingly unaffected.</span></p><h2>From One Song to Many</h2><p>Over the following years, the Delft researchers refined their technology’s ability to screen bacteria for <a href="https://spectrum.ieee.org/technologies-to-combat-antimicrobial-resistance" target="_blank">antibiotic resistance</a>. Let a patient’s bacteria play the drums, then administer a given antibiotic—if the music stops, that antibiotic should work.</p><p>But their work took an unexpected turn after an attendee at a conference asked Alijani if different bacteria made different sounds. Unsure of the answer, the researchers wondered how they could find out. </p><p>It was clearly possible to tell a living bacterium from a dead one by listening alone, but separating one bacterium from another species required a more sophisticated approach. The Delft researchers recorded the drumbeats of different infectious bacteria from real patients’ samples. Instead of using raw sounds, the researchers processed them into <a href="https://people.ece.cornell.edu/land/PROJECTS/ReassignFFT/index.html" target="_blank">time-frequency spectrograms</a>, which allowed the researchers to more carefully study the patterns of each bacterium’s motion.</p><p>The researchers trained two different machine learning models to examine a spectrogram and identify its drummer as one of three species: <em>E. coli</em>,<em> Staphylococcus aureus </em>(responsible for staph infections), or<em><em> Klebsiella pneumoniae</em></em> (one of the germs that can cause pneumonia). </p><p>Both models, each with a different underlying architecture, scored high marks in testing: One classified bacteria with 87 percent accuracy, and the other achieved 88 percent. These results suggest that each species plays different characteristic notes when it moves on the drum. </p><p>“It’s a completely different way of interpreting the different species,” Japaridze says. “Not chemically or biologically, with markers and genes, but just purely on...mechanical behavior.”</p><h2>Diagnosis Through Song and Dance</h2><p>The Delft researchers think their drums are a powerful diagnostic tool. SoundCell was originally spun off to commercialize the ability to quickly and easily determine whether a bacterium is resistant to a given antibiotic, and the researchers hope hospitals in the future will also listen to the songs of a patient’s sample to identify the infection.</p><p>Antimicrobial-resistant germs may be responsible for <a href="https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(24)01867-1/fulltext" target="_blank">more than 1 million deaths each year</a> and may play a part in millions more. There are many reasons that antibiotic-resistant bacteria are potent threats—one is that the <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC7924329/" target="_blank">tests for whether a microbe is resistant are relatively slow</a>. Today’s tools may take days to report if a microbe is resistant to a given antibiotic. By comparison, SoundCell’s technology can do this in as little as an hour.</p><p>First, SoundCell must show its nanodrums can work in the hospital. The Delft researchers’ published work was conducted on a bulky apparatus on an optical table, within the controlled confines of a laboratory. So, SoundCell has repackaged its nanodrums into <a href="https://soundcell.nl/technology/#MelodyOne" target="_blank">a smaller device</a> better suited for hospital use.</p><p>SoundCell has now deployed this device at two hospitals in the Netherlands, Japaridze says, to verify their performance.</p>]]></description><pubDate>Wed, 08 Apr 2026 11:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/soundcell-nanodrums-identify-bacteria-sound</guid><category>Bacteria</category><category>Antibiotics</category><category>Graphene</category><category>Biosensors</category><category>Graphene-membranes</category><dc:creator>IEEE Spectrum</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/close-up-of-gloved-hands-using-a-dropper-to-deposit-antibiotics-into-a-bacteria-test-tray.jpg?id=65454830&amp;width=980"></media:content></item><item><title>“Living Pharmacy” Implant Keeps Drug-producing Cells Alive Longer</title><link>https://spectrum.ieee.org/biologic-drugs-implant-bioelectronics-medicine</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-man-in-a-lab-coat-holding-a-miniature-device-in-his-hand-that-resembles-a-complex-flash-drive.jpg?id=65428662&width=1200&height=400&coordinates=0%2C417%2C0%2C417"/><br/><br/><p>Cells that have been genetically engineered to produce drugs are a promising way to deliver medicines inside the human body, but keeping those cells alive is challenging. A new bioelectronic implant can now support populations of three different drug-producing cells for more than a month. The researchers behind the result say it’s a step toward “living pharmacies” that can deliver a range of drugs on demand.</p><p>But another promising avenue involves using genetic engineering to turn cells into living drug factories that can pump out a class of medicines known as “biologics”—drugs derived from living organisms. The U.S. <a href="https://www.fda.gov/" rel="noopener noreferrer" target="_blank">Food and Drug Administration</a> has approved biologics targeting a wide range of conditions, including <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC12239671/" rel="noopener noreferrer" target="_blank">various cancers</a>, autoimmune diseases like <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC6857978/" rel="noopener noreferrer" target="_blank">arthritis</a> and <a href="https://www.psoriasis.org/current-biologics-on-the-market/" rel="noopener noreferrer" target="_blank">psoriasis</a>, and chronic conditions like <a href="https://aafa.org/asthma/asthma-treatment/biologics-asthma-treatment/" rel="noopener noreferrer" target="_blank">asthma</a> and <a href="https://www.crohnscolitisfoundation.org/sites/default/files/2025-03/Biologics%203.5.25.pdf" rel="noopener noreferrer" target="_blank">Crohn’s disease</a>.</p><p>For the approach to work, the cells need to stay alive long enough in the host’s body to produce the correct dose of the medicine. One of the biggest barriers is ensuring that the cells receive enough oxygen to thrive. The multi-institution team behind the latest development has created a bioelectronic implant the size of a thumb drive that houses drug-producing cells and also uses electrochemical reactions to provide a reliable supply of oxygen to them.</p><h2>Bioelectronic Implant Extends Cell Survival</h2><p>In a <a href="https://www.cell.com/device/abstract/S2666-9986(26)00058-X" target="_blank">recent paper</a> in <em><em>Device</em></em>, the team showed the implant could sustain three different strains of <a data-linked-post="2675304189" href="https://spectrum.ieee.org/cyborg-stem-cell-therapy-for-diabetes" target="_blank">engineered cells</a> for 31 days when implanted in rats, providing steady production of multiple drugs. While the implant remains a proof-of-concept, the long term goal is to create a device that can control the timing and dosage of multiple therapies over extended periods, says <a href="https://www.mccormick.northwestern.edu/research-faculty/directory/profiles/rivnay-jonathan.html" rel="noopener noreferrer" target="_blank">Jonathan Rivnay</a>, a professor of biomedical engineering at Northwestern University.</p><p>“Imagine a device that’s a few millimeters that you can put under your skin, and it can serve this purpose of a multi-therapeutic living pharmacy that can last for months to years,” Rivnay says. “That would be game changing. I think we have a long way to go, but the kind of advances that we’re writing about in this article are laying the foundation for what that might look like.”</p><p>Access to oxygen is the main limitation for this kind of <a data-linked-post="2650275551" href="https://spectrum.ieee.org/smartphonecontrolled-cells-keep-diabetes-in-check" target="_blank">cell therapy</a>, says <a href="https://profiles.rice.edu/faculty/omid-veiseh" rel="noopener noreferrer" target="_blank">Omid Veiseh</a>, a professor of bioengineering at Rice University. The area directly under the skin, which is an attractive target for implants because it can be accessed using minimally invasive procedures, tends to be particularly poorly oxygenated.</p><p>One potential solution are electrochemical approaches that convert water into oxygen and hydrogen. But these approaches have primarily been developed for industrial applications that don’t translate well to the constraints of operating inside the body or using its water. Specifically, they have high power requirements and potentially produce toxic byproducts like chlorine and hydrogen peroxide.</p><p>Previous research from the same researchers demonstrated a device that used a thin film of iridium oxide as a catalyst to generate oxygen, which enabled it to run at voltages between 1.6 and 1.9 volts (lower than other electrochemical reactions), and minimized the creation of harmful byproducts. But the device still required an external power source.</p><h2>HOBIT Wireless Oxygenation Implant System</h2><p>Building on that work, the researchers have now built a device they call HOBIT (Hybrid Oxygenation Bioelectronics system for Implanted Therapy) that integrates an oxygenator, a chamber for housing cells, a wireless communication system to control oxygen production and transmit data, and an internal battery into a hermetically sealed implant just 4.5 centimeters long.</p><p>“I think the power is in the fully implantable nature of this platform,” says Chris Wright, a Ph.D. student at Rice. “You don’t need external power, you don’t need external devices that connect to it. That’s a big differentiator.”</p><p>The cells are encapsulated in permeable gel capsules that allow nutrients and drugs to pass through, but prevents cells from escaping or being attacked by the body’s immune system.</p><p>The device is able to house drug-producing cells at a density as high as 60 million per milliliter. That density allowed the researchers to load three different engineered cell strains designed to produce an anti-HIV antibody, a hormone that regulates metabolism, and a peptide similar to the weight loss drug GLP-1.</p><p>These drugs all last different amounts of time in the body, but by balancing the ratios of the cells and controlling the oxygen supply the researchers were able to maintain steady production of each drug therapy for 31 days. By the end of the trial, 64.6 percent of cells were still viable, compared to just 19.2 percent in a control device without an oxygenator.</p><p>This ability to produce several drugs at reliable levels over extended periods could significantly reduce the burden of administering complex multi-therapy treatment regimes, says Veiseh. The team is already working to apply the technology as part of a project funded by the Advanced Research Projects Agency for Health (ARPA-H) called <a href="https://arpa-h.gov/explore-funding/awards/646" rel="noopener noreferrer" target="_blank">THOR</a> (Targeted Hybrid Oncotherapeutic Regulation), which will produce multiple cancer-fighting immunotherapies with different half lives in the abdomen.</p><p>Rivnay says that they hope to one day augment the device with sensors that can detect various biomarkers, as well as ways to control drug production using optogenetics and electrogenetics—methods for altering the genetic activity of cells using flashes of light or pulses of electricity, respectively. “All of those things layer onto a more complex living-pharmacy-type system, building that longer term vision of not only controlling dose but controlling exactly when you supply a dose,” he says.</p><p>One outstanding challenge will be getting approval from the FDA—the agency has yet to sanction a biohybrid device that combines both living and non-living components. But Rivnay remains confident that with the right approach they can win over regulators.</p><p>“It’s just a matter of showing that it’s safe and showing that it’s effective,” he says. “That’s why we have to start relatively simple and not throw all the bells and whistles at it straight away.”</p>]]></description><pubDate>Mon, 06 Apr 2026 12:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/biologic-drugs-implant-bioelectronics-medicine</guid><category>Drug-delivery</category><category>Bioelectronics</category><category>Medicine</category><dc:creator>Edd Gent</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-man-in-a-lab-coat-holding-a-miniature-device-in-his-hand-that-resembles-a-complex-flash-drive.jpg?id=65428662&amp;width=980"></media:content></item><item><title>Young Professional’s AI Tool Spots Mental Health Conditions</title><link>https://spectrum.ieee.org/abhishek-appaji-ai-diagnostic-tool</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/an-adult-indian-man-using-a-machine-to-capture-images-of-an-adult-womans-retina.jpg?id=65452299&width=1200&height=400&coordinates=0%2C729%2C0%2C730"/><br/><br/><p><a href="https://www.abhishekappaji.com/" rel="noopener noreferrer" target="_blank">Abhishek Appaji</a> has committed his career to bringing lifesaving technology to underresourced communities. The IEEE senior member weaves together artificial intelligence, biomedical engineering, deep learning, and neuroscience to make doctors’ jobs easier and to improve patient outcomes.</p><p>“The intersection of these fields is where the most impactful breakthroughs in diagnostic precision occur,” says Appaji, an associate professor of medical electronics engineering at the <a href="https://www.bmsce.ac.in/" target="_blank">B.M.S. College of Engineering</a>, in Bengaluru, India.</p><h3>Abhishek Appaji</h3><br/><p><strong>Employer </strong></p><p><strong></strong>B.M.S. College of Engineering, in Bengaluru, India</p><p><strong>Job title</strong></p><p><strong></strong>Associate professor of medical electronics engineering</p><p><strong>Member grade </strong></p><p><strong></strong>IEEE senior member</p><p><strong>Alma maters </strong></p><p><strong></strong>B.M.S. College of Engineering; University of Visvesvaraya, in Bengaluru; Maastricht University, in the Netherlands</p><p>Many of his inventions have been deployed in remote areas of India, providing physicians with quality diagnostic tools, including an AI-powered machine that can scan retinas to detect medical conditions and a smart bed that continuously monitors a patient’s vital signs.</p><p>An active volunteer with the <a href="https://yp.ieee.org/" rel="noopener noreferrer" target="_blank">IEEE Young Professionals</a> <a href="https://yp.ieeebangalore.org/" rel="noopener noreferrer" target="_blank">Bangalore Section</a>, he has launched professional networking events, technology workshops, a mentorship program, and other initiatives.</p><p>For his “contributions to accessible AI-driven health care solutions and leadership in empowering young professionals,” Appaji is the recipient of this year’s <a href="https://corporate-awards.ieee.org/award/ieee-theodore-w-hissey-outstanding-young-professional-award/" rel="noopener noreferrer" target="_blank">IEEE Theodore W. Hissey Outstanding Young Professional Award</a>. The honor is sponsored by the <a href="https://ieeephotonics.org/" rel="noopener noreferrer" target="_blank">IEEE Photonics</a> and <a href="https://ieee-pes.org/" rel="noopener noreferrer" target="_blank">Power & Energy</a> societies as well as IEEE Young Professionals. The award is scheduled to be presented this month during the <a href="https://corporate-awards.ieee.org/event/laureate-forum-honors-ceremony-gala/" rel="noopener noreferrer" target="_blank">IEEE Honors Ceremony</a> in New York City.</p><p>“This award represents a significant milestone in my career,” Appaji says. “It validates my core belief that our success as engineers is not solely measured by research outcomes or publications but by the tangible impact we have on lives through accessible technology and the quality of the next generation of leaders we empower.”</p><h2>Developing a blood glucose measurement device</h2><p>After earning a bachelor’s degree in engineering from B.M.S. in 2010, he joined the school as a lecturer in its medical electronics engineering department. At the same time, he pursued master’s degrees in bioinformatics at the <a href="https://uvce.ac.in/" rel="noopener noreferrer" target="_blank">University Visvesvarya College of Engineering</a>, also in Bengaluru. He graduated in 2013 and continued to teach at B.M.S.C.E.</p><p>Four years later, Appaji signed up for the <a href="https://openlearning.mit.edu/courses-programs/mit-bootcamps" rel="noopener noreferrer" target="_blank">MIT Global Entrepreneurship Bootcamp</a>, a two-week intensive hybrid program that includes webinars, online courses, and a five-day stay at MIT. It’s designed to give teams of aspiring entrepreneurs, innovators, and early-stage founders the structured mindset, tools, and frameworks they need to succeed.</p><p>Appaji says he discovered the program while researching opportunities in innovation.</p><p>“I had the technical expertise, but I needed a structured framework to transition my research from the laboratory to the market,” he says.</p><p>During the MIT boot camp, he and a team of four other participants were tasked with approaching a complex health care challenge. They developed a noninvasive blood glucose measurement device to manage gestational diabetes—a condition that causes high blood sugar and insulin resistance during pregnancy. When the program ended, Appaji and two of his Australia-based teammates continued their collaboration by founding <a href="https://au.linkedin.com/company/glucotekinc" rel="noopener noreferrer" target="_blank">Glucotek</a> in Brisbane, Australia.</p><p>Inspired to continue his research in health care technology, Appaji pursued a doctorate in mental health and neurosciences at <a href="https://www.maastrichtuniversity.nl/" rel="noopener noreferrer" target="_blank">Maastricht University</a>, in the Netherlands.</p><p>His <a href="https://cris.maastrichtuniversity.nl/en/publications/retinal-vascular-features-as-a-biomarker-for-psychiatric-disorder/" rel="noopener noreferrer" target="_blank">thesis</a> focused on computational methods to identify retinal vascular patterns.</p><p class="pull-quote">“The patterns we analyze—including the curvature of the vessels, the angles at which they branch out, and their dimensions—reveal the health of the microvascular system,” he says. “With conditions like schizophrenia and bipolar disorder, microvascular changes mirror neurovascular changes in the brain.”</p><p><span>“My journey has shown me that IEEE is much more than a professional society; it is a global platform that allows me to collaborate with a diverse network of experts to solve local humanitarian challenges.”</span></p><p>Examining and measuring the retinal vascular system offers physicians a noninvasive way to examine neural changes, which can be biomarkers for psychiatric illnesses, he says.</p><p>To bring his idea to life, he collaborated with an ophthalmologist, a psychiatrist, and colleagues from his engineering school to develop a screening device. They also created and trained the AI models that analyze retinal images.</p><p>Ideas from his thesis led to the creation of the Smart Eye Kiosk, an AI-powered tool that scans the network of small veins that deliver blood to the inner retina. The tool monitors stress levels and mental health. It also screens for basic eye diseases such as diabetic retinopathy, as well as damage to retinal blood vessels caused by high blood sugar.</p><p>Retinal images also can reveal physiological changes in the brain associated with psychiatric disorders such as schizophrenia and bipolar disorder, Appaji says. The kiosk uses AI models to analyze measurements of the vasculature network, such as vessel thickness, which can be biomarkers for psychiatric conditions. Since mental illnesses can be linked to genetics, relatives of patients with schizophrenia and bipolar disorder were also invited to participate in a study funded by India’s <a href="https://dst.gov.in/cognitive-science-research-initiative-csri" target="_blank">Cognitive Science Research Initiative’s Department of Science & Technology</a>. The clinical data from this study can pave the way for earlier, more accurate diagnoses.</p><p>“The biological basis for this is fascinating,” Appaji says. “The retina is the only place in the human body where the central nervous system and the vascular system can be visualized directly and noninvasively. Anatomically, the retina is an extension of the posterior part of the brain. Therefore, physiological changes in the brain are often reflected in the eyes.”</p><p>This kiosk was developed in collaboration with <a href="https://www.ttsh.com.sg/" target="_blank">Tan Tock Seng Hospital</a> and <a href="https://www.ntu.edu.sg/" target="_blank">Nanyang Technological University</a>, which was funded by <a href="https://www.chi.sg/platformprogrammes/ourfundingprogrammes/ntfhip/" rel="noopener noreferrer" target="_blank">Ng Teng Fong Healthcare Innovation Program</a>.</p><p>He earned his Ph.D. in 2020 from Maastricht, and he received the Best Thesis Award from the university’s <a href="https://www.maastrichtuniversity.nl/research/mental-health-and-neuroscience-research-institute" rel="noopener noreferrer" target="_blank">Mental Health and Neuroscience Research Institute</a>. Appaji credits his time at the school for his multidisciplinary approach to developing medical devices.</p><p>“Having the perspectives of mentors from diverse fields was essential to help me move my research beyond theory into a data-driven diagnostic tool,” he says.</p><p>He was then named institutional coordinator of R&D at B.M.S. and later was promoted to be its head.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="An adult Indian man looking at a rectangular device in his hand, labeled \u201cdozee\u201d." class="rm-shortcode" data-rm-shortcode-id="bc22f80982f03961c7b5f5fd684014f2" data-rm-shortcode-name="rebelmouse-image" id="40db1" loading="lazy" src="https://spectrum.ieee.org/media-library/an-adult-indian-man-looking-at-a-rectangular-device-in-his-hand-labeled-u201cdozee-u201d.jpg?id=65452303&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Abhishek Appaji working on a smart bed sensor that continuously monitors a patient’s vital signs without the use of wires or wearable sensors.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Abhishek Appaji</small></p><h2>A wireless smart bed to monitor vital signs</h2><p>Appaji continues to develop technologies for patients who need them most. “I feel a deep need to bridge this gap and ensure innovations have a tangible impact on society,” he says. In addition to the Smart Eye Kiosk, he improved the performance of the sensors of the smart beds that continuously monitor a patient’s vital signs without the use of wires or wearable sensors. The beds help hospital staff check on their patients in a noninvasive way.</p><p>The project was done in collaboration with health AI company <a href="https://www.dozeehealth.ai/" target="_blank">Dozee (Turtle Shell Technologies)</a> in Bengaluru. The system measures mechanical microvibrations produced by the body in response to the ejection of blood into the aorta, which occurs with each heartbeat. A thin, industrial-grade sensor sheet is placed underneath the mattress. Additional funding is being provided by India’s <a href="https://dst.gov.in" rel="noopener noreferrer" target="_blank">Department of Science and Technology</a>.</p><p>“These sensors are incredibly sensitive,” Appaji says. “They pick up minute mechanical tremors through the mattress material.”</p><p>The sensors detect the force of the patient’s heartbeat and the expansion and contraction of their chest during respiration. The vibrations are converted into electrical signals and analyzed using deep learning algorithms developed by Appaji and his team at the university in collaboration with Dozee.</p><p>The technology is used in more than 200 hospitals throughout India and in thousands of households, he says.</p><h2>Mentoring budding entrepreneurs </h2><p>Appaji is also executive director of the <a href="https://bigfoundation.org.in/" rel="noopener noreferrer" target="_blank">BMSreenivasiah Innovators Guild Foundation</a>, dedicated to nurturing entrepreneurial talent among students and faculty across the BMS group of Institutions. A not-for-profit company promoted by the BMS Education Trust, BIG Foundation provides a structured ecosystem for innovation, incubation, and startup growth.</p><p>There, Appaji mentors budding entrepreneurs, offering advice on business plans, product pitches, marketing strategies, and licensing. Participants are students and faculty members.</p><p>The foundation has incubated more than 10 ventures, according to Appaji.</p><p>“The majority are centered on health care applications,” he says, “and have successfully secured backing from investors and seed funds.”</p><h2>Taking IEEE’s mission to heart</h2><p>Appaji was introduced to IEEE as an undergraduate when one of his professors encouraged him to volunteer for a conference sponsored by the <a href="https://www.embs.org/" rel="noopener noreferrer" target="_blank">IEEE Engineering in Medicine and Biology Society</a>. He transcribed the seminars for session chairs, assisted with managing the talks, and helped answer attendees’ questions.</p><p>“That experience was transformative,” he recalls. “I was amazed to find myself in the same room with the speakers and scientists who had authored the very textbooks I was studying.</p><p>“It was then that I realized IEEE is far more than just technology and volunteering; it is a global platform for high-level networking with world-class scientists and technologists.”</p><p>Appaji has served in several IEEE leadership positions, including 2018–2019 chair of the Young Professionals Bangalore Section. He is now treasurer of the <a href="https://ieee-edusociety.org/home" rel="noopener noreferrer" target="_blank">IEEE Education Society</a>, chair of <a href="https://ieeecsbangalore.org/" rel="noopener noreferrer" target="_blank">IEEE Computer Society Bangalore Chapter</a>, member of the steering committee of <a href="https://ieee-dataport.org/" rel="noopener noreferrer" target="_blank">IEEE DataPort</a>, and serves on the IEEE <a href="https://www.ieee.org/communities/geographic-activities" rel="noopener noreferrer" target="_blank">Member and Geographic Activities</a> and <a href="https://ea.ieee.org/ea-programs" rel="noopener noreferrer" target="_blank">IEEE Educational Activities</a> boards.</p><p>“What motivates me to remain active within IEEE is the profound alignment between my personal goals and the organizational mission of advancing technology for the benefit of humanity,” he says. “My journey has shown me that IEEE is much more than a professional society; it is a global platform that allows me to collaborate with a diverse network of experts to solve local humanitarian challenges.”</p><p>The organization has helped fund some of Appaji’s lifesaving work. During the <a href="https://spectrum.ieee.org/tag/covid-19" target="_self">COVID-19 pandemic</a>, he received a grant from the <a href="https://ieeeht.org/" rel="noopener noreferrer" target="_blank">IEEE Humanitarian Technologies Board </a>and <a href="https://www.ieeer10.org/" rel="noopener noreferrer" target="_blank">Region 10</a> to develop <a href="https://spectrum.ieee.org/ieee-sections-receive-grants-for-their-innovative-ways-of-helping-to-fight-the-coronavirus" target="_self">3D-printed protective equipment</a> for people in Bengaluru’s underserved communities. The virus spread quickly in the high-density areas, where social distancing was nearly impossible. The kits, which included a door opener to avoid high-touch surfaces and an elbow-operated soap dispenser, were sent to nearly 500 households.</p><p>“This work remains one of my most meaningful contributions to humanitarian technology,” Appaji says, “demonstrating how engineering can be rapidly deployed to protect vulnerable populations during a global crisis.”</p><p>He advises younger IEEE members to: “Say yes to taking on roles of responsibility. Don’t wait for a formal title to lead; instead, start by volunteering to do small, manageable tasks within your local chapter or section.”</p><p>“The networking opportunities and leadership skills you gain through these early responsibilities will shape your professional career far more than any textbook ever could.”</p>]]></description><pubDate>Thu, 02 Apr 2026 18:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/abhishek-appaji-ai-diagnostic-tool</guid><category>Ieee-member-news</category><category>Health-care</category><category>Biomedical</category><category>Ieee-young-professionals</category><category>Ieee-awards</category><category>Type-ti</category><dc:creator>Amanda Davis</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/an-adult-indian-man-using-a-machine-to-capture-images-of-an-adult-womans-retina.jpg?id=65452299&amp;width=980"></media:content></item><item><title>Scientists Build Living Robots With Nervous Systems</title><link>https://spectrum.ieee.org/neurobot-living-robot-nervous-system</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/close-up-of-a-neuro-robot-that-has-been-stained-to-highlight-multi-ciliated-cells-around-its-periphery.jpg?id=65444408&width=1200&height=400&coordinates=0%2C417%2C0%2C417"/><br/><br/><p>Engineers have long tried to mimic life. They’ve built machine learning algorithms <a href="https://spectrum.ieee.org/topographic-neural-network" target="_self"><span><span>modeled after the human brain</span></span></a>, designed machines that <a href="https://spectrum.ieee.org/boston-dynamics-research-spot" target="_self"><span><span>walk like dogs</span></span></a> or <a href="https://spectrum.ieee.org/flying-robot-bug" target="_self"><span><span>fly like insects</span></span></a>, and taught robots to adapt, <a href="https://spectrum.ieee.org/video-friday-morphing-robots" target="_self"><span><span>however clumsily</span></span></a>, to the world around them.</p><p>Now they are skipping imitation altogether.</p><p>Instead of taking inspiration from biology, they are building robots out of it: fashioning tiny, <a href="https://spectrum.ieee.org/aidesigned-living-robots-crawl-heal-themselves" target="_self">free-swimming assemblages of living cells</a> that organize into self-directed systems, complete with neurons that wire themselves into functional circuits.</p><p>The result, <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/advs.202508967" target="_blank">reported last month in <em>Advanced Science</em></a>, is what the researchers call a “neurobot.”</p><p>These living machines could help scientists better understand how simple neural networks give rise to complex behaviors, a foundational step toward building cyborg systems that integrate biological tissue with engineered control. And with further refinement, they could be put to use in applications ranging from precision tissue repair to environmental cleanup.</p><p>“My general reaction is, ‘Wow, this is amazing!’ ” says <a href="https://cbs.umn.edu/directory/kate-adamala" target="_blank"><span>Kate Adamala</span></a>, a synthetic biologist at the University of Minnesota Twin Cities, who was not involved in the research. “This truly puts the engineering component into bioengineering.”</p><h2>Toward Internal Control</h2><p>Neurobots mark the latest advance in a <a href="https://journals.sagepub.com/doi/10.1089/soro.2022.0142" target="_blank">series of increasingly sophisticated biological machines</a> developed by Tufts University biologist <a href="https://allencenter.tufts.edu/our-team/michael-levin/" target="_blank">Michael Levin</a> and his collaborators.</p><p><a href="https://www.pnas.org/doi/10.1073/pnas.1910837117" target="_blank">First described in 2020</a>, these clusters of living cells, when removed from their normal developmental context and cultured in simple saline conditions, spontaneously self-organize in such a manner that they move and act in novel ways. Under the microscope, they look like irregular, translucent blobs of tissue, but their coordinated motion reveals an emergent order that is unlike anything found in the natural world.</p><p>“These things don’t occur naturally,” says <a href="https://www.binghamton.edu/ssie/people/profile.html?id=cgg" target="_blank"><span><strong><span></span></strong><span>Carlos Gershenson</span></span></a>, a<em> </em>computer scientist<em><em> </em></em>at Binghamton University, State University of New York, who <a href="https://direct.mit.edu/artl/article/29/2/153/114834/Emergence-in-Artificial-Life?guestAccessKey=" target="_blank"><span>studies artificial life</span></a> and complex systems but was not involved in the neurobot research. “They’re made with natural cells, but we’re the ones arranging them.”</p><p>The <a href="https://www.science.org/doi/full/10.1126/scirobotics.abf1571" target="_blank">earliest examples of this technology</a>, called xenobots, were built from frog-derived tissues and mainly from a single type of structural cell. Despite the simplicity of their construction, however, they could propel themselves through water using beating hair-like projections called cilia. They survived for days without added nutrients. And they could repair minor damage, all without any scaffolding materials or genetic manipulation. <a href="https://www.pnas.org/doi/10.1073/pnas.2112672118" target="_blank">Some could even self-<span>replicate</span></a><em><em> </em></em>by spontaneously sweeping up loose stem cells.<em><em></em></em></p><p>Still, for all the novelty of these biological machines, their behavior was essentially mechanical. Their movements were driven by anatomy and physics, not by anything resembling internal control. They could sense chemical cues, change direction accordingly, and even retain traces of past experiences, as <a href="https://www.biorxiv.org/content/10.64898/2026.03.17.712168v1" target="_blank">detailed in a preprint posted 17 March on <em>bioRxiv</em></a>.</p><p>But many other simple organisms—fungi, protists, and bacteria included—can do much the same. To achieve more flexible, coordinated behavior, they would need a way to integrate information across the body and dynamically direct their actions. Neurobots begin to provide that missing layer of control.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="f86434d62c5577170353478e6aeab577" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wrIpHUmYKBE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">Small tufts of hairlike cilia, combined with the neurobot’s nervous system, allow it to move on its own.</small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Haleh Fotowat</small> </p><h2>Linking Neural Activity to Action</h2><p>Like earlier xenobots, neurobots are still built from frog cells, but they are now endowed with neurons that mature from partially differentiated stem<strong> </strong>cells. These nerve cells develop alongside structural tissues, forming branching connections throughout the autonomous beings. This means they can relay electrochemical signals from cell to cell.</p><p>And unlike other laboratory models of the nervous system—<a href="https://spectrum.ieee.org/organoid-intelligence-computing-on-brain" target="_self">brain organoids</a>, say, or <a href="https://spectrum.ieee.org/biological-computer-for-sale" target="_self">lab-on-a-chip</a> technologies—neurobots move. They swim, explore, and respond to their surroundings in ways that tie electrical signaling to observable movement, producing patterns of <strong></strong>physical activity  distinct from<strong> </strong>their non-neural counterparts.</p><p>Neurobots spend less time idling and more time exploring. They also trace looping and spiraling paths rather than repeating simple trajectories. And they respond differently to neuroactive drugs.</p><p>If the organizing principles that enable these internally guided motions and reflexes can now be deciphered, they could then be harnessed to produce more predictable biological functions, says <a href="https://wyss.harvard.edu/team/advanced-technology-team/haleh-fotowat/" target="_blank">Haleh Fotowat</a>, a neuroengineer from Harvard’s Wyss Institute for Biologically Inspired Engineering, who collaborated with Levin’s team on the study.</p><p>“We’re still very early in terms of understanding the system and its capabilities.” But once the scientists understand how the neurobots self-organize, she says, “then we can begin to engineer on top of that.”</p><p>Beyond the practical, neurobots also raise deeper epistemological questions about the nature of biological organization, notes Levin. “Where does form and function come from in the first place?” he asks. “When it’s not evolved and it’s not engineered, where do these patterns come from?”</p><p>“This is a model system for asking those kinds of questions,” Levin says—in frog and human constructs alike.</p><h2>From Discovery to Deployment</h2><p>Among the many variations on the biobot theme are “<a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/advs.202303575" target="_blank"><span><span>anthrobots</span></span></a><span>,</span>” built from clusters of human lung cells instead of frog tissue.</p><p>Levin’s team now plans to add human neural cells to their anthrobots, extending the neurobot framework into a fully human context. Then, through further conditioning and guided learning, these living machines—like <a href="https://spectrum.ieee.org/using-a-twopronged-approach-to-detect-explosive-substances-from-bombs" target="_self">dogs trained to sniff for bombs</a>—may become capable of adapting their behavior in predictable ways.</p><p>“The hope would be that you could teach them or train them to do what you want them to do,” says <a href="https://www.uvm.edu/cems/cs/profile/josh-bongard" target="_blank">Josh Bongard</a>, a computer scientist and roboticist at the University of Vermont.</p><p>Bongard was not involved in the neurobot study but is a frequent collaborator of Levin’s. Together, they cofounded the nonprofit <a href="https://icdorgs.org/" target="_blank">Institute for Computationally Designed Organisms</a> and a commercial startup, <a href="https://www.faunasystems.com/" target="_blank">Fauna Systems</a>, to advance biobot-related technologies.</p><p>According to Fauna CEO <a href="https://www.linkedin.com/in/naimish-patel-925a84" target="_blank">Naimish Patel</a>, the company is initially targeting environmental sensing applications, aiming to deploy xenobots in settings such as aquaculture, wastewater monitoring, and pollutant detection, where the technology’s ability to integrate multiple signals could provide an early readout of ecosystem health.</p><p> If the xenobots encounter a mixture of stressors—say, elevated heavy metals, shifts in pH, and traces of agricultural runoff—their collective changes in movement or activity could provide a sensitive, real-time signal that something in the environment is amiss. </p><p>Precedent for this idea comes from Poland, where many cities already use <a href="https://www.atlasobscura.com/articles/wild-life-excerpt-water-quality-mussels" target="_blank">freshwater mussels as living sentinels of water quality</a>, wired with sensors that register when the animals clamp their shells shut in response to pollutants. Xenobots could extend this concept further, Patel says, potentially offering greater sensitivity and specificity by integrating multiple environmental cues into a single, measurable behavioral response. And neurobots could eventually push this fusion of sensing and computation into ever more sophisticated territory, he adds.</p><p>But the technical hurdles remain substantial—and the practical opportunities with simpler, non-neural versions are already compelling—so the first-gen xenobots, for the time being,  remain the focus of Fauna’s initial product-development efforts, Patel says. “Right now, we’re looking for the intersection between unmet commercial need and emerging capability.” </p>]]></description><pubDate>Thu, 02 Apr 2026 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/neurobot-living-robot-nervous-system</guid><category>Bioengineering</category><category>Frog</category><category>Living-cells</category><category>Biomimetics</category><category>Bioinspired-robots</category><dc:creator>Elie Dolgin</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/close-up-of-a-neuro-robot-that-has-been-stained-to-highlight-multi-ciliated-cells-around-its-periphery.jpg?id=65444408&amp;width=980"></media:content></item><item><title>What Exoskeletons Learned From One Relentless User</title><link>https://spectrum.ieee.org/exoskeleton-user-experience</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-man-wearing-a-full-body-robotic-exoskeleton-standing-on-a-city-sidewalk.png?id=65426945&width=1200&height=400&coordinates=0%2C1000%2C0%2C1001"/><br/><br/><p><strong>It’s easy to assume</strong> that Robert Woo was defined by the accident that took away his ability to walk.</p><p>Certainly, the day of his accident—14 December 2007—was a turning point. Woo, an architect working on the new Goldman Sachs headquarters in New York City, hadn’t attended his company’s holiday party the night before, and that morning he was the only one in the trailer that served as the construction-site office. He was bent over his laptop when, 30 floors above, a <a href="https://www.nydailynews.com/2007/12/14/west-side-crane-accident-injures-1-at-goldman-sachs-site/" target="_blank">crane’s nylon sling gave way</a>, sending about 6 tonnes of steel plummeting toward the trailer. The roof collapsed, folding Woo in half and smashing his face into his laptop, which smashed through his desk.</p><p>“I was conscious throughout the whole ordeal,” Woo remembers. “It was an out-of-body experience. I could hear myself screaming in pain. I could hear the voices of the rescue workers. I heard one firefighter say, ‘Don’t worry, we’re getting to you.’” The rescue workers hauled him out of the rubble and got him to the emergency room in 18 minutes flat; with one lung crushed and the other punctured, he wouldn’t have lasted much longer. In those frantic early moments, a doctor told him that he might be paralyzed from the neck down for the rest of his life. He remembers asking the doctors to let him die.</p><p>Woo simply couldn’t imagine how a paralyzed version of himself could continue living his life. Then 39 years old, he worked long hours and jetted around the world to supervise the construction of skyscrapers. More important, he had two young boys, ages 6 months and 2 years. “I couldn’t see having a life while being paralyzed from the neck down, not being able to teach my boys how to play ball,” he recalls. “What kind of life would that be?”</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="2986541a87f62bd11465a0fd835782ed" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UNddtkBGuAs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">Robert Woo walks inside the Wandercraft facility in New York City using the company’s latest self-balancing exoskeleton. </small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Nicole Millman </small> </p><p><span>But in a Manhattan showroom last May, Woo showed that he’s not defined by that accident, which left him paralyzed from the chest down, but with the use of his arms. Instead, he has defined himself by how he has responded to his injury, and the new life he built after it.</span></p><h3></h3><br/><div class="rblad-ieee_in_content"></div><p>In the showroom, Woo transferred himself from his wheelchair to a 80-kilogram (176-pound) exoskeleton suit. After strapping himself in, he manipulated a joystick in his left hand to rise from a chair and then proceeded to walk across the room on robotic legs. Woo’s steps were short but smooth, and he clanked as he walked.</p><p>This exoskeleton, from the French company <a href="https://en.wandercraft.eu/" target="_blank">Wandercraft</a>, is one of the first to let the user walk without arm braces or crutches, which most other models require to stabilize the user’s upper body. The battery-powered exoskeleton took care of both propulsion and balance; Woo just had to steer. The bulky apparatus had a backplate that extended above Woo’s head, a large padded collar, armrests, motorized legs, and footplates. Walking across the room, he appeared to be half man, half machine. On the other side of the showroom’s plate-glass window, on Park Avenue, a kid walking by with his family came to a dead halt on the sidewalk, staring with awe at the cyborg inside.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Person seated wearing a full lower-body robotic exoskeleton for mobility assistance" class="rm-shortcode" data-rm-shortcode-id="eeace6a9e987149ce383ccec6937a1b8" data-rm-shortcode-name="rebelmouse-image" id="73d05" loading="lazy" src="https://spectrum.ieee.org/media-library/person-seated-wearing-a-full-lower-body-robotic-exoskeleton-for-mobility-assistance.jpg?id=65427288&width=980"/></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Close-up of a hand operating the joystick and controls on a powered wheelchair armrest" class="rm-shortcode" data-rm-shortcode-id="c5dd0b296623bb32a2eb37c88ac0b5f0" data-rm-shortcode-name="rebelmouse-image" id="2d73d" loading="lazy" src="https://spectrum.ieee.org/media-library/close-up-of-a-hand-operating-the-joystick-and-controls-on-a-powered-wheelchair-armrest.jpg?id=65427286&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Robert Woo prepares to walk in a Wandercraft exoskeleton; the device’s controller enables him to stand up, initiate walk mode, and choose a direction. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Bryan Anselm/Redux </small></p><p>The amazement on the boy’s face was reminiscent of Woo’s young sons’ reaction when they saw a photo of Woo trying out an early exoskeleton, back in 2011. “Their first comment was, ‘Oh, Daddy’s in an Iron Man suit,’” he remembers. Then they asked, “When are you going to start flying?” To which Woo replied, “Well, I’ve got to learn how to walk first.”</p><p>The title of exoskeleton superhero suits Woo. He’s as soft-spoken and mild-mannered as Clark Kent, with a smile that lights up his face. Yet the strength underneath is undeniable; he has built a new life out of sheer determination. </p><p>For 15 years, he’s been a test pilot, early adopter, and clinical-study subject for the most prominent exoskeletons under development around the world. He placed the first order for an exoskeleton that was approved for home use, and he learned what it was like to be Iron Man around the house. Throughout it all, he has given the companies detailed feedback drawn from both his architectural design skills and his user experience. He has shaped the technology from inside of it.</p><p><a href="https://people.njit.edu/profile/pal" target="_blank">Saikat Pal</a>, a researcher at the New Jersey Institute of Technology, in Newark, met Woo during clinical trials for Wandercraft’s first model. Like so many others in the field, Pal quickly recognized that Woo brought a lot to the table. “He’s a super-mega user of exoskeletons: very enthusiastic, very athletic,” Pal says. “He’s the perfect subject.”</p><p>By pushing the technology forward, Woo has paved the way for thousands of people with spinal cord injuries as well as other forms of paralysis, who are now benefiting from exoskeletons in rehab clinics and in their homes. “Our bionics program at Mount Sinai started with Robert Woo,” says <a href="https://profiles.mountsinai.org/angela-riccobonno" target="_blank">Angela Riccobono</a>, the director of rehabilitation neuropsychology at <a href="https://www.mountsinai.org/" target="_blank">Mount Sinai Hospital</a>, in New York City, where Woo became an outpatient after his accident. “We have a plaque that dedicates our bionics program to him.”</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="3b08ced9c1ebb53070cf467341ccabd1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/6kIvBtYeYUs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><small class="image-media media-caption" placeholder="Add Photo Caption...">Robert Woo walks down a sidewalk in New York City in 2015 using a ReWalk exoskeleton, one of the first exoskeletons designed for use outside the rehab clinic. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Eliza Strickland</small></p><p>It’s a fitting tribute. Woo’s post-accident life has been marked by victories, frustrations, deep love, and one devastating loss, and yet he has continued to devote himself to bionics. And while his vision for exoskeletons hasn’t changed, experience has reshaped what he expects from them in his lifetime.</p><h2>Rebuilding a Life After his Spinal Cord Injury </h2><p>Long before Woo ever stood up in a robotic suit, he had developed the habits of mind that would later make him an unusually perceptive test pilot.</p><p>Woo has always been a builder, a tinkerer, a fixer. Growing up in the suburbs of Toronto, he put together model kits of battleships and airplanes without looking at the instructions. “I just put things together the way I thought it would work out,” he says. He trained as an architect and in 2000 joined the Toronto-based firm <a href="https://www.adamson-associates.com/" target="_blank">Adamson Associates Architects</a>, a job that soon had him traveling to Europe and Asia to work on corporate high-rises.</p><p>Adamson specializes in taking the stunning designs of visionary architects and turning them into practical buildings with elevators and bathrooms. “Most of the design architects don’t really have a clue about how to build buildings,” Woo says. He liked solving those problems; he liked reconciling beautiful designs with the stubborn reality of construction. That talent for understanding a structure from the inside and spotting the flaws would prove essential later.</p><p>After his accident, Woo had two major surgeries to stabilize his crushed spine, which required surgeons to cut through muscles and nerves that connected to his arms. For two months, he couldn’t feel or move his arms; there was a chance he never would again. Only when sensation began creeping back into his fingertips did he allow himself to imagine a different future. If he wasn’t paralyzed from the neck down, he thought, maybe more of his body could be brought back online. “My focus was to walk again,” he says.</p><p>Woo was discharged in March 2008 and went back to his New York City apartment. He was still bedridden and required around-the-clock care. He doesn’t much like to talk about this next part: By May, his then-wife had moved back to Canada and filed for divorce, asking for full custody of their two children. Woo remembers her saying, “I can’t look after three babies, and one of them for life.”</p><p>It was a dark time. Riccobono of Mount Sinai, who met Woo shortly after he became an outpatient there in 2008, recalls the despondent look on his face the first time they talked. “I wasn’t sure that he wasn’t going to take his life, to be honest,” she says. “He felt like he had nothing to live for.”</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="One photo shows a smiling man in an exoskeleton with his arm around a smiling woman. The other photo shows a metal plaque saying that the Rehabilitation Bionics Program was made possible by the advocacy and dedication of Robert Woo." class="rm-shortcode" data-rm-shortcode-id="24060627efe3d5ed4b5585e963e6cd34" data-rm-shortcode-name="rebelmouse-image" id="7a1d5" loading="lazy" src="https://spectrum.ieee.org/media-library/one-photo-shows-a-smiling-man-in-an-exoskeleton-with-his-arm-around-a-smiling-woman-the-other-photo-shows-a-metal-plaque-saying.jpg?id=65427290&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Angela Riccobono of Mount Sinai Hospital (left) credits Woo with jump-starting the hospital’s bionics program; a plaque in the department of rehabilitation medicine recognizes his role. </small></p><p>Yet Woo harbors no animosity toward his ex-wife. “If we hadn’t separated and gone through the custody hearing, I don’t think I would have gotten this far,” he says. To win partial custody of his children, Woo had to become independent. He had to get off narcotic pain medications, regain strength, and learn how to navigate life in a wheelchair. He had to show that he no longer needed constant nursing, and that he could take care of both himself and his boys.</p><p>There were milestones: learning how to get back into his wheelchair after a fall, learning to drive a car with hand controls, learning to manage his body as it was, not as it had been. The biggest change came when he reconnected with his high school sweetheart, a vivacious woman named Vivian Springer. She was then dividing her time between Toronto and New York City, and she had a son who was almost the same age as Woo’s two boys. Springer had worked in a nursing home and knew how to change the sheets without getting him out of bed; she was currently working in human resources and knew how to deal with insurance companies. “You wouldn’t believe how much stress it lifted off of me,” Woo says. Over time, they became a family.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Man using a robotic exoskeleton with support, shopping and standing with children." class="rm-shortcode" data-rm-shortcode-id="893ec3e7bbaf953f0fa9b20e639dd9a4" data-rm-shortcode-name="rebelmouse-image" id="54575" loading="lazy" src="https://spectrum.ieee.org/media-library/man-using-a-robotic-exoskeleton-with-support-shopping-and-standing-with-children.jpg?id=65427555&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Robert Woo’s wife, Vivian, was trained in how to operate the device he used at home. His sons, Tristan (left) and Adrien, grew up watching their dad test exoskeletons. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Left: Lifeward; Right: Robert Woo </small></p><p>Once Woo had that foundation in place, Riccobono witnessed a profound change. “He went from focusing on ‘what I can’t do anymore’ to ‘What’s still possible? What can I do with what I have?’” At Mount Sinai, Woo remembers asking his doctor <a href="https://profiles.mountsinai.org/kristjan-t-ragnarsson" target="_blank">Kristjan Ragnarsson</a>, who was then chairman of the department of rehabilitation medicine, if he would ever walk again. “His response was, ‘Yes, you can walk again,’” Woo remembers, “‘but not the way you used to walk.’”</p><h2>First Steps in an Exoskeleton </h2><p>As soon as he had regained use of his hands, Woo had started googling, looking for anything that could get him back on his feet. He tried rehab equipment like the <a href="https://www.sralab.org/services/lokomat" target="_blank">Lokomat</a>, which used a harness suspended above a treadmill to enable users to walk. But at the time, it required three physical therapists: one to move each leg and one to control the machine. It was a far cry from the independent strides he dreamed of.</p><p>Several years in, he learned about two companies that had built something radically different: exoskeleton suits for people with spinal cord injuries. These prototypes had motors at the knees and the hips to move the legs, with the user stabilizing their upper body with arm braces. Woo desperately wanted to try one, although the technology was still experimental and far from regulatory approval. So he took the idea to Ragnarsson, asking if Mount Sinai could bring an exoskeleton into its rehab clinic for a test drive. Ragnarsson, who’s now retired, remembers the request well. “He certainly gave us the kick in the behind to get going with the technology,” he says.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Man in robotic exoskeleton walks with canes during rehab demo as clinicians observe" class="rm-shortcode" data-rm-shortcode-id="08a494fb70ca5c5d7c0e5a3bb263b28c" data-rm-shortcode-name="rebelmouse-image" id="16b99" loading="lazy" src="https://spectrum.ieee.org/media-library/man-in-robotic-exoskeleton-walks-with-canes-during-rehab-demo-as-clinicians-observe.jpg?id=65427556&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Robert Woo tries out an early exoskeleton from Ekso Bionics at Mount Sinai Hospital, where he first began testing the technology. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Mario Tama/Getty Images</small></p><p>Ragnarsson had seen decades of failed attempts to get paraplegics upright, including “inflatable garments made of the same material the astronauts used when they went to the moon,” he says. All those devices had proved too tiring for the user; in contrast, the battery-powered exoskeletons promised to do most of the work. And he knew the CEO of <a href="https://eksobionics.com/" target="_blank">Ekso Bionics</a>, a Berkeley, Calif.–based company that had built exoskeletons for the military. In 2011, Ekso <a href="https://spectrum.ieee.org/goodbye-wheelchair-hello-exoskeleton" target="_blank">brought its new clinical prototype to Mount Sinai</a>.</p><p>The day came for Woo’s first walk. “I was excited, and I was also scared, because I hadn’t stood up for almost five years,” he remembers. “Standing up for the first time was like floating, because I couldn’t feel my feet.” In that first Ekso model, Woo didn’t control when he stepped forward; instead, he shifted his weight in preparation, and then a physical therapist used a remote control to trigger the step. Woo walked slowly across the room, using a walker to stabilize his upper body, his steps a symphony of clunks and creaks and whirs. He found it mentally and physically exhausting, but the effort felt like progress.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="996f7d01a8c62b70fe92b38fa003fe59" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/l-QJx8QWCyc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><small class="image-media media-caption" placeholder="Add Photo Caption...">Robert Woo stands using an exoskeleton and embraces his wife, Vivian. Woo says that exoskeleton use has both physical and psychological benefits. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Mt. Sinai</small></p><p>Riccobono was there for those first steps, with tears running down her face. “I remembered how he looked the day I first met him, so defeated,” she says. “To see him rise from the chair, to see him rise to a standing position, to see how tall he was, to see him take those first steps—it was beautiful.” Ragnarsson saw clear benefits to the technology. “Any type of walking is good physiologically,” he says. “And it’s a tremendous boost psychologically to stand up and look someone in the eye.” Woo remembers hugging his partner, Springer, and for the first time not worrying about running over her toes with his wheelchair. I first met Woo a few days later, during his third session with the Ekso at Mount Sinai.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Two people stand outside; one uses blue exoskeleton crutches for mobility." class="rm-shortcode" data-rm-shortcode-id="69a52fa10854ff73f463efd70c6fbaac" data-rm-shortcode-name="rebelmouse-image" id="b81ad" loading="lazy" src="https://spectrum.ieee.org/media-library/two-people-stand-outside-one-uses-blue-exoskeleton-crutches-for-mobility.jpg?id=65427570&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Ann Spungen (left), a researcher at a Veterans Affairs hospital, led early clinical trials of exoskeletons. Her research focused on the medical benefits of exoskeleton use. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Robert Woo </small></p><p>Later that same year, at a Department of Veterans Affairs (VA) hospital in the Bronx, Woo got to try a prototype of the world’s other leading exoskeleton: the <a href="https://golifeward.com/products/rewalkpersonal-exoskeleton/" target="_blank">ReWalk</a>, from the Israeli company of the same name (since renamed <a href="https://golifeward.com/" target="_blank">Lifeward</a>). VA researchers, led by <a href="https://www.linkedin.com/in/ann-spungen-3971b246/" target="_blank">Ann Spungen</a>, were keen to determine if exoskeleton use had real medical value for veterans with spinal cord injuries. Woo was part of <a href="https://clinicaltrials.gov/study/NCT01454570?lat=40.8673611&lng=-73.9065313&locStr=James%20J.%20Peters%20Department%20of%20Veterans%20Affairs%20Medical%20Center,%20West%20Kingsbridge%20Road,%20The%20Bronx,%20NY&distance=50&term=ReWalk&viewType=Card&rank=1" target="_blank">that clinical trial</a>, for which he had more than 70 walking sessions, and he’s since been in many others. But he remembers the first VA trial with the most gratitude. “Dr. Spungen’s first exoskeleton clinical trial really turned things around for me,” he says.</p><p>Over the course of the trial’s nine intense months, Woo says he saw noticeable improvements to many facets of his health. “By the end of the trial, I eliminated about three-quarters of my medication intake,” he says, including narcotic pain pills and medication for muscle spasms. He grew fitter, with <a href="https://www.sciencedirect.com/science/article/abs/pii/S1094695018300970" target="_blank">less body fat</a>, more muscle mass, and lower cholesterol. His circulation improved, he says, causing scrapes and cuts to heal more quickly, and his <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC7957745/" target="_blank">digestion improved too</a>. The results Woo experienced have generally been borne out in research studies at the VA and elsewhere—exoskeletons aren’t just good for the mind, they’re good for the body.</p><h2>Improving Exoskeletons From the Inside </h2><p>During the VA trial, Woo began to think of exoskeletons not as miraculous machines, but as works in progress.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Man wearing robotic exoskeleton and using crutches on a city sidewalk" class="rm-shortcode" data-rm-shortcode-id="c6e269240874c399dd042e63b52fc7f6" data-rm-shortcode-name="rebelmouse-image" id="8c60a" loading="lazy" src="https://spectrum.ieee.org/media-library/man-wearing-robotic-exoskeleton-and-using-crutches-on-a-city-sidewalk.jpg?id=65427579&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Pierre Asselin (right), a biomedical engineer, worked with Robert Woo during clinical trials of exoskeletons. He says Woo was always pushing the limits of the technology. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Robert Woo </small></p><p><a href="https://www.linkedin.com/in/pierre-asselin-195a0b4/" target="_blank">Pierre Asselin</a>, the biomedical engineer coordinating the VA’s study, watched participants respond very differently to the equipment. “These devices are not the equivalent of walking—you’re tired after walking a mile,” he says. He notes that later models of both the Ekso and ReWalk enabled users to initiate each step through software that recognized when they shifted their weight. Asselin adds that the cognitive load is “like learning to drive a manual transmission car, where at first you’re really struggling to coordinate the clutch and the brake.” Woo picked it up immediately, he remembers.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Man in a leg exoskeleton reaches into a kitchen cabinet while another observes." class="rm-shortcode" data-rm-shortcode-id="c537ce4f78539951c11063a9cb902729" data-rm-shortcode-name="rebelmouse-image" id="236cd" loading="lazy" src="https://spectrum.ieee.org/media-library/man-in-a-leg-exoskeleton-reaches-into-a-kitchen-cabinet-while-another-observes.jpg?id=65427582&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Robert Woo uses an exoskeleton to reach items in a kitchen cabinet during a test of the device’s utility for everyday tasks.  </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Eliza Strickland </small></p>Woo became an invaluable partner, Asselin says. “When we first started with the devices, there was no training manual. We developed all of that through collaboration with Robert and other participants.” Woo pushed the limits of the technology, Asselin says, whether it was seeing how many steps he could take on one battery charge or simulating a failure mode. “He’d say, ‘What happens if I was to fall? What would be the approach to getting up?’”<p><span>Woo approached the ReWalk the way he had approached buildings in his previous life: He looked inside the structure and found the weak points. An early model left some users with leg abrasions where the straps rubbed—a small injury for most people, but a serious risk for someone who can’t feel a wound forming. Woo suggested better padding and stronger abdominal supports to redistribute the load. He also hated the heavy backpack that carried the battery and computer, so one afternoon he grabbed an old pack, cut off the straps, and rebuilt it into a compact hip-mounted pouch. Then he snapped photos and sent them to the company. The next model arrived with a fanny pack.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Hand-drawn concept sketch of a modular device labeled \u201cReWack 6.0\u201d with notes and arrows" class="rm-shortcode" data-rm-shortcode-id="d0e09446b489c6a5f720b68d263450a3" data-rm-shortcode-name="rebelmouse-image" id="76e48" loading="lazy" src="https://spectrum.ieee.org/media-library/hand-drawn-concept-sketch-of-a-modular-device-labeled-u201crewack-6-0-u201d-with-notes-and-arrows.jpg?id=65427594&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Robert Woo sent detailed design sketches as part of his feedback to exoskeleton engineers.  </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Robert Woo </small></p><p>Sometimes his fixes were more ambitious. One Ekso unit that he used at Mount Sinai kept shutting down after 30 minutes. Woo felt the hip motors and found them hot to the touch. “I said, ‘Can I remove these? I’m going to make a really quick fix, okay? Give me a drill and I’ll put a couple of holes in it,” he recalls telling the therapists, proposing to create a DIY heat sink. He wasn’t allowed to modify the prototype, but a year later the company introduced improved cooling around the hip motors. “There is a Robert Woo design on this device,” one therapist told him.</p><p><a href="https://www.linkedin.com/in/eythorbender/" target="_blank">Eythor Bender</a>, who was then the CEO of Ekso, called Woo to thank him for his feedback and invite him to spend a week at Ekso’s headquarters. “There was no lack of engineering power in that building,” says Bender. “But sometimes when you work with engineers, they overlook important things.” Bender says Woo brought both design skills and lived experience to his weeklong residency. “He told the engineers, ‘Guys, this has to be something that people actually like to wear.’”</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Patient in exoskeleton uses walker, flanked by doctor in lab coat and man in suit" class="rm-shortcode" data-rm-shortcode-id="0769a2526e44a9360c2a966a9839c4ee" data-rm-shortcode-name="rebelmouse-image" id="2e1fa" loading="lazy" src="https://spectrum.ieee.org/media-library/patient-in-exoskeleton-uses-walker-flanked-by-doctor-in-lab-coat-and-man-in-suit.jpg?id=65427643&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Ekso Bionics CEO Eythor Bender and Mount Sinai physician Kristjan Ragnarsson were both on hand for Woo’s early trials of the Ekso device. Ragnarsson says he saw physical and psychological benefits of exoskeleton use. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Robert Woo </small></p><p>The longer Woo tested, the further ahead he started thinking. With motors only at the hips and knees, every exoskeleton still required crutches. Add powered ankles, he told the Ekso and ReWalk teams, and the suits could balance themselves, freeing the user’s hands. But Woo was ahead of his time. “They said they weren’t going to do that. They weren’t going to change their whole platform,” he remembers. Years later, though, hands-free exoskeletons like those from Wandercraft would emerge built around exactly that principle.</p><h2>When the Exoskeleton Came Home </h2><p>By the mid-2010s, Woo had pushed the technology as far as he could in clinics. What he wanted now was to use an exoskeleton at home.</p><p>That milestone came after <a href="https://spectrum.ieee.org/rewalk-robotics-new-exoskeleton-lets-paraplegic-stroll-the-streets-of-nyc" target="_blank">ReWalk’s exoskeleton</a> became the first to win <a href="https://ir.rewalk.com/news-releases/news-release-details/rewalktm-personal-exoskeleton-system-cleared-fda-home-use" target="_blank">FDA approval for home use</a> in 2014. ReWalk engineers still remember Woo’s help on the final tests for that personal-use model. It was the end of May in 2015, recalls <a href="https://www.linkedin.com/in/david-hexner-8699413/" target="_blank">David Hexner</a>, the company’s vice president of research and development. “He said, ‘Guys, this is great. I’m going to buy it.’”</p><p>Woo was the first customer to buy an exoskeleton to bring home, paying US $80,000 out of pocket. His insurance wouldn’t cover the cost, but he was able to make the purchase in part because of a legal settlement after his accident. The home-use model came with a requirement that the user have at least one companion who was fully trained in operating the device. In Woo’s case, that meant that Springer learned to suit him up, realign his balance, and help him if he fell.</p><p>On delivery day, two SUVs drove up to a hotel down the street from Woo’s condo in the Toronto area. The technicians hauled two huge boxes into a hotel room and assembled his personal exoskeleton. They took Woo’s measurements, made adjustments, checked the software. This latest version could be controlled by either weight shifting or tapping commands on a smartwatch, and Woo had the app ready. He tested out everything in the hotel room, signed off, and then the technicians drove his robot legs to his home.</p><p>That was the start of his golden period with the ReWalk—similar to the excitement many people experience with a new piece of exercise equipment. “I used it every day for a few hours, and then I started logging how many steps I’d done,” Woo says. “My last count was probably just slightly over a million steps,” he says, with half of those steps taken in his home unit and half in training programs and clinical trials.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Person using a ReWalk exoskeleton with crutches beside stacked ReWalk shipping boxes" class="rm-shortcode" data-rm-shortcode-id="3341315ea904071979a50c6d8ab999dd" data-rm-shortcode-name="rebelmouse-image" id="ddd70" loading="lazy" src="https://spectrum.ieee.org/media-library/person-using-a-rewalk-exoskeleton-with-crutches-beside-stacked-rewalk-shipping-boxes.jpg?id=65434618&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The ReWalk was the first exoskeleton available for use outside the clinic. Robert Woo’s ReWalk arrived in two large boxes. ReWalk engineers assembled it in a hotel room, and Woo tried it out in the hallway before taking it home.  </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Robert Woo</small></p><p>Tristan, Woo’s eldest son, remembers doing laps with his dad in the condo’s underground parking garage while his dad was training for a 5-kilometer race in New York City. Tristan admits that he had previously been embarrassed about his dad, but training for the race shifted something for him. “I was so used to not wanting to tell people that my dad was in a wheelchair, but then I shared his passion for the training,” he says. “When people would come up to us, I’d tell them about it.”</p><p>The ReWalk could turn ordinary moments into small engineering projects. On weekends, Woo would take his boys to the golf course behind their condo and bring a baseball. He had rigged two holsters to the sides of the suit so he could stash a crutch and stand on three points (two legs and one arm) while he pitched or caught. Throw, switch crutches, catch. On the day of his accident, he never thought such a scene would be possible. But with the exoskeleton, it became just another design problem to solve. “It’s a little more work. It’s not perfect,” he says. “But in the end, you still get to do what you want to do—which is play ball with your sons.”</p><p>Tristan, now a college student, says he didn’t realize at the time how hard his dad worked to make those mundane activities possible. “Reflecting on it now,” he says, “he has shaped almost every element of my life, and he definitely is my hero.”</p><p>But even during that golden stretch, the ReWalk had a way of asserting its limits. Every so often it would freeze mid-stride and require a reboot—a small technical hiccup in theory, but a serious problem when there’s a person strapped inside. Once, when he was walking on his own in the parking garage (without his mandated companion), the suit glitched and went into “graceful collapse” mode, lowering him to a seated position on the ground. Woo had to ask security to bring his wheelchair and a dolly.</p><p>He had imagined the exoskeleton would be most useful in the kitchen. Woo loves to cook, and he had pictured himself standing at the stove, looking down into pots, and moving easily between counter and sink. The reality, he found out, was more complicated. “It’s actually very time-consuming and troublesome” to cook in an exoskeleton, he says.</p><p>Preparing a meal meant first rolling through the kitchen in his wheelchair to gather every ingredient and utensil, then transferring himself into the ReWalk and moving himself into position at the counter, stopping at just the right moment. “That’s when I fell once,” Woo says. “I collided with the counter and then lost my balance and fell backward.” If all went well, he’d lean either on one crutch or the counter to keep his balance while he worked. But if he’d forgotten to grab the vinegar from the cabinet, he’d have to go into walk mode, crutch over to it, and figure out how to carry the bottle back to his workstation.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Powered exoskeleton suit and crutches positioned in a modern clinical room" class="rm-shortcode" data-rm-shortcode-id="a984e71926de8dd39f35b478e1bbe279" data-rm-shortcode-name="rebelmouse-image" id="6a40f" loading="lazy" src="https://spectrum.ieee.org/media-library/powered-exoskeleton-suit-and-crutches-positioned-in-a-modern-clinical-room.jpg?id=65434518&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Sitting unused in Robert Woo’s home, his ReWalk exoskeleton reflects both the promise and the limits of early devices.  </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Robert Woo</small></p><p>Gradually, he stopped trying. The suit, which he’d once worn every day, spent more time sitting idle in the hallway; like so many abandoned treadmills and stationary bikes, it gathered dust. Part of the reason was the exoskeleton’s practical limitations, but part of it was a shocking development: In 2024, Vivian was diagnosed with an aggressive form of breast cancer. She died in November of that year, at the age of 54.</p><p>Woo was scheduled to begin a new round of clinical trials for the Wandercraft home-use exoskeleton that month. In the aftermath of Vivian’s death, he postponed his sessions and questioned whether he would ever go back. “At the time, I thought, ‘What’s the point?’” he remembers.</p><p>He did go back, though. “He just rolled up, right into my office,” says Mount Sinai’s Riccobono. “He still had Vivian’s box of ashes on his lap. That’s how fresh it was.” Woo brought the box into a meeting of spinal cord injury patients and shared the story of losing the love of his life. And he told them that he heard his wife’s voice in his head every day, telling him to get back to work. Once again, he was figuring out how to move forward with what he had.</p><h2>How Close Are We to Everyday Exoskeletons? </h2><p>In the Wandercraft showroom last May, Woo steered toward the door to the street, technicians flanking him like spotters. The slope down to the sidewalk was barely an inch high, but everyone tensed. He shifted his weight and took a step forward. The suit halted automatically. He tried again—step, stop; step, stop—as the suit kept detecting the slight decline and a safety feature kicked in. The Wandercraft isn’t yet rated for slopes of more than 2 percent, and even the gentle pitch of Park Avenue was enough to trigger its safeguards. When he finally reached the sidewalk, Woo broke into a grin. A man in the back seat of a stopped Uber leaned out his window, filming.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Knee brace with straps and a leg showing a fresh, red incision scar." class="rm-shortcode" data-rm-shortcode-id="c7d7199f6643de021a7f81d6c256876e" data-rm-shortcode-name="rebelmouse-image" id="2235b" loading="lazy" src="https://spectrum.ieee.org/media-library/knee-brace-with-straps-and-a-leg-showing-a-fresh-red-incision-scar.jpg?id=65427649&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">During testing of the Wandercraft exoskeleton, straps caused an abrasion on Robert Woo’s leg, which he documented as part of his feedback to the company.   </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Robert Woo </small></p><p>Woo had recently completed seven sessions with the Wandercraft at the VA hospital and had been impressed overall. But at the showroom, he rolled up his pants leg to reveal an abrasion on his shin, the result of a strap that had worn away a patch of skin during a long walking session. He would later send Wandercraft a nine-page assessment with photos and a technology wish list, asking the company to work on things like padding, variable walking speeds, and deeper squats.</p><p>Wandercraft’s engineers relish that kind of user feedback, says CEO <a href="https://www.linkedin.com/in/matthieu-masselin-64585537/" target="_blank">Matthieu Masselin</a>. Exoskeletons are a far more difficult engineering problem than humanoid robots, he explains. “You basically have two systems of equal importance. You know about the robot—it’s fully quantified and measured. But you don’t know what the person is doing, and how the person is moving within the device.”</p><p>Since Woo began testing exoskeletons 15 years ago, both the technology and the market have made strides. ReWalk and Ekso won FDA clearance for clinical use in the 2010s, and both now sell home-use versions. The companies have sold thousands of exoskeletons to rehab clinics and personal users, and they see room for growth; in the United States alone, about <a href="https://msktc.org/sites/default/files/Facts-and-Figures-2025-Eng-508.pdf" target="_blank">300,000 people live with spinal cord injuries</a>, and millions more have mobility impairments from stroke, multiple sclerosis, or other conditions. The VA began supplying devices to eligible veterans in 2015, and Medicare recently <a href="https://golifeward.com/blog/medicare-reimbursement-established-for-medically-eligible-beneficiaries/" target="_blank">established a system for reimbursement</a>, a move that private insurers are beginning to follow. What was once experimental is slowly becoming established.</p><p>Researchers who test the devices say the technology still has significant limits. Pal, of the New Jersey Institute of Technology, mentions battery life, dexterity, and reliability as ongoing challenges. But, he says with a laugh, “Our bodies have evolved over many millions of years—these machines will need a bit more time.” Pal hopes the companies will keep pushing the technological frontier. “My lifetime goal is to see the day when someone like Robert Woo can wake up in the morning, put this device on, and then live an ordinary life.”</p><p>For Woo, the real question about the self-balancing Wandercraft was: Could he cook with it? In the VA hospital’s home mockup, he tried it out in the kitchen, stepping sideways to retrieve items from cabinets and squatting to grab something from the fridge’s lower shelf. For the first time in years, he could work at a counter without leaning on crutches. “The self-standing exoskeleton changes everything,” he says. He imagines a user placing a Thanksgiving turkey on a tray attached to the suit and walking it into the dining room.</p><p>Back in the showroom, Woo finishes the demo and brings the suit to a seated position before transferring back to his wheelchair. After so many years of testing prototypes, he’s now realistic about the technology’s timeline. A truly all-day exoskeleton—the kind you live in, the kind that replaces a wheelchair—may be a decade or more away. “It may not be for me,” he says. But that’s no longer the point. He’s thinking about young people who are newly injured, who are lying in hospital beds and trying to imagine how their lives can continue. “This will give them hope.” <span class="ieee-end-mark"></span></p>]]></description><pubDate>Wed, 01 Apr 2026 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/exoskeleton-user-experience</guid><category>Bionics</category><category>Paralysis</category><category>Exoskeleton</category><category>Spinal-cord-injury</category><category>Assistive-technology</category><dc:creator>Eliza Strickland</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-man-wearing-a-full-body-robotic-exoskeleton-standing-on-a-city-sidewalk.png?id=65426945&amp;width=980"></media:content></item><item><title>Can Electrical Stimulation Restore Sight?</title><link>https://spectrum.ieee.org/optic-nerve-damage-electrical-stimulation</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/an-optic-nerve-wrap-recording-device-split-into-two-sections-by-a-central-gap-its-overall-structure-resembles-a-strip-of-photog.jpg?id=65327455&width=1200&height=400&coordinates=0%2C417%2C0%2C417"/><br/><br/><p>The optic nerve is like a high-speed fiber-optic<strong> </strong>cable between your eyes and your brain. But once that cable is cut, whether through trauma or <a href="https://spectrum.ieee.org/glaucoma-test" target="_blank">disease</a>, the nerve cannot be repaired and vision cannot be restored.</p><p>Some engineers are working to change that. </p><p><a href="https://jacobsschool.ucsd.edu/faculty/profile?id=351" rel="noopener noreferrer" target="_blank">Shadi Dayeh</a>, a professor of electrical and computer engineering at UC San Diego, has been developing a technology that could electrically stimulate and regenerate the optic nerve. His work is part of a multidisciplinary initiative called <a href="https://med.stanford.edu/ophthalmology/news-and-media/annual-reports/annualreport2024/vision-restoring-eye-transplants.html" rel="noopener noreferrer" target="_blank">VISION</a> (Viability, Imaging, Surgical, Immunomodulation, Ocular preservation, and Neuroregeneration) Strategies for Whole-Eye Transplant. The project aims to make vision-restoring, whole-eye transplantation a reality.</p><p>While <a href="https://nyulangone.org/news/worlds-first-whole-eye-partial-face-transplant-recipient-achieves-remarkable-recovery-viable-eye-one-year-after-landmark-surgery" rel="noopener noreferrer" target="_blank">whole-eye transplantation</a> was first achieved in 2023, the procedure cannot yet restore sight. Dayeh wants to make whole-eye transplantation “not only anatomically viable but also neurophysiologically useful,” he says. If he succeeds, transplant recipients will actually be able to see out of their new eye.</p><p>“The optic nerve is the main highway between the eye and the brain. It’s also one of the hardest pathways to repair,” Dayeh says. “So, from an engineering point of view, it’s a major challenge and a major opportunity.”</p><p>But before they can reconnect the optic nerve to the brain, Dayeh’s team first has to understand how these two parts of our bodies communicate. Recently, the team  completed what Dayeh calls “a foundational step:” mapping how changes in light, color, and frequency affect the visual axis, from the retina to the optic nerve and the brain.</p><h2>Learning a visual language</h2><p>The optic nerve is small, but mighty. </p><p>An average adult’s optic nerve is only about 4.5 to 5 centimeters long and roughly 0.5 centimeter wide. But a cross-section of the optic nerve holds over a million axons, the threadlike projections of nerve cells that conduct electrical impulses.</p><p>“The optic nerve is very small and delicate,” Dayeh says. “It’s a densely packed cable that carries an enormous information bandwidth—probably the densest bandwidth cable in our nervous system.”</p><p>To understand exactly how this delicate cable transmits visual information, Dayeh’s team has developed biocompatible electrode arrays that wrap around the optic nerve and sit on the visual cortex (the part of the brain that processes visual information) in animal and cadaver studies.<strong></strong></p><p>The arrays send electrical pulses across the visual pathway, from the optic nerve to the brain, and record the eye’s and brain’s responses to electrical and visual stimulation. This means the team can see how the optic nerve reads certain visual signals—such as changes in light, color, and contrast—how the optic nerve sends these messages to the brain, and how the brain interprets them. </p><p>“It’s like a distributed set of sensors in a communication system,” Dayeh says.</p><p>As the technology collects high-resolution data, the team maps the optic nerve and visual cortex to understand what Dayeh calls “the language of the visual pathway”—how visual signals get encoded in the optic nerve and represented in the visual cortex. “The idea is not just to record, but to build a code book across the visual pathway.”</p><p>The optic nerve isn’t a straight, uniform cylinder. Its diameter varies along its curving structure. That’s why Dayeh’s team developed electrode arrays that are ultrathin and flexible, ensuring stable placement, “like an electronic skin on the surface of the neural tissue,” Dayeh says.</p><p>Adding to the difficulty is the very tricky matter of charging optic and brain tissue. “The visual system is not like a muscle that you can electrically shock and then see what happens,” Dayeh says. </p><p>To avoid heating the tissue, Dayeh’s system maintains careful control of the density and spatial spread of the electrical charges. “The thermal load is very important for safety,” he says. “Much of our earlier engineering work went into electrode materials and geometries that can inject charge effectively and safely.”</p><h2>Regenerating the optic nerve</h2><p>Understanding the visual pathway’s language is one piece of a larger puzzle. Now that they have successfully mapped optic nerve and visual cortex signals, Dayeh’s team is investigating how their technology can help a severed optic nerve regenerate.</p><p>To that end, the electrode interface technology very precisely applies and records controlled, localized electrical stimulations to the optic nerve in order to determine where and how much stimulation can spur regeneration.</p><p>“The stimulation is not a magic switch,” Dayeh explains. “It’s a precision tool that assists and accelerates the biological processes of regenerating the neural pathway.”</p><p>Dayeh’s work contributes to several efforts aimed at restoring sight, which he considers “one of the most ambitious challenges in regenerative medicine and neurotechnology.” While Dayeh’s team measures, maps, and potentially guides the reconnection between the eye and the brain, other approaches include neuroprotection, or preserving the vision cells and circuits before they’re lost, and <a href="https://spectrum.ieee.org/neuralink-blindsight" target="_blank">visual prosthetics</a> and neural byass systems, which restore sight by delivering information directly to the retina, optic nerve, or visual cortex when the natural pathway cannot function.</p><p>Dayeh cautions that optic nerve regeneration is a developing field, and much is as yet unknown. Still, research has shown that, when active, cells can survive longer and can better integrate with surrounding tissue. Dayeh’s technology activates cells electrically. <span>“In a simple sense,” he says, “our goal is to activate the cells so they survive longer.”</span></p><p>For now, optic-nerve regeneration technology is being tested in animals to show that a cut optic nerve can grow axons to the brain and restore vision. Dayeh anticipates that in perhaps three years, after rigorous tests and studies on the technology’s efficacy and safety, studies of the novel technology could be conducted for the first time in humans.</p>]]></description><pubDate>Thu, 26 Mar 2026 13:00:06 +0000</pubDate><guid>https://spectrum.ieee.org/optic-nerve-damage-electrical-stimulation</guid><category>Vision</category><category>Nerve-stimulation</category><category>Eyesight-technologies</category><category>Organ-transplants</category><category>Neurotechnology</category><dc:creator>Novid Parsi</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/an-optic-nerve-wrap-recording-device-split-into-two-sections-by-a-central-gap-its-overall-structure-resembles-a-strip-of-photog.jpg?id=65327455&amp;width=980"></media:content></item><item><title>How Your Virtual Twin Could One Day Save Your Life</title><link>https://spectrum.ieee.org/living-heart-project-virtual-twins</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/two-color-coded-computer-simulations-of-a-human-heart-the-simulation-on-the-left-shows-the-muscle-structure-and-the-simulation.png?id=65278129&width=1200&height=400&coordinates=0%2C415%2C0%2C416"/><br/><br/><p><strong>One morning in May </strong>2019, a cardiac surgeon stepped into the operating room at Boston Children’s Hospital more prepared than ever before to perform a high-risk procedure to rebuild a child’s heart. The surgeon was experienced, but he had an additional advantage: He had already performed the procedure on this child dozens of times—virtually. He knew exactly what to do before the first cut was made. Even more important, he knew which strategies would provide the best possible outcome for the child whose life was in his hands.</p><p>How was this possible? Over the prior weeks, the hospital’s surgical and cardio-engineering teams had come together to build a fully functioning model of the child’s heart and surrounding vascular system from MRI and CT scans. They began by carefully converting the medical imaging into a 3D model, then used physics to bring the 3D heart to life, creating a dynamic <a href="https://spectrum.ieee.org/virtual-hearts-improve-cardiac-surgery" target="_self">digital replica</a> of the patient’s physiology. The mock-up reproduced this particular heart’s unique behavior, including details of blood flow, pressure differentials, and muscle-tissue stresses.</p><p>This type of model, known as a virtual twin, can do more than identify medical problems—it can provide detailed diagnostic insights. In Boston, the team used the model to predict how the child’s heart would respond to any cut or stitch, allowing the surgeon to test many strategies to find the best one for this patient’s exact anatomy.</p><p>That day, the stakes were high. With the patient’s unique condition—a heart defect in which large holes between the atria and ventricles were causing blood to flow between all four chambers—there was no manual or textbook to fully guide the doctors. The condition strains the lungs, so the doctors planned an open-heart surgery to reroute deoxygenated blood from the lower body directly to the lungs, bypassing the heart. Typically with this kind of surgery, decisions would be made on the fly, under demanding conditions, and with high uncertainty. But in this case, the plan had been tested in advance, and the entire team had rehearsed it before the first incision. The surgery was a complete success.</p><p>Such procedures have become routine at the Boston hospital. Since that first patient, nearly 2,000 procedures have been guided by virtual-twin modeling. This is the power of the technology behind the <a href="https://www.3ds.com/3dexperiencelab/portfolio/living-heart" rel="noopener noreferrer" target="_blank">Living Heart Project</a>, which I launched in 2014, five years before that first procedure. The project started as an exploratory initiative to see if modeling the human heart was possible. Now with more than 150 member organizations across 28 countries, the project includes dozens of multidisciplinary teams that regularly use multiscale virtual twins of the heart and other vital organs.</p><p>This technology is reshaping how we understand and treat the human body. To reach this transformative moment, we had to solve a fundamental challenge: building a digital heart accurate enough—and trustworthy enough—to guide real clinical decisions.</p><h2>A father’s concern</h2><p>Now entering its second decade, the Living Heart Project was born in part from a personal conviction. For many years, I had watched helplessly as my daughter Jesse faced endless diagnostic uncertainty due to a <a href="https://doi.org/10.1016/B978-1-4557-0599-3.00039-9" rel="noopener noreferrer" target="_blank">rare congenital heart condition</a> in which the position of the ventricles is reversed, threatening her life as she grew. As an engineer, I understood that the heart was an array of pumping chambers, controlled by an electrical signal and its blood flow carefully regulated by valves. Yet I struggled to grasp the unique structure and behavior of my daughter’s heart well enough to contribute meaningfully to her care. Her specialists knew the bleak forecast children like her faced if left untreated, but because every heart with her condition is anatomically unique, they had little more than their best guesses to guide their decisions about what to do and when to do it. With each specialist, a new guess.</p><p>Then my engineering curiosity sparked a question that has guided my career ever since: Why can’t we simulate the human body the way we <a href="https://spectrum.ieee.org/selfdriving-cars-learn-about-road-hazards-through-augmented-reality" target="_self">simulate a car</a> or a plane?</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="woman facing away and looking at a wall where the simulated interior of a heart is projected" class="rm-shortcode" data-rm-shortcode-id="442abe00bb6d81b4be0ad13e4ec3880e" data-rm-shortcode-name="rebelmouse-image" id="09f25" loading="lazy" src="https://spectrum.ieee.org/media-library/woman-facing-away-and-looking-at-a-wall-where-the-simulated-interior-of-a-heart-is-projected.png?id=65301974&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">At a visualization center in Boston, VR imagery helps the mother of a young girl with a complex heart defect understand the inner workings of her child’s heart. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Dassault Systèmes</small></p><p>I had spent my career developing powerful computational tools to help engineers build digital models of complex mechanical systems, using models that ranged from the interactions of individual atoms to the components of entire vehicles. What most of these models had in common was the use of physics to predict behavior and optimize performance. But in medicine today, those same physics-based approaches rarely inform decision-making. In most clinical settings, treatment decisions still hinge on judgments drawn from static 2D images, statistical guidelines, and retrospective studies.</p><p>This was not always the case. Historically, physics was central to medicine. The word “physician” itself traces back to the Latin <em><em>physica</em></em>, which translates to “natural science.” Early doctors were, in a sense, applied physicists. They understood the heart as a pump, the lungs as bellows, and the body as a dynamic system. To be a physician meant you were a master of physics as it applied to the human body.</p><p>As medicine matured, biology and chemistry grew to dominate the field, and the knowledge of physics got left behind. But for patients like my daughter, that child in Boston, and millions like them, outcomes are governed by mechanics. No pill or ointment—no chemistry-based solution—would help, only physics. While I did not realize it at the time, virtual twins can reunite modern physicians with their roots, using engineering principles, simulation science, and artificial intelligence.</p><h2>A decade of progress</h2><p>The LHP concept was simple: Could we combine what hundreds of experts across many specialties knew about the human heart to build a digital twin accurate enough to be trusted, flexible enough to personalize, and predictive enough to guide clinical care?</p><p>We invited researchers, clinicians, device and drug companies, and government regulators to share their data, tools, and knowledge toward a common goal that would lift the entire field of medicine. The Living Heart Project launched with a dozen or so institutions on board. Within a year, we had created the first fully functional virtual twin of the human heart.</p><p>The Living Heart was not an anatomical rendering, tuned to simply replicate what we observed. It was a first-principles model, coupling the network of fibers in the <a href="https://spectrum.ieee.org/medtronics-cardioinsight-electrode-vest-maps-hearts-electrical-system" target="_self">heart’s electrical system</a>, the biological battery that keeps us alive, with the heart’s mechanical response, the muscle contractions that we know as the heartbeat.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="85d721660928d134fc0039fb17d76716" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ae_IqlxgCME?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">The Living Heart virtual twin simulates how the heart beats, offering different views to help scientists and doctors better predict how it will respond to disease or treatment. The center view shows the fine engineering mesh, the detailed framework that allows computers to model the heart’s motion. The image on the right uses colors to show the electrical wave that drives the heartbeat as it conducts through the muscle, and the image on the left shows how much strain is on the tissue as it stretches and squeezes. </small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Dassault Systèmes</small> </p><p>Academic researchers had long explored computational models of the heart, but those projects were typically limited by the technology they had access to. Our version was built on industrial-grade simulation software from <a href="https://www.3ds.com/" target="_blank">Dassault Systèmes</a>, a company best known for modeling tools used in aerospace and automotive engineering, where I was working to develop the engineering simulation division. This platform gave teams the tools to personalize an individual heart model using the patient’s MRI and CT data, blood-pressure readings, and echocardiogram measurements, directly linking scans to simulations.</p><p>Surgeons then began using the Living Heart to model procedures. Device makers used it to design and test implants. Pharmaceutical companies used it to evaluate drug effects such as toxicity. Hundreds of publications have emerged from the project, and because they all share the same foundation, the findings can be reproduced, reused, and built upon. With each application, the research community’s understanding of the heart snowballed.</p><p>Early on, we also addressed an essential requirement for these innovations to make it to patients: regulatory acceptance. Within the project’s first year, the U.S Food and Drug Administration <a href="https://www.3ds.com/newsroom/press-releases/dassault-systemes-signs-research-agreement-food-and-drug-administration-its-living-heart-project" target="_blank">agreed to join the project</a> as an observer. Over the next several years, methods for using virtual-heart models as scientific evidence began to take shape within regulatory research programs. In 2019, we formalized a second five-year collaboration with the FDA’s Center for Devices and Radiological Health with a specific goal.</p><p>That goal was to use the heart model to create a virtual patient population and re-create a pivotal trial of a previously approved device for repairing the heart’s mitral valve. This helped our team learn how to create such a population, and let the FDA experiment with evaluating virtual evidence as a replacement for evidence from flesh-and-blood patients. In August 2024, we <a href="https://pubmed.ncbi.nlm.nih.gov/39188879/" target="_blank">published the results</a>, creating the first FDA-led guidelines for in silico clinical trials and establishing a new paradigm for streamlining and reducing risk in the entire clinical-trial process.</p><p>In 10 years, we went from a concept that many people doubted could be achieved to regulatory reality. But building the heart was only the beginning. Following the template set by the heart team, we’ve expanded the project to develop virtual twins of other organs, including the lungs, liver, brain, eyes, and gut. Each corresponds to a different medical domain, which has its own community, data types, and clinical use cases. Working independently, these teams are progressing toward a breakthrough in our understanding of the human body: a multiscale, modular twin platform where each organ twin could plug into a unified virtual human.</p><h2>How a digital twin of the heart is constructed</h2><p>A cardiac digital twin starts with medical imaging, typically MRI, CT, or both. The slices are reconstructed into the 3D geometry of the heart and connected vessels. The geometry of the whole organ must then be segmented into its constituent parts, so each substructure—atria, ventricles, valves, and so on—can be assigned their unique properties.</p><p>At this point, the object is converted to a functional, computational model that can represent how the various cardiac tissues deform under load—the mechanics. The complete digital twin model becomes “living” when we integrate the electrical fiber network that drives mechanical contractions in the muscle tissue.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="two computer simulations of a heart. The simulation on left shows the left ventricle with a triangular grid across the 3D surface. The simulation on right shows the exterior of a heart including vasculature and fat. " class="rm-shortcode" data-rm-shortcode-id="8b175dd3f95e87ac7f36ab39b38f9784" data-rm-shortcode-name="rebelmouse-image" id="deda7" loading="lazy" src="https://spectrum.ieee.org/media-library/two-computer-simulations-of-a-heart-the-simulation-on-left-shows-the-left-ventricle-with-a-triangular-grid-across-the-3d-surfac.png?id=65301904&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Each part of the heart, such as the left ventricle [left], is superimposed with a detailed digital mesh to re-create its physiology. These pieces come together to form an anatomically accurate rendering of the whole organ [right].</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Dassault Systèmes</small></p><p>To simulate circulation, the twin adds computational models of hemodynamics, the physics of blood flow and pressure. The model is constrained by boundary conditions of blood flow, valve behavior, and vascular resistance set to closely match human physiology. This lets the model predict blood flow patterns, pressure differentials, and tissue stresses.</p><p>Finally, the model is personalized and calibrated using available patient data, such as how much the volume of the heart chambers changes during the cardiac cycle, pressure measurements, and the timing of electrical pulses. This means the twin reflects not only the patient’s anatomy but how their specific heart functions.</p><h2>Building bigger cohorts with generative AI</h2><p>When the <a href="https://discover.3ds.com/fda-enrichment-clinical-trial" target="_blank">FDA in silico clinical trial initiative</a> launched in 2019, the project’s focus shifted from these handcrafted virtual twins of specific patients to cohorts large enough to stand in for entire trial populations. That scale is feasible today only because virtual twins have converged with generative AI. Modeling thousands of patients’ responses to a treatment or projecting years of disease progression is prohibitively slow with conventional digital-twin simulations. Generative AI removes that bottleneck.</p><p>AI boosts the capability of virtual twins in two complementary ways. First, machine learning algorithms are unrivaled at integrating the patchwork of imaging, sensor, and clinical records needed to build a high-fidelity twin. The algorithms rapidly search thousands of model permutations, benchmark each against patient data, and converge on the most accurate representation. Workflows that once required months of manual tuning can now be completed in days, making it realistic to spin up population-scale cohorts or to personalize a single twin on the fly in the clinic.</p><p>Second, enriching AI models’ training sets with data from validated virtual patients grounds the AI simulations in physics. By contrast, many conventional AI predictions for patient trajectories rely on statistical modeling trained on retrospective datasets. Such models can drift beyond physiological reality, but virtual twins anchor predictions in the laws of hemodynamics, electrophysiology, and tissue mechanics. This added rigor is indispensable for both research and clinical care—especially in areas where real-world data are scarce, whether because a disease is rare or because certain patient populations, such as children, are underrepresented in existing datasets.</p><h2>Enabling in silico clinical trials</h2><p>On the research side, the FDA-sponsored In Silico Clinical Trial Project that we completed in 2024 opened a new world for medical innovations. A conventional clinical trial may take a decade, and 90 percent of new drug treatments fail in the process. Virtual twins, combined with AI methods, allow researchers to design and test treatments quickly in a simulated human environment. With a small library of virtual twins, AI models can rapidly create expansive virtual patient cohorts to cover any subset of the general population. As clinical data becomes available, it can be added into the training set to increase reliability and enable better predictions.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="3D simulations of the brain, foot, and lungs. A quadrant of the brain is cut out, showing a dense network of connections between color-coded sections of the brain. The foot shows a gray outline of bones and points of soft tissue strain in red at the ankle and heel. In the lung model, the trachea is colored green flowing into blue bronchi. " class="rm-shortcode" data-rm-shortcode-id="6c65f028c501081d47120dbb37f2d816" data-rm-shortcode-name="rebelmouse-image" id="90af6" loading="lazy" src="https://spectrum.ieee.org/media-library/3d-simulations-of-the-brain-foot-and-lungs-a-quadrant-of-the-brain-is-cut-out-showing-a-dense-network-of-connections-between.png?id=65302220&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The Living Heart Project has expanded beyond the heart, modeling organs throughout the body. The 3D brain reconstruction [top] shows major pathways in the brain’s white matter connecting color-coded regions of the brain. The lung virtual twin [middle] combines the organ’s geometry with a physics-based simulation of air flowing down the trachea and into the bronchi. And the cross section of a patient’s foot [bottom] shows points of strain in the soft tissue when bearing weight. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Dassault Systèmes</small></p><p>Virtual twin cohorts can represent a realistic population by building individual “virtual patients” that vary by age, gender, race, weight, disease state, comorbidities, and lifestyle factors. These twins can be used as a rich training set for the AI model, which can expand the cohort from dozens to hundreds of thousands. Next the virtual cohort can be filtered to identify patients likely to respond to a treatment, increasing the chances of a successful trial for the target population.</p><p>The trial design can also include a sampling of patient types less likely to respond or with elevated risk factors, thus allowing regulators and clinicians to understand the risks to the broader population without jeopardizing overall trial success. This methodology enhances precision and efficiency in clinical research, providing population-level insights previously available only after many years of real-world evidence.</p><p>Of course, though today’s heart digital twins are powerful, they’re not perfect replicas. Their accuracy is bounded by three main factors: what we can measure (for example, image resolution or the uncertainty of how tissue behaves in real life), what we must assume about the physiology, and what we can validate against real outcomes. Many inputs, like scarring, microvascular function, or drug effects are difficult to capture clinically, so models often rely on population data or indirect estimation. That means predictions can be highly reliable for certain questions but remain less certain for others. Additionally, today’s digital twins lack validation for predicting long-term outcomes years in the future, because the technology has been in use for only a few years.</p><p>Over time, each of these limitations will steadily shrink. Richer, more standardized data will tighten personalization of the models. AI tools will help automate labor-intensive steps. And the collection of longitudinal data will improve the model’s ability to reliably predict how the body will evolve over time.</p><h2>How virtual twins will change health care</h2><p>Throughout modern medicine, new technologies have sharpened our ability to <a href="https://spectrum.ieee.org/ai-doctor" target="_self">diagnose</a>, providing ever-clearer images, lab data, and analytics that tell physicians what is presently happening inside a patient’s body. Virtual twins shift that paradigm, giving clinicians a predictive tool.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" rel="float: left;" style="float: left;"> <img alt="gif of a lung simulation. The lungs are blue when deflated then grow and become green with points of red. " class="rm-shortcode" data-rm-shortcode-id="99cdfc0b66a34d7bf081125259464d73" data-rm-shortcode-name="rebelmouse-image" id="499fe" loading="lazy" src="https://spectrum.ieee.org/media-library/gif-of-a-lung-simulation-the-lungs-are-blue-when-deflated-then-grow-and-become-green-with-points-of-red.gif?id=65302107&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">This “Living Lung” virtual-twin simulation shows strain patterns during breathing. </small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">            Mona Eskandari/UC Riverside        </small> </p><p>Early demonstrations are already appearing in many areas of medicine, including cardiology, orthopedics, and oncology. Soon, doctors will also be able to collaborate across specialties, using a patient-specific virtual twin as the common ground for discussing potential interactions or side effects they couldn’t predict independently.</p><p>Although these applications will take some time to become the standard in clinical care, more changes are on the horizon. Real-time <a href="https://spectrum.ieee.org/wearable-health-data-standards" target="_self">data from wearables</a>, for example, could continuously update a patient’s personalized virtual twin. This approach could empower patients to understand and engage more deeply in their care, as they could see the direct effects of medical and lifestyle changes. In parallel, their doctors could get comprehensive data feeds, using virtual twins to monitor progress.</p><p><span>Imagine a digital companion that shows how your particular heart will react to different amounts of salt intake, stress, or sleep deprivation. Or a visual explanation of how your upcoming surgery will affect your circulation or breathing. Virtual twins could demystify the body for patients, fostering trust and encouraging proactive health decisions.</span></p><h3>How are virtual twins being used in medicine?</h3><br/><ul><li>Virtual twins have guided <strong>cardiovascular surgeries</strong>, providing predictions and exposing hidden details that even expert clinicians might miss, such as subtle tissue responses and flow dynamics.<br/></li><li><strong>Oncologists</strong> are modeling tumor growth and the body’s response to different therapies, reducing the uncertainty in choosing the best treatment path for both medical and quality-of-life metrics.<br/></li><li><strong>Orthopedic</strong> specialists are personalizing implants to deliver custom-made solutions, considering not only the local environment but also the overall body kinematics that will govern long-term outcomes.</li></ul><h2>A new era of healing</h2><p>With the Living Heart Project, we’re bringing physics back to physicians. Modern physicians won’t need to be physicists, any more than they need to be chemists to use pharmacology. However, to benefit from the new technology, they will need to adapt their approach to care.</p><p>This means no longer seeing the body as a collection of discrete organs and considering only symptoms, but instead viewing it as a dynamic system that can be understood, and in most cases, guided toward health. It means no longer guessing what might work but knowing—because the simulation has already shown the result. By better integrating engineering principles into medicine, we can redefine it as a field of precision, rooted in the unchanging laws of nature. The modern physician will be a true physicist of the body and an engineer of health. <span class="ieee-end-mark"></span></p>]]></description><pubDate>Thu, 19 Mar 2026 12:00:05 +0000</pubDate><guid>https://spectrum.ieee.org/living-heart-project-virtual-twins</guid><category>Cardiology</category><category>Digital-twins</category><category>Personalized-medicine</category><category>Virtual-heart</category><category>Generative-ai</category><dc:creator>Steve Levine</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/two-color-coded-computer-simulations-of-a-human-heart-the-simulation-on-the-left-shows-the-muscle-structure-and-the-simulation.png?id=65278129&amp;width=980"></media:content></item><item><title>Lab-On-a-Chip Grippers Could Handle Human Cells</title><link>https://spectrum.ieee.org/lab-on-a-chip-grippers</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/micrograph-of-a-chip-with-a-micro-cage-array-featuring-claw-like-grippers.jpg?id=65284233&width=1200&height=400&coordinates=0%2C471%2C0%2C471"/><br/><br/><p>Living cells and tissues grown in the lab are vital tools for helping scientists learn about basic biology and test new drugs. Growing miniature organs on a chip from a person’s stem cells could even one day help doctors test <a href="https://spectrum.ieee.org/the-ultimate-in-personalized-medicine-your-body-on-a-chip" target="_blank">personalized treatments</a>. </p><p>Now, researchers have developed a lab-on-chip that adds a new feature to these systems: low-power grippers that can hold cells or tiny organ models called organoids in place. The CMOS-compatible lab-on-a-chip features shape-memory grippers and chemical sensors for detecting molecules such as neurotransmitters. The micro-cage array was <a href="https://submissions.mirasmart.com/ISSCC2026/Itinerary/PresentationDetail.aspx?evdid=126" rel="noopener noreferrer" target="_blank">presented</a> in San Francisco on 18 February at the IEEE International Solid State Circuits Conference.</p><p>Researchers working on this multifunctional system hope it will be used to sense and manipulate biological samples of different sizes and potentially help direct the development of stem cells into organoids, which are used to study basic biology and drugs. Growing <a href="https://spectrum.ieee.org/organoid-intelligence-computing-on-brain" target="_blank">neural organoids</a> in lab-on-a-chip systems, for instance, can help biologists study brain development and how it’s impacted by chemicals or drugs. Cage-like grippers could be used to hold samples in place, or to bring tissue samples next to each other to encourage their development.</p><p>Building bioelectronic systems directly on a chip is attractive because it makes it easy to integrate many different features, including chemical sensing, electrical sensing and stimulation, and physical manipulation. However, manipulating biological samples on CMOS chips can be tricky, says <a href="https://ee.ethz.ch/the-department/people-a-z/person-detail.MzExMDIy.TGlzdC8zMjc5LC0xNjUwNTg5ODIw.html" rel="noopener noreferrer" target="_blank">Adam Wang</a>, an electrical engineer at ETH Zurich. Optical and acoustic tweezers, for example, can heat up, while the electric fields used to generate motion in <a href="https://www.sciencedirect.com/topics/biochemistry-genetics-and-molecular-biology/dielectrophoresis" target="_blank">dieletrophoresis</a> can be weakened by high concentrations of ions in the media used to support cells and tissues. These methods also require continuous power inputs. Wang presented the research on behalf of lead student <a href="https://ee.ethz.ch/the-department/people-a-z/person-detail.MjY1NDY3.TGlzdC8zMjc5LC0xNjUwNTg5ODIw.html" rel="noopener noreferrer" target="_blank">Zhikai Huang</a>, who was unable to attend.<strong></strong></p><h2>How the Microcages</h2><p>The ETH chip integrates tiny grippers to “cage” biological samples. These grippers are based on so-called <a href="https://spectrum.ieee.org/tag/shape-memory-alloy" target="_blank">shape-memory alloys</a>, layered metal structures that change their shape in response to electric signals, then hold that shape without the need for any additional power.</p><p>The ETH chip holds an array of nine sets of microcages, along with control electrodes and electrodes for chemical sensing. At each spot on the array, cages of three different sizes are nested together like rows of concentric flower petals. Their arms are 100, 150, and 280 micrometers long. The smallest might be used to grab single cells while the largest is designed to grapple with whole organoids.</p><p>The arms are made of layered platinum and titanium. Each of the three different sized sets has its own dedicated control electrode. In response to the polarity and magnitude of a signal, the cage arms will either bend and curl upward or flatten back down onto the surface. The electric signal triggers the movement by changing the electrochemical state of the platinum. Once the cages change shape, they stay in place with no additional power, unless they receive an electrical order to open or close again. <span>The array includes electrochemical sensors in the form of electrodes made of gold, platinum, and palladium. Using different electrode materials with different properties enhances the sensitivity of the system, says Wang. And all these materials</span><span> can operate in electrolytes,</span><span> including the cell culture media that help sustain biological cells and tissues in the lab. </span></p><p>At the conference, Wang presented the circuit design, and initial tests using the cages to grip onto glass beads and measure concentrations of ferrocyanide, a chemical commonly used to test lab-on-a-chip sensors. Next, they hope to demonstrate that the array can delicately handle biological cells and organoids, and measure biochemicals such as neurotransmitters. Wang says future versions of the CMOS platform could integrate more electrodes for electrical sensing and stimulation of nerve cells.</p>]]></description><pubDate>Sat, 14 Mar 2026 13:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/lab-on-a-chip-grippers</guid><category>Lab-on-a-chip</category><category>Neuroscience</category><category>Cmos</category><category>Isscc</category><category>Shape-memory-alloy</category><dc:creator>Katherine Bourzac</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/micrograph-of-a-chip-with-a-micro-cage-array-featuring-claw-like-grippers.jpg?id=65284233&amp;width=980"></media:content></item><item><title>This RF Tag Is Lighter Than a Dewdrop</title><link>https://spectrum.ieee.org/rf-tags-wasps</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/close-up-of-a-gloved-hand-holding-a-live-wasp-with-a-miniature-circuit-board-on-its-back.jpg?id=65164229&width=1200&height=400&coordinates=0%2C292%2C0%2C292"/><br/><br/><p>Scientists don’t know much about how insects spend their time, but it’s well worth finding out. Insects play key roles in food webs and pollinate our crops, and social insects have a lot to teach us about the basics of friendship formation and communication. An ultralightweight <a href="https://spectrum.ieee.org/wi-fi-lora-hybrid" target="_blank">radio-frequency tag</a> designed to be worn by a paper wasp may help scientists get a glimpse at some basic behavioral information that’s long been missing: Where do the animals go when they leave the nest?</p><p>The tag is just 20 milligrams—about one third the weight of a drop of water. It was <a href="https://submissions.mirasmart.com/ISSCC2026/Itinerary/PresentationDetail.aspx?evdid=53" rel="noopener noreferrer" target="_blank">presented</a> on 18 February at the IEEE <a href="https://www.isscc.org/" rel="noopener noreferrer" target="_blank">International Solid State Circuits Conference</a> in San Francisco by doctoral student <a href="https://blaauw.engin.umich.edu/people/yi-shen-2/" rel="noopener noreferrer" target="_blank">Yi Shen</a>, who works in the lab of University of Michigan electrical engineer <a href="https://blaauw.engin.umich.edu/" rel="noopener noreferrer" target="_blank">David Blaauw</a>. University of Michigan computer scientist <a href="https://midas.umich.edu/directory/hun-seok-kim/" rel="noopener noreferrer" target="_blank">Hun-Seok Kim</a> developed localization algorithms to help spot the tag. Their challenge was to make an ultralightweight transmitter that had sufficient range (1.45 kilometers) and accuracy (0.9 meters) to locate these tiny insects.</p><p>They’re not the only ones trying to make more accurate, less intrusive trackers for small critters. <a href="https://celltracktech.com/" target="_blank">Cellular Tracking Technologies</a> (CTT) of Cape May, N.J., sells a 60-mg tracker that’s being used to follow the <a href="https://celltracktech.com/pages/project-monarch-collaboration-2025" rel="noopener noreferrer" target="_blank">migration patterns</a> of Monarch butterflies. This tracker uses photovoltaics paired with a capacitor and transmits a Bluetooth signal. Anyone can download an app to help track the butterflies. Other versions of the tracker are designed to be worn by nocturnal bats and are fitted with batteries. To track birds that move during the night as well as during the day, CTT makes systems that combine photovoltaics with a rechargeable battery.</p><h2>What Wasps Want</h2><p>But even 60 mg would weigh down a wasp. “Every animal that has been tracked is much bigger than a wasp,” says <a href="https://sites.lsa.umich.edu/tibbetts-lab/" rel="noopener noreferrer" target="_blank">Elizabeth Tibbetts</a>, who studies their behavior and evolution at the University of Michigan. Tibbetts advised Blaauw on their design.</p><p>Honeybees and butterflies get a lot of attention, but “people forget to love wasps,” Tibbetts says. Paper wasps are a gardener’s friend. These pollinators eat nectar and prey on caterpillars. And they don’t typically sting humans.</p><p>They also have complex social lives and can even recognize each other’s faces. Tibbetts says life is different when you know that one wasp is Diana and the other is Susan, as opposed to a life where “everyone is just another wasp.” Wasps form friendships and partnerships, though some are loners. When they come out of hibernation in the spring, aggregations of about 10 wasps hang out, fight, scope each other out, and decide which others to join up with in cooperative groups. Some decide not to join a group.</p><p>Tibbetts says she and other researchers have been able to watch these complex behaviors because wasps usually return to their nests. Wasp researchers identify individuals by putting colored dots on them. “We don’t know anything about what they do when they’re not at their nests,” she says. Sometimes they don’t come back. Did Susan die, start her own nest, or join up with a different nest? With the right kind of tracker, Tibbetts hopes to find out.</p><p>Paper wasps weigh about 125 milligrams. They can carry heavy loads, ferrying caterpillars back to their nests. But Blaauw and Shen sought to keep the tag as light as possible, so that the animals can forage freely. They also had to make sure it would not interfere with the wasp’s aerodynamics, so it needed to be small in addition to lightweight.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="c56170af8202a795f1150ffa52a32a26" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/i59HuLkbdVg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Brendan Casey</small></p><p>Getting the right combination of <a href="https://spectrum.ieee.org/specksize-computers-now-with-deep-learning" target="_blank">light weight</a>, long range, and positional accuracy was key. Jettisoning the battery was the first step. “Batteries don’t scale,” says Blaauw. A <a href="https://spectrum.ieee.org/microbots" target="_blank">miniaturized battery</a> can’t provide enough current to generate a strong radio signal. Capacitors, which store energy by accumulating charges on surfaces, do better at small scales, Blaauw says. “Really small capacitors can store enough charge now to send a radio pulse,” he says. The capacitor used in the wasp tag weighs just 0.86 mg. A tiny photovoltaic array slowly charges up the capacitor until it has enough energy to generate a radio signal.</p><p>The need to aggressively miniaturize the entire system created constraints on the circuit design, Shen says. During transmission, the signal can interfere with other parts of the circuit, including the controller and oscillator. So these parts are isolated from the rest of the circuit during transmission. Blaauw says designing the circuit for a specific biological application led them to come up with new design ideas that would not have occurred to them otherwise. “This problem led us to circuit innovations,” says Blaauw.</p><p><a href="https://celltracktech.com/pages/team" rel="noopener noreferrer" target="_blank">Michael Lanzone</a>, a behavioral biologist and CEO of CTT, says the wasp tag is impressive. “A tag that weight gives the rest of us something to push for,” he says.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Close-up of a miniature program board. Its chip is equipped with a  loop antenna." class="rm-shortcode" data-rm-shortcode-id="7c340ff2c746cfe9aacbb95bb33df023" data-rm-shortcode-name="rebelmouse-image" id="3b9cb" loading="lazy" src="https://spectrum.ieee.org/media-library/close-up-of-a-miniature-program-board-its-chip-is-equipped-with-a-loop-antenna.jpg?id=65164244&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The 9-square-millimeter tag is attached to circuit board for programming.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Yi Shen and David Blaauw</small></p><p>Shen says since paper wasps are active only in the warmer months, the team rushed to test their transmitter on one of the pollinators in time to submit their work to ISSCC. In addition to circuit designs, they used CT scans of a wasp to make sure the tag would fit on the insect and would be unlikely to interfere with its aerodynamics. A collaborator in the biology department put on two pairs of gloves to block the creature’s stinger and affixed the tag. The team took the animal outside, and it rapidly flew out of sight while they tracked it for about a kilometer and a half. So far, so good. This summer, they hope to conduct more tests.</p><p>Lanzone says he hopes the University of Michigan technology gets funding and further develops the tag to get it in the hands of researchers. “There’s a lot of cool tech that comes out of university labs, but then you don’t hear about it again. I’m excited to see if they can expand it to the next level.”</p><p>“I hope this thing works—it’s going to be so fun to use on wasps,” says Tibbetts.</p>]]></description><pubDate>Mon, 09 Mar 2026 13:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/rf-tags-wasps</guid><category>Animals</category><category>Isscc</category><category>Radio-frequency</category><category>Rf-design</category><category>Agriculture</category><dc:creator>Katherine Bourzac</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/close-up-of-a-gloved-hand-holding-a-live-wasp-with-a-miniature-circuit-board-on-its-back.jpg?id=65164229&amp;width=980"></media:content></item><item><title>The Millisecond That Could Change Cancer Treatment</title><link>https://spectrum.ieee.org/flash-radiotherapy</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/photo-of-a-man-in-a-lab-coat-adjusting-a-large-piece-of-medical-equipment-thats-pointed-at-the-head-of-a-partial-mannequin.jpg?id=65111419&width=1200&height=400&coordinates=0%2C428%2C0%2C428"/><br/><br/><p><strong>Inside a cavernous hall</strong> at the Swiss-French border, the air hums with high voltage and possibility. From his perch on the wraparound observation deck, physicist <a href="https://www.researchgate.net/profile/Walter-Wuensch" rel="noopener noreferrer" target="_blank">Walter Wuensch</a> surveys a multimillion-dollar array of accelerating cavities, klystrons, modulators, and pulse compressors—hardware being readied to drive a new generation of linear particle accelerators.</p><p>Wuensch has spent decades working with these machines to crack the deepest mysteries of the universe. Now he and his colleagues are aiming at a new target: cancer. Here at <a href="https://home.cern/" rel="noopener noreferrer" target="_blank">CERN</a> (the European Organization for Nuclear Research) and other particle-physics labs, scientists and engineers are applying the tools of fundamental physics to develop a technique called FLASH radiotherapy that offers a radical and counterintuitive vision for treating the disease.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Photo of a white-haired man standing next to floor-to-ceiling experimental equipment with many tubes and wires. " class="rm-shortcode" data-rm-shortcode-id="ce95648ce39bd5c09f73bddf6af75766" data-rm-shortcode-name="rebelmouse-image" id="f8147" loading="lazy" src="https://spectrum.ieee.org/media-library/photo-of-a-white-haired-man-standing-next-to-floor-to-ceiling-experimental-equipment-with-many-tubes-and-wires.jpg?id=65111429&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">CERN researcher Walter Wuensch says the particle physics lab’s work on FLASH radiotherapy is “generating a lot of excitement.”</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">CERN</small></p><p>Radiation therapy has been a cornerstone of cancer treatment since shortly after <a href="https://medicalmuseum.health.mil/index.cfm/visit/exhibits/virtual/xraydiscovery/index" target="_blank">Wilhelm Conrad Röntgen</a> discovered X-rays in 1895. Today, more than half of all cancer patients receive it as part of their care, typically in relatively low doses of X-rays delivered over dozens of sessions. Although this approach often kills the tumor, it also wreaks havoc on nearby healthy tissue. Even with modern precision targeting, the potential for collateral damage limits how much radiation doctors can safely deliver.</p><p>FLASH radiotherapy flips the conventional approach on its head, delivering a single dose of ultrahigh-power radiation in a burst that typically lasts less than one-tenth of a second. In study after study, this technique causes significantly less injury to normal tissue than conventional radiation does, without compromising its antitumor effect.</p><p>At CERN, which I visited last July, the approach is being tested and refined on accelerators that were never intended for medicine. If ongoing experiments here and around the world continue to bear out results, FLASH could transform radiotherapy—delivering stronger treatments, fewer side effects, and broader access to lifesaving care.</p><p>“It’s generating a lot of excitement,” says Wuensch, a researcher at CERN’s Linear Electron Accelerator for Research (CLEAR) facility. “We accelerator people are thinking, Oh, wow, here’s an application of our technology that has a societal impact which is more immediate than most high-energy physics.”</p><h2>The Unlikely Birth of FLASH Therapy</h2><p>The breakthrough that led to FLASH emerged from a line of experiments that began in the 1990s at <a href="https://institut-curie.org/" target="_blank">Institut Curie</a> in Orsay, near Paris. Researcher <a href="https://institut-curie.org/person/vincent-favaudon" target="_blank">Vincent Favaudon</a> was using a low-energy electron accelerator to study radiation chemistry. Targeting the accelerator at mouse lungs, Favaudon expected the radiation to produce scar tissue, or fibrosis. But when he exposed the lungs to ultrafast blasts of radiation, at doses a thousand times as high as what’s used in conventional radiation therapy, the expected fibrosis never appeared.</p><p>Puzzled, Favaudon turned to <a href="https://scholar.google.com/citations?user=xx8VQkMAAAAJ&hl=fr" target="_blank">Marie-Catherine Vozenin</a>, a radiation biologist at Curie who specialized in radiation-induced fibrosis. “When I looked at the slides, there was indeed no fibrosis, which was very, very surprising for this type of dose,” recalls Vozenin, who now works at <a href="https://www.hug.ch/en" target="_blank">Geneva University Hospitals</a>, in Switzerland.</p><h3>How to Measure Radiation Doses</h3><br/><p>Radiation therapy uses a variety of units to refer to the amount of energy received by the patient. Here are the main ones under the International System of Units, or SI.</p><p><strong>Gray (Gy):</strong> A measure of the absorbed dose—that is, how much radiation energy is absorbed by the body. One gray equals 1 joule of radiation energy per kilogram of matter. FLASH delivers a single dose of 40 Gy or more in a fraction of a second. Conventional radiation therapy, by contrast, may deliver a total dose of 40 to 80 Gy but over the course of several weeks.</p><p><strong>Sievert (Sv):</strong> A measure of the effective dose—that is, the health effects of the radiation, with different types of ionizing radiation (gamma rays, X-rays, alpha particles, and so on) having different effects. One sievert equals 1 joule per kilogram weighted for the biological effectiveness of the radiation and the tissues exposed.</p><h3></h3><br/><p>The pair expanded the experiments to include cancerous tumors. The results upended a long-held trade-off of radiotherapy: the idea that you can’t destroy a tumor without also damaging the host. “This differential effect is really what we want in radiation oncology, not damaging normal tissue but killing the tumors,” Vozenin says.</p><p>They repeated the protocol across different types of tissue and tumors. By 2014, they had gathered enough evidence to publish their findings in <a href="https://www.science.org/doi/10.1126/scitranslmed.3008973" target="_blank"><em>Science Translational Medicine</em></a>. Their experiments confirmed that delivering an ultrahigh dose of 10 gray or more in less than a tenth of a second could eradicate tumors in mice while leaving surrounding healthy tissue virtually unharmed. For comparison, a typical chest X-ray delivers about 0.1 milligray, while a session of conventional radiation therapy might deliver a total of about 2 gray per day. (The authors called the effect “FLASH” because of the quick, high doses involved, but it’s not an acronym.)</p><h3></h3><br/><img alt="Three sets of images comparing highly magnified tissue samples." class="rm-shortcode" data-rm-shortcode-id="00fc1edc5ddb29e98aa8bb4755930278" data-rm-shortcode-name="rebelmouse-image" id="6ce44" loading="lazy" src="https://spectrum.ieee.org/media-library/three-sets-of-images-comparing-highly-magnified-tissue-samples.jpg?id=65111609&width=980"/><h3></h3><br/><p>Many cancer experts were skeptical. The FLASH effect seemed almost too good to be true. “It didn’t get a lot of traction at first,” recalls <a href="https://med.stanford.edu/profiles/Billy_Loo" target="_blank">Billy Loo</a>, a Stanford radiation oncologist specializing in lung cancer. “They described a phenomenon that ran counter to decades of established radiobiology dogma.”</p><p>But in the years since then, researchers have observed the effect across a wide range of tumor types and animals—beyond mice to zebra fish, fruit flies, and even a few human subjects, with the same protective effect in the brain, lungs, skin, muscle, heart, and bone.</p><p>Why this happens remains a mystery. “We have investigated a lot of hypotheses, and all of them have been wrong,” says Vozenin. Currently, the most plausible theory emerging from her team’s research points to metabolism: Healthy and cancerous cells may process reactive oxygen species—unstable oxygen-containing molecules generated during radiation—in very different ways.</p><h2>Adapting Accelerators for FLASH</h2><p>At the time of the first FLASH publication, Loo and his team at Stanford were also focused on dramatically speeding up radiation delivery. But Loo wasn’t chasing a radiobiological breakthrough. He was trying to solve a different problem: motion.</p><p>“The tumors that we treat are always moving targets,” he says. “That’s particularly true in the lung, where because of breathing motion, the tumors are constantly moving.”</p><p>To bring FLASH therapy out of the lab and into clinical use, researchers like Vozenin and Loo needed machines capable of delivering fast, high doses with pinpoint precision deep inside the body. Most early studies relied on low-energy electron beams like Favaudon’s 4.5-megaelectron-volt Kinetron—sufficient for surface tumors, but unable to reach more than a few centimeters into a human body. Treating deep-seated cancers in the lung, brain, or abdomen would require far higher particle energies.</p><h3></h3><br/><img alt="Photo of floor-to-ceiling electromagnetic hardware with many tubes and pipes, some of which is copper-colored." class="rm-shortcode" data-rm-shortcode-id="3b3bd74be1a8bc555eb51aa843114f06" data-rm-shortcode-name="rebelmouse-image" id="39797" loading="lazy" src="https://spectrum.ieee.org/media-library/photo-of-floor-to-ceiling-electromagnetic-hardware-with-many-tubes-and-pipes-some-of-which-is-copper-colored.jpg?id=65111435&width=980"/><h3></h3><br/><p>They also needed an alternative to conventional X-rays. In a clinical linac, X-ray photons are produced by dumping high-energy electrons into a bremsstrahlung target, which is made of a material with a high atomic number, like tungsten or copper. The target slows the electrons, converting their kinetic energy into X-ray photons. It’s an inherently inefficient process that wastes most of the beam power as heat and makes it extremely difficult to reach the ultrahigh dose rates required for FLASH. High-energy electrons, by contrast, can be switched on and off within milliseconds. And because they have a charge and can be steered by magnets, electrons can be precisely guided to reach tumors deep within the body. (Researchers are also investigating protons and carbon ions; see the sidebar, “What’s the Best Particle for FLASH Therapy?”)</p><p>Loo turned to the <a href="https://www6.slac.stanford.edu/" target="_blank">SLAC National Accelerator Laboratory</a> in Menlo Park, Calif., where physicist <a href="https://profiles.stanford.edu/sami-tantawi" rel="noopener noreferrer" target="_blank">Sami Gamal-Eldin Tantawi</a> was redefining how electromagnetic waves move through linear accelerators. Tantawi’s findings allowed scientists to precisely control how energy is delivered to particles—paving the way for compact, efficient, and finely tunable machines. It was exactly the kind of technology FLASH therapy would need to target tumors deep inside the body.</p><p>Meanwhile, Vozenin and other European researchers turned to CERN, best known for its 27-kilometer Large Hadron Collider (LHC) and the 2012 discovery of the Higgs boson, the “God particle” that gives other particles their mass. </p><p class="ieee-inbody-related">RELATED: <a href="https://spectrum.ieee.org/particle-physics-ai" target="_blank">AI Hunts for the Next Big Thing in Physics</a></p><p>CERN is also home to a range of smaller linear accelerators—including CLEAR, where Wuensch and his team are adapting high-energy physics tools for medicine.</p><h3>What’s the Best Particle for FLASH Therapy?</h3><br/><p>Even as research on FLASH radiotherapy advances, a central question remains: What kind of particle will deliver it best? The main contenders are electrons, protons, and carbon ions. Each has distinct advantages, limitations, and implications for cost, complexity, and clinical reach.</p><p><strong>Electrons</strong>—long used to treat surface tumors and to generate X-rays—are light, nimble particles, far easier to control than protons or carbon ions. At low energies, they stop quickly in tissue, but new high-energy systems can drive electrons deeper. Now researchers are working on machines that combine multiple high-energy beams at different angles to let doctors sculpt radiation doses that match the tumor’s shape.</p><p>That principle underpins Billy Loo’s PHASER (Pluridirectional High-energy Agile Scanning Electron Radiotherapy) system, developed at Stanford and SLAC and licensed to a startup called <a href="https://www.tibaray.com/" target="_blank">TibaRay</a>. An array of high-efficiency linacs generates X-ray beams from many directions at once. Their high output overcomes the inefficiency of electron-to-photon conversion to deliver the dose at FLASH speed. Beam convergence at the tumor and electronic shaping conform the dose in three dimensions, producing uniform coverage with relatively simple infrastructure. </p><p><strong>Protons</strong> have led the way in early clinical trials, largely because existing proton therapy centers can be adapted to deliver FLASH doses. In 2020, the University of Cincinnati Health launched the <a href="https://www.uchealth.com/en/media-room/articles/ground-breaking-cancer-research-is-in-your-backyard" rel="noopener noreferrer" target="_blank">first human FLASH trial</a> to use proton beams, to treat cancer that had metastasized to bones. “If I want to be pragmatic, the proton beam is ready to go, so let’s move with what we have,” says Geneva University Hospitals’ Marie-Catherine Vozenin.</p><p>Protons can penetrate up to 30 centimeters, reaching deep-seated tumors. But the delivery of protons in a continuous beam limits the dose rates. Also, proton systems are far larger and more expensive than, say, X-ray machines, which will likely constrain their availability to specialized centers.</p><p><strong>Carbon ions</strong>, used in a handful of elite facilities, offer even higher precision and biological effectiveness compared to electrons and protons. Their Bragg peak—a sudden deposition of energy at a specific depth—makes them appealing for deep or complex tumors. But that unmatched precision comes at a steep price, with each facility costing upward of US $300 million. —T.C.</p><h3></h3><br/><p>Unlike the LHC, which loops particles around a massive ring to build up energy before smashing them together, linear accelerators like CLEAR send particles along a straight, one-time path. That setup allows for greater precision and compactness, making it ideal for applications like FLASH.</p><p>At the heart of the CLEAR facility, Wuensch points out the 200-MeV linear accelerator with its 20-meter beamline. This is “a playground of creativity,” he says, for the physicists and engineers who arrive from all over the world to run experiments.</p><p>The process begins when a laser pulse hits a photocathode, releasing a burst of electrons that form the initial beam. These electrons travel through a series of precisely machined copper cavities, where high-frequency microwaves push them forward. The electrons then move through a network of magnets, monitors, and focusing elements that shape and steer them toward the experimental target with submillimeter precision.</p><p>Instead of a continuous stream, the electron beam is divided into nanosecond-long bunches—billions of electrons riding the radio-frequency field like surfers. Inside the accelerator’s cavities, the field flips polarity 12 billion times per second, so timing is everything: Only electrons that arrive perfectly in phase with the accelerating wave will gain energy. That process repeats through a chain of cavities, each giving the bunches another push, until the beam reaches its final energy of 200 MeV.</p><h3></h3><br/><img alt="Close-up photo of an etched copper disc being held under a microscope by a gloved hand." class="rm-shortcode" data-rm-shortcode-id="9cbcce34df51565a0cd0cea335517027" data-rm-shortcode-name="rebelmouse-image" id="6eeba" loading="lazy" src="https://spectrum.ieee.org/media-library/close-up-photo-of-an-etched-copper-disc-being-held-under-a-microscope-by-a-gloved-hand.jpg?id=65111478&width=980"/><p><span>Much of this architecture draws directly from the </span><a href="https://clic-study.org/" target="_blank">Compact Linear Collider study</a><span>, a decades-long CERN project aimed at building a next-generation collider. The proposed CLIC machine would stretch 11 kilometers and collide electrons and positrons at 380 gigaelectron volts. To do that in a linear configuration—without the multiple passes around a ring like the LHC—CERN engineers have had to push for extremely high acceleration gradients to boost the electrons to high energies over relatively short distances—up to 100 megavolts per meter.</span></p><p>Wuensch leads me to a large experimental hall housing prototype structures from the CLIC effort, and points out the microwave devices that now help drive FLASH research. Though the future of CLIC as a collider remains uncertain, its infrastructure is already yielding dividends: smaller, high-gradient accelerators that may one day be as suited for curing cancer as they are for smashing particles.</p><p class="ieee-inbody-related">RELATED: <a href="https://spectrum.ieee.org/supercolliders" target="_blank">Four Ways Engineers Are Trying to Break Physics</a></p><p>The power behind the high gradients comes from <a href="https://aries.web.cern.ch/xbox" target="_blank">CERN’s Xboxes</a>, the X-band RF systems that dominate the experimental hall. Each Xbox houses a klystron, modulator, pulse compressor, and waveguide network to generate and shape the microwave pulses. The pulse compressors store energy in resonant cavities and then release it in a microsecond burst, producing peaks of up to 200 megawatts; if it were continuous, that’s enough to power at least 40,000 homes. The Xboxes let researchers fine-tune the power, timing, and pulse shape.</p><p>According to Wuensch, many of the recent accelerator developments were enabled by advances in computer simulation and high-precision three-dimensional machining. These tools allow the team to iterate quickly, designing new accelerator components and improving beam control with each generation.</p><p>Still, real-world challenges remain. The power demands are formidable, as are the space requirements; for all the talk of its “compact” design, the original CLIC was meant to span kilometers. Obviously, a hospital needs something that’s actually compact.</p><p>“A big challenge of the project,” says Wuensch, “is to transform this kind of technology and these kinds of components into something that you can imagine installing in a hospital, and it will run every day reliably.”</p><p>To that end, CERN researchers have teamed up with the <a href="https://www.lausanneuniversityhospital.com/home" target="_blank">Lausanne University Hospital</a> (known by its French acronym, CHUV) and the French medical technology company <a href="https://www.theryq-alcen.com/" target="_blank">Theryq</a> to design a hospital facility capable of treating large and deep-seated tumors with the very short time scales needed for FLASH and scaled down to fit in a clinical setting.</p><h2>Theryq’s Approach to FLASH</h2><p>Theryq’s research center and factory are located in southern France, near the base of Montagne Sainte-Victoire, a jagged spine of limestone that Paul Cézanne painted dozens of times, capturing its shifting light and form.</p><p>“The solution that we are trying to develop here is something which is extremely versatile,” says <a href="https://www.linkedin.com/in/ludovic-le-meunier-7084382?originalSubdomain=fr" target="_blank">Ludovic Le Meunier</a>, CEO of the expanding company. “The ultimate goal is to be able to treat any solid tumor anywhere in the body, which is about 90 percent of the cancer these days.”</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Futuristic scientific equipment setup, featuring streamlined machinery and intricate components." class="rm-shortcode" data-rm-shortcode-id="91c6f9815a719ce2a415181d8352df23" data-rm-shortcode-name="rebelmouse-image" id="5b999" loading="lazy" src="https://spectrum.ieee.org/media-library/futuristic-scientific-equipment-setup-featuring-streamlined-machinery-and-intricate-components.jpg?id=65111601&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Theryq’s FLASHDEEP system, under development with CERN and the company’s clinical partners, has a 13.5-meter-long, 140-MeV linear accelerator. That’s strong enough to treat tumors at depths of up to about 20 centimeters in the body. The patient will remain in a supported standing position during the split-second irradiation.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">THERYQ</small></p><p>Theryq’s push to bring FLASH radiotherapy from the lab to clinic has followed a three-pronged rollout, with each device engineered for a specific depth and clinical use. The first machine, <a href="https://www.theryq-alcen.com/flash-radiotherapy-products/flashknife/" target="_blank">FLASHKNiFE</a>, was unveiled in 2020. Designed for superficial tumors and intraoperative use, the system delivers electron beams at 6 or 9 MeV. A prototype installed that same year at CHUV is conducting a phase-two trial for patients with localized skin cancer.</p><p>More recently, Theryq launched <a href="https://www.theryq-alcen.com/flash-radiotherapy-products/flashlab/" target="_blank">FLASHLAB</a>, a compact, 7-MeV platform for radiobiology research.</p><p>The company’s most ambitious system, <a href="https://www.theryq-alcen.com/flash-radiotherapy-products/flashdeep/" target="_blank">FLASHDEEP</a>, is still under development. The 13.5-meter-long electron source will deliver very high-energy electrons of as much as 140 MeV up to 20 centimeters inside the body in less than 100 milliseconds. An integrated CT scanner, built into a patient-positioning system developed by <a href="https://leocancercare.com/" target="_blank">Leo Cancer Care</a>, captures images that stream directly into the treatment-planning software, enabling precise calculation of the radiation dose. “Before we actually trigger the beam or the treatment, we make stereo images to verify at the very last second that the tumor is exactly where it should be,” says Theryq technical manager <a href="https://www.linkedin.com/in/philippe-liger-977a3316?originalSubdomain=fr" target="_blank">Philippe Liger</a>.</p><h2>FLASH Therapy Moves to Animal Tests</h2><p>While CERN’s CLEAR accelerator has been instrumental in characterizing FLASH parameters, researchers seeking to study FLASH in living organisms must look elsewhere: CERN doesn’t allow animal experiments on-site. That’s one reason why a growing number of scientists are turning to PITZ, the Photo Injector Test Facility in Zeuthen, a leafy lakeside suburb of Berlin.</p><p>PITZ is part of Germany’s national accelerator lab and is responsible for developing the electron source for the <a href="https://www.xfel.eu/" target="_blank">European X-ray Free-Electron Laser</a>. Now PITZ is emerging as a hub for FLASH research, with an unusually tunable accelerator and a dedicated biomedical lab to ensure controlled conditions for preclinical studies.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A photo showing a row of experimental electronic equipment on racks" class="rm-shortcode" data-rm-shortcode-id="b3c62ff858a14ceb04a3a4549f85d68a" data-rm-shortcode-name="rebelmouse-image" id="cfbfe" loading="lazy" src="https://spectrum.ieee.org/media-library/a-photo-showing-a-row-of-experimental-electronic-equipment-on-racks.jpg?id=65111551&width=980"/></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A photo of a closeup of a gloved hand holding a sample of a purple liquid above a piece of equipment." class="rm-shortcode" data-rm-shortcode-id="e4f204a1631b000ef17c7be15995ef83" data-rm-shortcode-name="rebelmouse-image" id="82e52" loading="lazy" src="https://spectrum.ieee.org/media-library/a-photo-of-a-closeup-of-a-gloved-hand-holding-a-sample-of-a-purple-liquid-above-a-piece-of-equipment.jpg?id=65111525&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">At Germany’s Photo Injector Test Facility in Zeuthen (PITZ), the electron-beam accelerator [top] is used to irradiate biological targets in early-stage animal tests of FLASH radiotherapy [bottom].</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Top: Frieder Mueller; Bottom: MWFK</small></p><p>“The biggest advantage of our facility is that we can do a very stepwise, very defined and systematic study of dose rates,” says <a href="https://www.linkedin.com/in/anna-grebinyk-186a8245?originalSubdomain=de" target="_blank">Anna Grebinyk</a>, a biochemist who heads the new biomedical lab, “and systematically optimize the FLASH effect to see where it gets the best properties.”</p><p>The experiments begin with zebra-fish embryos, prized for early-stage studies because they’re transparent and develop rapidly. After the embryos, researchers test the most promising parameters in mice. To do that, the PITZ team uses a small-animal radiation research platform, complete with CT imaging and a robotic positioning system adapted from CERN’s CLEAR facility.</p><p>What sets PITZ apart is the flexibility of its beamline. The 30-meter accelerator system steers electrons with micrometer precision, producing electron bunches with exceptional brightness and emittance—a metric of beam quality. “We can dial in any distribution of bunches we want,” says Frank Stephan, group leader at PITZ. “That gives us tremendous control over time structure.”</p><p>Timing matters. At PITZ, the laser-struck photocathode generates electron bunches that are accelerated immediately, at up to 60 million volts per meter. A fast electromagnetic kicker system acts as a high-speed gatekeeper, selectively deflecting individual electron bunches from a high-repetition beam and steering them according to researchers’ needs. This precise, bunch-by-bunch control is essential for fine-tuning beam properties for FLASH experiments and other radiation therapy studies.</p><p>“The idea is to make the complete treatment within one millisecond,” says Stephan. “But of course, you have to [trust] that within this millisecond, everything works fine. There is not a chance to stop [during] this millisecond. It has to work.”</p><p>Regulating the dose remains one of the biggest technical hurdles in FLASH. The ionization chambers used in standard radiotherapy can’t respond accurately when dose rates spike hundreds of times higher in a matter of microseconds. So researchers are developing new detector systems to precisely measure these bursts and keep pace with the extreme speed of FLASH delivery.</p><h2>FLASH as a Research Tool</h2><p>Beyond its therapeutic potential, FLASH may also open new windows to illuminate cancer biology. “What is really, really superinteresting, in my opinion,” says Vozenin, “is that we can use FLASH as a tool to understand the difference between normal tissue and tumors. There must be something we’re not aware of that really distinguishes the two—and FLASH can help us find it.” Identifying those differences, she says, could lead to entirely new interventions, not just with radiation, but also with drugs.</p><p>Vozenin’s team is currently testing a hypothesis involving long-lived proteins present in healthy tissue but absent in tumors. If those proteins prove to be key, she says, “we’re going to find a way to manipulate them—and perhaps reverse the phenomenon, even [turn] a tumor back into a normal tissue.”</p><p>Proponents of FLASH believe it could help close the cancer care gap worldwide; in low-income countries, only about 10 percent of patients have access to radiotherapy, and in middle-income countries, only about 60 percent of patients do, according to the International Atomic Energy Agency. Because FLASH treatment can often be delivered in a single brief session, it could spare patients from traveling long distances for weeks of treatment and allow clinics to treat many more people.</p><p>High-income countries stand to benefit as well. Fewer sessions mean lower costs, less strain on radiotherapy facilities, and fewer side effects and disruptions for patients.</p><p>The big question now is, How long will it take? Researchers I spoke with estimate that FLASH could become a routine clinical option in about 10 years—after the completion of remaining preclinical studies and multiphase human trials, and as machines become more compact, affordable, and efficient. Much of the momentum comes from a growing field of startups competing to build devices, but the broader scientific community remains remarkably open and collaborative.</p><p>“Everyone has a relative who knows about cancer because of their own experience,” says Stephan. “My mother died of it. In the end, we want to do something good for mankind. That’s why people work together.” <span class="ieee-end-mark"></span></p><p><em>This article appears in the March 2026 print issue.</em></p>]]></description><pubDate>Fri, 06 Mar 2026 14:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/flash-radiotherapy</guid><category>Medical-technology</category><category>Cern</category><category>High-energy-physics</category><category>Linear-accelerator</category><category>Electron-beams</category><category>Cancer-treatments</category><dc:creator>Tom Clynes</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/photo-of-a-man-in-a-lab-coat-adjusting-a-large-piece-of-medical-equipment-thats-pointed-at-the-head-of-a-partial-mannequin.jpg?id=65111419&amp;width=980"></media:content></item><item><title>“Cyborg” Tissue Could Help Fast-Track Cures for Type 1 Diabetes</title><link>https://spectrum.ieee.org/cyborg-stem-cell-therapy-for-diabetes</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-three-dimensional-rendering-of-immunostains-on-a-pancreatic-organoid-the-stains-from-most-abundant-to-least-include-insulin.jpg?id=65112030&width=1200&height=400&coordinates=0%2C729%2C0%2C730"/><br/><br/><p>Lab-grown cell therapies for diabetes are <a href="https://www.nytimes.com/2025/06/20/health/diabetes-cure-insulin-stem-cell.html" rel="noopener noreferrer" target="_blank">edging toward the clinic</a>. Researchers can now coax stem cells to behave like pancreatic islets, the tiny clusters of cells that regulate blood sugar.<strong> </strong>But even the most promising candidate therapies still need months inside a patient’s body <span>to fully mature and work reliably—and some never quite get there.</span></p><p>Now researchers have found a way to watch—and even gently steer—that maturation in the lab.</p><p>A team led by Harvard bioengineer <a href="https://liulab.seas.harvard.edu/prof-jia-liu" target="_blank">Jia Liu</a> and University of Pennsylvania stem-cell biologist <a href="https://www.med.upenn.edu/apps/faculty/index.php/g275/p9558766" target="_blank">Juan Alvarez</a> embedded soft, stretchable electronics into the tiny clusters to create “cyborg” islet <a href="https://spectrum.ieee.org/tag/organoid" target="_self">organoids</a>. Woven through with a flexible web of microelectrodes, the miniature pancreas-like tissue can eavesdrop on the electrical chatter of individual lab-grown cells for months as they mature, learning to sense glucose and release hormones in tightly coordinated bursts.</p><p>That electrical activity is <span>essential</span>. Islets are part of the body’s neuroendocrine system: Like neurons in the brain, their cells fire voltage-driven signals—and those electrical spikes trigger the release of insulin and glucagon, the twin hormones that stabilize blood sugar levels.</p><p>The cyborg islets helped the researchers tease apart how the two main cell types that make up islets—insulin-producing beta cells and glucagon-secreting alpha cells—pass through distinct electrical maturation stages before settling into the synchronized firing patterns seen in mature tissue. <a href="https://www.science.org/doi/10.1126/science.aeb3295" target="_blank">Reporting 19 February in <em>Science</em></a>, the researchers also showed how exposures to rhythmic daily glucose cycles and brief pulses of electrical stimulation sharpened the glucose responsiveness of the cells, suggesting that the road to islet maturity can be engineered, not merely observed.</p><p>“It is a testimony of the magic that can happen when two very different fields—beta-cell biology and nano-electronics—collide,” says <a href="https://research.bidmc.org/torsten-meissner" target="_blank">Torsten Meissner,</a> a stem-cell-therapy researcher at Beth Israel Deaconess Medical Center who was not involved in the research.</p><h2>Silicon Meets Stem Cells</h2><p>Integrating flexible bioelectronics directly into lab-made islets opens the door to several practical applications, says Alvarez. For one, the approach could accelerate efforts to refine stem-cell-differentiation recipes, so that lab-grown islets are closer to maturity and “can hit the ground running when transplanted,” he explains.</p><p>The embedded electronics could also provide a built-in way to monitor the performance of implanted cell therapies—or even one day form the basis of a true <a href="https://spectrum.ieee.org/artificial-pancreas-could-conquer-diabetes" target="_self">“bionic” pancreas</a> system that automatically stimulates cells to sharpen their insulin response when blood sugar levels begins to rise.</p><p>“This is really the future,” Liu says. “I think flexible, stretchable, soft electronics integrated with organoids should become the gold standard for next-generation cell therapies, because you don’t want to transplant large numbers of cells if you have no way to monitor or control what they’re doing.”</p><p>Liu has been moving toward this vision for more than a decade, beginning with early work he did as a graduate student on <a href="https://www.nature.com/articles/nnano.2015.115" target="_blank">syringe-injectable mesh electronics</a> designed to blend into living brain tissue for long-term neural recording. Instead of rigid probes that scar the brain, the porous, ultraflexible meshes were built to match the softness of cells and move with them.</p><p><span>Together with his Harvard colleagues, Liu </span><a href="https://pubs.acs.org/doi/10.1021/acs.nanolett.9b02512" target="_blank">first applied the concept to organoids</a><span> in 2019. The researchers showed that by weaving the stretchable electronics into flat sheets of stem cells as they folded themselves into three-dimensional mini-organs, the devices could become an integral part of the tissue itself. Early demonstrations focused on </span><a href="https://www.science.org/doi/10.1126/sciadv.ade8513" target="_blank">cardiac tissue</a><span>, tracking the coordinated electrical waves that drive beating heart cells. Subsequent work pushed the platform into </span><a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/adma.202106829" target="_blank">brain organoids</a><span> and </span><a href="https://spectrum.ieee.org/embryo-electrode-array" target="_self">even developing embryos</a><span>.</span></p><h2>How Stem Cells Could Cure Diabetes</h2><p>Now, with pancreatic islets, Liu is bringing the technology to one of regenerative medicine’s most urgent challenges: building replacement cells for people with type 1 diabetes who, because of an immune system that turns on itself, have lost the cells necessary to keep blood sugar in balance.</p><p><del></del>Important hurdles remain. Most islet-cell therapies still require lifelong treatment with immune-suppressing drugs, restricting transplants to only the most severe cases in which patients can no longer manage their diabetes with insulin alone. Companies are pursuing two main workarounds: <a href="https://www.nature.com/articles/nm0114-9" target="_blank">encasing cells in protective capsules</a> or <a href="https://www.nature.com/articles/d41586-024-00590-y" target="_blank">genetically engineering them to evade immune attack</a>. But encapsulation efforts have been <a href="https://www.nature.com/articles/540S60a" target="_blank">plagued by device failures</a>, and gene-edited “stealth” cells remain in the <a href="https://www.nature.com/articles/d41586-025-02802-5" target="_blank">early stages of development</a>.</p><p>That is not to say the field has lacked breakthroughs.</p><p>Last year, Vertex Pharmaceuticals announced that a full dose of its stem-cell-derived islet therapy, named Zimislecel, had <a href="https://www.nejm.org/doi/10.1056/NEJMoa2506549?url_ver=Z39.88-2003#ap2" target="_blank"><span>helped people with severe type 1 diabetes produce their own insulin again</span></a>, enabling 10 of 12 study participants to stop taking insulin injections altogether. The results were hailed as a watershed moment: evidence that lab-grown cells can work inside the human body and a glimpse of a future in which a virtually limitless supply of replacement islets could free the field from its reliance on scarce donor pancreases.</p><p>But while the transplanted cells eventually performed as well as fully mature islets taken from deceased donors, it took months inside patients’ bodies for them to reach that level of function—and even then, they didn’t work for everybody.<strong> </strong></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Microscopic view of a device's embedded high density electrode array inside a cyborg pancreatic organoid." class="rm-shortcode" data-rm-shortcode-id="45a0d29ec6911024ff4b8fd846979dab" data-rm-shortcode-name="rebelmouse-image" id="670b5" loading="lazy" src="https://spectrum.ieee.org/media-library/microscopic-view-of-a-device-s-embedded-high-density-electrode-array-inside-a-cyborg-pancreatic-organoid.jpg?id=65112055&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Passing light through the cyborg organoid while it’s under a microscope shows the flexible electronic device. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.science.org/doi/10.1126/science.aeb3295" target="_blank">Qiang Li et al.</a></small></p><h2>Toward Smarter Cell Therapies</h2><p>That is where the cyborg islets from Liu and Alvarez could make a difference. They won’t address the immune suppression challenge. But they could sharpen the cell product itself.</p><p>First, by providing a continuous, single-cell readout of electrical activity, the devices could help companies like Vertex fine-tune differentiation protocols in the manufacturing process—testing growth factors, for instance, or electrical stimulation patterns, and quickly identifying which combinations produce the most mature cells.</p><p>“Neuroendocrine connections are missing from current stem-cell differentiation protocols,” notes <a href="https://profiles.rice.edu/faculty/omid-veiseh" target="_blank">Omid Veiseh</a>, a bioengineer at Rice University who studies diabetes cell therapies but was not involved in the research. Incorporating cues like those delivered by the embedded electronics “could further enhance differentiation trajectories,” he says. “It’s really innovative.” </p><p>Second, the bioelectronic scaffolds could one day act as built-in health monitors, providing real-time feedback on islet performance so clinicians can adjust treatment if function begins to falter—a strategy also being explored by companies such as <a href="https://www.minutia.co/" target="_blank">Minutia</a>, though using different device-based approaches.<strong></strong> </p><p>And third, by coupling sensing with stimulation, the system points toward a closed-loop future: engineered islets equipped with AI-driven sensors that detect rising glucose and automatically boost insulin output through targeted electrical pulses that nudge the cells back on track.</p><p>“I see a lot of value here,” says <a href="https://sites.wustl.edu/millmanlab/" target="_blank">Jeffrey Millman,</a> a bioengineer at Washington University who <a href="https://linkinghub.elsevier.com/retrieve/pii/S0092-8674(14)01228-8" target="_blank">helped develop the protocol</a> used to create Vertex’s stem-cell-derived therapy and continues to work on <a href="https://doi.org/10.1016/j.stem.2023.04.002" target="_blank">improving the maturation and function</a> of lab-grown islets.</p><p>But with major engineering, safety, and regulatory questions still to be resolved, don’t expect cyborg islets to enter clinical trials anytime soon, he cautions. In Millman’s view, the near-term payoff is far more practical: using the system to fine-tune differentiation in the lab to produce islets that secrete insulin more powerfully, respond faster to glucose swings, and require fewer cells to achieve the same therapeutic effect.</p><p>It may not be as flashy as a high-tech, closed-loop implant, Millman notes, but getting the cells right from the start should yield better therapies in the end.</p>]]></description><pubDate>Wed, 04 Mar 2026 13:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/cyborg-stem-cell-therapy-for-diabetes</guid><category>Stem-cells</category><category>Diabetes</category><category>Regenerative-medicine</category><category>Soft-electronics</category><category>Pancreas</category><category>Organoid</category><dc:creator>Elie Dolgin</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-three-dimensional-rendering-of-immunostains-on-a-pancreatic-organoid-the-stains-from-most-abundant-to-least-include-insulin.jpg?id=65112030&amp;width=980"></media:content></item><item><title>Xiangyi Cheng Is Bringing AR to Classrooms and Hospitals</title><link>https://spectrum.ieee.org/xiangyi-cheng-ar-classrooms-hospitals</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/xiangyi-cheng-pointing-at-a-robotic-arm-in-a-lab-setting-next-to-her-is-a-young-adult-woman-wearing-a-virtual-reality-headset.jpg?id=65096065&width=1200&height=400&coordinates=0%2C729%2C0%2C730"/><br/><br/><p>When<a href="https://cse.lmu.edu/department/mechanicalengineering/faculty/?expert=xiangyi.cheng" rel="noopener noreferrer" target="_blank"> Xiangyi Cheng</a> published her first journal paper as a principal investigator in<a href="https://ieeeaccess.ieee.org/?gad_source=1&gad_campaignid=19948279603&gbraid=0AAAAApgaRM9zNBBlw-jYd7UE0gSXKor4y&gclid=Cj0KCQiA-YvMBhDtARIsAHZuUzLiREaQgsad40vwttsLGsVt00CzNOVcrZY4taO2lvzsqnbC8Q7hvBQaAuHqEALw_wcB" rel="noopener noreferrer" target="_blank"> <em><em>IEEE Access</em></em></a> in 2024, it marked more than a professional milestone. For Cheng, an IEEE member and an assistant professor of mechanical engineering at<a href="https://www.lmu.edu/" rel="noopener noreferrer" target="_blank"> Loyola Marymount University</a>, in Los Angeles, it was the latest waypoint in a career shaped by curiosity, persistence, and a belief that technology should serve people—not the other way around.</p><p>The paper’s title was “<a href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10419075" rel="noopener noreferrer" target="_blank">Mobile Devices or Head-Mounted Displays: A Comparative Review and Analysis of Augmented Reality in Healthcare</a>.”</p><h3>XIANGYI CHENG</h3><br/><p><strong>Employer </strong></p><p><strong></strong>Loyola Marymount University, in Los Angeles</p><p><strong>Title </strong></p><p><strong></strong>Assistant professor of mechanical engineering</p><p><strong>Member grade </strong></p><p><strong></strong>Member</p><p><strong>Alma maters </strong></p><p><strong></strong>China University of Mining and Technology; Texas A&M University</p><p>Cheng’s work spans<a href="https://spectrum.ieee.org/topic/robotics/" target="_self"> robotics</a>, intelligent systems, human-machine interaction and<a href="https://spectrum.ieee.org/tag/human-machine-interaction" target="_self"> </a><a href="https://spectrum.ieee.org/topic/artificial-intelligence/" target="_self"> artificial intelligence</a>. It has applications in patient-specific surgical planning, an approach whereby treatment is customized to the anatomy and clinical needs of each individual.</p><p>Her research also covers<a href="https://spectrum.ieee.org/tag/wearables" target="_self"> wearables</a> for rehabilitation and<a href="https://www.ibm.com/think/topics/augmented-reality" rel="noopener noreferrer" target="_blank"> augmented-reality</a>-enhanced engineering education.</p><p>The throughline of her career is sound judgment based on critical thinking. She urges her students to avoid the temptation to accept the answers they’re given by AI without cross-checking them against their own foundational understanding of the subject matter.</p><p>“AI can give you ideas,” Cheng says, “but it should never lead your thinking.”</p><p>That principle—honed through uncertainty, disciplinary shifts, and hard-earned confidence—has made Cheng an emerging voice in applied intelligent systems and a thoughtful educator preparing students for an AI-saturated world.</p><h2>From Xi’an to Beijing: A mind drawn to mathematics</h2><p>Cheng, born in <a href="https://www.britannica.com/place/Xian-China" rel="noopener noreferrer" target="_blank">Xi’an, China</a>, grew up in a household shaped by her parents’ disparate careers. Her father was a mining engineer, and her mother taught Chinese and literature at a high school.</p><p>“That contrast between logical and literary thinking helped me understand myself early,” Cheng says. “I liked math, and STEM felt natural to me.”</p><p>Several teachers reinforced her inclination, she says, particularly a math teacher whose calm, fair approach emphasized reasoning over punishments such as detention for misbehavior or failure to complete assignments.</p><p>“It wasn’t about being right,” Cheng says. “It was about thinking clearly.”</p><p>In 2011 she enrolled at the <a href="https://english.cumtb.edu.cn/" rel="noopener noreferrer" target="_blank">China University of Mining and Technology (Beijing)</a> , where she studied mechanical engineering. After graduating with a bachelor’s degree in 2015, she was unsure where the field would take her.</p><h2>An IEEE paper changed her trajectory</h2><p>Later in 2015, she traveled to the United States to study at<a href="https://case.edu/?campaignid=20602013936&adgroupid=154678129432&adid=675596328898&gad_source=1&gad_campaignid=20602013936&gbraid=0AAAAADHbx0VJm2eRyZsMlLOp8nqtMVwNX&gclid=Cj0KCQiA-YvMBhDtARIsAHZuUzLeRv-IjpkjT25nzbJLmuPBgndVAcirkurp9VNZxYujWgU2vMAOML8aAnHyEALw_wcB" rel="noopener noreferrer" target="_blank"> Case Western Reserve University</a>, in Cleveland.</p><p>She initially viewed the move as exploratory rather than a long-term commitment.</p><p>“I wasn’t thinking about a Ph.D.,” she says. “I wasn’t even sure research was for me.”</p><p>That uncertainty shifted in 2017, when Cheng submitted her <a href="https://ieeexplore.ieee.org/document/8460779" rel="noopener noreferrer" target="_blank">“IntuBot: Design and Prototyping of a Robotic Intubation Device</a>” paper to the<a href="https://2025.ieee-icra.org/" rel="noopener noreferrer" target="_blank"> IEEE International Conference on Robotics and Automation</a> (ICRA)—which was accepted.</p><p class="pull-quote"><span>“AI can give you more possibilities, but thinking is still our responsibility.”</span></p><p>Intubation is a procedure in which an endotracheal tube is inserted into a patient’s airway—usually through the mouth—to help them breathe. Because placing the tube correctly is not simple and usually must be done quickly, it requires training. That’s why research into robotic or assisted intubation systems focuses on improving speed, accuracy, and safety.</p><p>She presented her findings at ICRA in 2018, giving her early exposure to a global research community.</p><p>“That acceptance gave me confidence,” she recalls. “It showed me I could contribute to the field.”</p><p>Her advisor at Case Western encouraged her to switch from the mechanical engineering master’s program to the Ph.D. track. When the advisor moved to<a href="https://www.tamu.edu/index.html" target="_blank"> Texas A&M University</a>, in College Station, in 2019, Cheng decided to transfer. She completed her Ph.D. in mechanical engineering at Texas A&M in 2022.</p><p>Although she didn’t earn a degree from Case Western, she credits her experience there with clarifying her professional direction.</p><p>Shortly after graduating with her Ph.D., Cheng was hired as an assistant professor of mechanical engineering at <a href="https://www.onu.edu/" target="_blank">Ohio Northern University</a>, in Ada. She left in 2024 to become an assistant professor at Loyola Marymount.</p><h2>Engineering for the body—and the classroom</h2><p>Cheng’s research focuses on human-centered engineering, particularly in health care. One of her major projects addresses<a href="https://my.clevelandclinic.org/health/diseases/23521-syndactyly-webbed-digits" target="_blank"> syndactyly</a>, a congenital condition in which a newborn’s fingers are fused at birth. Surgeons rely on their experience to estimate the size and shape of skin grafts to be taken from another part of the body for the corrective surgery.</p><p>She is developing technology to scan the patient’s hand, extract anatomical landmarks, and use finite element analysis—a computer-based method for predicting how a physical object will behave under real-world conditions—to determine the optimal graft size and shape.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Smiling portrait of Xiangyi Cheng." class="rm-shortcode" data-rm-shortcode-id="526dfbc3e04a391b6b58fa177291d09d" data-rm-shortcode-name="rebelmouse-image" id="399a0" loading="lazy" src="https://spectrum.ieee.org/media-library/smiling-portrait-of-xiangyi-cheng.jpg?id=65096141&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Xiangyi Cheng designs human-centered intelligent systems with applications in health care and education.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Xiangyi Cheng</small></p><p>“Everyone’s hand is different,” Cheng says. “So the surgery should be personalized.”</p><p>Another project centers on developing smart gloves to assist with hand rehabilitation, pairing the unaffected hand with the injured one so the person’s natural motion can help guide therapy.</p><p>She also is exploring augmented reality in engineering education, using immersive visualization and AI tools to help students grasp three-dimensional concepts that are difficult to convey through traditional learning tools. Such visualization lets students see and interact with a digital world as if they’re inside it instead of viewing it on a flat screen.</p><h2>Teaching balance in an AI-driven world</h2><p>Despite working at the forefront of AI-enabled systems, Cheng cautions her students to be judicious in their use of the technology so that they don’t rely on it too heavily.</p><p>“AI is not always right and perfect,” she says. “You still need to be able to judge whether the answers it provides are correct.”</p><p>As AI continues to reshape engineering, Cheng remains grounded in a simple principle, she says: “We should use these tools. But we should never let them replace our judgment. AI can give you more possibilities, but thinking is still our responsibility.”</p><p>In her lab and classroom, Cheng prioritizes independent thinking, critical evaluation, and persistence. Many of her research students are undergraduates, and she encourages them to take ownership of their work—planning ahead, testing ideas, and learning from failure.</p><p>“The students who succeed don’t give up easily,” she says.</p><p>What she finds most rewarding, she says, is watching students mature. Reserved first-year students often become confident seniors who can present complex work and manage demanding projects.</p><p>“Getting to witness that transformation is why I teach,” she says.</p><p>For students considering engineering, Cheng offers straightforward advice: “Focus on mathematics. Engineering looks hands-on, but math is the foundation behind everything.”</p><p>With practice and persistence, she says, students can succeed and find meaning in the field.</p><h2>Why IEEE continues to matter</h2><p>Cheng joined IEEE in 2017, the year she submitted her first paper to ICRA. The organization has remained central to her professional development, she says.</p><p>She has served as a reviewer for IEEE journals and conferences including<a href="https://ieeexplore.ieee.org/document/10368213" target="_blank"> <em><em>Robotics and Automation Letters</em></em></a>,<a href="https://www.ieee-tmrb.org/new/" target="_blank"> <em><em>Transactions on Medical Robotics and Bionics</em></em></a>,<a href="https://ieeexplore.ieee.org/document/6894708" target="_blank"> <em><em>Transactions on Robotics</em></em></a>, the<a href="https://www.iros25.org/" target="_blank"> International Conference on Intelligent Robots and Systems</a>, and ICRA.</p><p>IEEE’s interdisciplinary scope aligns naturally with her work, she says, adding that the organization is “one of the few places that truly welcomes research across boundaries.”</p><p>More personally, IEEE helped her see a future she had not initially imagined.</p><p>“That first conference was a turning point,” she says. “It helped me realize I belonged.”</p>]]></description><pubDate>Sat, 28 Feb 2026 19:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/xiangyi-cheng-ar-classrooms-hospitals</guid><category>Robotics</category><category>Ai</category><category>Ieee-member-news</category><category>Type-ti</category><category>Careers</category><category>Biomedical</category><dc:creator>Willie D. Jones</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/xiangyi-cheng-pointing-at-a-robotic-arm-in-a-lab-setting-next-to-her-is-a-young-adult-woman-wearing-a-virtual-reality-headset.jpg?id=65096065&amp;width=980"></media:content></item><item><title>Bond Strength, Biocompatibility, and Beyond</title><link>https://content.knowledgehub.wiley.com/a-guide-to-selecting-adhesives-for-medical-device-applications/</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/master-bond-logo.png?id=28859628&width=980"/><br/><br/><p>Designing a medical device? This whitepaper helps you evaluate adhesive options for biocompatibility, sterilization resistance, and manufacturability — so you can make the right material decision early.</p><p><strong>What Attendees will Learn</strong></p><ol><li> How to select between epoxy, silicone, cyanoacrylate, and UV/LED curable adhesives based on your device requirements</li><li>Which adhesive systems meet USP Class VI and ISO 10993-5 biocompatibility standards</li><li>How different sterilization methods, such as autoclaving, EtO, gamma, chemical immersion affect adhesive performance over repeated cycles</li><li>Why integrating adhesive selection early in the design process reduces costly trade-offs between performance and manufacturability</li><li>Download this free whitepaper now!</li></ol><p><span><a href="https://content.knowledgehub.wiley.com/a-guide-to-selecting-adhesives-for-medical-device-applications/" target="_blank">Download this free whitepaper now!</a></span></p>]]></description><pubDate>Fri, 27 Feb 2026 11:00:02 +0000</pubDate><guid>https://content.knowledgehub.wiley.com/a-guide-to-selecting-adhesives-for-medical-device-applications/</guid><category>Type-whitepaper</category><category>Adhesive</category><category>Medical-devices</category><category>Biocompatibility</category><dc:creator>Master Bond</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/28859628/origin.png"></media:content></item><item><title>Achieving Micron-Level Tolerances: CAD Optimization for Sub-10µm 3D Printing</title><link>https://content.knowledgehub.wiley.com/designing-for-precision-cad-tips-for-micro-scale-3d-printing/</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/boston-micro-fabrication-logo-with-red-cubic-design-next-to-bold-bmf-text.png?id=64987960&width=980"/><br/><br/><p><span>Achieve successful micro-scale 3D prints by optimizing tolerances, wall thickness, support strategies, microfluidic channels, and material selection in your CAD models from the start.</span></p><p><strong><span>What Attendees will Learn</span></strong></p><ol><li><span>Tolerance-driven design -- How to define resolution and tolerance constraints that translate directly from CAD intent to sub-10µm printed geometry.</span></li><li><span>Geometry-aware fabrication -- Principles for engineering wall thickness, aspect ratios, and orientation to maintain structural fidelity at micron scale.</span></li><li><span>Support-free design strategies -- Leveraging self-supporting geometries and build orientation to preserve feature integrity without post-processing trade-offs.</span></li><li><span>Integrated material-process thinking -- Matching resin properties, shrinkage behavior, and export parameters to your application’s functional requirements.</span></li></ol><div><span><a href="https://content.knowledgehub.wiley.com/designing-for-precision-cad-tips-for-micro-scale-3d-printing/" target="_blank">Download this free whitepaper now!</a></span></div>]]></description><pubDate>Thu, 26 Feb 2026 11:00:02 +0000</pubDate><guid>https://content.knowledgehub.wiley.com/designing-for-precision-cad-tips-for-micro-scale-3d-printing/</guid><category>Typewhitepaper</category><category>3d-printing</category><category>Microfluidics</category><category>Fabrication</category><category>Type-whitepaper</category><dc:creator>Boston Micro Fabrication</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/64987960/origin.png"></media:content></item><item><title>Your Watch Will One Day Track Blood Pressure</title><link>https://spectrum.ieee.org/blood-pressure-monitor-smartwatch</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/illustration-of-a-smartwatch-on-a-hand-showing-a-blood-pressure-reading.jpg?id=64960268&width=1200&height=400&coordinates=0%2C292%2C0%2C292"/><br/><br/><p>Your smartwatch can track a lot of things, but at least for now, it can’t keep an accurate eye on your blood pressure. Last week <a href="https://sites.utexas.edu/yjia/" target="_blank">researchers from University of Texas at Austin</a> showed a way your smartwatch someday could. They were able to discern blood pressure by reflecting radio signals off a person’s wrist, and they plan to integrate the electronics that did it into a smartwatch in a couple of years.</p><p>Beside the tried-and-true blood-pressure cuff, researchers in general have found several new ways to monitor blood pressure using pasted-on <a href="https://spectrum.ieee.org/wearable-ultrasound-wireless" target="_blank">ultrasound transducers</a>, electrocardiogram sensors, bioimpedance measurements, <a href="https://spectrum.ieee.org/measure-your-blood-pressure-using-just-your-phone" target="_blank">photoplethysmography</a>, and combinations of these measurements.</p><p>“We found that existing methods all face limitations,” <a href="https://www.researchgate.net/scientific-contributions/Yiming-Han-2262877830" target="_blank">Yiming Han</a>, a doctoral candidate in the lab of <a href="https://www.ece.utexas.edu/people/faculty/yaoyao-jia" target="_blank">Yaoyao Jia</a>, told engineers at the <a href="https://www.isscc.org/" target="_blank">IEEE International Solid State Circuits Conference (ISSCC)</a> last week in San Francisco. For example, ultrasound sensing requires long-term contact with the skin. And as cool as <a data-linked-post="2672222291" href="https://spectrum.ieee.org/electronic-tattoo" target="_blank">electronic tattoos</a> seem, they’re not as convenient or comfortable as a smartwatch. Photoplethysmography, which detects the oxygenation state of blood using light, doesn’t need direct contact, and indeed <a href="https://www.nature.com/articles/s41598-025-07087-2" target="_blank">researchers in Tehran and California recently used it</a> and a heavy dose of machine learning to monitor blood pressure. However, these sensors are <a href="https://publichealth.jhu.edu/2024/pulse-oximeters-racial-bias" target="_blank">thought to be sensitive to a person’s skin tone</a> and were blamed for Black people in the United States getting <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC9257583/" target="_blank">inadequate treatment during the COVID-19 pandemic</a>.</p><p>The University of Texas team sought a noncontact s<sub></sub>olution that was immune to skin-tone bias and could be integrated into a small device.</p><h2>Continuous Blood Pressure Monitoring</h2><p>Blood pressure measurements consist of two readings—<a href="https://en.wikipedia.org/wiki/Systole" target="_blank">systole</a>, the peak pressure when the heart contracts and forces blood into arteries, and <a href="https://en.wikipedia.org/wiki/Diastole" target="_blank">diastole</a>, the phase in between heart contractions when pressure drops. During systole, blood vessels expand and stiffen and blood velocity increases. The opposite occurs in diastole.</p><p>All these changes alter conductivity, dielectric properties, and other tissue properties, so they should show up in reflected near-field radio waves, Jia’s colleague <a href="https://www.ece.utexas.edu/people/faculty/deji-akinwande" target="_blank">Deji Akinwande</a> reasoned. Near-field waves are radiation impacting a surface that is less than one wavelength from the radiation’s source.</p><p>The researchers were able to test this idea using a common laboratory instrument called a <a href="https://www.tek.com/en/documents/primer/what-vector-network-analyzer-and-how-does-it-work" target="_blank">vector network analyzer</a>. Among its abilities, the analyzer can sense RF reflection, and the team was able to quickly correlate the radio response to blood pressure measured using standard medical equipment.</p><p>What Akinwande and Jia’s team saw was this: During systole, reflected near-field waves were more strongly out of phase with the transmitted radiation, while in diastole the reflections were weaker and closer to being in phase with the transmission.</p><p>You obviously can’t lug around a <a href="https://www.keysight.com/used/us/en/network-impedance-analyzers?gad_source=1&gad_campaignid=22103374539&gbraid=0AAAAApOLManrO_SNr8vg_JstXXglDwLFy&gclid=CjwKCAiAwNDMBhBfEiwAd7ti1CVGILi4MGmMcdQ7CW_vAlTM5pCKCuJSlycmsC0l440OSlc-ZrVjwxoC5DsQAvD_BwE" rel="noopener noreferrer" target="_blank">US $50,000 analyzer</a> just to keep track of your blood pressure, so the team created a wearable system to do the job. It consists of a patch antenna strapped to a person’s wrist. The antenna connects to a device called a circulator—a kind of traffic roundabout for radio signals that steers outgoing signals to the antenna and signals coming in from the antenna to a separate circuit. A custom-designed integrated circuit feeds a 2.4-gigahertz microwave signal into one of the circulator’s on-ramps and receives, amplifies, and digitizes the much weaker reflection coming in from another branch. The whole system consumes just 3.4 milliwatts.</p><p>“Our work is the only one to provide no skin contact and no skin-tone bias,” Han said.</p><p>The next version of the device will use multiple radio frequencies to increase accuracy, says Jia, “because different people’s tissue conditions are different,” and some might respond better to one or another. Like the 2.4 GHz used in the prototype, these other frequencies will be of the sort already in common use such as 5 GHz (a <a href="https://spectrum.ieee.org/wi-fi-7" target="_blank">Wi-Fi</a> frequency) and 915 megahertz (a cellular frequency).</p><p>Following those experiments, Jia’s team will turn to building the device into a smartwatch form factor and testing them more broadly for possible commercialization.</p>]]></description><pubDate>Tue, 24 Feb 2026 15:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/blood-pressure-monitor-smartwatch</guid><category>Blood-pressure</category><category>Continuous-monitoring</category><category>Smart-watch</category><category>Wearable-sensors</category><dc:creator>Samuel K. Moore</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/illustration-of-a-smartwatch-on-a-hand-showing-a-blood-pressure-reading.jpg?id=64960268&amp;width=980"></media:content></item><item><title>Tomorrow’s Smart Pills Will Deliver Drugs and Take Biopsies</title><link>https://spectrum.ieee.org/ingestible-electronics</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/miniature-figures-in-lab-coats-seated-inside-half-of-a-red-capsule-next-to-a-circuit-board.png?id=64957587&width=1200&height=400&coordinates=0%2C306%2C0%2C307"/><br/><br/><p><strong>One day soon, </strong>a doctor might prescribe a pill that doesn’t just deliver medicine but also reports back on what it finds inside you—and then takes actions based on its findings.</p><p>Instead of scheduling an endoscopy or CT scan, you’d swallow an electronic capsule smaller than a multivitamin. As it travels through your digestive system, it could check tissue health, look for cancerous changes, and send data to your doctor. It could even release drugs exactly where they’re needed or snip a tiny biopsy sample before passing harmlessly out of your body.</p><div class="rm-embed embed-media"><iframe height="110px" id="noa-web-audio-player" src="https://embed-player.newsoveraudio.com/v4?key=q5m19e&id=https://spectrum.ieee.org/ingestible-electronics&bgColor=F5F5F5&color=1b1b1c&playColor=1b1b1c&progressBgColor=F5F5F5&progressBorderColor=bdbbbb&titleColor=1b1b1c&timeColor=1b1b1c&speedColor=1b1b1c&noaLinkColor=556B7D&noaLinkHighlightColor=FF4B00&feedbackButton=true" style="border: none" width="100%"></iframe></div><p><span>This dream of a do-it-all pill is driving a surge of research into ingestible electronics: smart capsules designed to monitor and even treat disease from inside the gastrointestinal (GI) tract. The stakes are high. GI diseases affect tens of millions of people worldwide, including such ailments as </span><a href="https://www.mayoclinic.org/diseases-conditions/inflammatory-bowel-disease/symptoms-causes/syc-20353315" target="_blank">inflammatory bowel disease</a><span>, </span><a href="https://www.mayoclinic.org/diseases-conditions/celiac-disease/symptoms-causes/syc-20352220" target="_blank">celiac disease</a><span>, and small intestinal bacterial overgrowth. Diagnosis often involves a frustrating maze of blood tests, imaging, and invasive endoscopy. Treatments, meanwhile, can bring serious side effects because drugs affect the whole body, not just the troubled gut.</span></p><p>If capsules could handle much of that work—streamlining diagnosis, delivering targeted therapies, and sparing patients repeated invasive procedures—they could transform care. Over the past 20 years, researchers have built a growing tool kit of ingestible devices, some already in clinical use. These capsule-shaped devices typically contain sensors, circuitry, a power source, and sometimes a communication module, all enclosed in a biocompatible shell. But the next leap forward is still in development: autonomous capsules that can both sense and act, releasing a drug or taking a tissue sample.</p><p>That’s the challenge that our lab—the <a href="https://umdmsal.com/" target="_blank">MEMS Sensors and Actuators Laboratory</a> (MSAL) at the University of Maryland, College Park—is tackling. Drawing on decades of advances in <a href="https://spectrum.ieee.org/collections/mems-at-40/" target="_self">microelectromechanical systems</a> (MEMS), we’re building swallowable devices that integrate sensors, actuators, and wireless links in packages that are small and safe enough for patients. The hurdles are considerable: power, miniaturization, biocompatibility, and reliability, to name a few. But the potential payoff will be a new era of personalized and minimally invasive medicine, delivered by something as simple as a pill you can swallow at home.</p><h2>The Origin of Ingestible Devices </h2><p>The idea of a smart capsule has been around since the late 1950s, when researchers first experimented with swallowable devices to record temperature, gastric pH, or pressure inside the digestive tract. At the time, it seemed closer to science fiction than clinical reality, bolstered by pop-culture visions like the 1966 film <a href="https://en.wikipedia.org/wiki/Fantastic_Voyage" target="_blank"><em><em>Fantastic Voyage</em></em></a>, where miniaturized doctors travel inside the human body to treat a blood clot.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A gloved hand holds a small electronic capsule, with a researcher in lab safety gear blurred in the background." class="rm-shortcode" data-rm-shortcode-id="804393e2863a310effa646fe1f2fe8af" data-rm-shortcode-name="rebelmouse-image" id="d9098" loading="lazy" src="https://spectrum.ieee.org/media-library/a-gloved-hand-holds-a-small-electronic-capsule-with-a-researcher-in-lab-safety-gear-blurred-in-the-background.png?id=64075029&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">One of the authors (Ghodssi) holds a miniaturized drug-delivery capsule that’s designed to release medication at specific sites in the gastrointestinal tract.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Maximilian Franz/Engineering at Maryland Magazine </small></p><p>For decades, though, the mainstay of GI diagnostics was endoscopy: a camera on a flexible tube, threaded down the throat or up through the colon. These procedures are quite invasive and require patients to be sedated, which increases both the risk of complications and procedural costs. What’s more, it’s difficult for endoscopes to safely traverse the circuitous pathway of the small intestine. The situation changed in the early 2000s, when video-capsule endoscopy arrived. The best-known product, <a href="https://www.medtronic.com/en-us/healthcare-professionals/products/digestive-gastrointestinal/capsule-endoscopy/endoscopy-systems/pillcam-sb-3-capsule-endoscopy-system.html" target="_blank">PillCam</a>, looks like a large vitamin but contains a camera, LEDs, and a transmitter. As it passes through the gut, it beams images and videos to a wearable device.</p><p>Today, capsule endoscopy is a routine tool in gastroenterology; ingestible devices can measure acidity, temperature, or gas concentrations. And researchers are pushing further, with experimental prototypes that deliver drugs or analyze the microbiome. For example, teams from <a href="https://engineering.tufts.edu/news-events/news/ingestible-microbiome-sampling-pill-technology-advances" target="_blank">Tufts University</a>, in Massachusetts, and <a href="https://www.sciencedirect.com/science/article/abs/pii/S1742706125002685" target="_blank">Purdue University</a>, in Indiana, are working on devices with dissolvable coatings and mechanisms to collect <a href="https://spectrum.ieee.org/swallowable-robotic-pill-gut-health" target="_blank">samples of liquid</a> for studies of the intestinal microbiome.</p><p>Still, all those devices are passive. They activate on a timer or by exposure to the neutral pH of the intestines, but they don’t adapt to conditions in real time. The next step requires capsules that can sense biomarkers, make decisions, and trigger specific actions—moving from clever hardware to truly autonomous “smart pills.” That’s where our work comes in.</p><h2>Building on MEMS technology </h2><p>Since 2017, MSAL has been pushing ingestible devices forward with the goal of making an immediate impact in health care. The group built on the MEMS community’s legacy in microfabrication, sensors, and system integration, while taking advantage of new tools like 3D printing and materials like biocompatible polymers. Those advances have made it possible to prototype faster and shrink devices smaller, sparking a wave of innovation in wearables, implants, and now ingestibles. Today, MSAL is collaborating with engineers, physicians, and data scientists to move these capsules from lab benches to pharmaceutical trials.</p><p>As a first step, back in 2017, we set out to design sensor-carrying capsules that could reliably reach the small intestine and indicate when they reached it. Another challenge was that sensors that work well on the benchtop can falter inside the gut, where shifting pH, moisture, digestive enzymes, and low-oxygen conditions can degrade typical sensing components.</p><p> Our earliest prototype adapted MEMS sensing technology to <a href="https://pubs.rsc.org/en/content/articlelanding/2020/lc/d0lc00133c" target="_blank">detect abnormal enzyme levels</a> in the duodenum that are linked to pancreatic function. The sensor and its associated electronics were enclosed in a biocompatible, 3D-printed shell coated with polymers that dissolved only at certain pH levels. This strategy could one day be used to detect biomarkers in secretions from the pancreas to detect early-stage cancer.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="High-speed footage shows a small mechanical arm extending from a capsule and contacting intestinal tissue." class="rm-shortcode" data-rm-shortcode-id="0755148f67e43a08b6fad935757e7959" data-rm-shortcode-name="rebelmouse-image" id="b0383" loading="lazy" src="https://spectrum.ieee.org/media-library/high-speed-footage-shows-a-small-mechanical-arm-extending-from-a-capsule-and-contacting-intestinal-tissue.gif?id=64075037&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A high-speed video shows how a capsule deploys microneedles to deliver drugs into intestinal tissue.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.cell.com/device/fulltext/S2666-9986(24)00281-3" target="_blank">University of Maryland/Elsevier</a> </small></p><p>That first effort with a passive device taught us the fundamentals of capsule design and opened the door to new applications. Since then, we’ve developed sensors that can track biomarkers such as <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/adhm.202302897" target="_blank">the gas hydrogen sulfide</a>, neurotransmitters such as serotonin and dopamine, and bioimpedance—a measure of how easily ions pass through intestinal tissue—to shed light on the gut microbiome, inflammation, and disease progression. In parallel, we’ve worked on more-active devices: capsule-based tools for controlled drug release and tissue biopsy, using low-power actuators to trigger precise mechanical movements inside the gut.</p><p>Like all new medical devices and treatments, ingestible electronics face many hurdles before they reach patients—from earning physician trust and insurance approval to demonstrating clear benefits, safety, and reliability. Packaging is a particular focus, as the capsules must be easy to swallow yet durable enough to survive stomach acid. The field is steadily proving safety and reliability, progressing from proof of concept in tissue, through the different stages of animal studies, and eventually to human trials. Every stage provides evidence that reassures doctors and patients—for example, showing that ingesting a properly packaged tiny battery is safe, and that a capsule’s wireless signals, far weaker than those of a cellphone, pose no health risk as they pass through the gut.</p><h2>Engineering a Pill-Size Diagnostic Lab </h2><p>The gastrointestinal tract is packed with clues about health and disease, but much of it remains out of reach of standard diagnostic tools. Ingestible capsules offer a way in, providing direct access to the small intestine and colon. Yet in many cases, the concentrations of chemical biomarkers can be too low to detect reliably in early stages of a disease, which makes the engineering challenge formidable. What’s more, the gut’s corrosive, enzyme-rich environment can foul sensors in multiple ways, interfering with measurements and adding noise to the data.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Close-up of a microchip with a shiny surface and protruding thin pins." class="rm-shortcode" data-rm-shortcode-id="18db306a46dc405d56666f9fd9b5e3f4" data-rm-shortcode-name="rebelmouse-image" id="1d6c2" loading="lazy" src="https://spectrum.ieee.org/media-library/close-up-of-a-microchip-with-a-shiny-surface-and-protruding-thin-pins.png?id=64075109&width=980"/></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Close-up of a textured surface with triangular, raised patterns in a grid formation." class="rm-shortcode" data-rm-shortcode-id="8e62bb6ccfb3f8805b662480d24d3d4d" data-rm-shortcode-name="rebelmouse-image" id="79504" loading="lazy" src="https://spectrum.ieee.org/media-library/close-up-of-a-textured-surface-with-triangular-raised-patterns-in-a-grid-formation.png?id=64075101&width=980"/></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" rel="float: left;" style="float: left;"> <img alt="Electron microscope image of a microscale 3D printed pyramid with four conical structures." class="rm-shortcode" data-rm-shortcode-id="29456f61129308daca8309dfa4be54df" data-rm-shortcode-name="rebelmouse-image" id="1d64b" loading="lazy" src="https://spectrum.ieee.org/media-library/electron-microscope-image-of-a-microscale-3d-printed-pyramid-with-four-conical-structures.png?id=64075093&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Microneedle designs for drug-delivery capsules have evolved over the years. An early prototype [top] used microneedle anchors to hold a capsule in place. Later designs adopted molded microneedle arrays [center] for more uniform fabrication. The most recent version [bottom] integrates hollow microinjector needles, allowing more precise and controllable drug delivery.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">From top: <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/admt.202201365" target="_blank">University of Maryland/Wiley;</a><a href="https://www.cell.com/device/fulltext/S2666-9986(24)00281-3" target="_blank">University of Maryland/Elsevier;</a><a href="https://pubs.acs.org/doi/10.1021/acsami.5c05183" target="_blank">University of Maryland/ACS</a> </small></p><p>Take, for example, inflammatory bowel disease, for which there is no standard clinical test. Rather than searching for a scarce biomarker molecule, our team focused on a physical change: the permeability of the gut lining, which is a key factor in the disease. We designed capsules that <a href="https://www.nature.com/articles/s41378-025-00877-8" target="_blank">measure the intestinal tissue’s bioimpedance</a> by sending tiny currents across electrodes and recording how the tissue resists or conducts those currents at different frequencies (a technique called impedance spectroscopy). To make the electrodes suitable for in vivo use, we coated them with a thin, conductive, biocompatible polymer that reduces electrical noise and keeps stable contact with the gut wall. The capsule finishes its job by transmitting its data wirelessly to our computers.</p><p>In our lab tests, the capsule performed impressively, delivering clean impedance readouts from excised pig tissue even when the sample was in motion. In our animal studies, it detected shifts in permeability triggered by calcium chelators, compounds that pry open the tight junctions between intestinal cells. These results suggest that ingestible bioimpedance capsules could one day give clinicians a direct, minimally invasive window into gut-barrier function and inflammation. We believe that ingestible diagnostics can serve as powerful tools—catching disease earlier, confirming whether treatments are working, and establishing a baseline for gut health.</p><h2>Drug Delivery at the Right Place, Right Time </h2><p>Targeted drug delivery is one of the most compelling applications for ingestible capsules. Many drugs for GI conditions—such as biologics for <a href="https://www.mayoclinic.org/diseases-conditions/inflammatory-bowel-disease/symptoms-causes/syc-20353315" target="_blank">inflammatory bowel disease</a>—can cause serious side effects that limit both dosage and duration of treatment. A promising alternative is delivering a drug directly to the diseased tissue. This localized approach boosts the drug’s concentration at the target site while reducing its spread throughout the body, which improves effectiveness and minimizes side effects. The challenge is engineering a device that can both recognize diseased tissue and deliver medication quickly and precisely.</p><p>With other labs making great progress on the sensing side, we’ve devoted our energy to designing devices that can deliver the medicine. We’ve developed miniature actuators—tiny moving parts—that meet strict criteria for use inside the body: low power, small size, biocompatibility, and long shelf life.</p><p>Some of our designs use <a href="https://www.cell.com/device/fulltext/S2666-9986(24)00281-3" target="_blank">soft and flexible polymer “cantilevers”</a> with attached microneedle systems that pop out from the capsule with enough force to release a drug, but without harming the intestinal tissue. While hollow microneedles can directly inject drugs into the intestinal lining, we’ve also demonstrated prototypes that use the <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/admt.202201365" target="_blank">microneedles for anchoring</a> drug payloads, allowing the capsule to release a larger dose of medication that dissolves at an exact location over time.</p><p>In other experimental designs, we had the <a href="https://www.cell.com/device/fulltext/S2666-9986(24)00281-3" target="_blank">microneedles themselves dissolve after injecting a drug</a>. In still others, we used microscale 3D printing to <a href="https://www.cell.com/device/fulltext/S2666-9986(24)00281-3" target="_blank">tailor the structure of the microneedles</a> and control how quickly a drug is released—providing either a slow and sustained dose or a fast delivery. With this 3D printing, we created rigid microneedles that penetrate the mucosal lining and gradually diffuse the drug into the tissue, and soft microneedles that compress when the cantilever pushes them against the tissue, forcing the drug out all at once.</p><h2>Tissue Biopsy via Capsule</h2><div class="ieee-sidebar-medium"><h3>What Smart Capsules Can Do</h3><p><strong>Ingestible electronic capsules use miniaturized sensors and actuators to monitor the gut, deliver medication, and collect biological samples.</strong></p><h3>Sensing</h3><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Medical capsule emitting signals in a tube environment." class="rm-shortcode" data-rm-shortcode-id="275f3eb66ba35b660532a8ec93501ba8" data-rm-shortcode-name="rebelmouse-image" id="aecb7" loading="lazy" src="https://spectrum.ieee.org/media-library/medical-capsule-emitting-signals-in-a-tube-environment.png?id=64953223&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Embedded sensors can probe the gut—for example, measuring the bioimpedance of the intestinal lining to detect disease—and transmit the data wirelessly.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">All illustrations: Chris Philpot</small></p><h3>Drug delivery</h3><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Illustration of a capsule with spikes releasing medicine inside a transparent, tube-like structure." class="rm-shortcode" data-rm-shortcode-id="f762c675d0b5d405e3012639794eb86c" data-rm-shortcode-name="rebelmouse-image" id="1677e" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-a-capsule-with-spikes-releasing-medicine-inside-a-transparent-tube-like-structure.png?id=64953224&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Miniature actuators can trigger drug release at specific sites in the gut, boosting effectiveness while limiting side effects.</small></p><h3>Biopsy</h3><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Illustration of a capsule with gears, showing a magnified section with medicine release." class="rm-shortcode" data-rm-shortcode-id="03919d5e10aeb3b0b748887d2fd3896c" data-rm-shortcode-name="rebelmouse-image" id="1d94d" loading="lazy" src="https://spectrum.ieee.org/media-library/illustration-of-a-capsule-with-gears-showing-a-magnified-section-with-medicine-release.png?id=64953225&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A spring-loaded mechanism can collect a tiny biopsy sample from the gut wall and store it during the capsule’s passage through the digestive system.</small></p></div><p>Tissue sampling remains the gold standard diagnostic tool in gastroenterology, offering insights far beyond what doctors can glean from visual inspection or blood tests. Capsules hold unique promise here: They can travel the full length of the GI tract, potentially enabling more frequent and affordable biopsies than traditional procedures. But the engineering hurdles are substantial. To collect a sample, a device must generate significant mechanical force to cut through the tough, elastic muscle of the intestines—while staying small enough to swallow.</p><p><span>Different strategies have been explored to solve this problem. Torsion springs can store large amounts of energy but are difficult to fit inside a tiny capsule. Electrically driven mechanisms may demand more power than current capsule batteries can provide. Magnetic actuation is another option, but it requires bulky external equipment and precise tracking of the capsule inside the body.</span></p><p><span></span>Our group has developed a low-power biopsy system that builds on the torsion-spring approach. We compress a spring and use adhesive to “latch” it closed within the capsule, then attach a microheater to the latch. When we wirelessly send current to the device, the microheater melts the adhesive on the latch, triggering the spring. We’ve experimented with tissue-collection tools, integrating a bladed scraper or a biopsy punch (a cylindrical cutting tool) with our spring-activated mechanisms; either of those tools can cut and collect tissue from the intestinal lining. With advanced 3D printing methods like direct laser writing, we can put fine, microscale edges on these miniature cutting tools that make it easier for them to penetrate the intestinal lining.</p><p>Storing and protecting the sample until the capsule naturally passes through the body is a major challenge, requiring both preservation of the sample and resealing the capsule to prevent contamination. In one of our designs, residual tension in the spring keeps the bladed scraper rotating, pulling the sample into the capsule and effectively closing a hatch that seals it inside.</p><h2>The Road to Clinical Use for Ingestibles </h2><p>Looking ahead, we expect to see the first clinical applications emerge in early-stage screening. Capsules that can detect electrochemical, bioimpedance, or visual signals could help doctors make sense of symptoms like vague abdominal pain by revealing inflammation, gut permeability, tumors, or bacterial overgrowth. They could also be adapted to screen for GI cancers. This need is pressing: The American Cancer Society reports that as of 2021, <a href="https://www.cancer.org/content/dam/cancer-org/research/cancer-facts-and-statistics/colorectal-cancer-facts-and-figures/colorectal-cancer-facts-and-figures-2023.pdf" target="_blank">41 percent of eligible U.S. adults</a> were not up to date on colorectal cancer screening. What’s more, effective screening tools don’t yet exist for some diseases, such as <a href="https://www.mayoclinic.org/diseases-conditions/small-bowel-cancer/symptoms-causes/syc-20352497" target="_blank">small bowel adenocarcinoma</a>. Capsule technology could make screening less invasive and more accessible.</p><p>Of course, ingestible capsules carry risks. The standard hazards of endoscopy still apply, such as the possibility of bleeding and perforation, and capsules introduce new complications. For example, if a capsule gets stuck in its passage through the GI tract, it could cause bowel obstruction and require endoscopic retrieval or even surgery. And concerns that are specific to ingestibles, including the biocompatibility of materials, reliable encapsulation of electronics, and safe battery operation, all demand rigorous testing before clinical use.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A series of images shows a small paper-based battery gradually dissolving in a dish of water over 60 minutes. " class="rm-shortcode" data-rm-shortcode-id="5fde71dac4d588b943523a827629cebe" data-rm-shortcode-name="rebelmouse-image" id="8b2e3" loading="lazy" src="https://spectrum.ieee.org/media-library/a-series-of-images-shows-a-small-paper-based-battery-gradually-dissolving-in-a-dish-of-water-over-60-minutes.jpg?id=64075124&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">A microbe-powered biobattery designed for ingestible devices dissolves in water within an hour.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">            Seokheun Choi/Binghamton University        </small></p><p>Powering these capsules is a key challenge that must be solved on the path to the clinic. Most capsule endoscopes today rely on coin-cell batteries, typically silver oxide, which offer a safe and energy-dense source but often occupy 30 to 50 percent of the capsule’s volume. So researchers have investigated alternatives, from wireless power transfer to energy-harvesting systems. At the State University of New York at Binghamton, one team is exploring <a href="https://www.binghamton.edu/news/story/5482/binghamton-university-researchers-make-dissolvable-battery-using-probiotics" target="_blank">microbial fuel cells</a> that generate electricity from probiotic bacteria interacting with nutrients in the gut. At MIT, researchers used the <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC5404703/" target="_blank">gastric fluids of a pig’s stomach</a> to power a simple battery. In our own lab, we are exploring piezoelectric and electrochemical approaches to harvesting energy throughout the GI tract.</p><p>The next steps for our team are pragmatic ones: working with gastroenterologists and animal-science experts to put capsule prototypes through rigorous in vivo studies, then refining them for real-world use. That means shrinking the electronics, cutting power consumption, and integrating multiple functions into a single multimodal device that can sense, sample, and deliver treatments in one pass. Ultimately, any candidate capsule will require regulatory approval for clinical use, which in turn demands rigorous proof of safety and clinical effectiveness for a specific medical application.</p><p>The broader vision is transformative. Swallowable capsules could bring diagnostics and treatment out of the hospital and into patients’ homes. Whereas procedures with endoscopes require anesthesia, patients could take ingestible electronics easily and routinely. Consider, for example, patients with inflammatory bowel disease who live with an elevated risk of cancer; a smart capsule could perform yearly cancer checks, while also delivering medication directly wherever necessary.</p><p>Over time, we expect these systems to evolve into semiautonomous tools: identifying lesions, performing targeted biopsies, and perhaps even analyzing samples and applying treatment in place. Achieving that vision will require advances at the very edge of microelectronics, materials science, and biomedical engineering, bringing together capabilities that once seemed impossible to combine in something the size of a pill. These devices hint at a future in which the boundary between biology and technology dissolves, and where miniature machines travel inside the body to heal us from within. <span class="ieee-end-mark"></span></p>]]></description><pubDate>Wed, 18 Feb 2026 15:14:00 +0000</pubDate><guid>https://spectrum.ieee.org/ingestible-electronics</guid><category>Drug-delivery</category><category>Microelectromechanical-systems</category><category>Mems</category><category>Ingestible-electronics</category><category>Sensors</category><dc:creator>Reza Ghodssi</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/miniature-figures-in-lab-coats-seated-inside-half-of-a-red-capsule-next-to-a-circuit-board.png?id=64957587&amp;width=980"></media:content></item><item><title>What the FDA’s 2026 Update Means for Wearables</title><link>https://spectrum.ieee.org/fda-medical-device-rules</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/illustration-of-a-smartwatch-with-an-eyeball-displayed-on-the-screen.jpg?id=64099717&width=1200&height=400&coordinates=0%2C229%2C0%2C230"/><br/><br/><p>As new consumer hardware and software capabilities have bumped up against medicine over the last few years, consumers and manufacturers alike have struggled with identifying the line between “wellness” products such as earbuds that can also amplify and clarify surrounding speakers’ voices and regulated medical devices such as conventional hearing aids. On 6 January 2026, the U.S. Food and Drug Administration issued new guidance documents clarifying how it interprets existing law for the review of wearable and AI-assisted devices. </p><p>The first document, for <a href="https://www.fda.gov/regulatory-information/search-fda-guidance-documents/general-wellness-policy-low-risk-devices" rel="noopener noreferrer" target="_blank">general wellness</a>, specifies that the FDA will interpret noninvasive sensors such as sleep trackers or heart rate monitors as low-risk wellness devices while treating invasive devices under conventional regulations. The other document defines how the FDA will exempt <a href="https://www.fda.gov/regulatory-information/search-fda-guidance-documents/clinical-decision-support-software" rel="noopener noreferrer" target="_blank">clinical-decision support tools</a> from medical device regulations, limiting such software to analyzing existing data rather than extracting data from sensors, and requiring them to enable independent review of their recommendations. The documents do not rewrite any statutes, but they refine interpretation of existing law, compared to the 2019 and 2022 documents they replace. They offer a fresh lens on how regulators see technology that sits at the intersection of consumer electronics, software, and medicine—a category many other countries are choosing to regulate more strictly rather than less.</p><h2>What the 2026 update changed</h2><p>The 2026 FDA update clarifies how it distinguishes between “medical information” and systems that measure physiological “signals” or “patterns.” Earlier guidance discussed these concepts more generally, but the new version defines signal-measuring systems as those that collect continuous, near-continuous, or streaming data from the body for medical purposes, such as home devices transmitting blood pressure, <a href="https://spectrum.ieee.org/should-you-trust-apples-new-blood-oxygen-sensor" target="_blank">oxygen saturation</a>, or <a href="https://spectrum.ieee.org/smartphone-camera-senses-patients-pulse-breathing-rate" target="_blank">heart rate</a> to clinicians. It gives more concrete examples, like a blood glucose lab result as medical information versus continuous glucose monitor readings as signals or patterns.</p><p>The updated guidance also sharpens examples of what counts as medical information that software may display, analyze, or print. These include radiology reports or summaries from legally marketed software, ECG reports annotated by clinicians, blood pressure results from cleared devices, and lab results stored in electronic health records. </p><p>In addition, the 2026 update softens FDA’s earlier stance on clinical decision tools that offer only one recommendation. While prior guidance suggested tools needed to present multiple options to avoid regulation, FDA now indicates that a single recommendation may be acceptable if only one option is clinically appropriate, though it does not define how that determination will be made. </p><p>Separately, updates to the general wellness guidance clarify that some noninvasive wearables—such as optical sensors estimating blood glucose for wellness or nutrition awareness—may qualify as general wellness products, while more-invasive technologies would not.</p><h2>Wellness still requires accuracy</h2><p>For designers of wearable health devices, the practical implications go well beyond what label you choose. “Calling something ‘wellness’ doesn’t reduce the need for rigorous validation,” says <a href="https://ece.gatech.edu/directory/omer-t-inan" rel="noopener noreferrer" target="_blank">Omer Inan</a>, a medical device technology researcher at the Georgia Tech School of Electrical and Computer Engineering. A wearable that reports blood pressure inaccurately could lead a user to conclude that their values are normal when they are not, potentially influencing decisions about seeking clinical care.</p><p>“In my opinion, engineers designing devices to deliver health and wellness information to consumers should not change their approach based on this new guidance,” says Inan. Certain measurements—such as blood pressure or glucose—carry real medical consequences regardless of how they’re branded, Inan notes.</p><p>Unless engineers follow robust validation protocols for technology delivering health and wellness information, Inan says, consumers and clinicians alike face the risk of faulty information.</p><p>To address that, Inan advocates for transparency: Companies should publish their validation results in peer-reviewed journals, and independent third parties without financial ties to the manufacturer should evaluate these systems. That approach, he says, helps the engineering community and the broader public assess the accuracy and reliability of wearable devices.</p><h2>When wellness meets medicine</h2><p>The societal and clinical impacts of wearables are already visible, regardless of regulatory labels, says Sharona Hoffman, JD, a law and bioethics professor at Case Western Reserve University.</p><p>Medical metrics from devices like the Apple Watch or Fitbit may be framed as “wellness,” but in practice many users treat them like medical data, influencing their behavior or decisions about care, Hoffman points out.</p><p>“It could cause anxiety for patients who constantly check their metrics,” she notes. Alternatively, “A person may enter a doctor’s office confident that their wearable has diagnosed their condition, complicating clinical conversations and decision-making.”</p><p>Moreover, privacy issues remain unresolved, unmentioned in previous or updated guidance documents. Many companies that design wellness devices fall outside protections like the Health Insurance Portability and Accountability Act (HIPAA), meaning data about health metrics could be collected, shared, or sold without the same constraints as traditional medical data. “We don’t know what they’re collecting information about or whether marketers will get hold of it,” Hoffman says. </p><h2>International approaches</h2><p>The European Union’s Artificial Intelligence Act designates systems that process health-related data or influence clinical decisions as “high risk,” subjecting them to stringent requirements around data governance, transparency, and human oversight. China and South Korea have also implemented rules that tighten controls on algorithmic systems that intersect with health care or public-facing use cases. South Korea provides very specific categories for regulation for technology makers, such as <a href="https://www.mfds.go.kr/eng/brd/m_40/list.do" rel="noopener noreferrer" target="_blank">standards on labeling and descriptions on medical devices and good manufacturing practices</a>. </p><p>Across these regions, regulators are not only classifying technology by its intended use but also by its potential impact on individuals and society at large.</p><p>“Other countries that emphasize technology are still worrying about data privacy and patients,” Hoffman says. “We’re going in the opposite direction.”</p><h2>Post-market oversight </h2><p>“Regardless of whether something is FDA approved, these technologies will need to be monitored in the sites where they’re used,” says Todd R. Johnson, a professor of biomedical informatics at the McWilliams School of Biomedical Informatics at UTHealth Houston, who has worked on FDA-regulated products and informatics in clinical settings. “There’s no way the makers can ensure ahead of time that all of the recommendations will be sound.”</p><p>Large health systems may have the capacity to audit and monitor tools, but smaller clinics often do not. Monitoring and auditing are not emphasized in the current guidance, raising questions about how reliability and safety will be maintained once devices and software are deployed widely.</p><h2>Balancing innovation and safety</h2><p>For engineers and developers, the FDA’s 2026 guidance presents both opportunities and responsibilities. By clarifying what counts as a regulated device, the agency may reduce upfront barriers for some categories of technology. But that shift also places greater weight on design rigor, validation transparency, and post-market scrutiny. </p><p>“Device makers do care about safety,” Johnson says. “But regulation can increase barriers to entry while also increasing safety and accuracy. There’s a trade-off.”</p>]]></description><pubDate>Thu, 12 Feb 2026 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/fda-medical-device-rules</guid><category>Wearable-devices</category><category>Fda</category><category>Medical-devices</category><dc:creator>Catherine Arnold</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/illustration-of-a-smartwatch-with-an-eyeball-displayed-on-the-screen.jpg?id=64099717&amp;width=980"></media:content></item><item><title>At-Home Brain Stimulation for Depression Is Just the Start</title><link>https://spectrum.ieee.org/flow-neuroscience-tdcs-depression-fda</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/person-wearing-a-futuristic-white-headband-device-with-a-sleek-design.jpg?id=63843851&width=1200&height=400&coordinates=0%2C1042%2C0%2C1042"/><br/><br/><p>For years, a small group of technology enthusiasts have been applying gentle electrical current to their brains in an effort to gain cognitive benefits, improve sleep, or aid memory. While brain stimulation, also referred to as neuromodulation, can take many forms, <a data-linked-post="2650275280" href="https://spectrum.ieee.org/athome-electric-headband-for-depression-could-go-mainstream" target="_blank">transcranial direct current stimulation</a> (tDCS) emerged as a reasonably safe, affordable choice for at-home experimentation for a range of purposes.<strong></strong></p><p>These devices have often been home-brewed or sold as wellness tools, but in her research into the do-it-yourself tDCS community, <a href="https://medicalethicshealthpolicy.med.upenn.edu/faculty-all/anna-wexler" rel="noopener noreferrer" target="_blank">Anna Wexler</a>, a medical ethicist at the University of Pennsylvania, found that in addition to brain boosting, many practitioners were self-medicating, using electrotherapy to treat symptoms of depression and anxiety. Until recently, there were no medical tDCS devices with U.S. <a href="https://www.fda.gov/" target="_blank">Food and Drug Administration</a> approval.</p><p>In December the FDA approved a tDCS headset produced by <a href="https://www.flowneuroscience.com/" rel="noopener noreferrer" target="_blank">Flow Neuroscience</a> for treatment of major depressive disorder. The decision paves the way for the Swedish company to make its device available in the United States via prescription, and millions of people may now have access through traditional health care to a noninvasive, nondrug treatment option for depression that can be self-administered in the home.</p><p>“It’s significant for patients who now have an alternative to pharmacotherapy with its limits, and it’s a big deal to the brain-stimulation community,” says <a href="https://www.ccny.cuny.edu/profiles/marom-bikson" rel="noopener noreferrer" target="_blank">Marom Bikson</a>, who leads the neural engineering group at City College of New York and coauthored <a href="https://www.brainstimjrnl.com/article/S1935-861X(25)00423-1/fulltext" rel="noopener noreferrer" target="_blank">an analysis</a> of the regulatory decision. He also cofounded a company, <a href="https://soterixmedical.com/" rel="noopener noreferrer" target="_blank">Soterix Medical</a>, which produces a tDCS device that has been approved for in-clinic treatment of depression in multiple countries.</p><p>“tDCS is a very safe technology. That’s why we get the kind of approval we get, but it can seem scary and kind of [like] science fiction,” says <a href="https://www.flowneuroscience.com/about/" rel="noopener noreferrer" target="_blank">Erik Rehn</a>, the CTO of Flow. The design philosophy of the headset has focused on safe, at-home use without supervision, he says. Similar to other tDCS devices, reported side effects are typically mild, such as skin irritation near electrode sites on the forehead.</p><h2>tDCS for Depression Treatment</h2><p>Parallel to the DIY movement, researchers have been investigating tDCS therapeutics and its effects on the human body for decades. How exactly does electricity treat depression? As with any case where the brain meets the mind, questions of biology and medicine become philosophical, and clear answers become very difficult. But tDCS does seem to help some of the people that use it.</p><p>In Flow’s <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC11750699/" rel="noopener noreferrer" target="_blank">pivotal trial</a>, patients applied a 2-milliampere current in 30 minute sessions for five days a week for three weeks, and then three days a week for seven more weeks. Fifty-eight percent of participants responded to treatment, compared to 38 percent that received a faked form of treatment in the <a href="https://registries.ncats.nih.gov/glossary/sham-comparator-arm/" target="_blank">sham arm</a> of the experiment.<strong></strong></p><p>“We find that tDCS is helpful for some patients on its own and for some as part of a treatment plan, but its effects vary among individuals and we want to understand who it would be most helpful for,” says <a href="https://www.uel.ac.uk/about-uel/staff/cynthia-hyfu" rel="noopener noreferrer" target="_blank">Cynthia Fu</a>, a psychiatry researcher at the University of East London, and a Flow clinical trial site leader. As part of a larger treatment plan, Flow could be used in conjunction with standard treatments, such as talk therapy, lifestyle changes, and importantly, pharmaceutical drugs.<strong></strong></p><p>Perhaps in part because of the many treatment variables, there has been <a href="https://www.theguardian.com/society/2025/jan/11/is-a-brain-stimulation-headset-the-answer-to-depression" rel="noopener noreferrer" target="_blank">mixed evidence</a> for the effectiveness of tDCS depression treatment broadly, and some tDCS researchers have suggested neuroimaging could help personalize care and improve results. Such treatment plans could require testing on expensive equipment such as MRI machines. But Rehn says that Flow is committed to access that scales, and is exploring individualized care through other means, such as machine learning.</p><p><a href="https://www.nimh.nih.gov/health/statistics/major-depression" rel="noopener noreferrer" target="_blank">Recent</a> <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC9483000/" rel="noopener noreferrer" target="_blank">estimates</a> suggest that around 8 percent of U.S. adults experience at least one major depressive episode in a given year. That number appears to be trending upward and is more common in young adults. Meanwhile, in a <a href="https://www.cdc.gov/nchs/products/databriefs/db528.htm" rel="noopener noreferrer" target="_blank">2023 survey</a>, 15 percent of U.S. women and 7 percent of U.S. men used antidepressants.<strong></strong></p><p>Other types of “electroceuticals” have been used to treat depression, though they have their drawbacks. Electroconvulsive therapy and transcranial magnetic stimulation are mature therapies, but require repeated in-person visits to a clinic. Deep brain stimulation has exploratory use, but involves surgery to install a <a href="https://spectrum.ieee.org/what-is-neural-implant-neuromodulation-brain-implants-electroceuticals-neuralink-definition-examples" target="_self">neural implant</a>.</p><p>The FDA restricts how treatments can be marketed, but doctors can prescribe an approved device for off-label usage, hopefully following a body of medical evidence to treat other diseases. In the case of Flow, the <a href="https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpma/pma.cfm?id=P230024" rel="noopener noreferrer" target="_blank">FDA approval</a> includes wording that suggests use of tDCS as a first-line option, rather than cases where other treatments have failed. But doctors may prescribe tDCS for depression in a variety of situations, or might even expand applications to manage other mental health issues, such as anxiety.<strong></strong></p><p>Other practical questions include how insurance will cover the device, or what sorts of assistance might be needed to help some depressed patients stick to a treatment schedule. These questions may take months to be answered, and Flow may have competition in the near future.</p><p>In January, a second company, <a href="https://neurolief.com/" target="_blank">Neurolief</a>, <a href="https://www.prnewswire.com/news-releases/neurolief-receives-fda-pma-approval-for-first-at-home-brain-neuromodulation-therapy-for-adults-whose-depression-was-not-adequately-improved-by-antidepressants-302658098.html" rel="noopener noreferrer" target="_blank">announced FDA approval</a> of its own at-home headset device for treatment of depression. Neurolief’s product does not use tDCS, but a different method to stimulate the brain, and acceptance may follow a separate track, says Bikson.</p><p>Further complicating matters, although a Flow headset will require a prescription for the foreseeable future in the United States, the device is already for sale to the public without a prescription in the United Kingdom and European Union, and a very similar device is already available in the United States on a direct-to-consumer basis.</p><p>In 2021, Flow Neuroscience acquired <a href="https://www.mobihealthnews.com/news/flow-neuroscience-buys-fellow-brain-stimulation-company-halo" rel="noopener noreferrer" target="_blank">Halo Neuroscience</a>, and under the Halo brand name currently sells a headset using what it calls “<a href="https://www.haloneuroscience.com/halo-flow" rel="noopener noreferrer" target="_blank">identical tDCS hardware</a>.” It is marketed as a wellness device, rather than a medical one, and does not require a prescription. The Halo website claims benefits to mood, sleep, and focus, and advertises a full price around US $600.</p><p>The simultaneous medical approval and consumer availability of twin devices is a striking example of a larger trend, says Wexler. Some patients could be making a choice between Halo and Flow headsets for treatment. “It’s blurring the lines between consumer products and medical devices,” she says.</p><p>Whatever the name on the device or the eventual proportion of tDCS devices sold medically or for wellness, the approval legitimizes and opens a medical avenue of access to the technology. “I think there’s a huge unmet need,” says Wexler. “We need more effective therapeutic options for depression.”</p><p><em>This article appears in the April 2026 print issue as “A Prescription for At-Home Brain Stimulation.”</em></p>]]></description><pubDate>Thu, 05 Feb 2026 20:21:42 +0000</pubDate><guid>https://spectrum.ieee.org/flow-neuroscience-tdcs-depression-fda</guid><category>Biomedical</category><category>Electroceuticals</category><category>Neuromodulation</category><category>Brain-stimulation</category><category>Tdcs</category><dc:creator>Greg Uyeno</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/person-wearing-a-futuristic-white-headband-device-with-a-sleek-design.jpg?id=63843851&amp;width=980"></media:content></item><item><title>AlphaGenome Deciphers Non-Coding DNA for Gene Regulation</title><link>https://spectrum.ieee.org/alphagenome-ai-gene-regulation</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/model-overview-of-alphagenome-1-megabyte-of-dna-sequence-is-broken-into-chunks-processed-across-devices-the-core-architecture.jpg?id=63769373&width=1200&height=400&coordinates=0%2C417%2C0%2C417"/><br/><br/><p>When <a href="https://spectrum.ieee.org/alphafold-proves-that-ai-can-crack-fundamental-scientific-problems" target="_self">AlphaFold solved the protein-folding problem</a> in 2020, it showed that artificial intelligence could crack one of biology’s deepest  mysteries: how a string of amino acids folds itself into a working molecular machine.</p><p>The team at <a href="https://deepmind.google/" rel="noopener noreferrer" target="_blank">Google DeepMind</a> behind that <a href="https://www.nobelprize.org/prizes/chemistry/2024/press-release/" rel="noopener noreferrer" target="_blank">Nobel Prize–winning</a> platform then turned their lens from from the structure of proteins to how these molecules function in the body. Applying similar machine learning methods, they first developed <a href="https://www.science.org/doi/10.1126/science.adg7492" rel="noopener noreferrer" target="_blank">AlphaMissense</a>, an AI tool for predicting which changes in protein structure are likely to cause disease. <a href="https://deepmind.google/blog/alphaproteo-generates-novel-proteins-for-biology-and-health-research/" rel="noopener noreferrer" target="_blank">AlphaProteo</a>, a system for designing proteins that bind to specific molecular targets, came next.</p><p>Now the architects of the Alpha platform are pushing beyond proteins into genomics, seeking to decipher how the vast regulatory regions of DNA shape when, where, and how genes are turned on and off.</p><p>Enter <a href="https://github.com/google-deepmind/alphagenome" rel="noopener noreferrer" target="_blank">AlphaGenome</a>. Described as a “<a href="https://doi.org/10.1016/j.tig.2025.11.007" rel="noopener noreferrer" target="_blank">Swiss Army knife for exploring non-coding DNA</a>,” the deep-learning tool offers a way to systematically interpret the 98 percent of the genome that does not encode instructions for making proteins but instead orchestrates how those genetic instructions are used inside the cell.</p><p>“This allows us to model intricate processes...with unprecedented precision,” <a href="https://scholar.google.com/citations?user=gojWHbQAAAAJ&hl=en" rel="noopener noreferrer" target="_blank">Žiga Avsec</a>, head of genomics at Google DeepMind, said in a press conference unveiling the new tool. </p><h2>Narrowing the Genomic Search Space<br/></h2><p>AlphaGenome has its limitations. For instance, the tool’s training data draw largely from bulk tissue datasets, curbing its reliability in rare cell types or specific developmental stages, notes <a href="https://www.mskcc.org/research/ski/labs/christina-leslie" target="_blank">Christina Leslie</a>, a computational biologist at Memorial Sloan Kettering Cancer Center, in New York City. <strong></strong>“Generalization to new cell types is a huge limitation,” she says.<strong></strong> <span><br/></span></p><p><span>It also struggles to capture distant effects when regulatory regions are hundreds of thousands to millions of DNA letters away from their target genes, Leslie pointed out.<br/></span></p><p><span>Even so, the model </span>is helping scientists to prioritize which genetic variants are most likely to matter, narrowing the search from across the genome to a manageable set of testable hypotheses. <span>“It is the state of the art right now,” Leslie says.</span></p><p>According to DeepMind, thousands of scientists around the world are already using AlphaGenome, which is <a href="https://github.com/google-deepmind/alphagenome" target="_blank">freely available on GitHub</a> for academic research purposes. It is being put to work across a range of applications, including pinpointing genetic drivers of cancer and rare diseases, discovering new drug targets, and designing synthetic strands of DNA with tailored regulatory functions.</p><p>“It’s exciting to have things like AlphaGenome come out and perform much better than all the other dedicated algorithms that are exploring various aspects of genome biology,” says <a href="https://wi.mit.edu/people/member/young" target="_blank">Richard Young</a>, a biologist at the Whitehead Institute for Biomedical Research who has collaborated with Google DeepMind on its <a href="https://spectrum.ieee.org/ai-co-scientist" target="_self">AI co-scientist platform</a> but was not involved in AlphaGenome. “It’s a huge accelerator.”</p><h2>High Resolution at Large Genomic Scale</h2><p>The arrival of AlphaGenome marks another step in <a href="https://spectrum.ieee.org/ai-for-science-2" target="_blank">AI’s steady advance into some of biology’s</a> most stubborn and consequential challenges.</p><p>For DeepMind, there is also a clear industrial logic at work. The company’s growing stable of biological models—spanning protein structure, mutation, and generation, and now genomic regulation—amounts to a vertically integrated platform for molecular prediction. That platform, in turn, should help unlock new diagnostic capabilities and therapeutic strategies, according to <a href="https://www.linkedin.com/in/pushmeet-kohli-4838994/?originalSubdomain=uk" target="_blank">Pushmeet Kohli</a>, vice president of science and strategic initiatives at Google DeepMind.</p><p>“All these different models are solving key problems that are relevant for understanding biology,” Kohli says.<br/></p><p>AlphaGenome is the latest—and most expansive—piece of that strategy. Trained on raw DNA, the model predicts 11 types of biological signals that help determine how genes are used inside cells. These include whether a gene is turned on or off, where gene activity begins, how genetic messages are edited, how tightly DNA is packed, which regulatory proteins bind to it, and how distant regions of the genome interact with one another.</p><p>Many of these features already have their own specialty AI tools—<a href="https://cell.com/cell/retrieve/pii/S0092867418316295" target="_blank">SpliceAI</a> for splice site prediction, <a href="https://www.biorxiv.org/content/10.1101/2024.12.25.630221v2" target="_blank">ChromBPNet</a> for local chromatin accessibility, <a href="https://www.nature.com/articles/s41588-022-01065-4" target="_blank">Orca</a> for three-dimensional genome architecture. But such tools are typically used in isolation, requiring researchers to stitch together results from multiple sources.</p><p>“AlphaGenome replaces this fragmentation with a more unified framework, which is more convenient and user-friendly—and we hope this will accelerate scientists’ workflows,” says <a href="https://scholar.google.com/citations?user=9PPptRIAAAAJ&hl=en" target="_blank">Natasha Latysheva</a>, a computational geneticist at Google DeepMind.</p><p>And while there have been attempts to capture all manner of regulatory effects in a single model, earlier architectures such as <a href="https://www.nature.com/articles/s41588-024-02053-6" target="_blank">Borzoi</a> and <a href="https://www.nature.com/articles/s41592-021-01252-x" target="_blank">Enformer</a> typically traded fine-scale resolution for breadth of biological coverage.</p><p>AlphaGenome tries to escape that trade-off. The model can ingest up to 1 million DNA letters at a time, preserving long-range regulatory context, while still making predictions at single-base-pair resolution. In practical terms, that means it can ask how a change in one nucleotide might reverberate across a vast swath of the genome.</p><h2>Connecting DNA Changes to Disease Biology</h2><p>The new paper presents several demonstrations of this capability.</p><p>In one case, AlphaGenome correctly predicted how a tiny deletion disrupts a splice site in a gene involved in blood-vessel biology, leading to reduced RNA output. In another, it captured how mutations near a cancer-linked gene boosted its activity, helping to drive an aggressive form of leukemia.</p><p>Whether this predictive power generalizes beyond well-studied genes remains an open question, though.</p><p>“This is obviously a potentially valuable tool—but it’s a tool,” says <a href="https://www.stjude.org/people/m/charles-mullighan.html" target="_blank">Charles Mullighan</a>, deputy director of the St. Jude Children’s Research Comprehensive Cancer Center, in Memphis. “It’s not a final point of discovery, but it’s going to be a very important tool for giving insights that then might guide further functional analyses and experiments.”<strong></strong> </p><p>One “quirk” of the system, notes Latysheva, is its bias toward false negatives over false positives, meaning it is more likely to miss a genuinely important DNA variant rather than incorrectly flag a harmless one. “But the flip side of that is if it does predict a strong effect, it’s actually very accurate,” she says. So, when the model serves up a strong prediction, “you can have a decent amount of confidence that it knows what it’s doing.”</p><p>That confidence proved useful for <a href="https://scholar.google.com/citations?user=WGLx0TYAAAAJ&hl=ja" rel="noopener noreferrer" target="_blank">Y-h. Taguchi</a> and Kenta Kobayashi from Chuo University in Japan when they set out to stress-test a data-driven link between sleep deprivation and specific neuronal cell types. Early adopters of AlphaGenome, the  bioinformatics researchers used the AI tool as an independent cross-check, confirming that genes implicated by sleep loss were especially active in their neurons of interest—just as their earlier analysis of gene-expression data from brain tissue had predicted.</p><p>“AlphaGenome succeeded in the cross validation,” says Takuchi, who <a href="https://www.mdpi.com/2073-4425/17/1/51" rel="noopener noreferrer" target="_blank">published the results</a> January 1 in the journal <em>Genes</em>. </p><p>That sort of validation underscores AlphaGenome’s role. Like AlphaFold before it, the system is not meant to explain biology in full but to make its most opaque regions easier to explore.</p>]]></description><pubDate>Wed, 04 Feb 2026 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/alphagenome-ai-gene-regulation</guid><category>Google-deepmind</category><category>Genomics</category><category>Protein-folding</category><category>Dna</category><category>Ai-tools</category><dc:creator>Elie Dolgin</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/model-overview-of-alphagenome-1-megabyte-of-dna-sequence-is-broken-into-chunks-processed-across-devices-the-core-architecture.jpg?id=63769373&amp;width=980"></media:content></item><item><title>Stretchable OLEDs Just Got a Huge Upgrade</title><link>https://spectrum.ieee.org/stretchable-oleds-wearable-display-drexel</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-bi-axially-stretched-exciplex-assisted-phosphorescent-film-deposited-on-a-small-stretchable-substrate.jpg?id=62303035&width=1200&height=400&coordinates=0%2C417%2C0%2C417"/><br/><br/><p>Wearable displays are catching up with phones and smartwatches. For decades, engineers have sought OLEDs that can bend, twist, and stretch while maintaining bright and stable light. These displays could be integrated into a new class of devices—woven into clothing fabric, for example, to show real-time information, like a runner’s speed or heart rate, without breaking or dimming.</p><p>But engineers have always encountered a trade-off: The more you stretch these materials, the dimmer they become. Now, a group co-led by <a href="https://drexel.edu/engineering/about/faculty-staff/G/gogotsi-yury/" rel="noopener noreferrer" target="_blank">Yury Gogotsi</a>, a materials scientist at Drexel University in Philadelphia, has found a way around the problem by employing a special class of materials called <a href="https://spectrum.ieee.org/why-mxenese-matter" target="_self">MXenes</a>—which Gogotsi helped discover—that maintain brightness while being significantly stretched.</p><p>The team developed an OLED that can stretch to twice its original size while keeping a steady glow. It also converts electricity into light more efficiently than any stretchable OLED before it, reaching a record 17 percent external quantum efficiency—a measure of how efficiently a device turns electricity into light.</p><h2>The “Perfect Replacement”</h2><p>Gogotsi didn’t have much experience with OLEDs when, about five years ago, he teamed up with <a href="https://www.pnel.snu.ac.kr/professor-intro" rel="noopener noreferrer" target="_blank">Tae-Woo Lee</a>, a materials scientist at Seoul National University, to develop better flexible OLEDs, driven by the ever-increasing use of <a href="https://spectrum.ieee.org/from-foldable-phones-to-stretchy-screens" target="_self">flexible electronics</a> like foldable phones.</p><p>Traditionally, the displays are built from multiple stacked layers. At the base, a cathode supplies electrons that enter the adjacent organic layers, which are designed to conduct this charge efficiently. As the electrons move through these layers, they meet positive charge injected by an indium tin oxide (ITO) film. The moment these charges combine, the organic material releases energy as light, creating the illuminated pixels that make up the image. The entire structure is sealed with a glass layer on top.</p><p>The ITO film—adhered to the glass—serves as the anode, allowing current to pass through the organic layers without blocking the generated light. “But it’s brittle. It’s ceramic, basically,” so it works well for flat surfaces, but can’t be bent, Gogotsi explains. There have been attempts to engineer flexible OLEDs many times before, but they failed to meaningfully overcome both flexibility and brightness limitations.</p><p>Gogotsi’s students started by creating a transparent, conducting film out of a MXene, a type of ultrathin and flexible material with metal-like conductivity. The material is unique in its inherent ability to bend because it’s made from many two-dimensional sheets that can slide relative to each other without breaking. The film—only 10 nanometers thick—“appeared to be this perfect replacement for ITO,” Gogotsi says. </p><p>Through experimentation, Gogotsi and Lee’s shared team found that a mix of the MXene and silver nanowire would actually stretch the most while maintaining stability. “We were able to double the size, achieving 200 percent stretching without losing performance,” Gogotsi says. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A bi-axially twisted exciplex-assisted phosphorescent film deposited on a small stretchable substrate." class="rm-shortcode" data-rm-shortcode-id="3e57d12160bbdd4ee5708c032d43b9fe" data-rm-shortcode-name="rebelmouse-image" id="2a744" loading="lazy" src="https://spectrum.ieee.org/media-library/a-bi-axially-twisted-exciplex-assisted-phosphorescent-film-deposited-on-a-small-stretchable-substrate.jpg?id=62303071&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The new material can also be twisted without losing its glow.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Source image: Huanyu Zhou, Hyun-Wook Kim, et al.</small></p><p>And the new MXene film was not only more flexible than ITO but also increased brightness by almost an order of magnitude by making the contact between the topmost light-emitting organic layer and the film more efficient. </p><p>Unlike ITO, the surface of MXenes can be chemically adjusted to make it easier for electrons to move from the electrode into the light-emitting layer. This more efficient electron flow significantly increases the brightness of the display, as evidenced by an external quantum efficiency of 17 percent, which the group claims is a record for stretchable OLEDs.</p><p>“Achieving those numbers in intrinsically stretchable OLEDs under substantial stretching is quite significant,” says <a href="https://ee.kaist.ac.kr/en/professor/12225/" target="_blank">Seunghyup Yoo</a>, who runs the Integrated Organic Electronics Laboratory at South Korea’s KAIST. An external quantum efficiency of 20 percent is an important benchmark for this kind of device because it is the upper limit of efficiency dictated by the physical properties of light generation, Yoo explains.</p><p>To increase illumination, the researchers went beyond working with MXene. Lee’s group developed two additional organic layers to add into the middle of their OLED—one that directs positive charges to the light-emitting layer, ensuring that electricity is used more efficiently, and one that recycles wasted energy that would normally be lost, boosting overall brightness.</p><p>Together, the MXene layer and two organic layers allow for a notably bright and stable OLED, even when stretched. Gogotsi thinks the subsequent OLED is “very successful” because it combines both brightness and stretchability, while, historically, engineers have only been able to achieve one or the other. </p><p>“The performance that they are able to achieve in this work is an important advancement,” says <a href="https://pme.uchicago.edu/faculty/sihong-wang" target="_blank">Sihong Wang</a>, a molecular engineer at the University of Chicago who also develops stretchable OLED materials. Wang also notes that the 200 percent stretchability that Gogotsi’s group attained is beyond robust for wearable applications.</p><h2>Wearables and Health Care</h2><p>A stretchable OLED that maintains its brightness has uses in many settings, including industrial environments, robotics, wearable clothing and devices, and communications, Gogotsi says, although he’s most excited about its adoption in health-monitoring devices. He sees a near future in which displays for diagnostics and treatment become embedded in clothing or “epidermal electronics,” comparing their function to smartwatches. </p><p>Before these displays can come to market, however, stability issues inherent to all stretchable OLEDs need to be solved, Wang says. Current materials are not able to sustain light emissions for long enough to serve customers in the ways they require. </p><p>Finding housings to protect them is also a problem. “You need a stretchable encapsulation material that can protect the central device without allowing oxygen and moisture to permeate,” Wang says.</p><p>Yoo agrees: He says it’s a tough problem to solve because the best protective layers are rigid and not very stretchable. He notes yet another challenge in the way of commercialization, which is “developing stretchable displays that do not exhibit image distortion.”</p><p>Regardless, Gogotsi is excited about the future of stretchable OLEDs. “We started with computers occupying the room, then moved to our desktops, then to laptops, then we got smartphones and iPads, but still we carry stuff with us,” he says. “Flexible displays can be on the sleeve of your jacket. They can be <a href="https://spectrum.ieee.org/rollable-smartphone" target="_self">rolled into a tube</a> or folded and put in your pocket. They can be everywhere.”</p><p><em>This article appears in the March 2026 print issue as “Stretchable OLEDs Just Got a Huge Upgrade.”</em></p>]]></description><pubDate>Wed, 14 Jan 2026 16:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/stretchable-oleds-wearable-display-drexel</guid><category>Oleds</category><category>Materials-science</category><category>Mxenes</category><category>Flexible-displays</category><category>Indium-tin-oxide</category><category>Wearables</category><dc:creator>Perri Thaler</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-bi-axially-stretched-exciplex-assisted-phosphorescent-film-deposited-on-a-small-stretchable-substrate.jpg?id=62303035&amp;width=980"></media:content></item><item><title>Machine-Learning System Monitors Patient Pain During Surgery</title><link>https://spectrum.ieee.org/machine-learning-measure-pain-surgery</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/illustration-of-a-womans-face-where-one-half-is-intact-and-the-other-is-segmented-into-several-triangular-pieces-a-web-of-red.jpg?id=62674391&width=1200&height=400&coordinates=0%2C292%2C0%2C292"/><br/><br/><p><em>This article is part of our exclusive <a href="https://spectrum.ieee.org/collections/journal-watch/" target="_self">IEEE Journal Watch series</a> in partnership with <a href="https://spectrum.ieee.org/tag/ieee-xplore" target="_self">IEEE Xplore</a>.</em></p><p>In the operating room, patients undergoing procedures with local anesthesia, while still conscious, may have difficulty expressing their levels of pain. Some, such as infants or people with dementia, may not be able to communicate these feelings at all. In the search for a better way to monitor patients’ pain, a team of researchers has developed a contactless method that analyzes a combination of patients’ heart rate data and facial expressions to estimate the pain they’re feeling. The approach is described in a <a href="https://ieeexplore.ieee.org/document/11249745" rel="noopener noreferrer" target="_blank">study</a> published 14 November in the <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=8782705" rel="noopener noreferrer" target="_blank"><em>IEEE Open Journal of Engineering in Medicine and Biology</em></a>.<strong></strong></p><p><a href="https://science-careers.htwk-leipzig.de/en/mainnavigation/female-scientists-network/portraits/alumnae#c213060" target="_blank">Bianca Reichard</a>, a researcher at the Institute for Applied Informatics in Leipzig, Germany, notes that camera-based pain monitoring sidesteps the need for patients to wear sensors with wires, such as ECG electrodes and blood pressure cuffs, which could interfere with the delivery of medical care.</p><p>To create their contactless approach, the researchers created a machine-learning algorithm capable of analyzing aspects of pain that can be detected visually by a camera. First, the algorithm analyzes the nuances of a person’s facial expressions to estimate their pain levels. </p><p>The system also uses heart rate data via a technique called <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC9267568/" target="_blank">remote photoplethysmogram</a> (rPPG), which involves shining a light on a person’s skin. The amount of light reflected back can be used to detect changes in blood volume within their vessels. The researchers initially considered 15 different heart-rate variability parameters measured by rPPG to include in their model and selected the top seven that are statistically most relevant to pain prediction, such as heart rate maximums, minimums, and intervals. </p><h2>Pain-Prediction Model Training Datasets</h2><p>The team used two different datasets to train and test their pain-prediction model. One is a well-established and widely used database that measures pain called the <a href="https://www.nit.ovgu.de/nit/en/BioVid-p-1358.html" target="_blank">BioVid Heat Pain Database</a>. Researchers <a href="https://www.researchgate.net/publication/259990721_Towards_Pain_Monitoring_Facial_Expression_Head_Pose_a_new_Database_an_Automatic_System_and_Remaining_Challenges" target="_blank">created this dataset in 2013</a> through experiments in which thermodes induced incremental, measurable temperature increases on individuals’ skin. The researchers then captured the participants’ physical responses to the corresponding pain that they felt.</p><p><strong></strong>The second dataset was developed by the researchers for this new work. Twenty-nine patients undergoing heart procedures involving insertion of a catheter were surveyed about their pain levels at five-minute intervals.<strong></strong></p><p>Importantly, most other pain-prediction algorithms have been trained using very short video clips, but Reichard and her team specifically used longer training videos (ranging from 30 minutes to 3 hours) of realistic surgery scenarios to train their model. For instance, the training videos used may have included scenarios where lighting may not be ideal, or the patient’s face may be partially obscured from the camera at times. “This reflects a more realistic clinical situation compared to laboratory datasets,” Reichard explains. </p><p>Tests of their model show that it has a pain-prediction accuracy of about 45 percent. <span>Reichard says she is surprised that the model is so accurate, </span><span>given the number of disruptions that occurred throughout the raw video footage, such as a patient moving on the operating table or changes in the camera angle.  </span>While many previously developed pain-prediction models can achieve higher accuracies, those were trained using short video clips that are “ideal” with no visual obstructions. Instead, this research team used less-than-ideal—but more realistic—video footage to train their model.</p><p>What’s more, Reichard notes that the team used a fairly simple statistical machine-learning model. “Using more complex approaches, for example, based on neural networks, would most likely further improve performance,” she says.</p><p>Reichard says she finds this type of research—which could support both patients and medical staff—meaningful and is planning on developing similar contactless systems for measuring patients’ vital signs using radar in medical settings, in future work. </p><p><em>This article appears in the March 2026 print issue as “Machine Learning System Feels Your Pain.”</em></p>]]></description><pubDate>Mon, 12 Jan 2026 16:52:21 +0000</pubDate><guid>https://spectrum.ieee.org/machine-learning-measure-pain-surgery</guid><category>Journal-watch</category><category>Surgery</category><category>Pain</category><category>Machine-learning</category><category>Cameras</category><dc:creator>Michelle Hampson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/illustration-of-a-womans-face-where-one-half-is-intact-and-the-other-is-segmented-into-several-triangular-pieces-a-web-of-red.jpg?id=62674391&amp;width=980"></media:content></item><item><title>How AI Accelerates PMUT Design for Biomedical Ultrasonic Applications</title><link>https://content.knowledgehub.wiley.com/quanscient-multiphysicsai-for-pmut-design/</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/quanscient-logo-in-bold-blue-purple-letters-on-a-transparent-background.png?id=62697598&width=980"/><br/><br/><p>This whitepaper provides MEMS engineers, biomedical device developers, and multiphysics simulation specialists with a practical AI-accelerated workflow for optimizing piezoelectric micromachined ultrasonic transducers (PMUTs), enabling you to explore complex design trade-offs between sensitivity and bandwidth while achieving validated performance improvements in minutes instead of days using standard cloud infrastructure.</p><p><strong>What you will learn about:</strong></p><ul><li>MultiphysicsAI combines cloud-based FEM simulation with neural surrogates to transform PMUT design from trial-and-error iteration into systematic inverse optimization</li><li>Training on 10,000 randomized geometries produces AI surrogates with 1% mean error and sub-millisecond inference for key performance indicators: transmit sensitivity, center frequency, fractional bandwidth, and electrical impedance</li><li>Pareto front optimization simultaneously increases fractional bandwidth from 65% to 100% and improves sensitivity by 2-3 dB while maintaining 12 MHz center frequency within ±0.2%</li></ul><div><a href="https://content.knowledgehub.wiley.com/quanscient-multiphysicsai-for-pmut-design/" target="_blank">Download this free whitepaper now!</a></div>]]></description><pubDate>Thu, 08 Jan 2026 22:06:42 +0000</pubDate><guid>https://content.knowledgehub.wiley.com/quanscient-multiphysicsai-for-pmut-design/</guid><category>Type-whitepaper</category><category>Mems</category><category>Artificial-intelligence</category><category>Pmut</category><dc:creator>Quanscient</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/62697598/origin.png"></media:content></item><item><title>These Hearing Aids Will Tune in to Your Brain</title><link>https://spectrum.ieee.org/hearing-aids-biosignals</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/side-by-side-photos-show-a-woman-wearing-specialized-hearing-aids-each-with-a-cord-coming-down-from-it-and-a-close-up-of-the-d.jpg?id=62620525&width=1200&height=400&coordinates=0%2C390%2C0%2C390"/><br/><br/><p><strong>Imagine you’re at a</strong> bustling dinner party filled with laughter, music, and clinking silverware. You’re trying to follow a conversation across the table, but every word feels like it’s wrapped in noise. For most people, these types of party scenarios, where it’s difficult to filter out extraneous sounds and focus on a single source, are an occasional annoyance. For millions with <a href="https://spectrum.ieee.org/tag/hearing-loss" target="_blank">hearing loss</a>, they’re a daily challenge—and not just in busy settings.</p><div class="rm-embed embed-media"><iframe height="110px" id="noa-web-audio-player" src="https://embed-player.newsoveraudio.com/v4?key=q5m19e&id=https://spectrum.ieee.org/hearing-aids-biosignals&bgColor=F5F5F5&color=1b1b1c&playColor=1b1b1c&progressBgColor=F5F5F5&progressBorderColor=bdbbbb&titleColor=1b1b1c&timeColor=1b1b1c&speedColor=1b1b1c&noaLinkColor=556B7D&noaLinkHighlightColor=FF4B00&feedbackButton=true" style="border: none" width="100%"></iframe></div><p>Today’s <a href="https://spectrum.ieee.org/tag/hearing-aids" target="_blank">hearing aids</a> aren’t great at determining which sounds to amplify and which to ignore, and this often leaves users overwhelmed and fatigued. Even the routine act of conversing with a loved one during a car ride can be mentally draining, simply because the hum of the engine and road noises are magnified to create loud and constant background static that blurs speech.</p><p>In recent years, modern hearing aids have made impressive strides. They can, for example, use a technology called adaptive beamforming to focus their microphones in the direction of a talker. Noise-reduction settings also help decrease background cacophony, and some devices even use machine-learning-based analysis, trained on uploaded data, to detect certain environments—for example a car or a party—and deploy custom settings.</p><p>That’s why I was initially surprised to find out that today’s state-of-the-art hearing aids aren’t good enough. “It’s like my ears work but my brain is tired,” I remember one elderly man complaining, frustrated with the inadequacy of his cutting-edge noise-suppression hearing aids. At the time, I was a graduate student at the University of Texas at Dallas, surveying individuals with hearing loss. The man’s insight led me to a realization: Mental strain is an unaddressed frontier of hearing technology.</p><p>But what if hearing aids were more than just amplifiers? What if they were listeners too? I envision a new generation of intelligent hearing aids that not only boost sound but also read the wearer’s brain waves and other key physiological markers, enabling them to react accordingly to improve hearing and counter fatigue.</p><p>Until last spring, when I took time off to care for my child, I was a senior audio research scientist at <a href="https://www.harman.com/" target="_blank">Harman International</a>, in Los Angeles. My work combined cognitive neuroscience, auditory prosthetics, and the processing of biosignals, which are measurable physiological cues that reflect our mental and physical state. I’m passionate about developing brain-computer interfaces (<a href="https://spectrum.ieee.org/tag/bci" target="_blank">BCIs</a>) and adaptive signal-processing systems that make life easier for people with hearing loss. And I’m not alone. A number of researchers and companies are working to create smart hearing aids, and it’s likely they’ll come on the market within a decade.</p><p>Two technologies in particular are poised to revolutionize hearing aids, offering personalized, fatigue-free listening experiences: electroencephalography (EEG), which tracks brain activity, and pupillometry, which uses eye measurements to gauge cognitive effort. These approaches might even be used to improve consumer audio devices, transforming the way we listen everywhere.</p><h2>Aging Populations in a Noisy World<br/></h2><p>More than <a href="https://www.who.int/news-room/fact-sheets/detail/deafness-and-hearing-loss" rel="noopener noreferrer" target="_blank">430 million people</a> suffer from disabling hearing loss worldwide, including 34 million children, according to the World Health Organization. And the problem will likely get worse due to rising life expectancies and the fact that the world itself seems to be getting louder. By 2050, an estimated <a href="https://pubmed.ncbi.nlm.nih.gov/33714390/" target="_blank">2.5 billion people</a> will suffer some degree of hearing loss and 700 million will require intervention. On top of that, <a href="https://gh.bmj.com/content/7/11/e010501" rel="noopener noreferrer" target="_blank">as many as 1.4 billion of today’s young people</a>—nearly half of those aged 12 to 34—could be at risk of permanent hearing loss from listening to audio devices too loud and for too long.</p><p>Every year, close to a trillion dollars is lost globally due to unaddressed hearing loss, a trend that is also likely getting more pronounced. That doesn’t account for the significant emotional and physical toll on the hearing impaired, including isolation, loneliness, depression, shame, anxiety, sleep disturbances, and loss of balance.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A back view of a man's head shows a flexible pattern of lines with electrodes inside that go over his ear and extend toward the front of his face." class="rm-shortcode" data-rm-shortcode-id="12fa3c83e74db09134be9565e28904d6" data-rm-shortcode-name="rebelmouse-image" id="83e0c" loading="lazy" src="https://spectrum.ieee.org/media-library/a-back-view-of-a-man-s-head-shows-a-flexible-pattern-of-lines-with-electrodes-inside-that-go-over-his-ear-and-extend-toward-the.jpg?id=62620547&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Flex-printed electrode arrays, such as these from the Fraunhofer Institute for Digital Media Technology, offer a comfortable option for collecting high-quality EEG signals. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Leona Hofmann/Fraunhofer IDMT</small></p><p>And yet, despite widespread availability, hearing aid adoption remains low. According to a <a href="https://www.thelancet.com/journals/lanhl/article/PIIS2666-7568%2823%2900232-5/fulltext" target="_blank">2024 study</a> published in <a href="https://www.thelancet.com/" target="_blank"><em><em>The Lancet</em></em>,</a> only about 13 percent of American adults with hearing loss regularly wear hearing aids. Key reasons for this deficiency include discomfort, stigma, cost—and, crucially, frustration with the poor performance of hearing aids in noisy environments.</p><p>Historically, hearing technology has come a long way. As early as the 13th century, people began using horns of cows and rams as “<a href="https://en.wikipedia.org/wiki/Ear_trumpet" target="_blank">ear trumpets</a>.” Commercial versions made of various materials, including brass and wood, came on the market in the early 19th century. (Beethoven, who famously began losing his hearing in his twenties, used variously shaped ear trumpets, some of which are now on display in a museum in Bonn, Germany.) But these contraptions were so bulky that users had to hold them with their hands or wear them within headbands. To avoid stigma, some even hid hearing aids inside furniture to mask their disability. In 1819, a <a href="https://www.bbc.com/news/blogs-ouch-29896747" target="_blank">special acoustic chair</a> was designed for the king of Portugal, featuring arms ornately carved to look like open lion mouths, which helped transmit sound to the king’s ear via speaking tubes.</p><p>Modern hearing aids came into being after the advent of electronics in the early 20th century. Early devices used vacuum tubes and then transistors to amplify sound, shrinking over time from bulky body-worn boxes to discreet units that fit behind or inside the ear. At their core, today’s hearing aids still work on the same principle: A microphone picks up sound, a processor amplifies and shapes it to match the user’s hearing loss, and a tiny speaker delivers the adjusted sound into the ear canal.</p><p>Today’s best-in-class devices, like those from <a href="https://www.oticon.com/solutions/real-hearing-aids" target="_blank">Oticon</a>, <a href="https://www.phonak.com/en-us/hearing-devices/hearing-aids/audeo-lumity" target="_blank">Phonak</a>, and <a href="https://www.starkey.com/hearing-aids/genesis-artificial-intelligence-hearing-aids" target="_blank">Starkey</a>, have pioneered increasingly advanced technologies, including the aforementioned beamforming microphones, frequency lowering to better pick up high-pitched sounds and voices, and machine learning to recognize and adapt to specific environments. For example, the device may reduce amplification in a quiet room to avoid escalating background hums or else increase amplification in a noisy café to make speech more intelligible.<strong> </strong></p><p>Advances in the AI technique of deep learning, which relies on artificial neural networks to automatically recognize patterns, also hold enormous promise. Using context-aware algorithms, this technology can, for example, be used to help distinguish between speech and noise, predict and suppress unwanted clamor in real time, and attempt to clean up speech that is muffled or distorted.</p><p>The problem? As of right now, consumer systems respond only to external acoustic environments and not to the internal cognitive state of the listener—which means they act on imperfect and incomplete information. So, what if hearing aids were more empathetic? What if they could sense when the listener’s brain feels tired or overwhelmed and automatically use that feedback to deploy advanced features?</p><h2>Using EEG to Augment Hearing Aids</h2><p>When it comes to creating intelligent hearing aids, there are two main challenges. The first is building convenient, power-efficient wearable devices that accurately detect brain states. The second, perhaps more difficult step is decoding feedback from the brain and using that information to help hearing aids adapt in real time to the listener’s cognitive state and auditory experience.</p><p>Let’s start with <a href="https://spectrum.ieee.org/tag/eeg" target="_blank">EEG</a>. This century-old noninvasive technology uses electrodes placed on the scalp to measure the brain’s electrical activity through voltage fluctuations, which are recorded as “brain waves.”</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A man with headphones sits in a lab in front of computers displaying information. Behind him through a doorway is seen another person sitting in front of a screen, wearing an EEG cap." class="rm-shortcode" data-rm-shortcode-id="19caacab1cb3ed292373b00bcba7dbc3" data-rm-shortcode-name="rebelmouse-image" id="b6cbf" loading="lazy" src="https://spectrum.ieee.org/media-library/a-man-with-headphones-sits-in-a-lab-in-front-of-computers-displaying-information-behind-him-through-a-doorway-is-seen-another-p.jpg?id=62620561&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Brain-computer interfaces allow researchers to accurately determine a listener’s focus in multitalker environments. Here, professor Christopher Smalt works on an attention-decoding system at the MIT Lincoln Laboratory.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">MIT Lincoln Laboratory</small></p><p>Clinically, EEG has long been applied for diagnosing epilepsy and sleep disorders, monitoring brain injuries, assessing hearing ability in infants and impaired individuals, and more. And while standard EEG requires conductive gel and bulky headsets, we now have versions that are far more portable and convenient. These breakthroughs have already allowed EEG to migrate from hospitals into the consumer tech spaces, driving everything from neurofeedback headbands to the BCIs in gaming and wellness apps that allow people to control devices with their minds.</p><p>The <a href="https://uol.de/psychologie/abteilungen/ceegrid" target="_blank">cEEGrid project</a><strong> </strong>at Oldenburg University, in Germany, positions lightweight adhesive electrodes around the ear to create a low-profile version. In Denmark, <a href="https://ece.au.dk/en/research/research-centres/center-for-ear-eeg/" target="_blank">Aarhus University’s Center for Ear-EEG</a> also has an ear-based EEG system designed for comfort and portability. While the signal-to-noise ratio is slightly lower compared to head-worn EEG, these ear-based systems have proven sufficiently accurate for gauging attention, listening effort, <a href="https://ieeexplore.ieee.org/document/8006311" target="_blank">hearing thresholds</a>, and <a href="https://doi.org/10.1088/1741-2552/ae00f3" target="_blank">speech tracking</a> in real time.</p><p>For hearing aids, EEG technology can pick up brain-wave patterns that reveal how well a listener is following speech: When listeners are paying attention, their brain rhythms synchronize with the syllabic rhythms of discourse, essentially tracking the speaker’s cadence. By contrast, if the signal becomes weaker or less precise, it suggests the listener is struggling to comprehend and losing focus.</p><p>During my own <a href="https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2022.927872/full" target="_blank">Ph.D. research</a>, I observed firsthand how real-time brain-wave patterns, picked up by EEG, can reflect the quality of a listener’s speech cognition. For example, when participants successfully homed in on a single talker in a crowded room, their neural rhythms aligned nearly perfectly with that speaker’s voice. It was as if there were a brain-based spotlight on that speaker! But when background fracas grew louder or the listener’s attention drifted, those patterns waned, revealing stress in keeping up.</p><p>Today, researchers at <a href="https://ece.au.dk/en/research/research-centres/center-for-ear-eeg/projects/investigation-of-auditory-responses-in-ear-eeg?utm_source=chatgpt.com" target="_blank">Aarhus University</a>, <a href="https://uol.de/en/psychology/neurophysiology/forschung" target="_blank">Oldenburg University</a>, and <a href="https://www.ll.mit.edu/partner-us/available-technologies/end-end-deep-neural-network-auditory-attention-decoding?utm_source=chatgpt.com" target="_blank">MIT</a> are developing attention-decoding algorithms specifically for auditory applications. For example, Oldenburg’s cEEGrid technology has been used to <a href="https://iopscience.iop.org/article/10.1088/1741-2552/aa66dd" target="_blank">successfully identify</a> which of two speakers a listener is trying to hear. In <span>a </span><a href="https://iopscience.iop.org/article/10.1088/1741-2560/13/6/066004" target="_blank">related study</a><span>, researchers demonstrated that in-ear EEG can track the attended speech stream in multitalker environments.</span></p><p>All of this could prove transformational in creating neuroadaptive hearing aids. If a listener’s EEG reveals a drop in speech tracking, the hearing aid could infer increased listening difficulty, even if ambient noise levels have remained constant. For example, if a hearing-impaired car driver can’t focus on a conversation due to mental fatigue caused by background noise, the hearing aid could switch on beamforming to better augment the passenger’s voice, as well as machine-learning settings to deploy sound canceling that blocks the din of the road.</p><p>Of course, there are several hurdles to cross before commercialization becomes possible. For one thing, EEG-paired hearing aids will need to handle the fact that neural responses differ from person to person, which means they will likely need to be calibrated individually to capture each user’s unique brain-speech patterns.</p><p>Additionally, EEG signals are themselves notoriously “noisy,” especially in real-world environments. Luckily, we already have algorithms and processing tools for cleaning and organizing these signals so computer models can search for key patterns that predict mental states, including attention drift and fatigue.</p><p>Commercial versions of EEG-paired hearing aids will also need to be small and energy-efficient when it comes to signal processing and real-time computation. And getting them to work reliably, despite head movement and daily activity, will be no small feat. Importantly, companies will need to resolve ethical and regulatory considerations, such as data ownership. To me, these challenges seem surmountable, especially with technology progressing at a rapid clip.</p><h2>A Window to the Brain: Using Our Eyes to Hear</h2><p><span>Now let’s consider a second way of reading brain states: through the listener’s eyes.</span></p><p>When a person has trouble hearing and starts feeling overwhelmed, the body reacts. Heart-rate variability diminishes, indicating stress, and sweating increases. Researchers are investigating how these types of autonomic nervous-system responses can be measured and used to create smart hearing aids. For the purposes of this article, I will focus on a response that seems especially promising—namely, pupil size.</p><p><a href="https://en.wikipedia.org/wiki/Pupillometry" target="_blank">Pupillometry</a> is the measurement of pupil size and how it changes in response to stimuli. We all know that pupils expand or contract depending on light brightness. As it turns out, pupil size is also an accurate means of evaluating attention, arousal, mental strain—and, crucially, listening effort.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Three eye illustrations showing pupil size changes due to light and emotional stimuli." class="rm-shortcode" data-rm-shortcode-id="1e09afde9216a40d6eb56b4443439f2c" data-rm-shortcode-name="rebelmouse-image" id="8c8bd" loading="lazy" src="https://spectrum.ieee.org/media-library/three-eye-illustrations-showing-pupil-size-changes-due-to-light-and-emotional-stimuli.png?id=62712190&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Pupil size is determined by both external stimuli, such as light, and internal stimuli, such as fatigue or excitement.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Chris Philpot</small></p><p>In recent years, studies at <a href="https://discovery.ucl.ac.uk/id/eprint/10173304/1/4856.full.pdf" target="_blank">University College London</a><strong> </strong>and <a href="https://pubmed.ncbi.nlm.nih.gov/29435963/" target="_blank">Leiden University</a>, in the Netherlands,<strong> </strong>have demonstrated that pupil dilation is consistently greater in hearing-impaired individuals when processing speech in noisy conditions. Research has also shown pupillometry to be a sensitive, objective correlate of speech intelligibility and mental strain. It could therefore offer a<strong> </strong>feedback<strong> </strong>mechanism<strong> </strong>for user-aware hearing aids that dynamically adjust amplification strategies, directional focus, or noise reduction based not just on the acoustic environment but on how hard the user is working to comprehend speech.</p><p>While more straightforward than EEG, pupillometry presents its own engineering challenges. Pupillometry requires a direct line of sight to the pupil, necessitating a stable, front-facing camera-to-eye configuration—which isn’t easy to achieve when a wearer is moving around in real-world settings. On top of that, most pupil-tracking systems require infrared illumination and high-resolution optical cameras, which are too bulky and power intensive for the tiny housings of in-ear or behind-the-ear hearing aids. All this makes it unlikely that standalone hearing aids will include pupil-tracking hardware in the near future.</p><p>A more viable approach may be pairing hearing aids with smart glasses or other wearables that contain the necessary eye-tracking hardware. Products from companies like <a href="https://www.tobii.com/" target="_blank">Tobii</a><strong> </strong>and <a href="https://pupil-labs.com/" target="_blank">Pupil Labs</a> already offer real-time pupillometry via lightweight headgear for use in research, behavioral analysis, and assistive technology for people with medical conditions that limit movement but leave eye control intact. Apple’s Vision Pro<strong> </strong>and other augmented reality or virtual reality platforms also include built-in eye-tracking sensors that could support pupillometry-driven adaptations for audio content.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A woman wears a pair of specialized glasses that have small cameras and infrared illuminators around edges of the glass for eye tracking, as well as a camera and microphone above the nose bridge." class="rm-shortcode" data-rm-shortcode-id="61ffe5f11715a80f11deca4c2cf9149d" data-rm-shortcode-name="rebelmouse-image" id="061a4" loading="lazy" src="https://spectrum.ieee.org/media-library/a-woman-wears-a-pair-of-specialized-glasses-that-have-small-cameras-and-infrared-illuminators-around-edges-of-the-glass-for-eye.jpg?id=62620574&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Smart glasses that measure pupil size, such as these made by Tobii, could help determine listening strain. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Tobii</small></p><p>Once pupil data is acquired, the next step will be real-time interpretation. Here, again, is where machine learning can use large datasets to detect patterns signifying increased cognitive load or attentional shifts. For instance, if a listener’s pupils dilate unnaturally during a conversation, signifying strain, the hearing aid could automatically engage a more aggressive noise suppression mode or narrow its directional microphone beam. These types of systems can also learn from contextual features, such as time of day or prior environments, to continuously refine their response strategies.</p><p>While no commercial hearing aid currently integrates pupillometry, adjacent industries are moving quickly.<strong> </strong><a href="https://emteqlabs.com/" target="_blank">Emteq Labs</a> is developing “emotion-sensing” glasses that combine facial and eye tracking, along with pupil measurement, to do things like evaluate mental health and capture consumer insights. Ethical controversies aside—just imagine what dystopian governments might do with emotion-reading eyewear!—such devices show that it’s feasible to embed biosignal monitoring in consumer-grade smart glasses.</p><h2>A Future with Empathetic Hearing Aids</h2><p>Back at the dinner party, it remains nearly impossible to participate in conversation. “Why even bother going out?” some ask. But that will soon change.</p><p>We’re at the cusp of a paradigm shift in auditory technology, from device-centered to user-centered innovation. In the next five years, we may see hybrid solutions where EEG-enabled earbuds work in tandem with smart glasses. In 10 years, fully integrated biosignal-driven hearing aids could become the standard. And in 50? Perhaps audio systems will evolve into cognitive companions, devices that adjust, advise, and align with our mental state.</p><p>Personalizing hearing-assistance technology isn’t just about improving clarity; it’s also about easing mental fatigue, reducing social isolation, and empowering people to engage confidently with the world. Ultimately, it’s about restoring dignity, connection, and joy. <span class="ieee-end-mark"></span></p><p><em>This article appears in the February 2026 print issue.</em></p>]]></description><pubDate>Wed, 07 Jan 2026 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/hearing-aids-biosignals</guid><category>Eye-tracking</category><category>Hearing-aids</category><category>Hearing-loss</category><category>Machine-learning</category><category>Signal-processing</category><category>Smart-glasses</category><dc:creator>Shruthi Raghavendra</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/side-by-side-photos-show-a-woman-wearing-specialized-hearing-aids-each-with-a-cord-coming-down-from-it-and-a-close-up-of-the-d.jpg?id=62620525&amp;width=980"></media:content></item><item><title>Devices Target the Gut to Maintain Weight Loss from GLP-1 Drugs</title><link>https://spectrum.ieee.org/weight-loss-devices</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/diagram-of-the-human-gastrointestinal-tract-showing-a-close-up-view-of-an-endoscopic-device-inserted-through-the-stomach-into-th.png?id=62609606&width=1200&height=400&coordinates=0%2C662%2C0%2C662"/><br/><br/><p><strong>Christina had tried diet</strong><strong>ing</strong> and exercise before. The weight always came off but then crept back on, especially after she gave birth to her son in 2022.</p><div class="rm-embed embed-media"><iframe height="110px" id="noa-web-audio-player" src="https://embed-player.newsoveraudio.com/v4?key=q5m19e&id=https://spectrum.ieee.org/weight-loss-devices&bgColor=F5F5F5&color=1b1b1c&playColor=1b1b1c&progressBgColor=F5F5F5&progressBorderColor=bdbbbb&titleColor=1b1b1c&timeColor=1b1b1c&speedColor=1b1b1c&noaLinkColor=556B7D&noaLinkHighlightColor=FF4B00&feedbackButton=true" style="border: none" width="100%"></iframe></div><p><span>She had hoped that a new class of </span><a href="https://spectrum.ieee.org/tag/weight-loss" target="_blank">weight-loss</a><span> drugs might finally offer something different. Obesity treatments such as </span><a href="https://www.wegovy.com/about-wegovy/how-wegovy-works.html" target="_blank">Wegovy</a><span> and </span><a href="https://zepbound.lilly.com/weight/what-is-zepbound" target="_blank">Zepbound</a><span> had just arrived on the scene, helping people slim down with unprecedented ease. But the price tag of these GLP-1 drugs put them out of reach. Christina’s health insurance wouldn’t cover the cost.</span></p><p>Desperate for another option, Christina enrolled in a clinical trial that guaranteed several months on a blockbuster weight-loss therapy—and then the possibility of something more. (Christina, a Texas woman in her early 50s, asked that her last name be withheld to protect her privacy about her weight-loss treatment.)</p><div class="ieee-sidebar-small"><p>This article is part of our special report <a href="https://spectrum.ieee.org/special-reports/top-tech-2026" target="_blank">Top Tech 2026</a>.</p></div><p>That something more wasn’t another injection or pill, but a one-time procedure using a new <a href="https://spectrum.ieee.org/tag/medical-devices" target="_blank">medical device</a>. And instead of targeting the stomach or brain, it focused on the gut itself: rewiring how a part of the upper intestine, known as the <a href="https://my.clevelandclinic.org/health/body/duodenum" target="_blank">duodenum</a>, processes nutrients and regulates metabolism.</p><p>Performed via a minimally invasive endoscopic device, this approach is designed to help people who want to stop taking GLP-1 drugs. The goal is to lock in the benefits without the high costs, weekly jabs, or lingering side effects. And in 2026, the first company to develop such a device is likely to seek clearance to bring it to patients.</p><p>“We’re creating a new therapeutic area,” says <a href="https://www.fractyl.com/team/harith-rajagopalan-md-phd/" target="_blank">Harith Rajagopalan</a>, cofounder and chief executive of that company, <a href="https://www.fractyl.com/" target="_blank">Fractyl Health</a>, based in Burlington, Mass.</p><h2>Resetting Metabolism for Lasting Weight Loss </h2><p>You can think of these systems as a middle ground between drugs and bariatric surgery. The endoscope is a slim, flexible tube equipped with a camera and a guidewire that leads a catheter into the digestive system. Doctors send the tools down the throat so they can view and modify the intestines from the inside—remodeling gut tissue and recalibrating its response to food without a single incision. The procedure takes about an hour or so, and patients typically go home the same day.</p><p>To understand how the treatment works, it helps to first understand what goes wrong in the gut during years of unhealthy eating. As diets high in sugar and fat bombard the duodenum, the lining there becomes inflamed and its normal signaling pathways distorted. Mucosal cells in the tissue grow abnormally and propagate these maladaptive changes, locking in a dysfunctional pattern that drives cravings, weight gain, and insulin resistance.</p><p>The Fractyl device overcomes these entrenched changes. It works by deliberately injuring the tissue, using near-boiling water to burn off diseased cells on the intestinal lining. A natural healing process then kicks in, producing a fresh layer of healthy tissue and re-establishing proper metabolic control.</p><p>“You see regrowth at about two weeks, and it continues until the mucosa looks pretty normal,” says <a href="https://medschool.vanderbilt.edu/mpb/person/alan-d-cherrington-phd/" target="_blank">Alan Cherrington</a>, a physiologist at Vanderbilt University School of Medicine who consults for Fractyl.</p><div class="rm-embed embed-media"><div class="flourish-embed flourish-chart" data-src="visualisation/26638897?820658"><script src="https://public.flourish.studio/resources/embed.js"></script><noscript><img alt="chart visualization" src="https://public.flourish.studio/visualisation/26638897/thumbnail" width="100%"/></noscript></div></div><p><a href="https://ir.fractyl.com/news-releases/news-release-details/fractyl-health-announces-groundbreaking-data-remain-1-midpoint" target="_blank">Preliminary results</a> from the clinical trial that Christina joined, termed the <a href="https://clinicaltrials.gov/study/NCT06484114" target="_blank">Remain-1 study</a>, indicate that the procedure is working as intended to stabilize weight after GLP-1 therapy. Three months after stopping Zepbound, study participants who underwent the Fractyl procedure generally held their weight steady or continued to lose weight, while those who received a sham treatment saw the number on their scales climb steadily upward.</p><p>The results are “honestly better than I thought they were going to be,” says one of the doctors leading the trial, <a href="https://providers.dartmouth-health.org/2233/shelby-a-sullivan" target="_blank">Shelby Sullivan</a>, a gastroenterologist and obesity-medicine specialist at the Dartmouth Hitchcock Medical Center in Lebanon, N.H.</p><p>Sullivan cautions against drawing firm conclusions, given the small number of participants and short follow-up so far. But anyone watching the field won’t have to wait long for clearer answers. “By six months,” she says, “we absolutely will know if it’s working or not.”</p><h2>Next-Generation Devices for Obesity  </h2><p>If the six-month data demonstrate lasting weight maintenance—full trial readouts are expected in 2026—Fractyl then intends to seek regulatory clearance to market what could become the first device specifically sanctioned for post-GLP-1 weight control.</p><p>But Fractyl is hardly alone in pursuing this therapeutic frontier. <a href="https://www.endogenex.com/" target="_blank">Endogenex</a>, a company based in Plymouth, Minn., is using a flexible, expandable circuit board to apply pulsed electric fields directly to the duodenal wall to burn away the problem cells. Meanwhile, <a href="https://en.tecure.com/" target="_blank">TeCure</a>, in South Korea, and <a href="https://aquaendoscopy.com/" target="_blank">Aqua Medical</a>, in Pleasanton, Calif., are using lasers and radiofrequency-heated water vapor, respectively, to achieve a similar remodeling of the gut lining.</p><p>“In the end, it’s different methods to do the same thing,” says <a href="https://physiciandirectory.brighamandwomens.org/details/13872/pichamol-jirapinyo-gastroenterology_hepatology_and_endoscopy-boston-foxborough-jamaica_plain" target="_blank">Pichamol Jirapinyo</a>, a bariatric endoscopist at Brigham and Women’s Hospital in Boston and a cofounder of <a href="https://bariendo.com/" target="_blank">Bariendo</a>, a network of 10 nonsurgical weight-loss clinics across the United States. While ongoing trials may clarify differences in efficacy and safety, Jirapinyo (who consults for Fractyl) expects operational features such as ease of use and procedure time to play a decisive role in determining uptake among practitioners.</p><p>Timing of market entry is critical, too—and Fractyl, now leading the pack, is expected to deliver the first large-scale clinical results. Those outcomes, from the trial that Sullivan is leading, could set the tone for an entire class of new device-based obesity treatments aiming to preserve the gains of GLP-1 drugs, notes Endogenex CEO <a href="https://www.endogenex.com/about-us/" target="_blank">Stacey Pugh</a>. “If they are successful, it’s going to blow this field wide open,” she says.</p><h2>Alternative Post-GLP-1 Devices </h2><p>Not everyone is convinced that resurfacing the duodenum is the way to go. In Europe, the past year saw the arrival of a new weight-loss device called <a href="https://morphicmedical.com/reset/" target="_blank">Reset</a> that—while not explicitly authorized for use in a post-GLP-1 drug setting—introduces a sleevelike liner to the duodenum that physically prevents contact between food and the gut wall. That device must be removed within a year, however, offering only a temporary fix.</p><p>Other endoscopic approaches target the stomach: <a href="https://www.hopkinsmedicine.org/health/treatment-tests-and-therapies/endoscopic-sleeve-gastroplasty" target="_blank">One in common use today</a> applies sutures to fold the stomach and shrink its size, while <a href="https://linkinghub.elsevier.com/retrieve/pii/S0016-5085(24)05349-6" target="_blank">another, more experimental method</a> burns off stomach tissue that regulates the secretion of appetite-stimulating hormones.</p><p>These stomach-directed methods may offer a logistical advantage given the relative robustness and accessibility of the stomach, explains <a href="https://www.linkedin.com/in/andrew-c-storm-m-d-93414431/" target="_blank">Andrew Storm</a>, a therapeutic endoscopist at Wake Forest University in Winston-Salem, N.C. “The duodenum is paper thin, as compared to the stomach, which is like a thick neoprene bag,” he says.</p><p>Regulatory clearance for Fractyl would allow the company to directly promote its product for post-GLP-1 weight maintenance—something that <a href="https://www.bostonscientific.com/en-US/home.html" target="_blank">Boston Scientific</a>, maker of the <a href="https://www.bostonscientific.com/en-EU/medical-specialties/bariatric-endoscopy/endoscopic-sleeve-gastroplasty.html" target="_blank">most widely used stomach-suturing device</a>, is not legally permitted to do unless it engages in a new round of clinical trials. And that distinction could give duodenal therapies an edge in marketing. But Storm, who consults for Boston Scientific and has also participated in trials of the Endogenex system, raises concerns about the complexity of duodenal therapy. “It just introduces a whole other level of difficulty for the endoscopist that that I think will impact scalability,” he says.</p><h2>Holding On to Hard-Won Progress </h2><p>For patients like Christina, the debate over stomach versus duodenum, or one company’s device versus another’s, is largely academic. What matters for her is that the 50 pounds she lost on Zepbound—nearly 20 percent of her body weight—has stayed off so far, a stability that she attributes to the Fractyl device. Because the trial is randomized and blinded, it is possible she actually received the sham procedure. But Christina is fairly confident that she got the real thing.</p><p>Her reasoning comes from small but telling moments, like when her husband was cooking smoked pork burnt ends, sending up the kind of rich aromas that once would have sent her straight to the table. “It smelled really good, but I didn’t have any desire to chow down on it,” Christina says.</p><p>Experiences like Christina’s hint at the tantalizing promise of a lasting solution after drug-assisted weight loss, but medical-device development demands more than anecdotes. With pivotal trial readouts on the horizon, the year ahead could determine whether these devices remain hopeful prototypes or become validated tools in the next era of obesity care. <span class="ieee-end-mark"></span></p>]]></description><pubDate>Mon, 29 Dec 2025 13:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/weight-loss-devices</guid><category>Medical-devices</category><category>Weight-loss</category><category>Pharmaceuticals</category><category>Metabolism</category><category>Obesity</category><dc:creator>Elie Dolgin</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/diagram-of-the-human-gastrointestinal-tract-showing-a-close-up-view-of-an-endoscopic-device-inserted-through-the-stomach-into-th.png?id=62609606&amp;width=980"></media:content></item><item><title>Ultrasound Treatment Takes on Cancer’s Toughest Tumors</title><link>https://spectrum.ieee.org/ultrasound-cancer-treatment</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/illustration-of-the-histosonics-device-over-a-patients-abdomen-sending-ultrasound-through-a-water-filled-membrane-into-the-bod.png?id=62599195&width=1200&height=400&coordinates=0%2C1719%2C0%2C1720"/><br/><br/><p><strong>For many years,</strong> doctors and technicians who performed medical <a href="https://spectrum.ieee.org/focused-ultrasound-stimulation-inflammation-diabetes" target="_blank">ultrasound</a> procedures viewed bubbles with wary concern. The phenomenon of cavitation—the formation and collapse of tiny gas bubbles due to changes in pressure—was considered an undesirable and largely uncontrollable side effect. But in 2001, researchers at the University of Michigan began exploring ways to harness the phenomenon for the destruction of cancerous tumors and other problematic tissue.</p><div class="rm-embed embed-media"><iframe height="110px" id="noa-web-audio-player" src="https://embed-player.newsoveraudio.com/v4?key=q5m19e&id=https://spectrum.ieee.org/ultrasound-cancer-treatment&bgColor=F5F5F5&color=1b1b1c&playColor=1b1b1c&progressBgColor=F5F5F5&progressBorderColor=bdbbbb&titleColor=1b1b1c&timeColor=1b1b1c&speedColor=1b1b1c&noaLinkColor=556B7D&noaLinkHighlightColor=FF4B00&feedbackButton=true" style="border: none" width="100%"></iframe></div><p>The trouble was, creating and controlling cavitation generated heat, which harmed healthy tissue beyond the target area. <a href="https://bme.umich.edu/people/xu-zhen/" target="_blank">Zhen Xu</a>, who was working on a Ph.D. in biomedical engineering at the time, was bombarding pig heart tissue in a tank of water with ultrasound when she made a breakthrough.</p><div class="ieee-sidebar-small"><p>This article is part of our special report <a href="https://spectrum.ieee.org/special-reports/top-tech-2026" target="_blank">Top Tech 2026</a>.</p></div><p>The key was using extremely powerful ultrasound to produce negative pressure of more than 20 megapascals, delivered in short bursts measured in microseconds—but separated by relatively long gaps, between a millisecond and a full second long. These parameters created bubbles that quickly formed and collapsed, tearing apart nearby cells and turning the tissue into a kind of slurry, while avoiding heat buildup. The result was a form of incisionless surgery, a way to wipe out tumors without scalpels, radiation, or heat.</p><h3></h3><br/><div class="rblad-ieee_in_content"></div><p>“The experiments worked,” says Xu, now a professor at Michigan, “but I also destroyed the ultrasound equipment that I used,” which was the most powerful available at the time. In 2009, she cofounded a company, <a href="https://histosonics.com/" target="_blank">HistoSonics</a>, to commercialize more powerful ultrasound machines, test treatment of a variety of diseases, and make the procedure, called histotripsy, widely available.</p><p>So far, the killer app is fighting <a href="https://spectrum.ieee.org/tag/cancer" target="_blank">cancer</a>. In 2023, HistoSonics’ Edison system received FDA approval for <a href="https://histosonics.com/news/fda-awards-histosonics-clearance-of-its-first-of-a-kind-edison-histotripsy-system-2/" target="_blank">treatment of liver tumors</a>. In 2026, clinicians will conclude a <a href="https://histosonics.com/news/worlds-first-kidney-tumor-treated-using-the-histosonics-edison-histotripsy-system/" target="_blank">pivotal kidney cancer study</a> and apply for regulatory approval. They’ll also launch a large-scale pivotal trial for pancreatic cancer, considered one of the deadliest forms of the disease with a five-year survival rate of just <a href="https://pancan.org/press-releases/pancreatic-cancer-diagnoses-and-mortality-rates-climb-five-year-survival-rate-for-pancreatic-cancer-stalls-at-13/" target="_blank">13 percent</a>. An effective treatment for pancreatic cancer would represent a major advance against one of the most lethal malignancies.</p><h2>Histotripsy’s Benefits for Cancer Treatment</h2><p>HistoSonics is not the only developer of histotripsy devices or techniques, but it is first to market with a purpose-built device. “What HistoSonics has developed is a symphony of technologies, which combines physics, biology, and biomedical engineering,” says <a href="https://www.cc.nih.gov/meet-our-doctors/bwood" target="_blank">Bradford Wood</a>, an interventional radiologist at the National Institutes of Health, who is not affiliated with the company. Its engineering effort has spanned multiple disciplines to produce robotic, computer-guided systems that turn physical forces into therapeutic effects.</p><p>Over the past decade, research has confirmed or found other benefits of histotripsy. With precise calibration, fibrous tissue—such as blood vessels—can be spared from damage even in the target zone. And while other noninvasive techniques may leave scar tissue, the liquefied debris created by histotripsy is cleared away by the body’s natural processes.</p><p>In HistoSonics’ early trials for pancreatic cancer, doctors used focused ultrasound pulses to ablate, or destroy, tumors deep within the pancreas. “It’s a great achievement for the entire field to show that it is possible to ablate pancreatic tumors and that it’s well tolerated,” says <a href="https://gastro.uw.edu/people/faculty/khokhlova-t" target="_blank">Tatiana Khokhlova</a>, a medical ultrasound researcher at the University of Washington, in Seattle, who has worked on alternative histotripsy techniques.</p><p>Khokhlova says the key to harnessing histotripsy’s benefits “will be combining ablation of the primary tumor in the pancreas with some other therapy.” Combination treatment could fight recurrent cancer and tiny tumors that ultrasound might miss, while also tapping into a surprising benefit.</p><p>Histotripsy generally seems to <a href="https://pubmed.ncbi.nlm.nih.gov/31940590/" target="_blank">stimulate an immune response</a>, helping the body attack cancer cells that weren’t targeted directly by ultrasound. The mechanical destruction of tumors likely leaves behind recognizable traces of cancer proteins that help the immune system learn to identify and destroy similar cells elsewhere in the body, explains Wood. Researchers are now exploring ways to pair histotripsy with immunotherapy to amplify that effect.</p><p>The company’s capacity to explore the treatment‘s potential for different conditions will only improve with time, says HistoSonics CEO <a href="https://www.linkedin.com/in/mike-blue-860b9522/" target="_blank">Mike Blue</a>. The company has fresh resources to accelerate R&D: A new ownership group, which includes billionaire Jeff Bezos, <a href="https://www.businesswire.com/news/home/20250807749442/en/HistoSonics-Announces-%242.25B-Acquisition-by-Consortium-of-Top-Tier-Investors" target="_blank">acquired</a> HistoSonics in August 2025 at a valuation of US $2.25 billion.</p><p>Engineers are already testing a new guidance system that uses a form of X-rays rather than ultrasound imaging, which should expand use cases. The R&D team is also developing a feedback system that analyzes echoes from the therapeutic ultrasound to detect tissue destruction and integrates that information into the live display, says Blue.</p><p>If those advances pan out, histotripsy could move well beyond the liver, kidney, and pancreas in the fight against cancer. What started as a curiosity about bubbles might soon become a new pillar of noninvasive medicine—a future in which surgeons wield not scalpels, but sound waves.</p><p><em>This article appears in the January 2026 print issue.</em></p>]]></description><pubDate>Mon, 22 Dec 2025 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/ultrasound-cancer-treatment</guid><category>Cancer</category><category>Cancer-therapy</category><category>Focused-ultrasound</category><category>Immune-system</category><category>Pancreas</category><category>Ultrasound</category><dc:creator>Greg Uyeno</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/illustration-of-the-histosonics-device-over-a-patients-abdomen-sending-ultrasound-through-a-water-filled-membrane-into-the-bod.png?id=62599195&amp;width=980"></media:content></item><item><title>The Top 6 Biomedical Stories of 2025</title><link>https://spectrum.ieee.org/top-biomedical-stories-2025</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/silhouette-of-a-woman-s-face-with-a-circuitry-pattern-overlay.jpg?id=62600228&width=1200&height=400&coordinates=0%2C1042%2C0%2C1042"/><br/><br/><p><em><em>IEEE Spectrum</em></em>’s most popular biomedical stories of the past year centered both on incorporating new technologies and revamping old ones. While AI is all the rage in most sectors—including biomed, with applications like an in-brain warning system for worsening mental health and a model to estimate heart rate in real time—biomedical news this past year has also focused on legacy technologies. <span>Tech</span><span> like Wi-Fi, ultrasound, and lasers have all made comebacks or found new uses in 2025.</span></p><p>Whether innovation stems from new tech or old, <em><em>IEEE Spectrum </em></em>will continue to cover it rigorously in 2026.</p><h2>1. <a href="https://spectrum.ieee.org/deep-brain-stimulation-depression" target="_self">Next-Gen Brain Implants Offer New Hope for Depression</a></h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Blue and gold fibrous texture in the shape of a brain against a dark background." class="rm-shortcode" data-rm-shortcode-id="3417f1e2c60fbb5d61d86ef2622fc11a" data-rm-shortcode-name="rebelmouse-image" id="9ae10" loading="lazy" src="https://spectrum.ieee.org/media-library/blue-and-gold-fibrous-texture-in-the-shape-of-a-brain-against-a-dark-background.png?id=61109682&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">            Georgia Institute of Technology, Icahn School of Medicine at Mt. Sinai and TeraPixel        </small> </p><p>When <a href="https://med.emory.edu/directory/profile/?u=PRIVAPO" target="_blank">Patricio Riva Posse</a>, a psychiatrist at Emory University School of Medicine, realized that his patient’s brain implants were sending him signals about her worsening depression before she even recognized anything was wrong, he wished he could have taken action sooner. </p><p>That experience led him and colleagues to develop an “automatic alarm system” for signs of changing mental health. The tool monitors brain signals in real time, using implants to record electrical impulses, and AI to analyze the outputs and flag warning signs of relapse. Other research groups across the United States are experimenting with different ways to use these stimulating brain implants to help treat depression, both with and without the help of AI. “There are so many levers we can press here,” neurosurgeon <a href="https://sunnybrook.ca/research/team/member.asp?m=734&page=0" target="_blank">Nir Lipsman</a> says in the <a href="https://spectrum.ieee.org/deep-brain-stimulation-depression" target="_self">article</a>.</p><h2>2. <a href="https://spectrum.ieee.org/graphene-biosensor" target="_self">These Graphene Tattoos Are Actually Biosensors</a></h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A hand resting on a table has on its fourth finger both a ring and a nearly invisible band of what looks like clear plastic." class="rm-shortcode" data-rm-shortcode-id="e9fc74d34f2a8d872b23517b28262d99" data-rm-shortcode-name="rebelmouse-image" id="443b8" loading="lazy" src="https://spectrum.ieee.org/media-library/a-hand-resting-on-a-table-has-on-its-fourth-finger-both-a-ring-and-a-nearly-invisible-band-of-what-looks-like-clear-plastic.jpg?id=56118800&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">            Dmitry Kireev/University of Massachusetts Amherst         </small> </p><p>In <a href="https://www.umass.edu/engineering/about/directory/dmitry-kireev" target="_blank">Dmitry Kireev</a>’s <a href="https://kireevlab.com/" rel="noopener noreferrer" target="_blank">lab</a> at the University of Massachusetts Amherst, researchers are developing imperceptibly thin graphene tattoos capable of monitoring your vital signs and more. “Electronic tattoos could help people track complex medical conditions, including cardiovascular, metabolic, immune system, and neurodegenerative diseases. <a href="https://professional.heart.org/en/science-news/-/media/453448D7D79948B39D5851D1FF2A0CFE.ashx" rel="noopener noreferrer" target="_blank">Almost half</a> of U.S. adults may be in the early stages of one or more of these disorders right now, although they don’t yet know it,” he <a href="https://spectrum.ieee.org/graphene-biosensor" target="_self">wrote</a> in an article for <em>IEEE Spectrum</em>.</p><p>How does it work? Graphene is conductive, strong, and flexible, able to measure features like heart rate and the presence of certain compounds in sweat. For now, the tattoos need to be plugged into a regular electronic circuit, but Kireev hopes that they will soon be integrated into smartwatches, and thus simpler to wear.</p><h2>3. <a href="https://spectrum.ieee.org/wi-fi-signal-heartbeat-detection" target="_self">How Wi-Fi Signals Can Be Used to Detect Your Heartbeat</a></h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" rel="float: left;" style="float: left;"> <img alt="Over the shoulder view of researching reviewing line graph data on their laptop" class="rm-shortcode" data-rm-shortcode-id="1b3cd79f287e04caa1dad0a05be8b81b" data-rm-shortcode-name="rebelmouse-image" id="59c8a" loading="lazy" src="https://spectrum.ieee.org/media-library/over-the-shoulder-view-of-researching-reviewing-line-graph-data-on-their-laptop.jpg?id=61696168&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">            Erika Cardema/UC Santa Cruz        </small> </p><p>Wi-Fi can do more than just get you connected to the internet—it can help monitor your heart inexpensively and without requiring constant physical contact. The new approach, called Pulse-Fi, uses an AI model to analyze heartbeats to estimate heart rate in real time from up to 10 feet away. </p><p>The <a href="https://spectrum.ieee.org/wi-fi-signal-heartbeat-detection" target="_self">system</a> is low cost, totaling around US $40, easy to deploy, and doesn’t introduce discomfort. It also works regardless of the user’s posture and in all kinds of environments. <a href="https://campusdirectory.ucsc.edu/cd_detail?uid=obraczka" target="_blank">Katia Obraczka</a>, a computer scientist at the University of California, Santa Cruz, who led the development of Pulse-Fi, says the team plans to commercialize the technology.</p><h2>4. <a href="https://spectrum.ieee.org/focused-ultrasound-stimulation-inflammation-diabetes" target="_self">Doctors Could Hack the Nervous System With Ultrasound</a></h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Colorful abstract of human silhouette with anatomical overlay and dynamic wave patterns." class="rm-shortcode" data-rm-shortcode-id="c7577b17974389f8b3ec6500ad621631" data-rm-shortcode-name="rebelmouse-image" id="66b52" loading="lazy" src="https://spectrum.ieee.org/media-library/colorful-abstract-of-human-silhouette-with-anatomical-overlay-and-dynamic-wave-patterns.jpg?id=60557881&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">            Shonagh Rae        </small> </p><p><a href="https://feinstein.northwell.edu/institutes-researchers/our-researchers/sangeeta-chavan-phd" target="_blank">Sangeeta S. Chavan</a> and <a href="https://feinstein.northwell.edu/institutes-researchers/our-researchers/stavros-zanos-md-phd" rel="noopener noreferrer" target="_blank">Stavros Zanos</a>, biomedical researchers at the <a href="https://feinstein.northwell.edu/institutes-researchers/bioelectronic-medicine" target="_blank">Institute of Bioelectronic Medicine</a> in New York, hypothesize that ultrasound waves may activate neurons, offering “a precise and safe way to provide healing treatments for a wide range of both acute and chronic maladies,” as they write in an <a href="https://spectrum.ieee.org/focused-ultrasound-stimulation-inflammation-diabetes" target="_self">article</a> for <em>Spectrum</em>. Targeted ultrasound could then serve as a treatment for inflammation or diabetes, instead of medication with wide-ranging side effects, they say.</p><p>It works by vibrating a neuron’s membrane and “opening channels that allow ions to flow into the cell, thus indirectly changing the cell’s voltage and causing it to fire,” they write. The authors think that activating specific neurons can help address the root causes of specific illnesses.</p><h2>5. <a href="https://spectrum.ieee.org/optical-brain-imaging" target="_self">Scientists Shine a Laser Through a Human Head</a></h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" rel="float: left;" style="float: left;"> <img alt="Imaging of a brain with a multitude of yellow squiggly lines tracing a path around the entire circumference of the image. On the left, a red square with an arrow faces the brain, and on the right there is a green square on the outside of the brain." class="rm-shortcode" data-rm-shortcode-id="21123bc4551f6406450dab66f7ebdc20" data-rm-shortcode-name="rebelmouse-image" id="70504" loading="lazy" src="https://spectrum.ieee.org/media-library/imaging-of-a-brain-with-a-multitude-of-yellow-squiggly-lines-tracing-a-path-around-the-entire-circumference-of-the-image-on-the.jpg?id=61344149&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">            Extreme Light group/University of Glasgow        </small> </p><p>If a doctor wants to see inside your head, they have to decide whether they want to do so cheaply or deeply—an electroencephalograph is inexpensive but doesn’t penetrate past the outer layers of the brain, while functional magnetic resonance imaging (fMRI) is expensive but can see all the way in. Shining a laser through a person’s head seems like the first step toward technology that accomplishes both.</p><p>For many years, this kind of work has seemed impossible because the human head is so good at blocking light, but researchers have now proven that lasers can send photons all the way through. “What was thought impossible, we’ve shown to be possible. And hopefully…that could inspire the next generation of these devices,” project lead <a href="https://www.physics.gla.ac.uk/xtremelight/Jack.html" target="_blank">Jack Radford</a> says in the <a href="https://spectrum.ieee.org/optical-brain-imaging" target="_self">article</a>.</p><h2>6. <a href="https://spectrum.ieee.org/star-autonomous-surgical-robot" target="_self">Robots Are Starting to Make Decisions in the Operating Room</a></h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="two white robotic arms in a room with blue and green light, working above an operating table. A monitor in the background shows footage of the robots suturing" class="rm-shortcode" data-rm-shortcode-id="908ec681f81c240d7fe7df50c5ff9ef3" data-rm-shortcode-name="rebelmouse-image" id="ef9f9" loading="lazy" src="https://spectrum.ieee.org/media-library/two-white-robotic-arms-in-a-room-with-blue-and-green-light-working-above-an-operating-table-a-monitor-in-the-background-shows.png?id=60274875&width=980"/> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">            Jiawei Ge        </small> </p><p>In the not-to-distant future, surgical patients may hear “The robot will see you now,” as the authors of this <a href="https://spectrum.ieee.org/star-autonomous-surgical-robot" target="_self">story</a> suggest. The three researchers work at the Johns Hopkins University <a href="https://imerse.lcsr.jhu.edu/" target="_blank">robotics lab</a> responsible for developing <a href="https://www.science.org/doi/abs/10.1126/scitranslmed.aad9398" rel="noopener noreferrer" target="_blank">Smart Tissue Autonomous Robot</a> (STAR), which performed the first autonomous soft-tissue surgery in a live animal in 2016.</p><p>While there are certainly challenges remaining in the quest to bring autonomous robots into the operating room—like developing general-purpose robotic controllers and collecting data within strict privacy regulations—the end goal is on the horizon. “A scenario in which patients are routinely greeted by a surgeon and an autonomous robotic assistant is no longer a distant possibility,” the authors write.</p>]]></description><pubDate>Sun, 21 Dec 2025 14:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/top-biomedical-stories-2025</guid><category>Wifi</category><category>Ultrasound</category><category>Lasers</category><category>Surgical-robots</category><category>Medical-devices</category><category>Brain-implants</category><dc:creator>Perri Thaler</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/silhouette-of-a-woman-s-face-with-a-circuitry-pattern-overlay.jpg?id=62600228&amp;width=980"></media:content></item><item><title>Why the Most “Accurate” Glucose Monitors Are Failing Some Users</title><link>https://spectrum.ieee.org/glucose-monitor-accuracy-user-concerns</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/illustration-of-an-arm-with-a-glucose-monitor-against-a-glitchy-line-graph-of-blood-sugar-data.jpg?id=62295175&width=1200&height=400&coordinates=0%2C292%2C0%2C292"/><br/><br/><p>When Dan Heller received his first batch of Dexcom’s latest <a data-linked-post="2650277674" href="https://spectrum.ieee.org/why-noviosenses-ineye-glucose-monitor-might-work-better-than-googles" target="_blank">continuous glucose monitors</a> in early 2023, he decided to run a small experiment: He wore the new biosensor and the previous generation at the same time to see how they compared in measuring his glucose levels. </p><p>The new, seventh-generation model (aptly called the G7) made by San Diego-based health care company <a href="https://www.dexcom.com/" rel="noopener noreferrer" target="_blank">Dexcom</a>, had just begun shipping in the United States. Dexcom claimed the G7 to be the <a href="https://investors.dexcom.com/news/news-details/2022/Dexcom-G7-Receives-FDA-Clearance-The-Most-Accurate-Continuous-Glucose-Monitoring-System-Cleared-in-the-U.S/default.aspx" target="_blank">“most accurate sensor”</a> available to the thousands of people with Type 1 diabetes who use continuous glucose monitors to help manage their blood sugars. But Heller found that its real-world performance wasn’t up to par. In a September 2023 post on his Substack, which is dedicated to covering Type 1 diabetes research and management, he <a href="https://open.substack.com/pub/danheller/p/the-dexcom-g7-vs-g6-which-is-better?r=1q2796&utm_campaign=post&utm_medium=web" rel="noopener noreferrer" target="_blank">wrote about the experience</a> and predicted an increase in adverse events with the G7, drawing on his past experience leading tech and biotech companies. <strong></strong></p><p>In the two years since Heller’s experiment, many other users have reported issues with the device. Some complaints regard failed connection and deployment issues, which Dexcom claims to have <a href="https://www.medtechdive.com/news/dexcom-execs-fixed-g7-quality-problems/804383/" rel="noopener noreferrer" target="_blank">now addressed</a>. More concerning are reports of erratic, inaccurate readings. A <a href="https://www.facebook.com/groups/1539517486772872" rel="noopener noreferrer" target="_blank">public Facebook group</a> dedicated to sharing negative experiences with the G7 has grown to thousands of users, and several class action <a href="https://rosenlegal.com/case/dexcom-inc/" rel="noopener noreferrer" target="_blank">lawsuits</a> have been filed against the company, alleging <a href="https://wilshirelawfirm.com/dexcom-class-action-lawsuit/" rel="noopener noreferrer" target="_blank">false advertising</a> and <a href="https://cowperlaw.com/cowper-law-llp-dicello-levitt-file-class-action-against-dexcom-over-misleading-claims-about-g7-cgm/" rel="noopener noreferrer" target="_blank">misleading claims</a> about device accuracy. </p><p>Yet, based on a standard metric in the industry, the G7 is one of the most accurate glucose sensors available. “Accuracy in the performance of our device is our No. 1 priority. We understand this is a lifesaving device for people with Type 1 diabetes,” <a href="https://www.linkedin.com/in/peter-c-simpson-533a083/" rel="noopener noreferrer" target="_blank">Peter Simpson</a>, Dexcom’s senior vice president of innovation and sensor technology, told <em><em>IEEE Spectrum</em></em>. Simpson acknowledged some variability in individual sensors but stood by the accuracy of the devices.</p><p>So why have users faced issues? In part, metrics used in marketing can be misleading compared to real-world performance. Differences in study design, combined with complex biological realities, mean that the accuracy of these biosensors can’t be boiled down to one number—and users are learning this the hard way. </p><h2>Dexcom’s Glucose Monitors</h2><p>Continuous glucose monitors (CGMs) typically consist of a small filament inserted under the skin, a transmitter, and a receiver. The filament is coated with an enzyme that generates an electrical signal when it reacts with glucose in the fluid surrounding the body’s cells. That signal is then converted to a digital signal and processed to generate glucose readings every few minutes. Each sensor lasts a week or two before needing to be replaced. <strong></strong></p><p>The technology has come a long way in recent years. In the 2010s, these devices required blood glucose calibrations twice a day and still weren’t reliable enough to dose insulin based on the readings. Now, some insulin pumps use the near-real-time data to <a href="https://spectrum.ieee.org/artificial-pancreas-could-conquer-diabetes" target="_blank">automatically make adjustments</a>. With those improvements has come greater trust in the data users receive—and higher standards. A faulty reading could result in a dangerous dose of insulin. </p><p>The G7 introduced several changes to Dexcom’s earlier designs, including a much smaller footprint, and updated the algorithm used to translate sensor signals into glucose readings for better accuracy, Simpson says. “From a performance perspective, we did demonstrate in a clinical trial that the G7 is significantly more accurate than the G6,” he says. </p><p>So Heller and others were surprised when the new Dexcom sensor seemed to be performing worse. For some batches of sensors, it’s possible that the issue was in part due to an unvalidated change in a component used in a resistive layer of the sensors. The new component showed worse performance, according to a <a href="https://www.fda.gov/inspections-compliance-enforcement-and-criminal-investigations/warning-letters/dexcom-inc-700835-03042025" rel="noopener noreferrer" target="_blank">warning letter</a> issued by the U.S. Food and Drug Administration in March 2025, following an audit of two U.S. manufacturing sites. The material has since been removed from all G7 sensors, Simpson says, and the company is continuing to work with the FDA to address concerns. (“The warning letter does not restrict Dexcom’s ability to produce, market, manufacture or distribute products, require recall of any products, nor restrict our ability to seek clearance of new products,” Dexcom added in a statement.)</p><p>“<span>There is a distribution of accuracies that have to do with people’s physiology and also the devices themselves. Even in our clinical studies, we saw some that were really precise and some that had a little bit of inaccuracy to them,” says Simpson. “But in general, our sensor is very accurate.”</span></p><p><span>In late November Abbott—one of Dexcom’s main competitors—<a href="https://www.fda.gov/medical-devices/medical-device-recalls-and-early-alerts/early-alert-glucose-monitor-sensor-issue-abbott-diabetes-care" target="_blank">recalled some of its CGMs</a> due to inaccurate low glucose readings. The recall affects approximately 3 million sensors and was caused by an issue with one of Abbott’s production lines. </span></p><p>The discrepancy between reported accuracy and user experience, however, goes beyond any one company’s manufacturing missteps. </p><h2>Does MARD Matter? </h2><p>The accuracy of CGM systems is frequently measured via “mean absolute relative difference,” or MARD, a percentage that compares the sensor readings to laboratory blood glucose measurements. The lower the MARD, the more accurate the sensor. </p><p>This number is often used in advertising and marketing, and it has a historical relevance, says <a href="https://de.linkedin.com/in/manueich/de" target="_blank">Manuel Eichenlaub</a>, a biomedical engineer at the <a href="https://www.ifdt-ulm.de/en/" target="_blank">Institute for Diabetes Technology Ulm</a> in Germany, where he and his colleagues conduct independent CGM performance studies. For years, there was <a href="https://www.liebertpub.com/doi/10.1089/dia.2023.0435" target="_blank">a general belief</a> that a MARD under 10 percent meant a system would be accurate enough to be used for insulin dosing. In 2018, the FDA established a specific set of accuracy <a href="https://www.ecfr.gov/current/title-21/chapter-I/subchapter-H/part-862/subpart-B/section-862.1355" target="_blank">requirements</a> beyond MARD for insulin-guiding glucose monitors, including Dexcom’s. But manufacturers design the clinical trials that determine accuracy metrics, and the way studies are designed can make a big difference. </p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Graph comparing readings from two glucose monitors from 12 AM to 2:24 PM. Blue dots represent the Dexcom G6 and red dots represent the G7. " class="rm-shortcode" data-rm-shortcode-id="3f5fb17a1310204cdfe93cec4ab6ae6d" data-rm-shortcode-name="rebelmouse-image" id="62791" loading="lazy" src="https://spectrum.ieee.org/media-library/graph-comparing-readings-from-two-glucose-monitors-from-12-am-to-2-24-pm-blue-dots-represent-the-dexcom-g6-and-red-dots-represe.jpg?id=62299343&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">When Dan Heller wore the Dexcom G6 and G7 at the same time, he says he noticed the G7 readings were more erratic, making it more difficult to properly control his blood sugar.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."> Dan Heller </small></p><p>For instance, blood glucose levels serve as the “ground truth to compare the CGM values against,” says Eichenlaub. But glucose levels vary across blood compartments in the body; blood collected from capillaries with a finger prick fluctuates more and can have glucose levels around 5 to 10 percent higher than venous blood. (Dexcom tests against a gold-standard venous blood analyzer. When users see inaccuracies against home meters that use capillary blood, it could in part be a reflection of the meter’s own inaccuracy, Simpson says, though he acknowledges real inaccuracies in CGMs as well.)</p><p>Additionally, the distribution of sampling isn’t standardized. CGMs are known to be less accurate at the beginning and end of use, or when glucose levels are out of range or changing quickly. That means measured accuracy could be skewed by taking fewer samples right after a meal or late in the CGM’s lifetime. </p><p>According to Simpson, Dexcom’s trial protocol meets the FDA’s expectation and tests the devices in different blood sugar ranges across the life of the sensor. “Within these clinical trials, we do stress the sensors to try and simulate those real-world conditions,” he says. </p><p>Dexcom and other companies advertise a MARD around 8 percent. But some independent studies are more demanding and find higher numbers; a <a href="https://journals.sagepub.com/doi/10.1177/19322968251315459" target="_blank">head-to-head study of three popular CGMs</a> that Eichenlaub led found MARD values closer to 10 percent or higher.</p><p><span>Eichenlaub and other CGM experts believe that more standardization of testing and an extension of the FDA requirements are necessary, so they <a href="https://www.sciencedirect.com/science/article/pii/S0009898125006072" target="_blank">recently proposed comprehensive guidelines</a> on CGM performance testing. </span><span>In the United States and Europe</span><span>, a few manufacturers currently dominate the market. But</span><span> newer players are entering the growing market and, especially in Europe,</span><span> may not meet the same standards as legacy manufacturers, he says</span><span>. “Having a standardized way of evaluating the performance of those systems is very important.”</span></p><p>For users like Heller, though, better accuracy only matters if it yields better diabetes management. “I don’t care about MARD. I want data that is reliably <em><em>actionable</em></em>,” Heller says. He encourages engineers working on these devices to think like the patient. “At some point, there’s quantitative data, but you need qualitative data.” </p><p><em>This article appears in the February 2026 print issue as “For Glucose Monitors, Accuracy Claims Outpace Reality.”</em></p>]]></description><pubDate>Tue, 09 Dec 2025 16:09:28 +0000</pubDate><guid>https://spectrum.ieee.org/glucose-monitor-accuracy-user-concerns</guid><category>Clinical-trials</category><category>Dexcom</category><category>Diabetes</category><category>Glucose-sensors</category><dc:creator>Gwendolyn Rak</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/illustration-of-an-arm-with-a-glucose-monitor-against-a-glitchy-line-graph-of-blood-sugar-data.jpg?id=62295175&amp;width=980"></media:content></item><item><title>The Biggest Causes of Medical Device Recalls</title><link>https://spectrum.ieee.org/medical-device-recalls</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/an-iv-pump-and-drip-apparatus-in-a-hospital-room-with-an-lcd-screen-and-keypad.jpg?id=62219177&width=1200&height=400&coordinates=0%2C754%2C0%2C754"/><br/><br/><p>According to the <a href="https://www.fda.gov/" target="_blank">U.S. Food and Drug Administration</a> records, in an average year over 2,500<a href="https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfres/res.cfm" rel="noopener noreferrer" target="_blank"> medical device recalls</a> are issued in the United States. Some of these recalls simply require checking the device for problems, but others require the return or destruction of the device. Once identified, the FDA categorizes the root cause of these recalls into 40 categories, plus a catchall of “other”: situations that include labeling mix-ups, problems with expiration dates, and counterfeiting.</p><div class="rm-embed embed-media"><div class="flourish-embed flourish-chart" data-src="visualisation/26242465?602891"><script src="https://public.flourish.studio/resources/embed.js"></script><noscript><img alt="chart visualization" src="https://public.flourish.studio/visualisation/26242465/thumbnail" width="100%"/></noscript></div></div><p><span>What’s shown here is the breakdown of the five biggest problem categories  found among the 56,000 entries in the FDA medical-recall database, which stretches back to 2002: device design, process control (meaning an error in the device’s manufacturing process), nonconforming material/component (meaning something does not meet required specifications), software issues, and packaging.</span></p><div class="rm-embed embed-media"><div class="flourish-embed flourish-chart" data-src="visualisation/26242199?602891"><script src="https://public.flourish.studio/resources/embed.js"></script><noscript><img alt="chart visualization" src="https://public.flourish.studio/visualisation/26242199/thumbnail" width="100%"/></noscript></div></div><p><span><a href="https://spectrum.ieee.org/lean-software-development-2667603125" target="_blank">Software issues</a> are broken down into six root causes, with software design far and away the biggest problem. The other five are, in order: change control; software design changes; software manufacturing or deployment problems; software design issues in the manufacturing process; and software in the “use environment.” That last one includes <a href="https://spectrum.ieee.org/tag/cybersecurity" target="_blank">cybersecurity</a> issues, or problems with supporting software, such as a smartphone app.</span></p><div class="rm-embed embed-media"><div class="flourish-embed flourish-chart" data-src="visualisation/26242080?602891"><script src="https://public.flourish.studio/resources/embed.js"></script><noscript><img alt="chart visualization" src="https://public.flourish.studio/visualisation/26242080/thumbnail" width="100%"/></noscript></div></div><div><p><em>This article appears in the December 2025 print issue as “Medical Device Recalls.”</em></p></div>]]></description><pubDate>Sat, 29 Nov 2025 13:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/medical-device-recalls</guid><category>Type-departments</category><category>Medical-devices</category><category>Software</category><category>Fda</category><category>The-data</category><dc:creator>Stephen Cass</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/an-iv-pump-and-drip-apparatus-in-a-hospital-room-with-an-lcd-screen-and-keypad.jpg?id=62219177&amp;width=980"></media:content></item><item><title>3 Weird Things You Can Turn Into a Memristor</title><link>https://spectrum.ieee.org/memristor-materials</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/conceptual-collage-of-memristor-symbols-filled-with-the-textures-of-mushrooms-honey-and-blood.jpg?id=62243597&width=1200&height=400&coordinates=0%2C292%2C0%2C292"/><br/><br/><p>From the <a href="#honey">honey</a> in your tea to the <a href="#blood">blood</a> in your veins, materials all around you have a hidden talent. Some of these substances, when engineered in specific ways, can act as memristors—electrical components that can “remember” past states. </p><p>Memristors are often used in chips that both perform computations and store data. They are devices that store data as particular levels of resistance. Today, they are constructed as a thin layer of titanium dioxide or similar dielectric material sandwiched between two metal electrodes. Applying enough voltage to the device causes tiny regions in the dielectric layer—where oxygen atoms are missing—to form filaments that bridge the electrodes or otherwise move in a way that makes the layer more conductive. Reversing the voltage undoes the process. Thus, the process essentially gives the memristor a memory of past electrical activity.</p><p>Last month, while exploring the electrical properties of fungi, a group at The Ohio State University found first-hand that some organic memristors have benefits beyond those made with conventional materials. Not only can <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0328965" rel="noopener noreferrer" target="_blank">shiitake act as a memristor</a>, for example, but it may be useful in aerospace or medical applications because the fungus demonstrates high levels of radiation resistance. The project “really mushroomed into something cool,” lead researcher <a href="https://si.osu.edu/john-larocco" rel="noopener noreferrer" target="_blank">John LaRocco</a> says with a smirk.</p><p>Researchers have learned that other unexpected materials may give memristors an edge. They may be more flexible than typical memristors or even biodegradable. Here’s how they’ve made memristors from strange materials, and the potential benefits these odd devices could bring.</p><h2>Mushrooms</h2><p>LaRocco and his colleagues were searching for a proxy for brain circuitry to use in electrical stimulation research when they stumbled upon something interesting—shiitake mushrooms are capable of learning in a way that’s similar to <a href="https://spectrum.ieee.org/memristor-random" target="_self">memristors</a>.</p><p>The group set out to evaluate just how well shiitake can remember electrical states by first cultivating nine samples and curating optimal growing conditions, including feeding them a mix of farro, wheat, and hay.</p><p>Once fully matured, the mushrooms were dried and rehydrated to a level that made them moderately conductive. In this state, the fungi’s structure includes conductive pathways that emulate the oxygen vacancies in commercial memristors. The scientists plugged them into circuits and put them through voltage, frequency, and memory tests. The result? Mushroom memristors.</p><p>It may smell “kind of funny,” LaRocco says, but shiitake performs surprisingly well when compared to conventional memristors. Around 90 percent of the time, the fungus maintains ideal memristor-like behavior for signals up to 5.85 kilohertz. While traditional materials can function at frequencies orders of magnitude faster, these numbers are notable for biological materials, he says. </p><p>What fungi lack in performance, they may make up for in other properties. For one, many mushrooms—including shiitake—are highly resistant to radiation and other environmental dangers. “They’re growing in logs in <a href="https://spectrum.ieee.org/24-hours-at-fukushima" target="_blank">Fukushima</a> and a lot of very rough parts of the world, so that’s one of the appeals,” LaRocco says.</p><p>Shiitake are also an environmentally friendly option that’s already commercialized. “They’re already cultured in large quantities,” LaRocco explains. “One could simply leverage existing logistics chains” if the industry wanted to commercialize mushroom memristors. The use cases for this product would be niche, he thinks, and would center around the radiation resistance that shiitake boasts. Mushroom <a href="https://spectrum.ieee.org/nvidia-h100-space" target="_self">GPUs</a> are unlikely, LaRocco says, but he sees potential for aerospace and medical applications.</p><h2 class="rm-anchors" id="honey">Honey</h2><p>In 2022, engineers at Washington State University interested in green electronics set out to <a href="https://iopscience.iop.org/article/10.1088/1361-6463/ac585b" rel="noopener noreferrer" target="_blank">study</a> if honey could serve as a good memristor. “Modern electronics generate 50 million tons of <a href="https://spectrum.ieee.org/e-waste" target="_self">e-waste</a> annually, with only about 20 percent recycled,” says <a href="https://ece.mst.edu/people/faculty-directory/fengzhao/" rel="noopener noreferrer" target="_blank">Feng Zhao</a>, who led the work and is now at Missouri University of Science and Technology. “Honey offers a biodegradable alternative.”</p><p>The researchers first blended commercial honey with water and stored it in a vacuum to remove air bubbles. They then spread the mixture on a piece of copper, baked the whole stack at 90 °C for nine hours to stabilize it, and, finally, capped it with circular copper electrodes on top—completing the honey-based memristor sandwich.</p><p>The resulting 2.5-micrometer-thick honey layer acted like oxide dielectric in conventional memristors: a place for conductive pathways to form and dissolve, changing resistance with voltage. In this setup, when voltage is applied, copper filaments extend through the honey.</p><p>The honey-based memristor was able to switch from low to high resistance in 500 nanoseconds and back to low in 100 nanoseconds, which is comparable to speeds in <a href="https://pubs.aip.org/aip/ape/article/2/4/040901/3323360/Emerging-materials-for-resistive-switching" rel="noopener noreferrer" target="_blank">some non-food-based memristive materials</a>. </p><p>One advantage of honey is that it’s “cheap and widely available, making it an attractive candidate for scalable fabrication,” Zhao says. It’s also “fully biodegradable and dissolves in water, showing zero toxic waste.” In the 2022 paper, though, the researchers note that for a honey-based device to be truly biodegradable, the copper components would need to be replaced with dissolvable metals. They suggest options like magnesium and tungsten, but also write that the performance of memristors made from these metals is still “under investigation.”</p><h2 class="rm-anchors" id="blood">Blood</h2><p>Considering it a potential means of delivering healthcare, a group in India <a href="https://www.inderscienceonline.com/doi/abs/10.1504/IJMEI.2011.039073" rel="noopener noreferrer" target="_blank">wondered</a> if blood would make a good memristor in 2011, just three years after the<a href="https://spectrum.ieee.org/how-we-found-the-missing-memristor" target="_blank"> first memristor</a> was built.</p><p>The experiments were pretty simple. The researchers filled a test tube with fresh, type O+ human blood and inserted two conducting wire probes. The wires were connected with a power supply, creating a complete circuit, and voltages of one, two, and three volts were applied in repeated steps. Then, to test the memristor qualities of blood as it exists in the human body, the researchers set up a “flow mode” that applied voltage to the blood as it flowed from a tube at up to one drop per second.</p><p>The experiments were preliminary and only measured current passing through the blood, but resistance could be set by applying voltage. Crucially, resistance changed by less than 10 percent in the 30-minute period after voltage was applied. In the <em><em>International Journal of Medical Engineering and Informatics</em></em>, the scientists wrote that, because of these observations, their contraption “looks like a human blood memristor.”</p>They suggested that this knowledge could be useful in treating illness. Sick people may have ion imbalances in certain parts of their bodies—instead of prescribing medication, why not employ a circuit component made of human tissue to solve the problem? In recent years, blood-based memristors have been tested by other scientists as a means to treat conditions ranging from <a href="https://www.sciencedirect.com/science/article/pii/S2590006424002308?via%3Dihub" rel="noopener noreferrer" target="_blank">high blood sugar</a> to <a href="https://www.sciencedirect.com/science/article/pii/S2590006425009214?via%3Dihub" rel="noopener noreferrer" target="_blank">nearsightedness</a>.]]></description><pubDate>Thu, 27 Nov 2025 15:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/memristor-materials</guid><category>Memristors</category><category>Blood</category><category>Green-electronics</category><category>Biodegradable-electronics</category><category>Radiation-hardening</category><dc:creator>Perri Thaler</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/conceptual-collage-of-memristor-symbols-filled-with-the-textures-of-mushrooms-honey-and-blood.jpg?id=62243597&amp;width=980"></media:content></item><item><title>Remote Robotics Could Widen Access to Stroke Treatment</title><link>https://spectrum.ieee.org/remote-robotic-stroke-treatment-evt</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/dr-vitor-pereira-remotely-performing-a-surgical-procedure-using-remedy-robotics-n1-system.jpg?id=62225319&width=1200&height=400&coordinates=0%2C262%2C0%2C263"/><br/><br/><p>When treating strokes, every second counts. But for patients in remote areas, it may take hours to receive treatment. </p><p>The standard treatment for a common type of stroke, caused by large clots interrupting blood flow to the brain, is a procedure called endovascular thrombectomy, or EVT. During the procedure, a<span>n experienced surgeon pilots catheters through blood vessels to the blockage, accessed through a major channel such as the femoral artery in the groin. This is typically aided by X-ray imaging, which shows the position of blood vessels.</span></p><p>“Good outcomes are directly associated with early treatment,” says <a href="https://findanexpert.unimelb.edu.au/profile/821474-cameron-williams" target="_blank">Cameron Williams</a>, a neurologist at the University of Melbourne and fellow with the Australian Stroke Alliance. In fact, “time is brain” is <a href="https://www.ahajournals.org/doi/10.1161/01.str.0000196957.55928.ab" target="_blank">a common refrain</a> in stroke treatment. While blood flow is stopped, about 2 million neurons die each minute. Over an hour, that adds up to 3.6 years of typical age-related brain cell loss.</p><p>But in remote places like Darwin, in the north of Australia, this treatment isn’t available. Instead, it could take 6 hours or more and an expensive aeromedical transfer to get a patient to a medical center, says Williams. There are similar geographical challenges to stroke treatment access all over the world. Sparing a rural patient hours of transfer time to a hospital with an on-site expert could save their life, prevent disability, or preserve years of their quality of life.</p><p>That’s why there is a particular interest in the possibility of emergency stroke treatment <a href="https://spectrum.ieee.org/telemedicine-and-surgery" target="_blank">performed remotely</a> with the help of robotics. Machines placed in smaller population centers could connect patients to expert surgeons miles away, and shave hours off of time to treatment. Two companies have recently demonstrated their remote capabilities. In September, doctors in Toronto completed a series of increasingly distant brain angiograms, the X-ray imaging element of an EVT, eventually performing two angiograms between crosstown hospitals using the N1 system from <a href="https://www.remedyrobotics.com/" target="_blank">Remedy Robotics</a>. And in October, <a href="https://sentante.com/" target="_blank">Sentante</a> equipment facilitated a simulated EVT between a surgeon in Jacksonville, Fla., and a cadaver with artificial blood flow in Dundee, Scotland.</p><p>“All those stories connected is not only proof of concept. It’s coming to realization and implementation that robotic and remote interventions can be performed, and soon will be the reality for many centers in rural areas,” says <a href="https://unityhealth.to/physician-directory/dr-vitor-mendes-pereira/" target="_blank">Vitor Pereira</a>, a neurosurgeon at Unity Health who performed the Toronto procedures.</p><h2>Two Approaches to Remote EVT</h2><p><span>One challenge of performing these remote procedures is maintaining strong, fast connections at large distances. </span>“Is there a real life need to do this transatlantically? Probably not,” says <a href="https://sentante.com/#tm" target="_blank">Edvardas Satkauskas</a>, CEO of Sentante. “It demonstrates the capabilities. Even this distance is feasible.” Although p<span>erforming a procedure remotely introduces issues related to latency, the pace of EVT<span>—</span>while urgent<span>—</span>is not reliant on instant reactions, says Satkauskas. </span></p><p><span>Redundant connections should also be an important safeguard for dropped connections. Remedy has taken measures, for instance</span><span>, to ensure that its robot monitors connection quality, and doesn’t make any harmful movements due to a poor connection, says <a href="https://www.linkedin.com/in/david-bell-rr" target="_blank">David Bell</a>, the company’s CEO.</span></p><p>Though both companies are careful about disclosing details of products and research that are still in development, there are notable differences between their approaches.</p><p>“Our device leans heavily on artificial intelligence,” says Bell. Machine learning is incorporated into how the Remedy device manipulates guide wires and creates an informational overlay atop X-ray images for remote physicians, who can control the robot with a laptop and software interface. The long-term goal is for a surgeon to be able to log on to Remedy software at short notice from a central location to interact with Remedy robots in multiple hospitals as needed.</p><p>In contrast, Sentante uses a control console meant to look and feel like the catheters and guide wires that surgeons are accustomed to manipulating in manual EVT, including force feedback that mimics the resistance they would feel in person. </p><p>“It’s very intuitive to use this,” says <a href="https://www.baptistjax.com/doctors/neurosurgeon/dr-ricardo-hanel-md" target="_blank">Ricardo Hanel</a>, a neurosurgeon with Baptist Health in Jacksonville, who was on the piloting end of the Sentante demonstration. Naturalistic feeling in the transatlantic procedure came with reported latency of around 120 milliseconds. Hanel is also on Sentante’s medical advisory board.</p><p>Sentante has not yet implemented AI-assisted movements of its robot, though a plan is in place to capture as much training data as possible, both from images and force measurements. “As we joke, we had to build a sophisticated piece of hardware to become a software company,” says CEO Satkauskas. </p><h2>The Path to Clinical Use</h2><p>Hanel expressed optimism that any control system would be easily learned by surgeons. <strong></strong></p><p>“I think the main limitation for robotics is that you are still dependent on bedside interventionists,” says <a href="https://radiology.medicine.arizona.edu/profile/ahmet-gunkan-md" target="_blank">Ahmet Gunkan</a>, an interventional radiologist at the University of Arizona, who has written about <a href="https://link.springer.com/article/10.1007/s10143-024-03155-9" target="_blank">robots and endovascular interventions</a>. </p><p>Depending on the system, these bedside assistants might be responsible for a variety of tasks related to preparing and communicating with the patient, sterilizing and preparing equipment, loading step-specific parts, and repositioning X-ray or robotic equipment. Both CEOs note that while proper training will be essential, there are ways to reduce the burden on health care providers at the patient site.</p><p>In the case of remote operations, “it was important to us that the robot <a href="https://spectrum.ieee.org/star-autonomous-surgical-robot" target="_blank">could do the entire thing</a>,” says Bell. Remedy’s system has been designed to handle as much of the procedure as possible, and streamline moments when bedside human interaction is necessary. For example, even since the older version used in Toronto, changes have been made to maintain a clean line of communication between bedside and remote clinicians, facilitated by the Remedy system, says Bell.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A neurovascular surgical team carefully monitors a procedure in an operating room." class="rm-shortcode" data-rm-shortcode-id="5eb9c716265148d7d5909d5567cee4bb" data-rm-shortcode-name="rebelmouse-image" id="81621" loading="lazy" src="https://spectrum.ieee.org/media-library/a-neurovascular-surgical-team-carefully-monitors-a-procedure-in-an-operating-room.jpg?id=62225325&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">A team at St. Michael’s Hospital in Toronto performs, for the world’s first time, a robot-assisted neurovascular procedure remotely over a network, on 28 August 2025. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Katie Cooper and Kevin Van Paassen/Unity Health Toronto</small></p><p>Though remote EVT is a high priority, systems capable of the procedure may first be approved for <a href="https://spectrum.ieee.org/fiber-optic-probe" target="_blank">other endovascular procedures</a> performed locally. The hope is that precision robotics leads to better patient outcomes, whether the surgeon is in the next room or the next county. <span><strong></strong></span></p><p>Remedy has a clinical trial planned in 2026 for on-premise neurointerventions, and has partnered with the <a href="https://www.remedyrobotics.com/articles/remedy-asa-partnership" target="_blank">Australian Stroke Alliance</a> to distribute its N1 system and conduct a future clinical trial for remote procedures. Eventually the robot could be used to treat as many as 30 different conditions, says Bell.</p><p>Satkauskas views Sentante’s equipment as a flexible platform for endovascular procedures throughout the body, which could help keep bedside clinicians familiar with the device. The system may go to market in the EU next year for peripheral vascular interventions, which restore blood flow to the limbs, and it has a <a href="https://sentante.com/sentante-stroke-system-receives-fda-breakthrough-device-designation/" target="_blank">breakthrough device designation</a> from the U.S. FDA for remote stroke treatment.</p><p>There are other players in the space. For example, an early telerobotic effort from a company called Corindus is <a href="https://www.fiercebiotech.com/medtech/stryker-siemens-healthineers-team-develop-stroke-treating-robot" target="_blank">still ongoing</a> after the company’s acquisition by Siemens in 2019. And Pereira notes that Xcath has also demonstrated a <a href="https://neuronewsinternational.com/xcath-successfully-performs-first-public-telerobotic-mechanical-thrombectomy-demonstration/" target="_blank">long-distance simulated EVT</a> and looks to perform local robotic EVT with live patients soon.</p><p>“I think it’s an exciting time to be a neurointerventionalist,” says Hanel.</p>]]></description><pubDate>Mon, 24 Nov 2025 14:15:02 +0000</pubDate><guid>https://spectrum.ieee.org/remote-robotic-stroke-treatment-evt</guid><category>Stroke-treatment</category><category>Telerobotics</category><category>Surgical-robots</category><category>Stroke</category><category>Medical-robots</category><dc:creator>Greg Uyeno</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/dr-vitor-pereira-remotely-performing-a-surgical-procedure-using-remedy-robotics-n1-system.jpg?id=62225319&amp;width=980"></media:content></item></channel></rss>