Behind the Headlines https://blogs.mathworks.com/headlines Lisa Harvey discusses how MATLAB and Simulink are connected to today’s news stories around the world. Thu, 15 Feb 2024 19:18:47 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 Three favorites from TIME Magazine’s “Best Innovations of 2023” https://blogs.mathworks.com/headlines/2024/02/15/three-favorites-from-time-magazines-best-innovations-of-2023/?s_tid=feedtopost https://blogs.mathworks.com/headlines/2024/02/15/three-favorites-from-time-magazines-best-innovations-of-2023/#respond Thu, 15 Feb 2024 19:18:47 +0000 https://blogs.mathworks.com/headlines/?p=4080

  Every year for more than twenty years, TIME editors compile a list of the most impactful ideas and products. This year, the editors focused on categories such as AI, accessibility, robotics,... read more >>

]]>
 

Every year for more than twenty years, TIME editors compile a list of the most impactful ideas and products. This year, the editors focused on categories such as AI, accessibility, robotics, and sustainability. The list is an amazing collection of inventions and innovations that have the potential to change the way we live.

 

Image credit: TIME Magazine

 

 

The 2023 list didn’t disappoint. It contains products with various levels of complexity, from a design that makes it easier for arthritis sufferers to brush their teeth, to NASA technology that helps pinpoint sources of air pollution with a satellite hovering 22,000 miles above Earth. Some are creative and fun, like Hasbro’s Selfie Series, which creates a custom action figure from a photo.

Here are three of the 2023 winners and how they used MATLAB and Simulink:

Breathing Martian Air

TIME selected MOXIE (Mars Oxygen In-Situ Resource Utilization Experiment), a device that travels the surface of Mars aboard the NASA Perseverance Rover. MOXIE makes oxygen by collecting carbon dioxide from the Martian atmosphere and splitting it into oxygen and carbon monoxide molecules.

This is a critical capability for future manned missions, providing breathable air for the astronauts and oxygen to burn the fuel on the return trip. MOXIE successfully generated 122 grams of oxygen, a proof of concept of technology critical for future manned missions to Mars.

An almost identical engineering twin of MOXIE is used for testing in NASA’s Jet Propulsion Laboratory in Pasadena, California lab. (Image courtesy of NASA/JPL-Caltech)

MOXIE was designed with Simulink. The MOXIE Simulink model includes electrical circuits, chemistry, fluid dynamics, controls, and sensors. The resulting digital twin was used to simulate operation and compare to an engineering model of the device on Earth. It also was used to evaluate the results of the actual MOXIE as it operated on Mars.

 

Simulink model for MOXIE.

 

MATLAB provides Simulink with data, including the sizes of pieces of hardware, atmospheric conditions, chemical constants, control system setpoints like the desired temperature, and safety limits. Simulink then sends simulation outputs—sensor readings—back to MATLAB for analysis.

MATLAB also receives data from the real MOXIE on Mars. But the real and virtual MOXIEs don’t tell you something as simple as how much oxygen they produce or the ratio of carbon dioxide to carbon monoxide. Instead, MATLAB calculates those values from temperature, pressure, and voltage sensor data.

“To support a human mission to Mars, we have to bring a lot of stuff from Earth, like computers, spacesuits, and habitats. But oxygen? If you can make it there, go for it — you’re way ahead of the game.”
– Jeff Hoffman, MOXIE Deputy Principal Investigator

To learn more about how the team at MIT and NASA designed MOXIE, read this article.

 

Farmerless Farming

A lot closer to Earth and helping farmers plow the earth, TIME Magazine’s selection of Monarch Tractor’s MK-V is in the green energy category. MK-V is an autonomous, all-electric tractor that can save farmers the cost of fuel and help them reduce labor expenses. And there’s the obvious benefit of reducing greenhouse gas emissions!

The driver-optional Monarch Tractor. (Image credit: Monarch Tractor)

 

Simulink and Model-Based Design helped the team at Monarch synchronize systems, including the sensors, cameras, lighting, and charging. Praveen Penmetsa, Monarch Tractor cofounder and CEO, credits a program that provides startups with access to MATLAB® and Simulink® with giving Monarch Tractor a leg up on getting their initial vehicles going, starting to test the architecture with their launch vehicles and rapidly delivering the first tractors to farmers.

Read this article for more information on how Monarch Tractor designed the MK-V, including how they support over-the-air updates and how AI is helping the tractor complete real-time visual data analysis in the fields.

 

Answers from the Universe

OSIRIS-REx, the NASA spacecraft that traveled to an asteroid, collected a 250-gram sample of rocks and dust and then delivered the sample to eager scientists back on Earth, also graced the TIME Magazine innovations list and deservedly so! Researchers hope the pristine space dirt will reveal clues about the birth of our solar system.

While Mars is over 211 million miles from Earth, OSIRIS-REx traveled almost 4 billion miles. That’s “billion” with a “b.” It landed, with precision, on a moving asteroid in a touch-and-go (TAG) operation that avoided boulders and craters on the asteroid’s surface. The maneuver is even more impressive, considering that due to the rougher-than-expected terrain on Bennu, the original lidar-based TAG approach was not feasible. The mission team pivoted to use a pure vision-based navigation method instead.

An actual image of Bennu (left) and a simulated image of Bennu generated by KXIMP (right)

 

The OSIRIS-Rex spacecraft used optical navigation software (OpNav) to set the course for the asteroid Bennu. OpNav techniques use camera images to determine the position of the spacecraft relative to a celestial body, such as a planet or asteroid. Developed in MATLAB®, the KinetX Image Processing software suite (KXIMP) processes images captured with an onboard camera. These images are downlinked to Earth to calculate the inertial camera attitude and the centroids of background stars and celestial bodies in the field of view.

On the OSIRIS-REx mission, the center-finding algorithms were accurate to within 30 centimeters, or about 0.06% of the asteroid’s diameter—significantly outperforming the predicted accuracy of the mission’s navigation Concept of Operations (ConOps).

To learn more about KitetX and Opnav, read this article.

 

2024 Predictions, Anyone?

Please comment on this post with your predictions on what amazing technology and products will make the 2024 list. My money is on autonomous aircraft!

]]>
https://blogs.mathworks.com/headlines/2024/02/15/three-favorites-from-time-magazines-best-innovations-of-2023/feed/ 0
The scary combination of “rapid intensification” and “slower decay” in hurricanes https://blogs.mathworks.com/headlines/2023/08/30/the-scary-combination-of-rapid-intensification-and-slower-decay-in-hurricanes/?s_tid=feedtopost https://blogs.mathworks.com/headlines/2023/08/30/the-scary-combination-of-rapid-intensification-and-slower-decay-in-hurricanes/#respond Wed, 30 Aug 2023 21:19:39 +0000 https://blogs.mathworks.com/headlines/?p=3857

The rapid intensification and slower decay in recent hurricanes provide a terrible “one-two” punch, increasing storm surges and wind speeds at landfall while expanding the total area affected by the... read more >>

]]>
The rapid intensification and slower decay in recent hurricanes provide a terrible “one-two” punch, increasing storm surges and wind speeds at landfall while expanding the total area affected by the storms. The day before Hurricane Idalia made landfall in Florida, it traveled through some of the warmest waters on the planet, making it stronger at landfall and allowing it to travel further before losing its damaging strength.

Scientists are working to understand these storms better and have published papers that cover topics ranging from modeling intensification, improving predictions, and finding ways to minimize loss of property and life from these increasingly frequent disastrous storms.

Rapid intensification

On Tuesday afternoon, Idalia strengthened further to a Category 2, with sustained winds of 100 mph. Overnight, it rapidly intensified to a Category 3 and then Category 4, with winds of 130 mph early Wednesday.

Idalia’s projected strength and path through early next week. Image credit: NOAA/CBS News

 

“Rapid intensification is associated with a sharp increase in intensity in a short amount of time, and consequently, the threat posed by the storm significantly increases,” said Phil Klotzbach, a research scientist in the Department of Atmospheric Science at Colorado State University.

According to The Washington Post, “In the Atlantic basin, which includes the Gulf of Mexico, 16 of the 20 hurricanes that formed during 2021 and 2022 rapidly intensified. Since 2017, seven rapidly intensifying storms have strengthened to at least a Category 4 (winds of at least 130 mph) before making landfall in the United States, together causing or contributing to at least 3,381 deaths and resulting in at least $496 billion in damage, according to reports compiled by the National Hurricane Center.”

Until recently, rapidly intensifying storms were less common. Tropical storms historically have taken several days to grow into powerful hurricanes, but with human-caused climate change, rapid intensification is becoming a more common occurrence, Allison Wing, an assistant professor of atmospheric science at Florida State University, told CNN.

Researchers from The Image Processing Laboratory, University of Valencia, turned to machine learning to predict a hurricane’s potential for intensification potential in the Atlantic and Pacific oceans. Their research, Advanced Machine Learning Methods for Major Hurricane Forecasting, was published in Remote Sensing. Their framework looks to identify the most important cloud structural parameters in GOES imagery and use these structures to identify which storms can potentially evolve into major hurricanes.

 

The hybrid machine learning approach developed in the study. Image credit: Javier Martinez-Amaya, Cristina Radin, and Veronica Nieves.

 

GOES (Geostationary Operational Environmental Satellite) imagery refers to the images captured by the GOES series of satellites. These satellites are operated by the National Oceanic and Atmospheric Administration (NOAA) and continuously monitor weather conditions from a geostationary orbit. GOES imagery includes visible, infrared, and water vapor channels, which are used to observe and track weather patterns, clouds, storms, and other atmospheric phenomena in real time.

Their framework was able to more accurately identify the end intensity of major hurricanes when rapid intensification occurred, compared to existing techniques such as statistical analysis. They employed a random forest algorithm. Their MATLAB code for sea-level reconstruction data is available here.

Their study demonstrated that integrating the prominent cloud features of a tropical cyclone, including the anatomy and temperature, in a machine learning approach is suitable as a benchmark for diagnosing a possible transition into a major hurricane.

Slower decay

Back to Idalia: “We’re [going to] see not just the storm surge but potential for damaging winds extending well inland all the way across portions of north Florida, into southern Georgia, into places like Savannah, Hilton Head. We have hurricane warnings in effect for the fast-moving hurricane. It’s going to bring those winds really far inland today and tonight,” Michael Brennan, director of the National Hurricane Center, told “CBS Mornings” on Wednesday.

Not only are storms intensifying before landfall, but the same storms also carry more moisture inland, furthering their destructive results. According to CNN, “A 2020 study published in the journal Nature found storms are moving farther inland than they did five decades ago. Hurricanes, which typically weaken after moving over land, have been raging longer after landfall in recent years. The study concludes that warmer sea surface temperatures are leading to a “slower decay” by increasing moisture that a hurricane carries.”

 

Predicted rainfall from Hurricane Idalia on days one through three. Image credit: NOAA/CBS News.

 

In the study referenced by CNN, “Slower decay of landfalling hurricanes in a warming world,” the researchers analyzed 50 years of intensity data of hurricanes that made landfall in the North Atlantic. They found the slowdown in the decay over time is in direct proportion to a contemporaneous rise in the sea surface temperature.  Specifically, a typical hurricane in the 1960s lost 75% of its intensity in the first 24 hours after landfall. Today, the decay is only about 50% in the same 24-hour period, meaning the devastating effects last longer and can travel further inland.

They utilized the data from several hurricanes from 1967 on, using the MATLAB function land_or_ocean to determine which results to include in their study. The team defined landfall as four continuous inland data points, using the first inland data point as strength at landfall. Land_or_ocean is available on MathWorks File Exchange. The visualization of the hurricane paths contained in the study is shown below.

 

Hurricanes tracked from 1967 to 1992 (in blue) and 1993 to 2018 (in red).  Image credit: L. Li and P. Chakraborty.

 

MATLAB was also used to determine the decay timescale. The graph below shows the slower decay experienced in storms post-1993, shown in red. The data is available from NOAA.

 

Histogram and probability density of intensity. Image credit: L. Li and P. Chakraborty.

 

Additional climate studies welcome

This is only a sampling of studies improving our understanding of hurricanes and other climate phenomena. If you are involved in a study that you would like to share, please reach out! I welcome your ideas and guest posts.

]]>
https://blogs.mathworks.com/headlines/2023/08/30/the-scary-combination-of-rapid-intensification-and-slower-decay-in-hurricanes/feed/ 0
Treating Alzheimer’s disease with lights and sounds https://blogs.mathworks.com/headlines/2023/08/03/treating-alzheimers-disease-with-lights-and-sounds/?s_tid=feedtopost https://blogs.mathworks.com/headlines/2023/08/03/treating-alzheimers-disease-with-lights-and-sounds/#respond Thu, 03 Aug 2023 12:52:03 +0000 https://blogs.mathworks.com/headlines/?p=3803

A Boston-based startup developed a non-invasive treatment for Alzheimer’s disease that slowed Alzheimer’s disease patients’ cognitive delay by 83% and functional decline by 84% after just six... read more >>

]]>
A Boston-based startup developed a non-invasive treatment for Alzheimer’s disease that slowed Alzheimer’s disease patients’ cognitive delay by 83% and functional decline by 84% after just six months. It’s as easy as wearing a specialized set of goggles and a headset once a day for an hour. Does this sound too good to be true? According to the Boston Globe, that’s precisely what the medical device designed by Cognito Therapeutics demonstrated in a recent study.

“We made Alzheimer’s patients look like non-Alzheimer’s patients,” Cognito chief executive Brent Vaughan told the Globe.

 

Cognito Therapeutics’ medical device is designed to treat Alzheimer’s disease patients. Image credit: David L. Ryan/The Boston Globe

 

Alzheimer’s disease is robbing millions of people of their memories and diminishing their cognitive abilities. The Alzheimer’s Association states that 55 million people worldwide live with Alzheimer’s and other dementias.

Alzheimer’s disease is a degenerative brain disease and the most common form of dementia. Dementia is not a specific disease. It’s an overall term that describes a group of symptoms. – The Alzheimer’s Association

Dementia is a devastating disease for patients and their families. Beyond the suffering, a high societal cost is associated with caring for dementia patients. According to The American Journal of Managed Care, “In 2022, the estimated healthcare costs associated with [Alzheimer’s disease] treatment were $321 billion, with costs projected to exceed 1 trillion [dollars] by 2050. These cost-of-care projections are based on direct healthcare costs and are likely underestimated because indirect costs associated with AD treatment are usually not included.”

As the world ages, there is an increased need for treatments for Alzheimer’s disease.

Cognito Therapeutics was founded in 2016 after Professors Ed Boyden and Li-Huei Tsai, both of MIT, discovered how to trigger gamma waves in mice. The gamma waves reduced the amyloid plaques in the mice’s brains. Since amyloid plaques accumulate in Alzheimer’s patients, they focused on applying gamma wave stimulation to human patients.

Gamma Wave Therapy

The technique, known as gamma wave therapy, uses a headset that delivers pulsing lights and sounds at fast and specific frequencies. The therapy aims to stimulate the activity of immune cells inside the brain known as microglia. The aim is to help the central nervous system clear out specific proteins that can lead to neurological diseases and dementia.

In the study published in PLOS ONE in December 2022 that showed that six months’ use of their digital therapeutic could reduce the rate of atrophy within the brain’s white matter, the researchers from Cognito Therapeutics used EEGLAB, a MATLAB community toolbox for processing electrophysiological signals.

 

Acute 40Hz combined visual and auditory stimulation entrains cortical and subcortical regions. Image credit: Chan D, Suk H-J, Jackson BL, Milman NP, Stark D, Klerman EB, et al.

 

In the same study, the researchers at Cognito Therapeutics used Statistical Parametric Mapping (SPM), a MATLAB Community Toolbox,  for the MRI analysis of the study participants. The SPM software package has been designed to analyze brain imaging data sequences. The sequences can be a series of images from different cohorts, or time series from the same subject. The current release is designed to analyze fMRIPETSPECTEEG and MEG.

The anatomical changes are shown in the figure below. This is also from the 2022 paper published in the journal PLOS ONE.

Daily GENUS leads to group-level differences in structural and functional MRI outcomes at Month 3. Image credit: Chan D, Suk H-J, Jackson BL, Milman NP, Stark D, Klerman EB, et al.

 

This study was very promising. However, it was limited in scope due to restrictions during the pandemic. Cognito Therapeutics has started a much larger clinical trial which will enroll approximately 500 patients at 50 sites. This phase three test will focus on patients with mild to moderate disease. They hope this test’s results will earn them FDA approval by the end of 2025.

The company has also designed a sleeker headset that it hopes to commercialize if its clinical trial is successful.

Image credit: Cognito Therapeutics.

 

]]>
https://blogs.mathworks.com/headlines/2023/08/03/treating-alzheimers-disease-with-lights-and-sounds/feed/ 0
Hottest month in over 120,000 years: Hot nights make it even more miserable. https://blogs.mathworks.com/headlines/2023/07/28/hottest-month-in-over-120000-years-hot-nights-make-it-even-more-miserable/?s_tid=feedtopost https://blogs.mathworks.com/headlines/2023/07/28/hottest-month-in-over-120000-years-hot-nights-make-it-even-more-miserable/#respond Fri, 28 Jul 2023 17:53:36 +0000 https://blogs.mathworks.com/headlines/?p=3755

July 2023 is the hottest month ever recorded on Earth. And we still have a few days to go before we flip the calendar to August. According to Scientific American, “Because July is... read more >>

]]>
July 2023 is the hottest month ever recorded on Earth. And we still have a few days to go before we flip the calendar to August.

According to Scientific American, “Because July is climatologically the hottest month of the year for the Earth as a whole, that makes July 2023 the hottest month since records have been kept and likely the hottest in 120,000 years, based on evidence of past temperatures found in ancient sediments and layers of ice, as well as on other paleoclimate records.”

 

Bar graph showing global temperature anomalies for July from 1880 to 2023. The baseline period is set to 1951 to 1980.

Global Temperature Anomalies for July. Image credit: Scientific American.

 

Heat is the deadliest form of extreme weather, responsible for more human fatalities than floods, tornadoes, or hurricanes, according to National Weather Service statistics. And the heatwaves of this past month have broken records in the U.S., Mexico, China, Greece, Canada, among other locations.

Adding to the concern is that this temperature increase is seen not only in individual hot days or nights, but now it is also evident in high-temperature events on consecutive days and nights. For example, Phoenix had 10 days straight where the low temperature stubbornly remained over 90 degrees.

 

Daytime high and low temperatures in Phoenix, Arizona, over a 10-day period in July 2023. Image credit: The Guardian.

 

CNN reported, “Now, a new study projects that without steps to rein in heat-trapping gas pollution, as many as three-quarters of summer days across much of the Northern Hemisphere could feature nearly around-the-clock extreme heat by 2100.”

The inability to cool off after a hot summer day is unbearable for many, but it is particularly challenging for vulnerable populations such as the elderly and those in poor health. Heat waves in Europe and India have been linked to thousands of deaths. Last summer, 62,00 people died from heat-related causes in Europe.

Cool Nights Needed to Recover from Daytime Heat

A study, published in Nature Communications, explains the risks of compound heat extremes. “After experiencing a hot day, people tend to expect a cool night so they can recover from the daytime heat,” said two of the study’s co-authors, Dr. Yang Chen and Dr. Jun Wang. “Compound hot extremes with daytime heat and nighttime heat occurring in close sequence deprive humans of this chance at relief.”

 

Observed changes in summertime hot extremes. Image credit: Wang, J., Chen, Y., Tett, S.F.B. et al. , Nature.

 

The researchers from the Chinese Academy of Meteorological Sciences and the University of Edinburgh explored three types of temperatures events: Hot days with mild nighttime temperatures, hot night temperatures with mild daytime temps, and compound heat events, which occur when both the daytime and nighttime temps are elevated. They found that compound extreme heat events are on the rise. From 1960 to today, there are now about 5 more exceptionally hot days in the Northern Hemisphere, and these days are approximately 2.7 degrees Fahrenheit (1.5 degrees Celsius) warmer.

The researchers then conducted a series of analyses on their historical changes, mechanism explanations, quantitative detection and attribution, constrained projections, and future population exposure. The predictions are startling: If humans cannot curb the GHGs, many places in the Northern Hemisphere can expect around 69 days with brutal daytime and nighttime heat by 2100 — more than eight times more than in 2012.

 

Constrained projections of summertime hot extremes. Image credit: Wang, J., Chen, Y., Tett, S.F.B. et al. , Nature.

 

The Southern United States, Northwest and Southeast Canada, Western and Southern Europe, Mongolia, and Southeast China have already seen the largest increase in compound extreme heat days. Depending on how much reduction in greenhouse gases is achieved determines how much temperatures will continue to rise. If left unchanged, the compound heat events will become a regular experience for many by the turn of the century.

Data and code availability

The following data sets were utilized for the study:

The climate data were analyzed in MATLAB. Scripts are available from the study authors upon request.

]]>
https://blogs.mathworks.com/headlines/2023/07/28/hottest-month-in-over-120000-years-hot-nights-make-it-even-more-miserable/feed/ 0
NASA’s DART mission successfully slams asteroid https://blogs.mathworks.com/headlines/2022/09/30/nasas-dart-mission-successfully-slams-asteroid/?s_tid=feedtopost https://blogs.mathworks.com/headlines/2022/09/30/nasas-dart-mission-successfully-slams-asteroid/#comments Fri, 30 Sep 2022 19:55:41 +0000 https://blogs.mathworks.com/headlines/?p=3650

Double Asteroid Redirection Test (DART), is a NASA space mission designed to test the planetary defense against near-Earth objects (NEOs). Last week, it crashed into Dimorphos, a small asteroid over... read more >>

]]>
Double Asteroid Redirection Test (DART), is a NASA space mission designed to test the planetary defense against near-Earth objects (NEOs). Last week, it crashed into Dimorphos, a small asteroid over 11 million km (7 million miles) away from Earth. This mission was designed to see if intentionally crashing a spacecraft into an asteroid is an effective way to alter the asteroid’s course.

Dimorphos was nowhere near Earth and posed no threat. In fact, there are no known Earth-threatening NEOs, but this technology could be deployed if one was discovered in the future.

This illustration is of the DART spacecraft and the Italian Space Agency’s (ASI) LICIACube prior to impact at the Didymos binary system. Image Credit: NASA/Johns Hopkins, APL/Steve Gribben.

This illustration is of the DART spacecraft and the Italian Space Agency’s (ASI) LICIACube prior to impact at the Didymos binary system. Image Credit: NASA/Johns Hopkins, APL/Steve Gribben.

 

The spacecraft launched in November 2021.  On the scheduled day, September 26, 2022, it collided with Dimorphos as planned. DART was traveling at an astounding 14,000 miles per hour and hit the target. That target was not visible to DART’s onboard sensors until the last hours of its 10-month plus one-day journey.

The James Webb Space Telescope took one observation of the impact location before the collision, then several observations over the next few hours. Images captured by Webb’s near-infrared camera show plumes of material appearing as wisps streaming away from the center of the impact.

animation with red glow on right hand side of frame

This animation, a timelapse of images from NASA’s James Webb Space Telescope, covers the time spanning just before impact at 7:14 p.m. EDT, Sept. 26, through 5 hours post-impact. Plumes of material from a compact core appear as wisps streaming away from where the impact took place. An area of rapid, extreme brightening is also visible in the animation. Credits: Science: NASA, ESA, CSA, Cristina Thomas (Northern Arizona University), Ian Wong (NASA-GSFC); Joseph DePasquale (STScI)

Dinosaurs 0, Asteroids 1

NASA estimates that every 2,000 years or so, a meteoroid the size of a football field hits Earth and causes significant damage to the area near the impact. But every few million years, an object large enough to threaten Earth’s civilization comes along. Impact craters on the moon, Earth, and other planetary bodies are evidence of these occurrences.

One such occurrence was the Cretaceous–Paleogene extinction event. According to National Geographic, 76 percent of all species on the planet, including all non-avian dinosaurs, went extinct when an asteroid hit the planet 66 million years ago. That asteroid slammed into the waters off the coast of Mexico at 45,000 mph.

Dimorphos is much smaller than the catastrophic asteroid that impacted Earth so long ago. The asteroid that wiped out the dinosaurs was 7.5 miles across, or almost 75 times as large as 530-foot long Dimorphos. Hitting such a small target millions of miles from Earth required a new type of navigation system: One that removed the possibility of human error during the last critical hours of the mission.

Throwing a DART at a moving target

The spacecraft’s autonomous guidance system, the Small-body Maneuvering Autonomous Real Time Navigation (SMART Nav), was used to guide DART to impact. SMART Nav, developed in MATLAB and C++, is the autonomous algorithm on DART that makes the decisions related to navigating the spacecraft. SMART Nav is a set of computational algorithms on DART that, along with the rest of the spacecraft’s guidance and navigation system, independently found Dimorphos and guided the spacecraft directly into it.

DART relied on SMART Nav to direct the spaceship without any human intervention. This was crucial in the mission’s last hours, when, according to NASA, “a tiny maneuvering error could be the difference between hitting Dimorphos and racing past it at more than 13,000 miles per hour.”

“With SMART Nav, it’s no longer about just keeping the spacecraft in a prescribed orientation or carrying out correction maneuvers. It’s about conducting the entirety of the last four hours of the mission without any human intervention”, said Mark Jensenius, a guidance and navigation control engineer on the SMART Nav team at the Johns Hopkins Applied Physics Laboratory in Laurel, Maryland. “It’s about locating objects in space, selecting the correct asteroid, estimating trajectory corrections and commanding maneuvers on-the-fly to achieve the higher-level directive of ‘hit Dimorphos.’”

Knocking Dimorphos off its track

According to NASA, “The DART Investigation Team will compare the results of DART’s kinetic impact with Dimorphos to highly detailed computer simulations of kinetic impacts on asteroids. Doing so will evaluate the effectiveness of this mitigation approach and assess how best to apply it to future planetary defense scenarios, as well as how accurate the computer simulations are and how well they reflect the behavior of a real asteroid.”

According to the New York Times, “More detailed study will come years later when Hera, a spacecraft being built by the European Space Agency, arrives to take a close look at the two asteroids, especially the scar made by DART. Scientists estimate that there should be a crater 30 feet to 60 feet wide.”

 

 

]]>
https://blogs.mathworks.com/headlines/2022/09/30/nasas-dart-mission-successfully-slams-asteroid/feed/ 3
Pumpkin toadlets can’t jump https://blogs.mathworks.com/headlines/2022/08/04/pumpkin-toadlets-cant-jump/?s_tid=feedtopost https://blogs.mathworks.com/headlines/2022/08/04/pumpkin-toadlets-cant-jump/#comments Thu, 04 Aug 2022 19:09:55 +0000 https://blogs.mathworks.com/headlines/?p=3593

What is tiny, bright orange, and really bad at jumping? The answer is a small amphibian found in the mountainous forests of Brazil, aptly called pumpkin toadlets. These tiny frogs are barely a... read more >>

]]>
What is tiny, bright orange, and really bad at jumping? The answer is a small amphibian found in the mountainous forests of Brazil, aptly called pumpkin toadlets. These tiny frogs are barely a centimeter long at maturity and are (in)famous for their jumping skills.

Image credit: Luiz F. Ribeiro

 

While frogs are typically known for their ability to clear large distances in a single jump, mainly as a means to escape predators, these tiny amphibians seem to lack the aerial skills needed to be a frog. Each jump results in an uncontrolled crash landing, as seen in the video below.

 

Gizmodo reported, “[Pumpkin toadlets are] actually decent at jumping upward; it’s the coming down that’s disastrous. The pumpkin toadlets are simply unable to control their landings. A team of researchers recently looked into the faulty gymnastics of these frogs, and their findings are published  in Science Advances.”

Don’t worry: be hoppy. While it’s hard not to chuckle at the tiny amphibian’s lack of grace, it’s worth noting that these repeated tumbles don’t seem to hurt the frogs.

Researchers looked into the cause of the clumsiness and found it may be related to the toadlets’ size.

Just too small!

“[…] having a body smaller than that of a honey bee comes with some pretty clear functional consequences,” Martha Muñoz, a biologist at Yale University, told The Atlantic. “Pumpkin toadlets have shrimpy legs and arms, and fewer fingers and toes than other frogs; their high surface-area-to-volume ratio makes them easily dry out. The teeny toadlets also have all sorts of issues with their head, which can shrink only so much before it malfunctions.”

 

Image credit: Luiz F. Ribeiro

 

Small heads mean there is very little real estate for all things contained in the skull. Researchers suspect that the frogs’ crash landings may be due to their inability to balance mid-leap due to their miniature stature and even smaller ears.

The researchers made CT scans of the inner ears of 147 frog species. They determined that the pumpkin toadlets’ semicircular ear canals are the smallest ever recorded for adult vertebrates.

The ability to correct mid-air is missing.

The researchers recorded the frogs jumping and analyzed the footage. They recorded each jump from the side and from above. They needed to determine the toadlet’s orientation throughout the jump. But the orientation of a minuscule projectile as it pirouetted through the air required specialized tools.

To analyze the videos, the recordings were digitized using the DLTdv7 application in MATLAB with an eight-point calibration cube. The researchers digitized two landmarks in both camera views to generate three-dimensional coordinates for the tip of the snout and the tip of the tailbone. Body angle was calculated as the angle formed by a line connecting these two landmarks and the horizontal, which was determined by projecting a point in the y direction.

Their analysis of the videos revealed that changes in the rotation speed of the tiny frogs were the lowest among the group of amphibians in the study. High rotation speeds are needed by jumping creatures to give them time to orient themselves before they return to the ground. To make matters worse, the team determined that the change in rotation speed was lowest when the toadlets were mid-air. “This suggests that the toadlets find this stage the hardest to track with their insensitive ear canals and end up flopping to the ground instead of landing on their feet, New Scientist’s Jake Buehler reports.”

It turns out that they can jump; they just can’t stick the landing.

While they are lousy at landing, it doesn’t seem to matter. When they land, they tend to remain motionless for an extended time, allowing them to escape predators after all. They match the leaf debris on the forest floor and are hard to find.

Watch the full video below. But for the best effect, “sound on!”

]]>
https://blogs.mathworks.com/headlines/2022/08/04/pumpkin-toadlets-cant-jump/feed/ 3
Capturing the image of this black hole required an Earth-sized telescope https://blogs.mathworks.com/headlines/2022/05/18/capturing-the-image-of-this-black-hole-required-an-earth-sized-telescope/?s_tid=feedtopost https://blogs.mathworks.com/headlines/2022/05/18/capturing-the-image-of-this-black-hole-required-an-earth-sized-telescope/#respond Wed, 18 May 2022 13:03:23 +0000 https://blogs.mathworks.com/headlines/?p=3524

The Milky Way is a hazy, dare I say milky, band of light seen in the night sky. In 1610, Galileo Galilei used his telescope to show the light emanated from individual stars. In 1920, scientists first... read more >>

]]>
The Milky Way is a hazy, dare I say milky, band of light seen in the night sky. In 1610, Galileo Galilei used his telescope to show the light emanated from individual stars. In 1920, scientists first argued that the Milky Way does not contain all the stars in the Universe but is instead just one of many galaxies. Fast forward to last week, and humans saw the black hole at the center of our galaxy for the first time, thanks to advancements in science and technology and the global collaboration of researchers that support the Event Horizon Telescope (EHT).

According to Phys.org, “On Thursday, an international team of astronomers gave us the first glimpse of the supermassive black hole at the center of the Milky Way. Dubbed Sagittarius A* the gravity- and light-sucking monster some 26,000 light-years from Earth has the same mass as four million Suns.”

 

 

The first-ever image of Sagittarius A*, the black hole at the center of the Milky Way galaxy. Image credit: Event Horizon Telescope

 

Per the EHT website, “Although we cannot see the black hole itself, because it is completely dark, glowing gas around it reveals a telltale signature: a dark central region (called a “shadow”) surrounded by a bright ring-like structure. The new view captures light bent by the powerful gravity of the black hole, which is four million times more massive than our Sun.”

Considering that the black hole is 27,000 light-years away from Earth, how did the scientists from EHT capture the image? The answer to that question lies in how the EHT collects data and how the scientists created an algorithm to turn the massive amount of data into the image shared last week. The result is the first-ever visual evidence of the massive object at the center of the Milky Way.

PBS shared, “The image is the first visual evidence of Sgr A*’s existence; the black hole was previously only indirectly confirmed by observations of stars in orbit around it.”

Image credit: NASA via BBC

 

“Until now, we didn’t have the direct picture to prove that this gentle giant in the center of our galaxy is a black hole,” Feryal Özel, an astrophysicist at the University of Arizona, said during a National Science Foundation news conference held on May 12. “It shows a bright ring surrounding the darkness, and the telltale sign of the shadow of the black hole.”

The flashes of light captured in the image are produced when matter, such as planets or debris, is pulled into the black hole’s outer boundary. This boundary is called an event horizon.

 

A telescope the size of Earth

In a great analogy, the EHT team explains that given the distance from Earth, seeing the black hole in the sky is like spotting a donut on the moon’s surface. EHT uses very-long-baseline interferometry to create the image, with telescopes spread across our globe, effectively turning our planet into an Earth-sized telescope to capture a tremendous amount of data. EHT then reassembles the terabytes into the image with an algorithm called CHIRP (Continuous High-resolution Image Reconstruction using Patch priors. )

The measurements taken to construct the image of the black hole came from seven radio telescopes spread around the world that comprise the EHT. An eighth telescope at the South Pole aided in calibrating these measurements. Each telescope can see a small part of the black hole, and together, they form a virtual telescope the size of the world that effectively works together like one giant dish.

EHT map. Image credit:  D. Marrone/UofA

 

The observations were performed at a wavelength of 1.3 millimeters, which is 2000 to 3000 times longer than the wavelength of visual light.

Seth Fletcher, Chief Features Editor for Scientific American, described the process: “There’s only a very limited period of time each year when telescopes in Europe, North America, South America, Antarctica can all see the same things in the sky. So they put together [an] elaborate schedule of when Sagittarius A*, for example, is gonna be up over the horizon and visible to what telescopes.

“They just scan black holes for several nights. Then they take all the data in hard drives. Then they physically ship it to two super computer banks, one in Massachusetts, one in Germany, and then they correlate it all into a single data set. And then they search it for common detections where all of its telescopes have seen the same thing.”

Instrumenting the telescopes to work together

In order to stitch together the data, the EHT teams need to ensure the high bandwidth data is collected in the same way at each telescope.

To ensure consistent data collection, each telescope requires the same high-performance instrumentation. The EHT based its instrumentation on digital hardware and technology developed by the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER).

CASPER designs and ships FPGAs boards to each telescope site to ensure the telescopes are processing the observed radio waves in the same way. These FPGAs were designed with Simulink. The end-users at each site utilize a highly-efficient polyphase filter bank for signal processing. A polyphase filter bank is similar to a fast Fourier transform (FFT), but with improved channel isolation. The FPGAs offer higher throughput and better energy efficiency than a typical CPU or GPU.

“Simulink was crucial in producing the M87 and recent image of the black hole at the center of the galaxy,” says Dan Werthimer, chief scientist at the Berkeley SETI Research Center. “The Casper FPGA instrumentation ran at all the radio telescopes and collected several petabytes of data used to make these images. Simulink was also used to design and test the beamformer instruments that combined signals from all the antennas in both Chile and Hawaii sites for these images.”

The image exceeded expectations

Per the EHT blog, the image has advanced our understanding of our own galaxy. “We were stunned by how well the size of the ring agreed with predictions from Einstein’s Theory of General Relativity,” said EHT Project Scientist Geoffrey Bower from the Institute of Astronomy and Astrophysics, Academia Sinica, Taipei. “These unprecedented observations have greatly improved our understanding of what happens at the very centre of our galaxy, and offer new insights on how these giant black holes interact with their surroundings.”

The EHT team’s results are being published today in a special issue of The Astrophysical Journal Letters.

 

EHT image of the black hole, Sagittarius A*. The inset images represent different imaging solutions and their associated frequency (histograms). Image credit: The Astrophysical Journal Letters.

 

]]>
https://blogs.mathworks.com/headlines/2022/05/18/capturing-the-image-of-this-black-hole-required-an-earth-sized-telescope/feed/ 0
1960s US Army project unearths Greenland ice sheet’s fragility https://blogs.mathworks.com/headlines/2022/01/27/1960s-us-army-project-unearths-greenland-ice-sheets-fragility/?s_tid=feedtopost Thu, 27 Jan 2022 19:03:13 +0000 https://blogs.mathworks.com/headlines/?p=3466

A decades-old nuclear research project uncovers proof that the Greenland ice sheet has melted before. During the 1960s, a US Army project called “Project Iceworm” set out to determine if... read more >>

]]>
A decades-old nuclear research project uncovers proof that the Greenland ice sheet has melted before.

During the 1960s, a US Army project called “Project Iceworm” set out to determine if a nuclear weapon facility could be built under the one-mile-thick ice of Greenland. They drilled into the ice sheet at Camp Century, through the entire depth, to determine if this was feasible.

 

Workers building the snow tunnels at the Camp Century research base in 1960. Image Credit: U.S. Army Corps of Engineers

 

Spoiler alert: It wasn’t

The tunnels collapsed.

But that does not mean the experiment was not valuable. Fast forward more than 50 years, and “Project Iceworm” is back in the news. But this time, it has nothing to do with weapons and everything to do with the climate crisis.

As reported by Popular Science, science was part of the cover for Project Iceworm. “In 1966, in the middle of the Cold War, scientists extracted a nearly mile-long core of ice and sediment from Greenland’s ice sheet.

Since then, the samples from the ice core were stored at the University of Copenhagen’s ice core repository. The 12 inches of soil from the bottom of the ice core wasn’t studied until recently. The sample was sent to Dr. Andrew Christ at the University of Vermont. He found ancient plants and fossils in the soil sample.

 

Glacial geomorphologist Andrew Christ (right), with geology student Landon Williamson, holds up the first fossil twig spotted as they washed a sediment sample from Camp Century. Image Credit: Paul Bierman, CC BY-ND

 

Dating determined the samples were less than 1 million years old. This shows that the entire ice sheet has melted before, making it even more vulnerable to human-caused warming. The study was published in PNAS.

If all the ice on Greenland melts

“The study’s findings bolster evidence that Greenland’s ice sheet may have completely melted without the kind of human-caused warming the planet is experiencing now. This could mean that the Greenland ice sheet is really sensitive to changes in the climate,” says lead author Andrew Christ.

Maps of Greenland ice sheet speed and bedrock elevation. Image Credit: Yahoo News

 

Yahoo News stated, “With no ice sheet, sunlight would have warmed the soil enough for tundra vegetation to cover the landscape. The oceans around the globe would have been more than 10 feet higher, and maybe even 20 feet. The land on which Boston, London and Shanghai sit today would have been under the ocean waves.”

“If we continue to warm the planet uncontrollably, we could melt away the Greenland ice sheet and raise the sea level,” Christ told Poplar Science. “That would be very bad, because 40 percent of the global population lives within 100 kilometers of the coast. And 600 million people live within about ten feet of sea level rise.”

How MATLAB was used

According to the Washington Post, once the organic sediment was discovered, “The fossils were passed along to plant experts for further analysis, and Christ set about trying to determine when they might have grown.”

Christ used a technique called cosmogenic nuclide dating to calculate the plants’ age. This process estimates the amount of time rocks have been buried by analyzing particles created when materials are exposed to radiation from outer space.

The researchers reported, “Cosmogenic 26Al/ 10Be minimum total histories were modeled using MATLAB code from using the world-average production rate and scaling implemented in the CREp calculator.”

26Al (Aluminum-26) and 10Be (Beryllium-10) are commonly used for surface exposure dating since these nuclides are produced when cosmic rays strike (parent isotopes) oxygen-16 and silicon-28, respectively. The parent isotopes are common in the earth’s crust. CREp, from the University of Lorraine, France, is MATLAB code that computes Cosmic Ray Exposure (CRE) ages. Here is a link to the CRE calculator.

Findings are “scary”

“If we had found a much older age, it would have been impressive, but it might not have been as scary,” Christ told the Washington Post. “Because what we found means the ice sheet melted away and raised sea level within a climate system kind of like ours. That, as a climate scientist, has more gravity.”

]]>
C-Band 5G Telecom Delays and Airline Frustration https://blogs.mathworks.com/headlines/2022/01/22/c-band-5g-telecom-delays-and-airline-frustration/?s_tid=feedtopost https://blogs.mathworks.com/headlines/2022/01/22/c-band-5g-telecom-delays-and-airline-frustration/#comments Sat, 22 Jan 2022 16:33:50 +0000 https://blogs.mathworks.com/headlines/?p=3349

The airlines made their case to the FAA earlier this week, warning that the rollout of 5G service near airports could cause catastrophic disruption to both passenger flights and cargo shipments.... read more >>

]]>
The airlines made their case to the FAA earlier this week, warning that the rollout of 5G service near airports could cause catastrophic disruption to both passenger flights and cargo shipments. “Unless our major hubs are cleared to fly, the vast majority of the traveling and the shipping public will essentially be grounded,” wrote the chief executives of American Airlines, Delta Air Lines, United Airlines, Southwest Airlines among others in a letter first reported by Reuters.

Image Credit: Toronto Star

 

The US is not alone in this struggle. Last October, Canadians living near airports were informed that they will not get full 5G service. According to the Toronto Star, “The government said it is introducing the restrictions because there are concerns about possible interference between those airwaves — which are known as spectrum and carry wireless communications signals — and certain aviation navigation tools.”

5G restrictions near Toronto Pearson International Airport. Image Credit: Toronto Star

 

This post was written by guest bloggers Mike Rudolph, Industry Manager for Aerospace and Defense, Steve Ajemian, Technical Account Manager for our Aerospace and Defense Accounts, and Rick Gentile and Babak Memarzadeh, Product Managers for Radar Toolbox and Sensor Fusion and Tracking Toolbox. Together, this team tracks and supports technical trends throughout the industry to enable engineers and scientists to better collaborate across disciplines such as Radar and 5G.

Analyzing for Interference

According to the FAA, the “deployments of 5G technology in other countries often involve different conditions than those proposed for the U.S., including lower power levels, antennas tilted downward to reduce potential interference to flights, different placement of antennas relative to airfields, and frequencies with a different proximity to frequencies used by aviation equipment.”

Image Credit: FAA

In the US, the concerns raised about possible interference are due to the region of spectrum within C-band that is being utilized for 5G communications. The 3.7 – 3.98 GHz band has been purchased by the wireless carriers AT&T and Verizon.  This spectrum is in close proximity to the operating frequencies of radar altimeters which reside in the 4.2 – 4.4 GHz band. Radar altimeters are utilized by aircraft to measure the aircraft’s altitude relative to the local terrain during flight.

While the wireless industry has referenced successful 5G rollouts in other countries, there has not been a comprehensive study of the possible impacts to radar altimeters due to 5G systems operating in such close proximity within C-band.

According to NPR, “[…] former Federal Aviation Administration Administrator Michael Huerta, who served in the role from 2010-2018, points out the 5G towers near airports in other countries are either turned off or operating at low power near airports, with transmitters pointing down toward the ground and away from aircraft.”

“What really needs to happen is the very detailed technical analysis, airport by airport, aircraft type by aircraft type, to determine how real the interference potential actually is,” Huerta told NPR.

 

Defining the Problem Space

The concern of interference from 5G systems near airports is magnified by the fact that 5G base stations utilize directional antennas that can focus transmissions in very specific directions. These base stations also radiate at elevated heights to allow these transmissions to propagate over greater distances.

From the perspective of the radar altimeter, the primary concern with systems operating in close frequency proximity is due to “spurious emissions.” Spurious emissions are undesired out-of-band emissions into nearby frequencies that occur during transmission. These emissions can degrade the performance of the receiver operating nearby (the radar altimeter in this case) due to RF energy leaking into this band, which could prevent an accurate determination of flight altitude. Characterizing the level of interference observed at the radar altimeter is critical to understanding what performance degradation could be experienced.

Lastly, these scenarios involve aircraft taking off and landing, with the position and altitude changing over time. Since the distance relative to the 5G base station will be varying, this adds additional complexity to the analysis.

 

Modeling this scenario in MATLAB

With MATLAB, you can build a high-fidelity model of this scenario and analyze the potential for interference from 5G transmissions to radar altimeters. The scenario can be decomposed into three major parts: 5G waveform generation and transmission, radar altimeter modeling, and scenario modeling.

  • 5G Waveform Generation and Transmission

Using the 5G Toolbox, users can generate a range of standard-compliant 5G waveforms to investigate the out-of-band emissions when transmitting in C-band (3.7 – 3.98 GHz). Custom waveforms can also be created and generated targeting the bands of interest for this type of analysis.

5G Waveform Generator App

Phased Array System Toolbox enables users to model a representative antenna array for a 5G base station. These base stations utilize adaptive beam steering to dynamically point antenna beams towards user equipment. Modeling the 5G waveform and the radiated power together can provide a representative interference source to characterize the out-of-band emissions as they propagate over the air.

Phased Array Beamforming

 

  • Radar Altimeter Modeling

Using the Radar Toolbox, you can model the radar altimeter and assess performance with and without 5G interference present. You can also model cases where interferers such as when a 5G base station is transmitting nearby. You can model the radar altimeter to generate the received in-phase and quadrature (IQ) signals. By applying signal processing algorithms on the received IQ signal, you can assess the effect of interference on the output of the radar altimeter.

  • Scenario Modeling

Aircraft performing a takeoff or landing maneuver can be readily modeled using Radar Toolbox. Modeling the aircraft flying relative to the 5G Base Station enables the characterization of a range of interference conditions experienced over the course of the flight trajectory while accounting for the RF propagation of the 5G transmissions.

High-Fidelity Modeling

Model radar air traffic systems and radar altimeters

With MATLAB, you model complex wireless and radar scenarios. These models can be used to characterize the effects of interference and inform decision-making on appropriate mitigations to allow these types of systems to co-exist. If your team is interested in a demo of these capabilities, please reach out to speak with one of our industry experts.

Here are some additional resources:

]]>
https://blogs.mathworks.com/headlines/2022/01/22/c-band-5g-telecom-delays-and-airline-frustration/feed/ 2
When the charging station is (only) 93 million miles away https://blogs.mathworks.com/headlines/2021/07/29/when-the-charging-station-is-only-93-million-miles-away/?s_tid=feedtopost Thu, 29 Jul 2021 16:37:47 +0000 https://blogs.mathworks.com/headlines/?p=3224

Lightyear has had a great month. First, they announced that their prototype car drives over 440 miles (770 km) on a single charge.  And that’s on a battery charge of just 60 kWh. The Lightyear One... read more >>

]]>
Lightyear has had a great month. First, they announced that their prototype car drives over 440 miles (770 km) on a single charge.  And that’s on a battery charge of just 60 kWh.

The Lightyear One prototype zipped around the test track with a little help from the big charging station in the sky. This power source is, on average, 93 million miles ( 149.6 million km) from Earth.

Lightyear One Prototype on the test track. (Image credit: Lightyear.)

“440 miles is a big deal, but the bigger deal is that the Lightyear One did it a relatively small battery pack,” Motortrend reported. “’So what?’ you might be thinking. Lucid and Tesla both have high-dollar, high-range EVs that can crack the 400 mile mark. That’s true, but the Lightyear One did what Tesla and Lucid can do but with a much smaller battery pack.”

And, as Motortrend noted, Lightyear is targeting delivery to its first customers in 2022. This brings us to the second splash of news from Lightyear. Lightyear has partnered with a manufacturer to build its solar-powered car. According to The Verge, Lightyear is working with Valmet Automotive, a manufacturer that has built cars for Mercedes-Benz, Saab, and Porsche.

Aerodynamics vs. solar panel surface tradeoff

The key to the range of the Lightyear One has been maximizing solar charging while minimizing air drag.

Lightyear One’s aerodynamic shape adds to the car’s high efficiency. It’s long with a slope that resembles the cross-section of an airplane wing. Other decisions, such as adding wheel covers to the rear tires and replacing side mirrors with cameras, squeezed out even more efficiency. Wind tunnel tests showed that Lightyear One broke the record for being the most aerodynamic five-seater electric car to date.

When considering the car’s roof, they needed to find the right balance between a completely flat, wide shape that would maximize sunlight exposure, and one with curves that would reduce wind resistance. By running simulations in Simulink, they were able to scrutinize these kinds of design tweaks and see where the best tradeoffs were to improve efficiency.

Image credit: Lightyear

From a student competition to a road-ready car

To see how Lightyear got its start from a student competition team, check out the article

]]>