<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Behind the Headlines</title>
	<atom:link href="https://blogs.mathworks.com/headlines/feed/" rel="self" type="application/rss+xml" />
	<link>https://blogs.mathworks.com/headlines</link>
	<description>Lisa Harvey discusses how MATLAB and Simulink are connected to today’s news stories around the world.</description>
	<lastBuildDate>Tue, 04 Nov 2025 16:50:46 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.1.1</generator>
	<item>
		<title>Understanding pig emotions to improve animal welfare</title>
		<link>https://blogs.mathworks.com/headlines/2025/10/30/understanding-pig-emotions-to-improve-animal-welfare/?s_tid=feedtopost</link>
					<comments>https://blogs.mathworks.com/headlines/2025/10/30/understanding-pig-emotions-to-improve-animal-welfare/#respond</comments>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Thu, 30 Oct 2025 14:06:27 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=4781</guid>

					<description><![CDATA[<div class="overview-image"><img  class="img-responsive" src="https://blogs.mathworks.com/headlines/files/2025/10/Pig-on-farm-1024x683.jpg" onError="this.style.display ='none';" /></div><p>Lately, artificial intelligence (AI) is being sold as the panacea for problem-solving. Can’t decide where to live? Ask AI. Which restaurant should you go to? Ask AI. Need lottery picks? Again, ask... <a class="read-more" href="https://blogs.mathworks.com/headlines/2025/10/30/understanding-pig-emotions-to-improve-animal-welfare/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>Lately, artificial intelligence (AI) is being sold as the panacea for problem-solving. Can’t decide where to live? Ask AI. Which restaurant should you go to? Ask AI. Need lottery picks? Again, ask AI. (Yes, it <a href="https://www.cbsnews.com/detroit/news/wyandotte-woman-wins-powerball-prize-with-chagpt/" target="_blank" rel="noopener">reportedly worked!</a>) There seems to be no shortage of ways AI can improve your life.</p>
<p>Artificial intelligence isn’t just transforming human lives; it’s making a real difference for animals as well. Today, AI-powered facial recognition helps reunite lost pets with their families, while smart monitoring systems keep wildlife and drivers safer at busy road crossings.</p>
<p>AI is also capable of improving the lives of farm animals. An international team of scientists developed an AI algorithm that decodes pig vocalizations to detect emotions, helping farmers identify when pigs are happy or stressed.</p>
<p>&nbsp;</p>
<p><div id="attachment_4784" style="width: 580px" class="wp-caption alignnone"><img aria-describedby="caption-attachment-4784" decoding="async" loading="lazy" class="wp-image-4784" src="https://blogs.mathworks.com/headlines/files/2025/10/Pig-on-farm-1024x683.jpg" alt="A close-up of a pig standing in a grassy outdoor pasture, looking directly at the camera. In the background, other pigs roam freely under a partly cloudy sky." width="570" height="380" /><p id="caption-attachment-4784" class="wp-caption-text">Pigs at a free-range farm.</p></div></p>
<p>&nbsp;</p>
<p>According to <a href="https://www.reuters.com/technology/artificial-intelligence/ai-decodes-oinks-grunts-keep-pigs-happy-2024-10-24/" target="_blank" rel="noopener">Reuters</a>, “European scientists have developed an AI algorithm capable of interpreting pig sounds, aiming to create a tool that can help farmers improve animal welfare.”</p>
<h2>The study</h2>
<p>By recording an impressive 7,414 sounds from 411 pigs as they played, competed for food, or spent time alone, the scientists were able to analyze which oinks and squeals signaled positive emotions, and which ones revealed stress or negative emotions.</p>
<p>First, the calls were annotated with the pig’s experience at the time of the recording, or context. Were they playing or being fed? Were they isolated from the other animals? Next, the contexts were classified by emotional valence at the time a vocalization was produced: either negative, such as isolation or competing for space at the trough, or positive, such as social contact, nursing, or playing.</p>
<p>Then, the researchers extracted data from each call using Audio Toolbox and Signal Analyzer, including duration and frequency. They then created spectrograms in MATLAB for each call.</p>
<p>Using the acoustical data, the researchers tested and compared two classification methods: Permuted Discriminant Function Analysis (pDFA), a traditional statistical approach using the acoustic features, and AI in the form of a convolutional neural network (CNN).</p>
<h2>AI outperforms the traditional statistical approach</h2>
<p>“As an input to the neural network, spectrograms were computed from the pig vocalization audio recordings in MATLAB R2020b,” <a href="https://www.nature.com/articles/s41598-022-07174-8#Sec1">the study</a> explained.</p>
<p>To bring this idea to life, the research team leveraged advanced tools from MathWorks. Assistant Professor Jeppe Have Rasmussen from the University of Copenhagen played a key role in implementing the technical approach. The process combined Computer Vision Toolbox for feature detection and Deep Learning Toolbox for training the AI models.</p>
<p>The backbone of the system was a ResNet-50 convolutional neural network, adapted through transfer learning. This choice allowed the researchers to build on a proven architecture while tailoring it to the unique challenge of interpreting pig vocalizations.</p>
<p>Two separate neural networks were trained for distinct tasks:</p>
<ul>
<li><strong>Valence Classification</strong>: Determining whether a spectrogram represented a positive or negative emotional state.</li>
<li><strong>Context Classification</strong>: Identifying the specific context in which the vocalization occurred, such as feeding, isolation, or social interaction.</li>
</ul>
<p>By training these models independently, the team ensured that each network specialized in its respective application, leading to more accurate predictions.</p>
<p><div style="width: 695px" class="wp-caption alignnone"><a href="https://media.springernature.com/lw685/springer-static/image/art%3A10.1038%2Fs41598-022-07174-8/MediaObjects/41598_2022_7174_Fig2_HTML.png" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="size-large" src="https://media.springernature.com/lw685/springer-static/image/art%3A10.1038%2Fs41598-022-07174-8/MediaObjects/41598_2022_7174_Fig2_HTML.png" alt="Two t-SNE plots showing clusters of pig vocalization spectrograms. Panel (a) separates calls by emotional valence: green for positive, red for negative. Panel (b) shows clusters by context, with multiple colors representing conditions like isolation, nursing, castration, and enrichment. A legend identifies each context category." width="685" height="341" /></a><p class="wp-caption-text">Classification of calls to the valence and context of production based on t-SNE. t-SNE embedding of (a) valence and (b) context classifying neural network’s last fully connected layer activations for each spectrogram. Triangles indicate negative valence vocalizations, while circles indicate positive ones. Image credit: Briefer et al. via <a href="https://www.nature.com/articles/s41598-022-07174-8" target="_blank" rel="noopener">Nature</a>.</p></div></p>
<p>&nbsp;</p>
<p>The researchers used t-SNE (t-distributed Stochastic Neighbor Embedding), a technique that reduces complex, high-dimensional data into two dimensions for visualization. In the plots above, each dot represents a vocalization. Panel (a) shows clusters based on emotional valence with green indicating positive and red indicating negative. Panel (b) reveals how calls are grouped by context, such as isolation or nursing. These clusters demonstrate that the AI model successfully learned patterns linking sound features to emotional states and situations.</p>
<p>Their algorithm could identify both emotion and what the pig was doing just from their calls. The neural networks outperformed the pDFA significantly. The CNNs had a valence classification accuracy of 92% compared to 62% for the pDFA classifications. While the results for context, or the pig’s activity at the time of the call, were still impressive. The neural network was correct 82% of the time, compared to just 19% accuracy for the pDFA.</p>
<h2>AI to improve animal welfare</h2>
<p>These findings confirm that pig vocalizations are reliable indicators of emotional states, paving the way for automated emotion-monitoring tools to improve animal welfare. Neural network analysis revealed that pigs raised outdoors or in free-range systems produce fewer stress calls than those on conventional farms. This technology could even lead to consumer-facing apps that label farms based on welfare standards.</p>
<p>Beyond transparency, the benefits are clear:</p>
<ul>
<li><strong>Improve animal welfare</strong> by responding quickly to signs of distress.</li>
<li><strong>Reduce stress-related health issues</strong>, lowering veterinary costs and boosting productivity.</li>
<li><strong>Advance precision livestock farming</strong> with AI-driven insights for smarter care.</li>
<li><strong>Support ethical farming practices</strong>, enhancing accountability and consumer trust.</li>
</ul>
<p>&nbsp;</p>
<p>So next time you hear a pig squeal, remember, there might be an international AI team ready to translate its mood!</p>
<p>To read the full research paper, see <a href="https://doi.org/10.1038/s41598-023-45242-9" target="_blank" rel="noopener">doi.org/10.1038/s41598-023-45242-9</a></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blogs.mathworks.com/headlines/2025/10/30/understanding-pig-emotions-to-improve-animal-welfare/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Only five people in the world have seen this color</title>
		<link>https://blogs.mathworks.com/headlines/2025/08/11/only-five-people-in-the-world-have-seen-this-color/?s_tid=feedtopost</link>
					<comments>https://blogs.mathworks.com/headlines/2025/08/11/only-five-people-in-the-world-have-seen-this-color/#respond</comments>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Mon, 11 Aug 2025 18:39:41 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=4679</guid>

					<description><![CDATA[<div class="overview-image"><img  class="img-responsive" src="https://blogs.mathworks.com/headlines/files/2025/08/olo1.jpg" onError="this.style.display ='none';" /></div><p>Researchers at UC Berkeley have discovered a new color outside the range of human color vision. Only five people in the world have seen this new color. They call the color... <a class="read-more" href="https://blogs.mathworks.com/headlines/2025/08/11/only-five-people-in-the-world-have-seen-this-color/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>Researchers at UC Berkeley have discovered a new color outside the range of human color vision. Only five people in the world have seen this new color. They call the color &#8220;olo&#8221;.</p>
<p>&nbsp;</p>
<p><img decoding="async" loading="lazy" class="wp-image-4691 alignnone" src="https://blogs.mathworks.com/headlines/files/2025/08/olo1.jpg" alt="" width="389" height="222" /></p>
<p>&nbsp;</p>
<p>According to <em><a href="https://www.theatlantic.com/science/archive/2025/04/olo-color-berkeley-teal/682557/" target="_blank" rel="noopener">The Atlantic</a></em>, “The color “olo” can’t be found on a Pantone color chart. It can be experienced only in a cramped 9-by-13 room in Northern California. That small space, in a lab on the UC Berkeley campus, contains a large contraption of lenses and other hardware on a table. To see &#8220;olo&#8221;, you need to scootch up to the table, chomp down on a bite plate, and keep your head as steady as you can.”</p>
<p>&nbsp;</p>
<p><div style="width: 460px" class="wp-caption alignnone"><a href="https://news.berkeley.edu/wp-content/uploads/2025/04/PXL_20250417_173111112-1024x576.jpg" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://news.berkeley.edu/wp-content/uploads/2025/04/PXL_20250417_173111112-1024x576.jpg" alt="The foreground shows lab equipment on a table. In the background, a man has his chin on a chinrest and is looking directly at the camera. There is a black cylinder in front of his left eye, which he is looking through. " width="450" height="253" /></a><p class="wp-caption-text">Austin Roorda, a professor of optometry and vision science at UC Berkeley, demonstrates what it looks like to be part of the Oz experiment. Image credit: Austin Roorda, University of California, Berkeley.</p></div></p>
<p>&nbsp;</p>
<p>Why the cramped room and complicated setup, you ask? To see this new color, your retina must be targeted by a laser with precise accuracy. The researchers created a special laser instrument, named Oz, that delivers light to only specific individual cells in your retina.</p>
<p>Here’s the closest approximation of the new color that we can see without the Oz system. It’s blue-green with unprecedented saturation.</p>
<p>&nbsp;</p>
<p><div style="width: 220px" class="wp-caption alignnone"><a href="https://ichef.bbci.co.uk/news/1536/cpsprodpb/951e/live/a3828620-1de2-11f0-80b3-83959215671c.png.webp" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://ichef.bbci.co.uk/news/1536/cpsprodpb/951e/live/a3828620-1de2-11f0-80b3-83959215671c.png.webp" alt="A square of solid color in the blue-green range. " width="210" height="210" /></a><p class="wp-caption-text">The closest approximation of the new color. Image credit: BBC.</p></div></p>
<p>&nbsp;</p>
<h1>The Oz System</h1>
<p>Your eyes see color using cone cells—L (long), M (medium), and S (short)—each sensitive to different wavelengths of light. Normally, these cones work together, blending inputs to create every color you’ve ever seen.</p>
<p>First, the researchers mapped a part of the retina to identify each cone cell as an S, M, or L cone.  Oz delivers light cell-by-cell to individual photoreceptor cells on the retina. They used precisely targeted laser pulses to stimulate just the M cones, while avoiding L and S cones.</p>
<p>&nbsp;</p>
<p><div style="width: 460px" class="wp-caption alignnone"><a href="https://news.berkeley.edu/wp-content/uploads/2025/04/fig1D.jpg" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://news.berkeley.edu/wp-content/uploads/2025/04/fig1D.jpg" alt="There are six columns separated into sets of two. Each column comprises 5 images in squares. The first two columns show ten images the Oz system wants to convey. The next two columns show the corresponding cone activations needed to " width="450" height="381" /></a><p class="wp-caption-text">The Oz software takes a color image (left column) and calculates which cone cells in the retina need to be activated for a person to see that image (center). It then calculates the pattern of laser microdoses that need to be delivered to the retina to activate those cones (right). Image credit: University of California, Berkeley.</p></div></p>
<p>&nbsp;</p>
<p>This bypassed the usual overlapping input that your brain uses to construct color. By isolating the M cones, the brain receives an input combination that never naturally occurs, resulting in the perception of a color outside our typical visual spectrum.</p>
<p>&#8220;Olo&#8221; is named for its theoretical <a href="https://en.wikipedia.org/wiki/LMS_color_space" target="_blank" rel="noopener">LMS color space</a> coordinates for long-, medium-, shortwave cones: (0, 1, 0), representing the stimulation of only the green (M) cone in the human eye, with no stimulation of the red (L) or blue (S) cones.</p>
<p>&nbsp;</p>
<p><img decoding="async" loading="lazy" class="alignnone wp-image-4724 " src="https://blogs.mathworks.com/headlines/files/2025/08/new-olo.jpg" alt="A square that is bright blue-green. There are two lines of text. the top line is &quot;olo&quot; and the bottom line is &quot;010&quot;." width="211" height="168" /></p>
<p>&nbsp;</p>
<p>Their research was published in <em><a href="https://www.science.org/doi/full/10.1126/sciadv.adu1052">Science Advances</a></em>.</p>
<h1>Proving “olo” is outside normal vision</h1>
<p>To test if &#8220;olo&#8221; was truly beyond normal human color vision, researchers had participants compare it to a teal laser and adjust its saturation using white light. The researchers controlled the display on the RGB projector with  <a href="https://www.mathworks.com/matlabcentral/fileexchange/76411-psychtoolbox-3" target="_blank" rel="noopener">Psychtoolbox</a>, a MATLAB community toolbox. When participants added white light to desaturate &#8220;olo&#8221;, it matched the laser—confirming &#8220;olo&#8221; exists outside the typical human visual range.</p>
<p>The study advances our understanding of color vision. The researchers hope to use this technique to further research color blindness.</p>
<p>To read the full research paper, see <a href="https://doi.org/10.1126/sciadv.adu1052">10.1126/sciadv.adu1052</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blogs.mathworks.com/headlines/2025/08/11/only-five-people-in-the-world-have-seen-this-color/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Can AI create better wireless chip designs than humans?</title>
		<link>https://blogs.mathworks.com/headlines/2025/04/24/can-ai-create-better-wireless-chip-designs-than-humans/?s_tid=feedtopost</link>
					<comments>https://blogs.mathworks.com/headlines/2025/04/24/can-ai-create-better-wireless-chip-designs-than-humans/#respond</comments>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Thu, 24 Apr 2025 14:34:51 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=4613</guid>

					<description><![CDATA[<div class="overview-image"><img  class="img-responsive" src="https://engineering.princeton.edu/wp-content/uploads/2025/01/sengupta-lap-chip-1536x864.jpg" onError="this.style.display ='none';" /></div><p>Can AI really design wireless chips better than humans? According to a newly published study in the journal Nature Communications, the answer is yes. Better, faster, and next to impossible for humans... <a class="read-more" href="https://blogs.mathworks.com/headlines/2025/04/24/can-ai-create-better-wireless-chip-designs-than-humans/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>Can AI <strong><em>really</em></strong> design wireless chips better than humans? According to a newly published study in the journal <a href="https://www.nature.com/articles/s41467-024-54178-1" target="_blank" rel="noopener">Nature Communications</a>, the answer is yes. Better, faster, and next to impossible for humans to understand.</p>
<p>&nbsp;</p>
<p><div style="width: 540px" class="wp-caption aligncenter"><a href="https://engineering.princeton.edu/wp-content/uploads/2025/01/sengupta-lap-chip-1536x864.jpg" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://engineering.princeton.edu/wp-content/uploads/2025/01/sengupta-lap-chip-1536x864.jpg" alt="A close-up of a chip designed with the AI described in the study." width="530" height="298" /></a><p class="wp-caption-text">A close-up of a chip designed with the AI described in the study. Image credit: Princeton University.</p></div></p>
<p>&nbsp;</p>
<p>According to a report by Princeton Engineering, “Specialized microchips that manage signals at the cutting edge of wireless technology are astounding works of miniaturization and engineering. They’re also difficult and expensive to design.” These microchips were the focus of the study. The new technique enables the rapid synthesis of complex architectures in minutes, unlike traditional algorithms that take weeks. The study even found that sometimes, this innovative method generates structures that are otherwise impossible to create with existing approaches.</p>
<p>“Classical designs, carefully, put these circuits and electromagnetic elements together, piece by piece, so that the signal flows in the way we want it to flow in the chip. By changing those structures, we incorporate new properties,” Professor Kaushik Sengupta, lead researcher and co-director of <a href="https://nextg.princeton.edu/" target="_blank" rel="noopener">NextG</a><u>,</u> Princeton’s industry partnership program for next-generation communications,  said. “Before, we had a finite way of doing this, but now the options are much larger.”</p>
<h1>Deep learning</h1>
<p>Researchers from Princeton University and the Indian Institute of Technology Madras employed deep learning-based models using an inverse design approach with arbitrary-shaped electromagnetic structures such as antennas, filters, splitters, and switches. The result was transformative: within minutes, the AI produced designs with novel structures.</p>
<p>A deep convolutional neural network (CNN) was used as an electromagnetic (EM) emulator. The input is the arbitrary geometry with arbitrary port placements, and the output is the predicted multi-port scattering and radiative properties.</p>
<p>&nbsp;</p>
<p><div style="width: 553px" class="wp-caption aligncenter"><a href="https://media.springernature.com/lw685/springer-static/image/art%3A10.1038%2Fs41467-024-54178-1/MediaObjects/41467_2024_54178_Fig1_HTML.png?as=webp" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://media.springernature.com/lw685/springer-static/image/art%3A10.1038%2Fs41467-024-54178-1/MediaObjects/41467_2024_54178_Fig1_HTML.png?as=webp" width="543" height="570" /></a><p class="wp-caption-text">a) The proposed approach for chip synthesis with inverse-designed arbitrary-shaped multi-port radiative and non-radiative structures co-designed with circuits. b) Inverse-designed integrated multi-port millimeter-wave passive structures and end-to-end mm-Wave amplifier circuit chip with co-design between multi-port passive and active circuitry. The chips are fabricated in industry standard 90-nm BiCMOS foundry. c) Inverse synthesis of arbitrary multi-port electromagnetic structures with desired scattering and radiating properties, enabled through a deep-learning based forward electromagnetic emulator. The latter takes the image of the structure and predicts accurately its multi-port scattering and radiating properties across frequencies in the space of arbitrary-shaped planar structures. Image credit: Nature.</p></div></p>
<p>&nbsp;</p>
<p><a href="https://www.mathworks.com/products/rftoolbox.html">RF Toolbox</a> and <a href="https://www.mathworks.com/products/antenna.html">Antenna Toolbox</a> were used extensively to design and simulate the electromagnetic properties of the structures and circuits. These simulations generated datasets used to train the deep neural network model to predict the electromagnetic properties of the designs. <a href="https://www.mathworks.com/products/parallel-computing.html">Parallel Computing Toolbox</a> accelerated these RF and EM simulations, with workloads offloaded to the HPC cluster at the university.</p>
<p>The CNN was developed with <a href="https://www.mathworks.com/products/deep-learning.html">Deep Learning Toolbox</a>. Deep Learning Toolbox was used to design and train the deep neural networks to predict scattering and radiative properties of the arbitrarily shaped electromagnetic structures. Deep Learning Toolbox was also used with Parallel Computing Toolbox for training and inference on the university’s local GPUs.</p>
<p>In the study, the researchers state, “Once trained, the synthesis achieves the target specifications within minutes.</p>
<h1>AI thinks differently from humans</h1>
<p>Chip designers often spend years perfecting their craft. The thought process is typically linear, based on pre-selected templates of EM structure. The resulting topologies are often handcrafted layouts based on extensive training and experience.  The AI doesn’t think in the same linear manner, instead developing arbitrarily shaped based on the desired EM characteristics and functionality.</p>
<p>“We are coming up with structures that are complex and look random shaped, and when connected with circuits, they create previously unachievable performance. Humans cannot really understand them, but they can work better,” said Sengupta.</p>
<p>Uday Khankhoje, a co-author and associate professor of electrical engineering at Indian Institute of Technology Madras, said the new technique not only delivers efficiency but promises to unlock new approaches to design challenges that have been beyond the capability of engineers.</p>
<p>“This work presents a compelling vision of the future,” he said. “AI powers not just the acceleration of time-consuming electromagnetic simulations, but also enables exploration into a hitherto unexplored design space and delivers stunning high-performance devices that run counter to the usual rules of thumb and human intuition.”</p>
<p>&nbsp;</p>
<p><div style="width: 583px" class="wp-caption aligncenter"><a href="https://engineering.princeton.edu/wp-content/uploads/2025/01/16X9-SenguptaLab_111224_0020-1536x864.jpg"><img decoding="async" loading="lazy" class="" src="https://engineering.princeton.edu/wp-content/uploads/2025/01/16X9-SenguptaLab_111224_0020-1536x864.jpg" width="573" height="322" /></a><p class="wp-caption-text">An enlarged image of the chip’s circuitry in Sengupta’s lab at Princeton, with Professor Kaushik Sengupta, left, and first author Emir Ali Karahan, a graduate student in electrical and computer engineering. Image credit: Princeton University.</p></div></p>
<p>&nbsp;</p>
<p>The increasing complexity and demands of next-generation wireless systems necessitates new design paradigms for RF and EM structures. The research presented opens new avenues in this direction and will help designers meet stringent requirements on the size and performance of these devices.</p>
<p>According to <a href="https://www.popularmechanics.com/science/a63606123/ai-designed-computer-chips/" target="_blank" rel="noopener">Popular Mechanics</a>, “The right algorithm, they say, could suggest new paradigms in a matter of minutes. From there, engineers could use these paradigms as innovative starting points for their own ideas.”</p>
<h1>Removing the need for human designers?</h1>
<p>Will this AI remove the need for human designers? Not likely, according to the researchers. The goal is to enhance designs with those that have not yet been considered. Still, human oversight is needed to ensure that the AI doesn’t create faulty or inefficient arrangements, and to avoid AI hallucinations that could introduce elements that don’t work at all.</p>
<p>“There are pitfalls that still require human designers to correct,” Sengupta said. “The point is not to replace human designers with tools. The point is to enhance productivity with new tools. The human mind is best utilized to create or invent new things, and the more mundane, utilitarian work can be offloaded to these tools.”</p>
<p>To read the full research paper, see <a href="https://doi.org/10.1038/s41467-024-54178-1" target="_blank" rel="noopener">DOI 10.1038/s41467-024-54178-1.</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://blogs.mathworks.com/headlines/2025/04/24/can-ai-create-better-wireless-chip-designs-than-humans/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Chatty little marmosets call each other by name</title>
		<link>https://blogs.mathworks.com/headlines/2025/02/06/chatty-little-marmosets-call-each-other-by-name/?s_tid=feedtopost</link>
					<comments>https://blogs.mathworks.com/headlines/2025/02/06/chatty-little-marmosets-call-each-other-by-name/#respond</comments>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Thu, 06 Feb 2025 21:47:40 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=4568</guid>

					<description><![CDATA[<div class="overview-image"><img  class="img-responsive" src="https://blogs.mathworks.com/headlines/files/2025/02/GettyImages-154534932-1024x680.jpg" onError="this.style.display ='none';" /></div><p>Marmosets, small primates native to South America, are known to be highly social primates. Living in family-based groups of up to 15 individuals, they use a complex system of vocalizations for... <a class="read-more" href="https://blogs.mathworks.com/headlines/2025/02/06/chatty-little-marmosets-call-each-other-by-name/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>Marmosets, small primates native to South America, are known to be highly social primates. Living in family-based groups of up to 15 individuals, they use a complex system of vocalizations for communication. These sounds range from high-pitched calls to trills and whistles that are referred to as “phee” calls. According to <a href="https://www.theguardian.com/science/article/2024/aug/29/marmosets-behaviour-specific-names-study" target="_blank" rel="noopener">The Guardian</a>, this behavior, identified for the first time in non-human primates, aids social cohesion.</p>
<p>A recent study by Hebrew University in Jerusalem showed that marmosets can communicate with one another by name and know when they are being addressed. These adorable primates join a very short list of species exhibiting such behavior. Dolphins, parrots, and elephants are the other species known to use names.</p>
<p>The study was published in the journal <a href="https://www.science.org/doi/10.1126/science.adp3757" target="_blank" rel="noopener">Science</a>.</p>
<p><img decoding="async" loading="lazy" class="alignnone wp-image-4583" src="https://blogs.mathworks.com/headlines/files/2025/02/GettyImages-154534932-1024x680.jpg" alt="Two marmosets sitting on a branch." width="401" height="266" /></p>
<p>The scientists studied ten captive marmosets from three family groups, analyzing the phee calls between different pairs of monkeys.</p>
<p>“The experiment was very simple,” David Omer, lead author and assistant professor at the university’s Safra Center for Brain Sciences (ELSC), told CNN. “We just positioned two marmosets in the same room and positioned a visual barrier between them. When you do this, they spontaneously start to engage in dialogue.”</p>
<p><div style="width: 302px" class="wp-caption alignnone"><a href="https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSAkA0ufvUtKyWAhzZAwVhznGtpC4S2gLTbksz_mvOdz5mOMubZ85UsMU2KFgSiak0fhs4&amp;usqp=CAU" target="_blank" rel="noopener"><img decoding="async" loading="lazy" src="https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSAkA0ufvUtKyWAhzZAwVhznGtpC4S2gLTbksz_mvOdz5mOMubZ85UsMU2KFgSiak0fhs4&amp;usqp=CAU" alt="Illustration showing two marmosets in a room, separated by a visual barrier. " width="292" height="173" /></a><p class="wp-caption-text">Image credit: David Omer’s Lab, Hebrew University of Jerusalem.</p></div></p>
<p>“This past decade, quantitative behavioral analysis has advanced as much as neural recording technologies in the neuroscience field”, says Vijay Iyer, principal academic discipline manager for neuroscience at MathWorks. “While I enjoy supporting applications pushing technology boundaries, this report is a delightful reminder that science is about discovery: asking and answering questions that haven’t been addressed before.”</p>
<h2>Analyzing the calls with machine learning</h2>
<p>The team recorded natural conversations between pairs of marmosets, as well as the primates’ interactions with a computer system. The sound data was collected at 96 kHZ with a custom MATLAB program. They recorded 54,000 calls.</p>
<p>Here is what a phee call sounds like:</p>
<p><audio class="wp-audio-shortcode" id="audio-4568-2" preload="none" style="width: 100%;" controls="controls"><source type="audio/mpeg" src="https://blogs.mathworks.com/headlines/files/2025/01/phee-calls.mp3?_=2" /><a href="https://blogs.mathworks.com/headlines/files/2025/01/phee-calls.mp3">https://blogs.mathworks.com/headlines/files/2025/01/phee-calls.mp3</a></audio></p>
<p>The sound data was fed into a machine learning system created with the MATLAB algorithm, TreeBagger. TreeBagger is an ensemble method that reduces the effects of overfitting and improves generalization.</p>
<p>The team used time-frequency analysis in MATLAB for signal feature extraction. Based on acoustic features alone, the machine learning system could accurately predict which monkey a particular call had been addressed to. The researchers found the marmosets used distinct phee calls for each monkey on the other side of the visual barrier, similar to how humans use names.</p>
<p><div style="width: 510px" class="wp-caption alignnone"><img decoding="async" loading="lazy" class="" src="https://www.science.org/cms/10.1126/science.adp3757/asset/8e91ed60-fcd2-4719-99c7-ca92e4295122/assets/images/large/science.adp3757-f2.jpg" alt="" width="500" height="352" /><p class="wp-caption-text">A and B: Average classification accuracy of 100 random-forest models trained and tested on calls from Monday A) Adonis and B) Ella. The left panel is a confusion matrix, and the center panel shows violin plots. In C and D the same calls are shown as spectrograms. E through G show results for all of the monkeys in the study. Image credit: David Omer’s Lab, Hebrew University of Jerusalem.</p></div></p>
<h2>Vocal learning</h2>
<p>The team concluded that the marmosets addressed specific individuals with the phee calls, or names. They found that not only do these tiny primates use names for each other, but they are more likely to respond if their name is called. And these aren’t unique to given pairs: the other members of the family groups use the same name when calling to a specific family member.</p>
<p>“This is evidence for vocal learning. They learn vocal labels from their family members,” says Omer.</p>
<p><div style="width: 509px" class="wp-caption alignnone"><a href="https://www.science.org/do/10.1126/science.zsdh0r2/full/marmosetmotheranddaughter-1724965347077.png" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://www.science.org/do/10.1126/science.zsdh0r2/full/marmosetmotheranddaughter-1724965347077.png" alt="Two marmosets." width="499" height="332" /></a><p class="wp-caption-text">A mother and daughter pair, Bhumi and Belle, from the study. Image credit: David Omer’s Lab, Hebrew University of Jerusalem.</p></div></p>
<p>&nbsp;</p>
<p>The code and data from this study is available <a href="https://zenodo.org/records/12721811" target="_blank" rel="noopener">here</a>. More details on the analysis can be found in the <a href="https://www.science.org/action/downloadSupplement?doi=10.1126%2Fscience.adp3757&amp;file=science.adp3757_sm.pdf" target="_blank" rel="noopener">supplemental information</a>.</p>
<p>To read the full research paper, see <a href="https://www.zoology.ubc.ca/edg/pdfs/Oren2024.pdf" target="_blank" rel="noopener">DOI: 10.1126/science.adp375.</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://blogs.mathworks.com/headlines/2025/02/06/chatty-little-marmosets-call-each-other-by-name/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		<enclosure url="https://blogs.mathworks.com/headlines/files/2025/01/phee-calls.mp3" length="0" type="audio/mpeg" />

			</item>
		<item>
		<title>Finding shelter on the moon, in a cave</title>
		<link>https://blogs.mathworks.com/headlines/2024/11/12/finding-shelter-on-the-moon-in-a-cave/?s_tid=feedtopost</link>
					<comments>https://blogs.mathworks.com/headlines/2024/11/12/finding-shelter-on-the-moon-in-a-cave/#respond</comments>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Tue, 12 Nov 2024 09:00:59 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=4505</guid>

					<description><![CDATA[<div class="overview-image"><img  class="img-responsive" src="https://blogs.mathworks.com/headlines/files/2024/10/moon-1024x971.jpg" onError="this.style.display ='none';" /></div><p>The first astronauts since the Apollo era will land on the lunar surface later this decade. If NASA’s Artemis program proceeds as planned, it will eventually establish a permanent presence on the... <a class="read-more" href="https://blogs.mathworks.com/headlines/2024/11/12/finding-shelter-on-the-moon-in-a-cave/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>The first astronauts since the Apollo era will land on the lunar surface later this decade. If NASA’s Artemis program proceeds as planned, it will eventually establish a permanent presence on the moon, near the water-rich south pole.</p>
<p>Keeping astronauts safe for extended time periods on the moon will present multiple challenges: The moon is one of the most extreme and hostile environments in the solar system, featuring huge temperature swings in and out of sunlight, intense moonquakes, and galactic and stellar radiation.</p>
<ul>
<li>Daytime temperatures near the equator can reach 250°F (121°C).</li>
<li>Nighttime temperatures can drop to -208°F (-133°C).</li>
<li>Temperatures in deep craters near the poles can drop below -410°F (-246°C) due to permanent shadows.</li>
<li>Radiation levels are nearly 200 times the levels on Earth’s surface since the moon lacks the atmospheric and magnetic shielding of the Earth.</li>
</ul>
<blockquote><p>“The lunar surface is hostile to humans and machines,” Tracy Gregg, a planetary volcanologist at the University at Buffalo told <a href="https://www.nationalgeographic.com/science/article/moon-cave-lava-tube-astronauts" target="_blank" rel="noopener">National Geographic</a>.</p></blockquote>
<p>&nbsp;</p>
<p><img decoding="async" loading="lazy" class="alignnone wp-image-4493" src="https://blogs.mathworks.com/headlines/files/2024/10/moon-1024x971.jpg" alt="" width="420" height="398" /></p>
<p>&nbsp;</p>
<p>Although manmade structures constructed on the lunar surface could provide shelter, it would be helpful if the moon offered some natural defenses.</p>
<blockquote><p>“It&#8217;s about having a ready-made habitat where astronauts can spend extended periods on the moon without contracting cancer,” Paul Byrne, a planetary scientist at the Washington University in St. Louis, tells National Geographic.</p></blockquote>
<p>A study published this month in <a href="https://www.nature.com/articles/s41550-024-02302-y.epdf?sharing_token=PN36F5sBrFuvcmjzOjTvOdRgN0jAjWel9jnR3ZoTv0NM3XCoO8qqQAVpZqycEJ41v-Y1i8jue3aIH5QvXVkwNKM1VU1zJ66l13rexlhrvWyR1SrF_fl-osl851IG6owAa0mBcJWiYLUEFsYJHaDQToCmPmK9-zmmu3ISDRCd_-bbjc6mWGjLESX0dlosaOO6ajSxqNLEsAwtOYQWjRSDZUas1FsiA_ba07AU0JoJn7TrUFfzNVCqnuk84kIxwY2jwOTbMazxswZY2EG7dQTfD1SvRhx_8EIghgsSVe7Iq-i1YaKaSRDnUtbCEyWC6qkMLXdTxwM8mBBplC4SNQC9fTR2iT_AQssNgpKcXN-dfjc%3D&amp;tracking_referrer=www.theguardian.com" target="_blank" rel="noopener"><em>Nature Astronomy</em></a> provides the first direct evidence of the existence of such natural shelters. Lorenzo Bruzzone and Leonardo Carrer, researchers from the University of Trento in Italy, found the cave using radar to penetrate the opening of a pit on a rocky plain called the Mare Tranquillitatis, near where the Astronauts from Apollo 11 landed.</p>
<p>&nbsp;</p>
<p><div style="width: 510px" class="wp-caption alignnone"><img decoding="async" loading="lazy" src="https://ichef.bbci.co.uk/news/1024/cpsprodpb/f588/live/1bc5b1c0-42a6-11ef-b83f-157e38fc9c03.jpg.webp" alt="Black and white image of a hole in the surface of the moon taken from directly above with a pronounced shadow inside the hole. " width="500" height="281" /><p class="wp-caption-text">The Mare Tranquillitatis pit examined in this study. Image credit: NASA</p></div></p>
<p>&nbsp;</p>
<h2>NASA finds a massive moon cave that could shelter astronauts</h2>
<p>By analyzing old data from a probe orbiting the moon, researchers discovered that a pit near the Apollo 11 landing site is not just a pit. It is actually a long cave—a volcanic tunnel formed by an ancient lava flow. It’s at least 100m deep.</p>
<p>The data used in the study were obtained by the Lunar Reconnaissance Orbiter (LRO) in 2010. The LRO camera captured images of the pits but could not see inside the pits. However, a small radar system, the Mini-RF instrument onboard the LRO, could peer inside the pits if the angle was optimal. But due to the limited resolution of the Mini-RF instrument of approximately 15 m by 30 m, the data could only be useful for evaluating pits that were at least 80 m across.</p>
<p>&nbsp;</p>
<p><div id="attachment_4517" style="width: 514px" class="wp-caption alignnone"><img aria-describedby="caption-attachment-4517" decoding="async" loading="lazy" class="wp-image-4517 " src="https://blogs.mathworks.com/headlines/files/2024/10/Radar-finds-cave.jpg" alt="The upper left corner shows a picture of the moon with a dot indicating the Apollo 11 landing site. A rectangular location on this image is enlarged in the upper right hand side and the location of the Mare Tranquillitatis pit is marked on the surface. The lower portion of the image is an illustration what shows how the LRO's radar view can extend into the pit at an angle to show more of the cave structure than what is visible from a photo taken directly above the pit. " width="504" height="500" /><p id="caption-attachment-4517" class="wp-caption-text">Image credit: Lorenzo Bruzzone and Leonardo Carrer, the University of Trento</p></div></p>
<p>&nbsp;</p>
<p>The researchers used the preexisting radar data on the Mare Tranquillitatis pit (MTP). The MTP is around 100m in diameter, and spreads further to an estimated 140 m in a below ground tunnel. This makes the MTP the right size for evaluating the LRO radar data.</p>
<p>The 3D radar simulations were performed on the LRO images with RaySAR24. They found that a portion of the radar reflections from the MTP could be attributed to a cave below the surface.</p>
<p>RaySAR is an open-source 3D synthetic aperture radar (SAR simulator) developed by the German Aerospace Center. It generates SAR images using a method called <a href="https://www.mathworks.com/help/comm/ref/rfprop.raytracing.html?s_tid=srchtitle_site_search_1_ray%20tracing">ray tracing</a>. RaySAR is written in MATLAB and available on GitHub <a href="https://github.com/StefanJAuer/RaySAR">here</a>.</p>
<p>&nbsp;</p>
<p><div style="width: 611px" class="wp-caption alignnone"><img decoding="async" loading="lazy" class="" src="https://media.springernature.com/m685/springer-static/image/art%3A10.1038%2Fs41550-024-02302-y/MediaObjects/41550_2024_2302_Fig1_HTML.png" alt="Four squares arranged in a 2 by 2 matrix. Each shows a black dot in the center with red dotted lines around the circumference of the dot. " width="601" height="442" /><p class="wp-caption-text">The radar look direction is indicated with a white arrow, a,b, Mini-RF SAR image of the MTP (a) and its corresponding RGB decomposition (red, double bounce; blue, single scattering; green, volume scattering) (b). The MTP overhang and cave conduit radar echoes exhibit single- and double-bounce scattering, respectively. (c) DTM from stereo observations (d) 3D radar simulation of the DTM in c. The red dashed circle delineates the edge of the pit. Image credit: Carrer et al. Nature.</p></div></p>
<p>&nbsp;</p>
<p>RaySAR generates detailed 3D object models. Specifically, it localizes the 3D positions and surface intersection points related to reflected radar signals.</p>
<p><img decoding="async" loading="lazy" class="" src="https://cdn.zmescience.com/wp-content/uploads/2024/07/models.webp" alt="The boxes in the upper left and lower left show a diagram of the cave opening with the area reached by the radar highlighted. The two boxes on the right hand side show a 3D reconstruction of MTP and the conduit at its base. " width="711" height="524" /></p>
<div></div>
<div>
<p>The newly discovered cave is the first subterranean conduit ever found on the moon, but there are likely many more yet to be discovered.</p>
<blockquote><p>“There are probably hundreds to thousands of caves on the moon in the form of drained lava tubes,” Gregg told National Geographic.</p></blockquote>
<p>With the available dataset on known lunar pits and the limited sensor resolution of the Mini-RF instrument, the method did not allow the researchers to identify caves other than the MTP. If new radar orbital sensors with higher resolution are deployed in the lunar orbit, this investigation could be substantially expanded.</p>
<p>The researchers also explain that this SAR imagery-based method could also be used on Mars, for which more than 1,000 cave entrances have already been identified.</p>
<p>One day, a cave on the moon or on Mars may become an astronaut&#8217;s home away from home.</p>
</div>
<div></div>
<div>To read the full research paper, see <a href="https://www.nature.com/articles/s41550-024-02302-y" target="_blank">DOI: 10.1038/s41550-024-02302-y</a></div>
<div class="t m0 x0 h6 y5 ff5 fs3 fc1 sc0 ls5 ws5"></div>
]]></content:encoded>
					
					<wfw:commentRss>https://blogs.mathworks.com/headlines/2024/11/12/finding-shelter-on-the-moon-in-a-cave/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Why bats&#x1f987;act like frogs &#x1f438;</title>
		<link>https://blogs.mathworks.com/headlines/2024/10/31/why-bats-act-like-frogs/?s_tid=feedtopost</link>
					<comments>https://blogs.mathworks.com/headlines/2024/10/31/why-bats-act-like-frogs/#respond</comments>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Thu, 31 Oct 2024 13:55:41 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=4355</guid>

					<description><![CDATA[<div class="overview-image"><img  class="img-responsive" src="https://blogs.mathworks.com/headlines/files/2024/10/Greater-Horseshoe-Bats-1024x683.jpg" onError="this.style.display ='none';" /></div><p>Happy Halloween! To celebrate the holiday, I’m turning my attention to an animal long associated with the holiday: bats! No, not the blood-sucking bats of vampire lore: This post focuses on a species... <a class="read-more" href="https://blogs.mathworks.com/headlines/2024/10/31/why-bats-act-like-frogs/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>Happy Halloween! To celebrate the holiday, I’m turning my attention to an animal long associated with the holiday: bats! No, not the blood-sucking bats of vampire lore: This post focuses on a species called the greater horseshoe bat. Their dietary preference is insects.</p>
<p><div id="attachment_4382" style="width: 510px" class="wp-caption alignnone"><img aria-describedby="caption-attachment-4382" decoding="async" loading="lazy" class="wp-image-4382 " src="https://blogs.mathworks.com/headlines/files/2024/10/Greater-Horseshoe-Bats-1024x683.jpg" alt="Two bats looking up towards camera." width="500" height="333" /><p id="caption-attachment-4382" class="wp-caption-text">Two greater horseshoe bats.</p></div></p>
<p>Bats are far from the cutest or cuddliest of animals but make no mistake—they play an important role in our environment. They are proficient pollinators and skillful seed spreaders. They are also one of the best natural insect control methods around, reducing the pesticides farmers need where there are active colonies. A single bat consumes up to 8,000 insects each night, so it’s no surprise that ecologists and biologists want to study these creatures. But mathematicians? Yes, them too!</p>
<p>The greater horseshoe bat is one of the UK’s largest bats, with a wingspan of up to 40 cm (almost 16 inches). It can live up to 30 years and gets its name from its horseshoe-shaped nose. Sadly, the population of greater horseshoe bats is estimated to have declined by 90% in the last century. Finding and protecting their roosts is important, and studying the behavior of the colonies is an important step.</p>
<h2>Combining Biology and Mathematics in Research</h2>
<p>Bats are susceptible to human activity, such as noise and light pollution, and locating their roosts is a necessary step for protecting the population. Understanding the movements of bats dispersing from and returning to their roost would help predict the location of the roost. Determining the extent of their foraging habitat is also important for bat conservation.</p>
<p>A team of researchers studied how bats travel back from their nightly foraging in a pattern that maximizes their hunting time while minimizing their exposure to predators, such as birds of prey. The team included Dr. Fiona Mathews, a professor of environmental biology at the University of Sussex, and Dr. Thomas Wooley, senior lecturer at the applied mathematics department at Cardiff University. Pairing mathematics with biology enabled the team to use math to analyze the bats’ behavior. Their research was published in the <a href="https://link.springer.com/article/10.1007/s11538-023-01233-5" target="_blank" rel="noopener">Bulletin of Mathematical Biology</a>.</p>
<p><a href="https://www.theguardian.com/science/2024/jan/12/bats-leapfrog-back-to-roost-to-stay-safe-from-predators-study-finds" target="_blank" rel="noopener">The Guardian</a> reported the team “developed a mathematical model using “trajectory data” that tracked the flight of greater horseshoe bats in Devon to pinpoint how the creatures engage with the nocturnal environment.”</p>
<p>In a <a href="https://www.youtube.com/watch?v=rpp2GF8j6d4" target="_blank" rel="noopener">video</a> describing the work, Professor Mathews spoke about how applying math to study the bats changed the approach. “So normally, I would follow bats around in the landscape and then try to deduce what they are doing afterward. How are they behaving? Whereas working with Thomas, we&#8217;ve kind of done this the opposite way round, where he comes up with some theoretical principles of saying, ‘I think the bats might behave like this.’”</p>
<p>For this study, the team trapped several bats in a harp trap, fitted them with tiny radio receivers with bat-safe glue, and then released them back into the wild. Radio tracking data was collected for 14 nights, from when the bats left the roost to when they returned.</p>
<p><div style="width: 510px" class="wp-caption alignnone"><a href="https://media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11538-023-01233-5/MediaObjects/11538_2023_1233_Fig1_HTML.jpg?as=webp" target="_blank" rel="noopener"><img decoding="async" loading="lazy" src="https://media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11538-023-01233-5/MediaObjects/11538_2023_1233_Fig1_HTML.jpg?as=webp" alt="Two pictures of bats with radio transmitters on their backs. Int he left, the back is hanging from a rock. The image on the right shows fingers in a white glove holding the bat. " width="500" height="316" /></a><p class="wp-caption-text">Greater horseshoe bats with radio transmitters glued to their backs. The radio transmitters have very thin antennae and are highlighted in white. Image credit: Professor Fiona Mathews.</p></div></p>
<h2>BATLAB, um, I mean MATLAB</h2>
<p>The recordings provided a few hundred trajectories of the bats’ locations throughout their flights. Processing the data was key to understanding their travels.</p>
<p><div style="width: 509px" class="wp-caption alignnone"><a href="https://media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11538-023-01233-5/MediaObjects/11538_2023_1233_Fig2_HTML.png?as=webp" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11538-023-01233-5/MediaObjects/11538_2023_1233_Fig2_HTML.png?as=webp" alt="Two graphs with blue dotted lines plotting distance from roost in X Y coordinates." width="499" height="222" /></a><p class="wp-caption-text">The locations of the same bat over two nights during the survey. The roost has been normalized to be at the origin in each case. The circles represent detections, and the numbers next to them represent the time in hours after sunset when the bat was detected at the given location. Image credit: Wooley et. Al.</p></div></p>
<p>To better understand the bats&#8217; location, the team turned to mathematics using mean square displacement. As Dr. Wooley described in a <a href="https://www.youtube.com/watch?v=jontpz8T3PQ" target="_blank" rel="noopener">video</a>, “The reason we just take the displacement and square is to reduce the two-dimensional information […]. Bats fly in three dimensions, but unfortunately, we can&#8217;t say how high they are. So, we have to flatten it first to two dimensions. Where are you on a map? Then we flatten it to one dimension of just how far away [you are] from your roosts.”</p>
<p><div style="width: 510px" class="wp-caption alignnone"><a href="https://media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11538-023-01233-5/MediaObjects/11538_2023_1233_Fig4_HTML.png?as=webp" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11538-023-01233-5/MediaObjects/11538_2023_1233_Fig4_HTML.png?as=webp" alt="Four charts plotting mean square displacement on the X axis, and hours after sunset on the Y axis. " width="500" height="270" /></a><p class="wp-caption-text">a The mean-squared distance (MSD) for all radio-tracked bats interpolated at Δt=200s. b The MSD for all radio-tracked bats interpolated at Δt=1000, 2000, and 3000 s, from left to right, respectively. The red line is the MSD trajectory, and the ribbon represents the mean ±standard error of the squared displacement trajectory data. Image credit: Wooley et. Al.</p></div></p>
<p>Using MATLAB to analyze their results, the team found that the bats furthest out started returning first. The MATLAB code for the calculations and graphics is available in this <a href="https://github.com/ThomasEWoolley/Bat_motion/tree/main/Matlab" target="_blank" rel="noopener">GitHub repository</a>.</p>
<p>The team described the motion in the paper, “We term this form of motion ‘leap frogging’ because it mirrors the idea that bats on the periphery will choose to fly towards the roost until they are no longer the furthest bat away from the roost. Once the convecting bat is no longer the furthest out from the roost it stops convecting and returns to moving randomly. The new bat that is furthest out starts to convect towards the roost and the process begins again. Note that the furthest out bat becomes the edge of the domain and no randomly moving bat is able to move past it. Over time this form of motion will cause the bat population to tend towards the roost.”</p>
<p><img decoding="async" loading="lazy" class="alignnone wp-image-4418" src="https://blogs.mathworks.com/headlines/files/2024/10/Halloween-Leapfogging-bats-1.gif" alt="" width="550" height="280" /></p>
<p>&nbsp;</p>
<p>The team postulates that the bat that is furthest out has three possible motivations to leapfrog back into the midst of other bats: A) Lessen its exposure to predators, B) They have the furthest to travel, and moving closer allows more time for foraging, and C) “[&#8230;] if the rest of the bat population has found suitable foraging areas closer to the roost the furthest out bat is wasting energy by flying further.”</p>
<p>But how do the bats know they are the furthest away? They only hear other bats, and the corresponding echolocation clicks from one direction. When they are among their fellow bats, the clicks come from multiple directions.</p>
<p>No matter the motivation, we now know that these bats act like frogs in two ways: They love to munch on insects and have a propensity to leapfrog their way home.</p>
<p>To read the full research paper, see <span data-teams="true"><span class="ui-provider a b c d e f g h i j k l m n o p q r s t u v w x y z ab ac ae af ag ah ai aj ak" dir="ltr"><a id="menur24e" class="fui-Link ___1q1shib f2hkw1w f3rmtva f1ewtqcl fyind8e f1k6fduh f1w7gpdv fk6fouc fjoy568 figsok6 f1s184ao f1mk8lai fnbmjn9 f1o700av f13mvf36 f1cmlufx f9n3di6 f1ids18y f1tx3yz7 f1deo86v f1eh06m1 f1iescvh fhgqx19 f1olyrje f1p93eir f1nev41a f1h8hb77 f1lqvz6u f10aw75t fsle3fq f17ae5zn" title="https://doi.org/10.1007/s11538-023-01233-5" href="https://doi.org/10.1007/s11538-023-01233-5" target="_blank" rel="noreferrer noopener" aria-label="Link 10.1007/s11538-023-01233-5">DOI: 10.1007/s11538-023-01233-5</a></span></span></p>
<p>&nbsp;</p>
<p>If you want more quirky animal trivia for Halloween, check out <a href="https://blogs.mathworks.com/headlines/2022/08/04/pumpkin-toadlets-cant-jump/?s_tid=srchtitle_site_search_1_toadlet" target="_blank" rel="noopener">this post on “Pumpkin” toadlets</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blogs.mathworks.com/headlines/2024/10/31/why-bats-act-like-frogs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Proving the physics behind warp drives</title>
		<link>https://blogs.mathworks.com/headlines/2024/09/27/proving-the-physics-behind-warp-drives/?s_tid=feedtopost</link>
					<comments>https://blogs.mathworks.com/headlines/2024/09/27/proving-the-physics-behind-warp-drives/#respond</comments>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Fri, 27 Sep 2024 15:42:34 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=4214</guid>

					<description><![CDATA[<div class="overview-image"><img  class="img-responsive" src="https://blogs.mathworks.com/headlines/files/2024/09/GettyImages-1400105417-1024x576.jpg" onError="this.style.display ='none';" /></div><p>Star Trek may have introduced the masses to the concept of a warp drive, but early references to this mode of travel date back to the 1930s in Jack Williamson’s science fiction novel, The Cometeers.... <a class="read-more" href="https://blogs.mathworks.com/headlines/2024/09/27/proving-the-physics-behind-warp-drives/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>Star Trek may have introduced the masses to the concept of a warp drive, but early references to this mode of travel date back to the 1930s in Jack Williamson’s science fiction novel, <em>The Cometeers</em>. Warp drives have been sprinkled throughout science fiction, but the “science” part has remained elusive.</p>
<blockquote><p><a href="https://www.popularmechanics.com/space/rockets/a60746821/warp-drive-within-known-physics/" target="_blank" rel="noopener">Popular Mechanics</a> reported, “Researchers have always been intrigued by the concept of warp drives, which was first proposed by Mexican physicist Miguel Alcubierre in 1994.</p>
<p>“According to the theoretical Alcubierre warp drive concept, a spacecraft could appear to travel faster than light by contracting space in front of it and expanding space behind it.”</p></blockquote>
<p><img decoding="async" loading="lazy" class="alignnone wp-image-4229" src="https://blogs.mathworks.com/headlines/files/2024/09/GettyImages-1400105417-1024x576.jpg" alt="A 3D rendering of a hyperspace jump." width="400" height="225" /></p>
<p>&nbsp;</p>
<p>But these early theoretical designs relied on exotic concepts such as negative energy or negative mass. Expanding and contracting spacetime is no easy task!</p>
<p>Now, nearly 90 years after The Cometeers was published in 1936, an international group of scientists and engineers at <a href="https://appliedphysics.org/warp-drive/" target="_blank" rel="noopener">Advanced Propulsion Laboratory (APL) at Applied Physics</a> announced that they had obtained numerically a new warp drive solution of spacetime does not involve exotic concepts. The work was published in the journal <a href="https://iopscience.iop.org/article/10.1088/1361-6382/ad26aa" target="_blank" rel="noopener">Classical and Quantum Gravity</a>.</p>
<h2>Warp Drive Solution</h2>
<p>This new warp drive solution operates within known physics. The new solution maintains a constant speed, albeit slower than light.</p>
<blockquote><p>“Prior models required a matter-energy content that was ‘unphysical,’ meaning it had features we don’t see in the regular universe, like negative energy,” lead author Jared Fuchs, a senior scientist at the research firm Applied Physics (AP), explains. “Our approach was to avoid needing this exotic matter by adding positive energy to the solution while keeping as much of the warp effects as possible.”</p></blockquote>
<p><div style="width: 510px" class="wp-caption alignnone"><img decoding="async" loading="lazy" class="" src="https://aijourn.com/wp-content/uploads/2024/05/ConstantVelocityShellAnimation.jpg" alt="warp" width="500" height="281" /><p class="wp-caption-text">Warp bubble. The black lines show the direction of the momentum flow. The gray sphere is the passenger volume. The white region shows higher density and the blue region lower density. Image credit: Applied Physics.</p></div></p>
<p>&nbsp;</p>
<p>Coinciding with the release of their research paper, <a href="https://iopscience.iop.org/article/10.1088/1361-6382/ad2e42/meta" target="_blank" rel="noopener">Analyzing warp drive spacetimes with Warp Factory</a> published in the journal of Classical and Quantum Gravity, WarpFactory is an open-source codebase that runs in MATLAB for any physicist or researcher seeking to test ideas for physical warp drives.</p>
<blockquote><p>&#8220;We developed Warp Factory in MATLAB because it provides ideal capabilities in a single integrated environment and language: powerful parallel numerical computation on both the GPU and CPU, intuitive visualization tools, and ease of use. These features enable Warp Factory to efficiently analyze complex warp drive spacetimes and gain new insights into their physical properties,&#8221; explains Christopher Helmerich, The University of Alabama in Huntsville and Advanced Propulsion Laboratory at Applied Physics.</p></blockquote>
<h2>WarpFactory Toolbox</h2>
<p>The accompanying toolbox is available on <a href="https://www.mathworks.com/matlabcentral/fileexchange/163836-warpfactory?s_tid=srchtitle_site_search_1_warp%20drive" target="_blank" rel="noopener">MathWorks File Exchange</a>.</p>
<blockquote><p>&#8220;The prospect of actually traveling between the stars via warp bubble sometime in the future is fascinating&#8221;, said Matt Lister, technical writer at MathWorks and an adjunct professor of physics at Purdue University. &#8220;This illustrates the amazing things that can result when the advanced mathematical tools in MATLAB are used to explore fundamental physics concepts like Einstein&#8217;s theory of gravitation.&#8221;</p></blockquote>
<p>Besides simulating warp drive solutions, the toolbox allows researchers to numerically perform, using a GPU-friendly 3D finite difference solver, Einstein’s general relativity theory calculations, such as calculating spacetime curvature from spacetime metrics.</p>
<blockquote><p>&#8220;The toolbox is user-friendly and contains examples in the form of interactive live scripts,&#8221; says Temo Vekua, physics academic discipline manager at MathWorks. &#8220;You can open it directly in <a href="https://matlab.mathworks.com/open/github/v1?repo=NerdsWithAttitudes/WarpFactory&amp;file=README.md" target="_blank" rel="noopener">MATLAB Online</a>.&#8221;</p></blockquote>
<p>For standard spacetime metrics similar results can be obtained symbolically using <a href="https://github.com/MatthewErvinChasco/Metric2Ricci" target="_blank" rel="noopener">this toolkit</a> and <a href="https://www.mathworks.com/products/symbolic.html" target="_blank" rel="noopener">Symbolic Math Toolbox</a>, providing benchmark tests for the accuracy of WarpFactory.</p>
<p><div style="width: 490px" class="wp-caption alignnone"><img decoding="async" loading="lazy" src="https://mms.businesswire.com/media/20240408213718/en/2092261/4/WarpBubbleAnimation_watermark_2.gif" alt="Animation showing swirling red and yellow regions with embedded black arrows around a gray sphere in the center." width="480" height="270" /><p class="wp-caption-text">Animation of the stress-energy tensor. Lighter regions in the cross-section correspond to higher energy densities. The blue lines demonstrate the direction of momentum flow within the warp bubble. Image credit: Applied Physics.</p></div></p>
<p>The resulting warp bubble momentum flux does not contract and expand space. Fuchs explains:</p>
<blockquote><p>&#8220;Contrary to popular discussions, warp doesn’t expand or contract space. Rather, it relies on large amounts of energy moving rapidly around the passenger volume which creates a conveyor belt effect on the inside. This energy motion is like a smoke ring, but on the inside of this ring, it can transport passengers along via gravity. If you make this ring of matter move while you modify this conveyor effect, you have a warp drive that transports passengers along space-time.”</p></blockquote>
<h2>Moving Beyond Science Fiction</h2>
<blockquote><p>“This study changes the conversation about warp drives,” Fuchs <a href="https://www.businesswire.com/news/home/20240506270015/en/New-Study-Achieves-Breakthrough-in-Warp-Drive-Design" target="_blank" rel="noopener"><strong>said in a press statement</strong></a>. “By demonstrating a first-of-its-kind model, we’ve shown that warp drives might not be relegated to science fiction.”</p></blockquote>
<p><span data-teams="true"><span class="ui-provider a b c d e f g h i j k l m n o p q r s t u v w x y z ab ac ae af ag ah ai aj ak" dir="ltr">To read the full research papers, see Analyzing warp drive spacetimes with Warp Factory, <a id="menur27h" class="fui-Link ___1q1shib f2hkw1w f3rmtva f1ewtqcl fyind8e f1k6fduh f1w7gpdv fk6fouc fjoy568 figsok6 f1s184ao f1mk8lai fnbmjn9 f1o700av f13mvf36 f1cmlufx f9n3di6 f1ids18y f1tx3yz7 f1deo86v f1eh06m1 f1iescvh fhgqx19 f1olyrje f1p93eir f1nev41a f1h8hb77 f1lqvz6u f10aw75t fsle3fq f17ae5zn" title="https://iopscience.iop.org/article/10.1088/1361-6382/ad2e42/meta" href="https://iopscience.iop.org/article/10.1088/1361-6382/ad2e42/meta" target="_blank" rel="noreferrer noopener" aria-label="Link DOI: 10.1088/1361-6382/ad2e42">DOI: 10.1088/1361-6382/ad2e42</a> and Constant velocity physical warp drive solution, <a id="menur27j" class="fui-Link ___1q1shib f2hkw1w f3rmtva f1ewtqcl fyind8e f1k6fduh f1w7gpdv fk6fouc fjoy568 figsok6 f1s184ao f1mk8lai fnbmjn9 f1o700av f13mvf36 f1cmlufx f9n3di6 f1ids18y f1tx3yz7 f1deo86v f1eh06m1 f1iescvh fhgqx19 f1olyrje f1p93eir f1nev41a f1h8hb77 f1lqvz6u f10aw75t fsle3fq f17ae5zn" title="https://iopscience.iop.org/article/10.1088/1361-6382/ad26aa" href="https://iopscience.iop.org/article/10.1088/1361-6382/ad26aa" target="_blank" rel="noreferrer noopener" aria-label="Link DOI: 10.1088/1361-6382/ad26aa">DOI: 10.1088/1361-6382/ad26aa</a>.</span></span></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blogs.mathworks.com/headlines/2024/09/27/proving-the-physics-behind-warp-drives/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>These crows are smarter than you think. Let them count the ways!</title>
		<link>https://blogs.mathworks.com/headlines/2024/08/27/these-crows-are-smarter-than-you-think-let-them-count-the-ways/?s_tid=feedtopost</link>
					<comments>https://blogs.mathworks.com/headlines/2024/08/27/these-crows-are-smarter-than-you-think-let-them-count-the-ways/#respond</comments>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Tue, 27 Aug 2024 18:19:14 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=4177</guid>

					<description><![CDATA[<div class="overview-image"><img  class="img-responsive" src="https://blogs.mathworks.com/headlines/files/2024/08/crow-1024x768.jpg" onError="this.style.display ='none';" /></div><p>Carrion crows, a notably brainy bird native to Europe, recently demonstrated their laudable intellectual capabilities in yet another way. While these crows have previously shown the ability to... <a class="read-more" href="https://blogs.mathworks.com/headlines/2024/08/27/these-crows-are-smarter-than-you-think-let-them-count-the-ways/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>Carrion crows, a notably brainy bird native to Europe, recently demonstrated their laudable intellectual capabilities in yet another way. While these crows have previously shown the ability to communicate, use tools, and problem-solve, they now can add another skill to their impressive resume: They can count out loud! Well, at least as well as a toddler.</p>
<p><img decoding="async" loading="lazy" class="alignnone  wp-image-4180" src="https://blogs.mathworks.com/headlines/files/2024/08/crow-1024x768.jpg" alt="Black bird standing next to a pile of tangled wires" width="400" height="300" /></p>
<p>&nbsp;</p>
<p><a href="https://www.researchgate.net/profile/Andreas-Nieder/publication/380822270_Crows_count_the_number_of_self-generated_vocalizations/links/66536a840b0d2845745bd015/Crows-count-the-number-of-self-generated-vocalizations.pdf" target="_blank" rel="noopener">In a study</a> published in the <a href="https://www.science.org/doi/10.1126/science.adl0984" target="_blank" rel="noopener">journal Science</a>, researchers from the University of Tübingen in Germany shared the crows’ new-found skill: these clever birds can count out loud up to four, showcasing a form of numerical cognition previously thought to be exclusive to humans.</p>
<p>&#8220;We show that crows have the capacity to count vocally,&#8221; Diana Liao, a neuroscientist at the University of Tübingen and one of the paper&#8217;s authors, <a href="https://www.npr.org/2024/07/18/g-s1-9773/crows-count-out-loud-human-toddlers-animal-intelligence" target="_blank" rel="noopener">told NPR</a>, &#8220;which mirrors this important developmental stage in toddlers.&#8221;</p>
<p>The study revealed that crows use a simplified version of “counting” through a verbal tally system. By associating specific vocalizations with numerical values, the crows demonstrated their capacity to comprehend and respond to auditory cues.</p>
<h2>Learning to count</h2>
<p>Three crows were trained in a darkened chamber with a touchscreen monitor, infrared light barrier, and automated feeder for rewards. They were taught to emit varying numbers of vocalizations (caws) in response to a particular audio recording that was played—a guitar chord indicated one, a cash register noise was paired with two, a drum roll meant three, and a frequency sound corresponded with four caws. They were also provided with visual cues for the numbers.</p>
<p><div style="width: 510px" class="wp-caption alignnone"><img decoding="async" loading="lazy" class="" src="https://www.science.org/cms/asset/8c58b534-39d8-4fef-999a-a4af4dd47e1d/science.adl0984-keyimage.gif" alt="Sections include bird images staring at screens, a set of 6 line graphs, 2 bar graphs, and visual representations with their corresponding numerical value. " width="500" height="385" /><p class="wp-caption-text">Experimental design and performance. (A) Protocol of the vocal production task. In this example, the crow was cued to produce three calls. (B) Visual (colored Arabic numerals) and auditory (distinct sounds indicated by spectrograms) cues instructed the crows to produce a certain number of vocalizations. (C) Behavioral performance of the three crows to visual cues. The dotted horizontal line indicates chance level (1/7). (D) Behavioral performance in regards to auditory cues; layout same as (C). (E) Average performance per cued target number for visual and auditory cues. (F) Widths (standard deviation) of performance functions displayed in (C) and (D). Error bars represent standard error of the mean (SEM). Image credit: Liao, Nieder, et al.</p></div></p>
<p>&nbsp;</p>
<p>The birds exhibited a remarkable degree of accuracy in their counting abilities, with a perfect success rate of 100 percent when counting to one and a respectable 60 percent accuracy when counting to two. However, the number four seemed to pose a quite challenge for the crows, with a lower accuracy rate of 40 percent. And the crows would often throw a toddler-esque tantrum to indicate a preference for lower numerical values.</p>
<p>One of the most intriguing aspects of the study was the observation of pre-planned responses in the crows. MATLAB was used to calculate reaction times and subsequent vocal intervals, with statistical analysis to compare different time intervals.</p>
<p>The birds displayed longer reaction times before producing higher totals of vocalizations, suggesting a deliberate process of mental planning and organization. They paused longer before emitting three caws than they did for a single call. The reaction times are shown below.</p>
<p><div style="width: 511px" class="wp-caption alignnone"><img decoding="async" loading="lazy" class="" src="https://www.science.org/cms/10.1126/science.adl0984/asset/0eb2aee8-aa33-4caa-9c29-0c4d5e2ef6a5/assets/images/large/science.adl0984-f2.jpg" alt="Diam showing four frequency responses at top with corresponding spectrogram underneath. " width="501" height="144" /><p class="wp-caption-text">Relationship between reaction time and number of impending vocalizations. (A) Time intervals for the vocal production sequence. (Top) Timeline of an example trial for four vocalizations. Reaction time extends from the monitor display (left) to vocalization 1 (oscillogram 1). Vocalizations 2 to 4 are followed by the noise of the “enter” peck. (Bottom) Temporally aligned spectrogram of the crow’s produced 4 vocalizations. (B) Session durations of pertinent time intervals [as shown in (A)]. (C) Reaction times to produce the first vocalization after instruction cues for the four target numbers of vocalizations. Image credit: Liao, Nieder, et al.</p></div>&nbsp;</p>
<p>The researchers used MATLAB to fit a linear mixed-effects model to evaluate the accuracy and width of response distributions The research team used a supervised machine learning technique in MATLAB known as Gaussian support vector machines to investigate if the first vocalization in a sequence could predict the impending number of vocalizations. And they discovered this was true.</p>
<p>This research hints at a deeper understanding of how certain bird species communicate and process information. By unraveling the mysteries of avian intelligence, scientists aim to explore the evolutionary origins of cognitive abilities shared by humans and crows, despite the vast differences in brain architecture that have evolved over millions of years.</p>
<p>The study of crows, long known to be intelligent ambassadors for birds, offers a glimpse into the intricate workings of the avian mind. By unlocking the secrets of their numerical abilities, researchers pave the way for a greater appreciation of the diverse forms of intelligence in the natural world.</p>
<p>So, <a href="https://www.youtube.com/watch?v=-oqAU5VxFWs" target="_blank" rel="noopener">Mr. Jones</a>, the <a href="https://en.wikipedia.org/wiki/One_for_Sorrow_(nursery_rhyme)" target="_blank" rel="noopener">nursery rhyme about the counting crows</a> wasn’t all that far off! We can learn quite a bit from crows.</p>
<p>To read the full research paper, see <u><a href="https://www.science.org/doi/10.1126/science.adl0984" target="_blank" rel="noopener">DOI: 10.1126/science.adl098</a>. </u></p>
<p>&nbsp;</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blogs.mathworks.com/headlines/2024/08/27/these-crows-are-smarter-than-you-think-let-them-count-the-ways/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Three favorites from TIME Magazine’s “Best Innovations of 2023”</title>
		<link>https://blogs.mathworks.com/headlines/2024/02/15/three-favorites-from-time-magazines-best-innovations-of-2023/?s_tid=feedtopost</link>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Thu, 15 Feb 2024 19:18:47 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=4080</guid>

					<description><![CDATA[<div class="overview-image"><img  class="img-responsive" src="https://blogs.mathworks.com/headlines/files/2024/02/time-cover-1024x240.jpg" onError="this.style.display ='none';" /></div><p>&#160;
Every year for more than twenty years, TIME editors compile a list of the most impactful ideas and products. This year, the editors focused on categories such as AI, accessibility, robotics,... <a class="read-more" href="https://blogs.mathworks.com/headlines/2024/02/15/three-favorites-from-time-magazines-best-innovations-of-2023/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>&nbsp;</p>
<p>Every year for more than twenty years, TIME editors compile a list of the most impactful ideas and products. This year, the editors focused on categories such as AI, accessibility, robotics, and sustainability. The list is an amazing collection of inventions and innovations that have the potential to change the way we live.</p>
<p>&nbsp;</p>
<p><div id="attachment_4083" style="width: 1034px" class="wp-caption alignnone"><a href="https://time.com/collection/best-inventions-2023/" target="_blank" rel="noopener"><img aria-describedby="caption-attachment-4083" decoding="async" loading="lazy" class="wp-image-4083 size-large" src="https://blogs.mathworks.com/headlines/files/2024/02/time-cover-1024x240.jpg" alt="" width="1024" height="240" /></a><p id="caption-attachment-4083" class="wp-caption-text">Image credit: TIME Magazine</p></div></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>The 2023 list didn’t disappoint. It contains products with various levels of complexity, from a design that makes it easier for arthritis sufferers to brush their teeth, to NASA technology that helps pinpoint sources of air pollution with a satellite hovering 22,000 miles above Earth. Some are creative and fun, like Hasbro’s Selfie Series, which creates a custom action figure from a photo.</p>
<h1>Here are three of the 2023 winners and how they used MATLAB and Simulink:</h1>
<h1><a href="https://time.com/collection/best-inventions-2023/6326415/nasa-moxie/" target="_blank" rel="noopener">Breathing Martian Air</a></h1>
<p><a href="https://time.com/collection/best-inventions-2023/6326415/nasa-moxie/" target="_blank" rel="noopener">TIME selected MOXIE</a> (Mars Oxygen In-Situ Resource Utilization Experiment), a device that travels the surface of Mars aboard the NASA Perseverance Rover. MOXIE makes oxygen by collecting carbon dioxide from the Martian atmosphere and splitting it into oxygen and carbon monoxide molecules.</p>
<p>This is a critical capability for future manned missions, providing breathable air for the astronauts and oxygen to burn the fuel on the return trip. MOXIE successfully generated 122 grams of oxygen, a proof of concept of technology critical for future manned missions to Mars.</p>
<p><div style="width: 414px" class="wp-caption alignnone"><img decoding="async" loading="lazy" class="" src="https://www.mathworks.com/company/mathworks-stories/moxie-converts-mars-co2-to-oxygen/_jcr_content/mainParsys2/columns_320727814_co/18897323-014e-432d-8828-20872c7d83dd/image_copy_copy.adapt.full.medium.jpg/1699632567152.jpg" alt="" width="404" height="376" /><p class="wp-caption-text">An almost identical engineering twin of MOXIE is used for testing in NASA’s Jet Propulsion Laboratory in Pasadena, California lab. (Image courtesy of NASA/JPL-Caltech)</p></div></p>
<p>MOXIE was designed with Simulink. The MOXIE Simulink model includes electrical circuits, chemistry, fluid dynamics, controls, and sensors. The resulting digital twin was used to simulate operation and compare to an engineering model of the device on Earth. It also was used to evaluate the results of the actual MOXIE as it operated on Mars.</p>
<p>&nbsp;</p>
<p><div style="width: 1061px" class="wp-caption alignnone"><img decoding="async" loading="lazy" class="" src="https://www.mathworks.com/company/mathworks-stories/moxie-converts-mars-co2-to-oxygen/_jcr_content/mainParsys2/columns_963905119_co/d2de9858-1875-47d0-b599-1cdb27f41cfa/image.adapt.full.medium.jpg/1699632567377.jpg" width="1051" height="349" /><p class="wp-caption-text">Simulink model for MOXIE.</p></div></p>
<p>&nbsp;</p>
<p>MATLAB provides Simulink with data, including the sizes of pieces of hardware, atmospheric conditions, chemical constants, control system setpoints like the desired temperature, and safety limits. Simulink then sends simulation outputs—sensor readings—back to MATLAB for analysis.</p>
<p>MATLAB also receives data from the real MOXIE on Mars. But the real and virtual MOXIEs don’t tell you something as simple as how much oxygen they produce or the ratio of carbon dioxide to carbon monoxide. Instead, MATLAB calculates those values from temperature, pressure, and voltage sensor data.</p>
<blockquote>
<h2>&#8220;To support a human mission to Mars, we have to bring a lot of stuff from Earth, like computers, spacesuits, and habitats. But oxygen? If you can make it there, go for it — you&#8217;re way ahead of the game.&#8221;<br />
&#8211; Jeff Hoffman, MOXIE Deputy Principal Investigator</h2>
</blockquote>
<p>To learn more about how the team at MIT and NASA designed MOXIE, <a href="https://www.mathworks.com/company/mathworks-stories/moxie-converts-mars-co2-to-oxygen.html?s_tid=srchtitle_customer_stories_1_moxie" target="_blank" rel="noopener">read this article</a>.</p>
<p>&nbsp;</p>
<h1><a href="https://time.com/collection/best-inventions-2023/6324564/monarch-tractor-mk-v/" target="_blank" rel="noopener">Farmerless Farming</a></h1>
<p>A lot closer to Earth and helping farmers plow the earth, TIME Magazine&#8217;s selection of Monarch Tractor’s MK-V is in the green energy category. MK-V is an autonomous, all-electric tractor that can save farmers the cost of fuel and help them reduce labor expenses. And there’s the obvious benefit of reducing greenhouse gas emissions!</p>
<p><div style="width: 563px" class="wp-caption alignnone"><img decoding="async" loading="lazy" class="" src="https://www.mathworks.com/company/mathworks-stories/smart-electric-tractor-uses-ai-for-autonomous-operation/_jcr_content/mainParsys2/columns_963905119/7c1f0c4e-6d05-4d46-96bd-6ad2f58f3859/image.adapt.full.medium.jpg/1699632545917.jpg" width="553" height="415" /><p class="wp-caption-text">The driver-optional Monarch Tractor. (Image credit: Monarch Tractor)</p></div></p>
<p>&nbsp;</p>
<p>Simulink and Model-Based Design helped the team at Monarch synchronize systems, including the sensors, cameras, lighting, and charging. Praveen Penmetsa, Monarch Tractor cofounder and CEO, credits a <a href="https://www.mathworks.com/products/startups.html?s_tid=srchtitle" target="_blank" rel="noopener">program that provides startups</a> with access to MATLAB<sup>®</sup> and Simulink<sup>®</sup> with giving Monarch Tractor a leg up on getting their initial vehicles going, starting to test the architecture with their launch vehicles and rapidly delivering the first tractors to farmers.</p>
<p><a href="https://www.mathworks.com/company/mathworks-stories/smart-electric-tractor-uses-ai-for-autonomous-operation.html?s_tid=srchtitle_site_search_2_monarch" target="_blank" rel="noopener">Read this article</a> for more information on how Monarch Tractor designed the MK-V, including how they support over-the-air updates and how AI is helping the tractor complete real-time visual data analysis in the fields.</p>
<p>&nbsp;</p>
<h1><a href="https://time.com/collection/best-inventions-2023/6327649/nasa-osiris-rex/" target="_blank" rel="noopener">Answers from the Universe</a></h1>
<p><a href="https://time.com/collection/best-inventions-2023/6327649/nasa-osiris-rex/" target="_blank" rel="noopener">OSIRIS-REx</a>, the NASA spacecraft that traveled to an asteroid, collected a 250-gram sample of rocks and dust and then delivered the sample to eager scientists back on Earth, also graced the TIME Magazine innovations list and deservedly so! Researchers hope the pristine space dirt will reveal clues about the birth of our solar system.</p>
<p>While Mars is over 211 million miles from Earth, OSIRIS-REx traveled almost 4 billion miles. That’s &#8220;billion&#8221; with a “b.” It landed, with precision, on a moving asteroid in a touch-and-go (TAG) operation that avoided boulders and craters on the asteroid’s surface. The maneuver is even more impressive, considering that due to the rougher-than-expected terrain on Bennu, the original lidar-based TAG approach was not feasible. The mission team pivoted to use a pure vision-based navigation method instead.</p>
<p><div style="width: 735px" class="wp-caption alignnone"><img decoding="async" loading="lazy" class="" src="https://www.mathworks.com/company/technical-articles/developing-optical-navigation-software-for-nasas-new-horizons-osiris-rex-and-lucy-missions/_jcr_content/mainParsys/columns_336766139_co/005b337a-5e70-4cf3-9da9-dd3ea3195042/image.adapt.full.medium.jpg/1704195934198.jpg" width="725" height="323" /><p class="wp-caption-text">An actual image of Bennu (left) and a simulated image of Bennu generated by KXIMP (right)</p></div></p>
<p>&nbsp;</p>
<p>The OSIRIS-Rex spacecraft used optical navigation software (OpNav) to set the course for the asteroid Bennu. OpNav techniques use camera images to determine the position of the spacecraft relative to a celestial body, such as a planet or asteroid. Developed in MATLAB®, the KinetX Image Processing software suite (KXIMP) processes images captured with an onboard camera. These images are downlinked to Earth to calculate the inertial camera attitude and the centroids of background stars and celestial bodies in the field of view.</p>
<p>On the OSIRIS-REx mission, the center-finding algorithms were accurate to within 30 centimeters, or about 0.06% of the asteroid’s diameter—significantly outperforming the predicted accuracy of the mission’s navigation Concept of Operations (ConOps).</p>
<p>To learn more about KitetX and Opnav, <a href="https://www.mathworks.com/company/technical-articles/developing-optical-navigation-software-for-nasas-new-horizons-osiris-rex-and-lucy-missions.html?s_tid=srchtitle_site_search_1_optical%20navigation" target="_blank" rel="noopener">read this article</a>.</p>
<p>&nbsp;</p>
<h1>2024 Predictions, Anyone?</h1>
<p>Please comment on this post with your predictions on what amazing technology and products will make the 2024 list. My money is on autonomous aircraft!</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The scary combination of “rapid intensification” and “slower decay” in hurricanes</title>
		<link>https://blogs.mathworks.com/headlines/2023/08/30/the-scary-combination-of-rapid-intensification-and-slower-decay-in-hurricanes/?s_tid=feedtopost</link>
		
		<dc:creator><![CDATA[Lisa Harvey]]></dc:creator>
		<pubDate>Wed, 30 Aug 2023 21:19:39 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://blogs.mathworks.com/headlines/?p=3857</guid>

					<description><![CDATA[<div class="overview-image"><!-- Featured Image From URL plugin --> <img src="https://www.mdpi.com/remotesensing/remotesensing-15-00119/article_deploy/html/images/remotesensing-15-00119-g002-550.jpg" alt="" style=""></div><p>The rapid intensification and slower decay in recent hurricanes provide a terrible “one-two” punch, increasing storm surges and wind speeds at landfall while expanding the total area affected by the... <a class="read-more" href="https://blogs.mathworks.com/headlines/2023/08/30/the-scary-combination-of-rapid-intensification-and-slower-decay-in-hurricanes/">read more >></a></p>]]></description>
										<content:encoded><![CDATA[<p>The rapid intensification and slower decay in recent hurricanes provide a terrible “one-two” punch, increasing storm surges and wind speeds at landfall while expanding the total area affected by the storms. The day before Hurricane Idalia made landfall in Florida, it traveled through some of the warmest waters on the planet, making it stronger at landfall and allowing it to travel further before losing its damaging strength.</p>
<p>Scientists are working to understand these storms better and have published papers that cover topics ranging from modeling intensification, improving predictions, and finding ways to minimize loss of property and life from these increasingly frequent disastrous storms.</p>
<h1><strong>Rapid intensification</strong></h1>
<p>On Tuesday afternoon, Idalia strengthened further to a Category 2, with sustained winds of 100 mph. Overnight, it rapidly intensified to a Category 3 and then Category 4, with winds of 130 mph early Wednesday.</p>
<p><div style="width: 512px" class="wp-caption alignnone"><a href="https://assets3.cbsnewsstatic.com/hub/i/r/2023/08/30/1deef1ae-90b6-42d5-9c62-66d9923c726f/thumbnail/620x509/8b6f18e52b7896a45a41c9e42c39e8d9/idalia-path.png?v=a8a34dbc23229a263a09ea92e9d5b7dc" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://assets3.cbsnewsstatic.com/hub/i/r/2023/08/30/1deef1ae-90b6-42d5-9c62-66d9923c726f/thumbnail/620x509/8b6f18e52b7896a45a41c9e42c39e8d9/idalia-path.png?v=a8a34dbc23229a263a09ea92e9d5b7dc" alt="" width="502" height="412" /></a><p class="wp-caption-text">Idalia’s projected strength and path through early next week. Image credit: NOAA/CBS News</p></div></p>
<p>&nbsp;</p>
<p>“Rapid intensification is associated with a sharp increase in intensity in a short amount of time, and consequently, the threat posed by the storm significantly increases,” said Phil Klotzbach, a research scientist in the Department of Atmospheric Science at Colorado State University.</p>
<p>According to <a href="https://www.washingtonpost.com/weather/2023/08/29/idalia-rapid-intensification-hurricanes-climate/" target="_blank" rel="noopener">The Washington Post</a>, “In the Atlantic basin, which includes the Gulf of Mexico, <a href="https://www.washingtonpost.com/climate-environment/2022/09/29/ian-hurricane-rapid-intensification-climate/?itid=lk_inline_manual_5" target="_blank" rel="noopener">16 of the 20 hurricanes</a> that formed during 2021 and 2022 rapidly intensified. Since 2017, seven rapidly intensifying storms have strengthened to at least a Category 4 (winds of at least 130 mph) before making landfall in the United States, together causing or contributing to at least 3,381 deaths and resulting in at least $496 billion in damage, according to reports compiled by the National Hurricane Center.”</p>
<p>Until recently, rapidly intensifying storms were less common. Tropical storms historically have taken several days to grow into powerful hurricanes, but with human-caused climate change, rapid intensification is becoming a more common occurrence, Allison Wing, an assistant professor of atmospheric science at Florida State University, told <a href="https://www.cnn.com/2023/08/28/us/idalia-rapid-intensification-florida-climate/index.html" target="_blank" rel="noopener">CNN</a>.</p>
<p>Researchers from The Image Processing Laboratory, University of Valencia, turned to machine learning to predict a hurricane’s potential for intensification potential in the Atlantic and Pacific oceans. Their research, <a href="https://www.mdpi.com/2072-4292/15/1/119" target="_blank" rel="noopener">Advanced Machine Learning Methods for Major Hurricane Forecasting</a>, was published in Remote Sensing. Their framework looks to identify the most important cloud structural parameters in <a href="https://www.star.nesdis.noaa.gov/goes/floater_index.php" target="_blank" rel="noopener">GOES</a> imagery and use these structures to identify which storms can potentially evolve into major hurricanes.</p>
<p>&nbsp;</p>
<p><div style="width: 530px" class="wp-caption alignnone"><a href="https://www.mdpi.com/remotesensing/remotesensing-15-00119/article_deploy/html/images/remotesensing-15-00119-g002-550.jpg" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://www.mdpi.com/remotesensing/remotesensing-15-00119/article_deploy/html/images/remotesensing-15-00119-g002-550.jpg" alt="" width="520" height="376" /></a><p class="wp-caption-text">The hybrid machine learning approach developed in the study. Image credit: Javier Martinez-Amaya, Cristina Radin, and Veronica Nieves.</p></div></p>
<p>&nbsp;</p>
<p>GOES (Geostationary Operational Environmental Satellite) imagery refers to the images captured by the GOES series of satellites. These satellites are operated by the National Oceanic and Atmospheric Administration (NOAA) and continuously monitor weather conditions from a geostationary orbit. GOES imagery includes visible, infrared, and water vapor channels, which are used to observe and track weather patterns, clouds, storms, and other atmospheric phenomena in real time.</p>
<p>Their framework was able to more accurately identify the end intensity of major hurricanes when rapid intensification occurred, compared to existing techniques such as statistical analysis. They employed a random forest algorithm. Their MATLAB code for sea-level reconstruction data is <a href="https://github.com/AI4OCEANS/ML-Sea-Level-Reconstructions" target="_blank" rel="noopener">available here</a>.</p>
<p>Their study demonstrated that integrating the prominent cloud features of a tropical cyclone, including the anatomy and temperature, in a machine learning approach is suitable as a benchmark for diagnosing a possible transition into a major hurricane.</p>
<h1><strong>Slower decay</strong></h1>
<p>Back to Idalia: &#8220;We&#8217;re [going to] see not just the storm surge but potential for damaging winds extending well inland all the way across portions of north Florida, into southern Georgia, into places like Savannah, Hilton Head. We have hurricane warnings in effect for the fast-moving hurricane. It&#8217;s going to bring those winds really far inland today and tonight,&#8221; Michael Brennan, director of the National Hurricane Center, told &#8220;CBS Mornings&#8221; on Wednesday.</p>
<p>Not only are storms intensifying before landfall, but the same storms also carry more moisture inland, furthering their destructive results. According to CNN, “A 2020 study published in the journal Nature found storms are moving farther inland than they did five decades ago. Hurricanes, which typically weaken after moving over land, have been raging longer after landfall in recent years. The study concludes that warmer sea surface temperatures are leading to a “slower decay” by increasing moisture that a hurricane carries.”</p>
<p>&nbsp;</p>
<p><div style="width: 514px" class="wp-caption alignnone"><a href="https://assets3.cbsnewsstatic.com/hub/i/r/2023/08/29/b164879f-fe24-4a1f-8adb-91ed0eadfec2/thumbnail/620x498/39b265b4667affe5d847ce548dfbd10f/noaa-idalia-rainfall-map.png?v=a8a34dbc23229a263a09ea92e9d5b7dc" target="_blank" rel="noopener"><img decoding="async" loading="lazy" class="" src="https://assets3.cbsnewsstatic.com/hub/i/r/2023/08/29/b164879f-fe24-4a1f-8adb-91ed0eadfec2/thumbnail/620x498/39b265b4667affe5d847ce548dfbd10f/noaa-idalia-rainfall-map.png?v=a8a34dbc23229a263a09ea92e9d5b7dc" alt="" width="504" height="405" /></a><p class="wp-caption-text">Predicted rainfall from Hurricane Idalia on days one through three. Image credit: NOAA/CBS News.</p></div></p>
<p>&nbsp;</p>
<p>In the study referenced by CNN<strong>, “</strong><a href="https://www.nature.com/articles/s41586-020-2867-7.epdf?sharing_token=NUSR_Yw9mMuUYZQyPT4ERNRgN0jAjWel9jnR3ZoTv0NzEY9D1JrmvVrrZAzzDtl2CDVtbqbizkBOTB6OLMXtO_S60MUHYTolYHeB-jLrvhlI8EJNL5aSfW1eBtSAI-9zpM6jWuhysOasno7cNxxvS_ck2yvys9XY1oy2lp7yg3a66KAJhZP4h0Ex8Lrsad7rZJFjk6kyzOZWWsh7eNtoXg%3D%3D&amp;tracking_referrer=www.cnn.com" target="_blank" rel="noopener">Slower decay of landfalling hurricanes in a warming world,”</a> the researchers analyzed 50 years of intensity data of hurricanes that made landfall in the North Atlantic. They found the slowdown in the decay over time is in direct proportion to a contemporaneous rise in the sea surface temperature.  Specifically, a typical hurricane in the 1960s lost 75% of its intensity in the first 24 hours after landfall. Today, the decay is only about 50% in the same 24-hour period, meaning the devastating effects last longer and can travel further inland.</p>
<p>They utilized the data from several hurricanes from 1967 on, using the MATLAB function land_or_ocean to determine which results to include in their study. The team defined landfall as four continuous inland data points, using the first inland data point as strength at landfall. Land_or_ocean is available on MathWorks <a href="https://www.mathworks.com/matlabcentral/fileexchange/45268-land_or_ocean-m?s_tid=srchtitle_site_search_1_or%20ocean" target="_blank" rel="noopener">File Exchange</a>. The visualization of the hurricane paths contained in the study is shown below.</p>
<p>&nbsp;</p>
<p><div id="attachment_3875" style="width: 510px" class="wp-caption alignnone"><a href="https://www.nature.com/articles/s41586-020-2867-7.epdf?sharing_token=NUSR_Yw9mMuUYZQyPT4ERNRgN0jAjWel9jnR3ZoTv0NzEY9D1JrmvVrrZAzzDtl2CDVtbqbizkBOTB6OLMXtO_S60MUHYTolYHeB-jLrvhlI8EJNL5aSfW1eBtSAI-9zpM6jWuhysOasno7cNxxvS_ck2yvys9XY1oy2lp7yg3a66KAJhZP4h0Ex8Lrsad7rZJFjk6kyzOZWWsh7eNtoXg%3D%3D&amp;tracking_referrer=www.cnn.com" target="_blank" rel="noopener"><img aria-describedby="caption-attachment-3875" decoding="async" loading="lazy" class="wp-image-3875 " src="https://blogs.mathworks.com/headlines/files/2023/08/hurricanes-in-study.jpg" alt="" width="500" height="311" /></a><p id="caption-attachment-3875" class="wp-caption-text">Hurricanes tracked from 1967 to 1992 (in blue) and 1993 to 2018 (in red).  Image credit: L. Li and P. Chakraborty.</p></div></p>
<p>&nbsp;</p>
<p>MATLAB was also used to determine the decay timescale. The graph below shows the slower decay experienced in storms post-1993, shown in red. The data is available from <a href="https://www.nhc.noaa.gov/sst/" target="_blank" rel="noopener">NOAA</a>.</p>
<p>&nbsp;</p>
<p><div id="attachment_3878" style="width: 499px" class="wp-caption alignnone"><a href="https://www.nature.com/articles/s41586-020-2867-7.epdf?sharing_token=NUSR_Yw9mMuUYZQyPT4ERNRgN0jAjWel9jnR3ZoTv0NzEY9D1JrmvVrrZAzzDtl2CDVtbqbizkBOTB6OLMXtO_S60MUHYTolYHeB-jLrvhlI8EJNL5aSfW1eBtSAI-9zpM6jWuhysOasno7cNxxvS_ck2yvys9XY1oy2lp7yg3a66KAJhZP4h0Ex8Lrsad7rZJFjk6kyzOZWWsh7eNtoXg%3D%3D&amp;tracking_referrer=www.cnn.com" target="_blank" rel="noopener"><img aria-describedby="caption-attachment-3878" decoding="async" loading="lazy" class="wp-image-3878 size-full" src="https://blogs.mathworks.com/headlines/files/2023/08/hurricane-decay.jpg" alt="" width="489" height="286" /></a><p id="caption-attachment-3878" class="wp-caption-text">Histogram and probability density of intensity. Image credit: L. Li and P. Chakraborty.</p></div></p>
<p>&nbsp;</p>
<h1><strong>Additional climate studies welcome</strong></h1>
<p>This is only a sampling of studies improving our understanding of hurricanes and other climate phenomena. If you are involved in a study that you would like to share, please reach out! I welcome your ideas and guest posts.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
