<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>PlanetArduino</title>
	<atom:link href="http://www.planetarduino.org/?feed=rss2" rel="self" type="application/rss+xml" />
	<link>https://www.planetarduino.org</link>
	<description>all about Arduino platform</description>
	<lastBuildDate>Sun, 10 May 2026 13:30:43 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.5.8</generator>
	<item>
		<title>Logging Ham Radio repeater usage with a Baofeng and Adafruit IO #HamSunday</title>
		<link>https://blog.adafruit.com/2026/05/10/logging-ham-radio-repeater-usage-with-a-baofeng-and-adafruit-io-hamsunday/</link>
		
		<dc:creator><![CDATA[Anne Barela]]></dc:creator>
		<pubDate>Sun, 10 May 2026 13:30:43 +0000</pubDate>
				<category><![CDATA[arduino]]></category>
		<category><![CDATA[baofeng]]></category>
		<category><![CDATA[ESP32-C5]]></category>
		<category><![CDATA[ham radio]]></category>
		<category><![CDATA[Ham Sunday]]></category>
		<category><![CDATA[radio]]></category>
		<guid isPermaLink="false">https://blog.adafruit.com/?p=656388</guid>

					<description><![CDATA[The Whiskey Tango Hotel blog writes: The Austin N5OAK Ham Radio Club Repeater seems to be pretty active.  That shouldn’t surprise us much because the club has a lot of activities and a lot of enthusiastic members.  Still, we wondered just how active and opened the spare parts drawer to see if it held a solution. Turns […]]]></description>
										<content:encoded><![CDATA[<p><img fetchpriority="high" decoding="async" class="alignnone  wp-image-656389 img-responsive" src="https://cdn-blog.adafruit.com/uploads/2026/05/bootsel.png" alt="" width="416" height="278" srcset="https://cdn-blog.adafruit.com/uploads/2026/05/bootsel.png 450w, https://cdn-blog.adafruit.com/uploads/2026/05/bootsel-300x201.png 300w, https://cdn-blog.adafruit.com/uploads/2026/05/bootsel-150x100.png 150w" sizes="(max-width: 416px) 100vw, 416px" /></p>
<p>The Whiskey Tango Hotel blog writes:</p>
<blockquote><p>The Austin <a href="https://n5oak.org/"  rel="noopener">N5OAK Ham Radio Club</a> Repeater seems to be pretty active.  That shouldn&#8217;t surprise us much because the club has a lot of activities and a lot of enthusiastic members.  Still, we wondered just how active and opened the spare parts drawer to see if it held a solution.</p>
<p>Turns out the spare parts drawer did; take a look at the wiring diagram at the top of the page.   We simply used an Espressif ESP32-C3 Dev Board with an input pin wired to to the output speaker of a <a href="https://www.amazon.com/Baofeng-UV-5R-136-174-400-480Mhz-1800mAh/dp/B074XPB313"  rel="noopener">Baofeng HT</a> tuned to the repeater frequency.  We monitor the C3 input pin and if it sees a signal the repeater in transmitting.  We push that data to <a href="https://io.adafruit.com/"  rel="noopener">Adafruit IO</a> at the top of each hour to <a href="https://io.adafruit.com/ironjungle/dashboards/n5oak-r-usage-by-whiskeytangohotel-dot-com?kiosk=true"  rel="noopener">graph the history</a>.</p></blockquote>
<p>The post lays out the details and has Arduino code. See it <a href="https://www.whiskeytangohotel.com/2026/05/logging-ham-radio-repeater-usage-with.html"  rel="noopener">here</a>.</p>
<p><img decoding="async" class="alignnone  wp-image-656392 img-responsive" src="https://cdn-blog.adafruit.com/uploads/2026/05/Ham-Sunday.jpg" alt="" width="329" height="216" srcset="https://cdn-blog.adafruit.com/uploads/2026/05/Ham-Sunday.jpg 379w, https://cdn-blog.adafruit.com/uploads/2026/05/Ham-Sunday-300x197.jpg 300w, https://cdn-blog.adafruit.com/uploads/2026/05/Ham-Sunday-150x99.jpg 150w" sizes="(max-width: 329px) 100vw, 329px" /></p>



]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
		<item>
		<title>UPDATED LEARN GUIDE: Adafruit VCNL4030 Proximity and Lux Sensor #WipperSnapper #AdafruitLearningSystem @Adafruit</title>
		<link>https://blog.adafruit.com/2026/05/08/updated-learn-guide-adafruit-vcnl4030-proximity-and-lux-sensor-wippersnapper-adafruitlearningsystem-adafruit/</link>
		
		<dc:creator><![CDATA[Tyeth]]></dc:creator>
		<pubDate>Fri, 08 May 2026 15:00:00 +0000</pubDate>
				<category><![CDATA[adafruit learning system]]></category>
		<category><![CDATA[adafruit.io]]></category>
		<category><![CDATA[adafruitio]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[Breakout Boards]]></category>
		<category><![CDATA[circuitpython]]></category>
		<category><![CDATA[light sensor]]></category>
		<category><![CDATA[Lux]]></category>
		<category><![CDATA[lux sensor]]></category>
		<category><![CDATA[no code]]></category>
		<category><![CDATA[optics]]></category>
		<category><![CDATA[proximity and lux sensor]]></category>
		<category><![CDATA[proximity sensor]]></category>
		<category><![CDATA[sensors]]></category>
		<category><![CDATA[Stemma QT]]></category>
		<category><![CDATA[VCNL4030]]></category>
		<category><![CDATA[wippersnapper]]></category>
		<guid isPermaLink="false">https://blog.adafruit.com/?p=656786</guid>

					<description><![CDATA[UPDATED GUIDE: Adafruit VCNL4030 Proximity and Lux Sensor The VCNL4030 is a handy two-in-one sensor, with a proximity sensor that works from 0 to 300mm (about 12 inches) and light sensor with range of 0.004 to 16,768 lux. We’ve all been there. That thing is close but how close? When you need to measure a small distance with reasonable accuracy, such as the […]]]></description>
										<content:encoded><![CDATA[<p><img decoding="async" class="img-responsive" src="https://cdn-learn.adafruit.com/guides/images/000/004/499/medium800/6491-00.jpg?1773931853" alt="Small rectangular VCNL4030 proximity and light sensor board with four mounting holes, STEMMA QT connectors, and labeled pins: VIN, GND, 3Vo, SDA, SCL, and INT." /></p>
<p><a href="https://learn.adafruit.com/adafruit-vcnl4030-proximity-and-lux-sensor/wippersnapper">UPDATED GUIDE: Adafruit VCNL4030 Proximity and Lux Sensor</a></p>
<blockquote><p>The VCNL4030 is a handy two-in-one sensor, with a proximity sensor that works from<strong> 0 to 300mm</strong> (about 12 inches) and light sensor with range of<strong> 0.004 to 16,768 lux</strong>.</p>
<p>We&#8217;ve all been there. That thing is <em>close</em> but <em>how close?</em> When you need to measure a small distance with reasonable accuracy, such as the rough height of particularly calm bumble bee, the VCNL4030 Proximity Sensor from Vishay can do that for you. If perchance you also needed to measure the amount of light at the same time, perhaps to let the bee to know if it&#8217;s time for bed, you&#8217;re in luck! The VCNL4030 can do that too (bumble bee not included, we tried putting it in the anti-static bag but it started buzzing in a threatening manner)</p></blockquote>
<p>A new page has been added for our no-code <a href="https://learn.adafruit.com/quickstart-adafruit-io-wippersnapper">WipperSnapper</a> firmware, allowing easy sending of data to Adafruit IO.</p>
<p>Read more at <a href="https://learn.adafruit.com/adafruit-vcnl4030-proximity-and-lux-sensor/wippersnapper">Adafruit VCNL4030 Proximity and Lux Sensor</a></p>
]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
		<item>
		<title>How FermiLabs builds championship-level robots with Arduino</title>
		<link>https://blog.arduino.cc/2026/05/08/how-fermilabs-builds-championship-level-robots-with-arduino/</link>
		
		<dc:creator><![CDATA[Arduino Team]]></dc:creator>
		<pubDate>Fri, 08 May 2026 10:55:15 +0000</pubDate>
				<category><![CDATA[arduino]]></category>
		<category><![CDATA[Giga R1 WiFi]]></category>
		<category><![CDATA[RoboCup]]></category>
		<category><![CDATA[RoboCup Junior Europe]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[Robots]]></category>
		<category><![CDATA[UNO Q]]></category>
		<guid isPermaLink="false">https://blog.arduino.cc/?p=42038</guid>

					<description><![CDATA[<p>After-school workshops run by curious, driven students is where some of the most exciting engineering happens in the Arduino community! One of the most compelling examples of this is FermiLabs, the innovation hub at secondary school IIS “E. Fermi – R. Guttuso” in Giarre, Sicily, offering students afternoon lab sessions in robotics, automation, and experimental […]</p>
<p>The post <a href="https://blog.arduino.cc/2026/05/08/how-fermilabs-builds-championship-level-robots-with-arduino/">How FermiLabs builds championship-level robots with Arduino</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><div class="image-post"><img fetchpriority="high" decoding="async" width="1024" height="558" src="https://blog.arduino.cc/wp-content/uploads/2026/05/image-4-1-1024x558.png" alt="" class="wp-image-42045" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/image-4-1-1024x558.png 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-4-1-300x164.png 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-4-1-768x419.png 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-4-1.png 1201w" sizes="(max-width: 1024px) 100vw, 1024px" /></div></figure>



<p>After-school workshops run by curious, driven students is where some of the most exciting engineering happens in the Arduino community! One of the most compelling examples of this is <a href="http://fermilabs.it/">FermiLabs</a>, the innovation hub at secondary school IIS “E. Fermi &#8211; R. Guttuso” in Giarre, Sicily, offering students afternoon lab sessions in robotics, automation, and experimental physics. The results speak for themselves: FermiLabs teams have earned multiple podium positions at <a href="https://www.robocupjunior.eu/">RoboCupJunior Europe</a>, one of the most demanding student robotics competitions in the world.</p>



<p>RoboCupJunior Rescue, in particular, challenges teams to <strong>design, build, and program fully autonomous robots capable of navigating disaster scenarios</strong> – from following lines across obstacle-laden terrain to exploring multi-level mazes and assisting simulated victims. For the 2026 season, two FermiLabs teams are pushing the limits of what student-built robots can do, with Arduino at the core of both machines.</p>



<h2 class="wp-block-heading">Team Tachyons: solving the maze with Arduino GIGA R1 WiFi</h2>



<p>The RoboCupJunior Rescue Maze requires a robot to autonomously explore a complex, multi-level labyrinth, identify victims, and deploy rescue kits with precision. <strong>The 2026 rulebook raised the bar significantly with the introduction of “cognitive targets”</strong> – five concentric colored circles that robots must decode in real-time to classify victim types. This shift from simple colored squares to dense visual patterns demands a substantial leap in processing power and sensor integration.</p>



<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe title="Breaking Ground with Arduino: Team Tachyons at RoboCup Junior" width="500" height="281" src="https://www.youtube.com/embed/mcZd5kCd6Ic?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
</div></figure>



<p>Team Tachyons – who showcased their work during Arduino Days 2026 and are led by YouTuber and TEDx speaker <a href="https://www.youtube.com/watch?v=J3CSU6Z9Upk">Etto Fins</a> – met that challenge by centering their robot on the <a href="https://store.arduino.cc/collections/giga/products/giga-r1-wifi">Arduino GIGA R1 WiFi</a>, leveraging the board’s ability to handle complex, multi-threaded tasks with the reliability and low latency that competitive robotics demands.</p>



<p>The robot’s intelligence lives in a custom-designed Arduino shield that acts as its central nervous system. Four dedicated stepper motor drivers deliver sub-millimeter positioning accuracy, while a six-axis IMU (Inertial Measurement Unit), fused with data from six ToF (Time-of-Flight) distance sensors, feeds a PID control loop that keeps the robot precisely centered within each tile – even on ramps and uneven terrain. On top of all this, the software builds a live 3D matrix to map the labyrinth in real-time, allowing the robot to backtrack and optimize its path autonomously.</p>



<p>The mechanical design is equally thoughtful. Custom silicone wheels, molded in-house with an airless structure, maximize traction while minimizing weight and absorbing shocks. The rescue kit deployment mechanism uses a compliant mechanism and twin springs to fire rescue cubelets at high velocity – and the kits themselves are engineered with the lowest possible coefficient of restitution, so they drop dead in place when they reach a victim rather than bouncing away.</p>



<p>After a successful showing at the regional selections in Catania, Team Tachyons placed second in the Italian Nationals with a new and improved model based on UNO Q 4GB boards… winning the chance to fly to Incheon, South Korea to compete with the best 3,000 robotics students in the world.</p>



<h2 class="wp-block-heading">Team Yellow Radiators: vision-first line following with Arduino UNO Q</h2>



<p>The Rescue Line challenge tasks a fully autonomous robot with following a black line across a modular arena of tiles, overcoming obstacles, debris, and varying terrain – ultimately locating and rescuing simulated victims before navigating to an extraction zone. <strong>Speed, reliability, and real-time visual processing are everything.</strong></p>



<p>Team Yellow Radiators chose to abandon traditional line-following sensors entirely in favor of a vision-first architecture built around <a href="https://www.arduino.cc/product-uno-q">Arduino UNO Q</a>. Rather than running high-level logic and low-level motor control on separate boards, this allowed them to unify both on a single platform.&nbsp;</p>



<p>A Python layer running OpenCV processes real-time camera data to identify the line and read intersection markers, while the Arduino side simultaneously handles the high-frequency motor control loop and sensor integration. A custom communication bridge between the Python vision layer and the Arduino language hardware layer makes this seamless two-brain operation possible.</p>



<p>For the competition, the team built a custom web control panel that transforms how the robot is calibrated on-site. Via a local Wi-Fi network, team members can view live camera buffers, toggle between different image masks to debug line detection in real-time, and adjust color calibration or sensor thresholds wirelessly using on-screen sliders – no code re-upload required. The dashboard even allows direct remote function calls to the Arduino core, so specific subsystems like the rescue kit grabber can be tested manually. In the variable lighting conditions of a competition arena, this kind of live debugging capability is a genuine competitive advantage.</p>



<p>On the AI side, the team deployed a custom-trained YOLO object detection model using the NCNN runtime, optimized for the UNO Q Arm-based Qualcomm Technologies’ SoC. Their next milestone: enabling GPU passthrough to leverage Vulkan acceleration on the onboard Qualcomm Adreno GPU, further reducing inference latency. Development has been eased significantly by the full Debian OS running on the board, letting the team work directly from VS Code via Remote Development – a proper professional workflow on a compact edge device.</p>



<figure class="wp-block-image size-large"><div class="image-post"><img decoding="async" width="1024" height="870" src="https://blog.arduino.cc/wp-content/uploads/2026/05/image-3-1-1024x870.png" alt="" class="wp-image-42047" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/image-3-1-1024x870.png 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-3-1-300x255.png 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-3-1-768x653.png 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-3-1.png 1358w" sizes="(max-width: 1024px) 100vw, 1024px" /></div></figure>



<h2 class="wp-block-heading">From Sicily to the world championship</h2>



<p>Both projects illustrate something FermiLabs has made a habit of demonstrating: that with the right tools, a secondary school team can engineer solutions that rival professional-grade systems. Arduino’s role in both robots isn’t incidental – it’s <strong>the platform that makes rapid iteration, hardware control, and connectivity available to students who want to build things that actually work under pressure</strong>.&nbsp;</p>



<p>After multiple successes at the national level in Catania in April, FermiLabs is now gearing up to take two teams to the RoboCupJunior European Championships in Vienna, and two more to the RoboCup Federation Junior World Championships in South Korea. Follow <a href="http://fermilabs.it/">fermilabs.it on LinkedIn</a> to see their progress, or check out their <a href="https://www.isfermiguttuso.edu.it/call-for-partner-robocup-2026/">call for partners</a> to find out how you can support them.</p>



<p><em>Qualcomm branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries. Arduino, GIGA R1, and UNO are trademarks or registered trademarks of Arduino S.r.l.</em></p>
<p>The post <a href="https://blog.arduino.cc/2026/05/08/how-fermilabs-builds-championship-level-robots-with-arduino/">How FermiLabs builds championship-level robots with Arduino</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
		<item>
		<title>Real talk: building with Arduino UNO Q</title>
		<link>https://blog.arduino.cc/2026/05/08/real-talk-building-with-arduino-uno-q/</link>
		
		<dc:creator><![CDATA[Arduino Team]]></dc:creator>
		<pubDate>Fri, 08 May 2026 07:40:16 +0000</pubDate>
				<category><![CDATA[UNO Q]]></category>
		<guid isPermaLink="false">https://blog.arduino.cc/?p=42080</guid>

					<description><![CDATA[<p>We’re bringing the maker community behind the scenes with a new live format: Built with Arduino, a candid conversation between our own Andrea Richetta (Senior Product Manager) from Arduino (for Qualcomm Europe) and Rafik from Kamitronix, the creator behind a smart mirror project built entirely on the Arduino® UNO&#x2122; Q board. No polished demos, no […]</p>
<p>The post <a href="https://blog.arduino.cc/2026/05/08/real-talk-building-with-arduino-uno-q/">Real talk: building with Arduino UNO Q</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><div class="image-post"><img fetchpriority="high" decoding="async" width="1024" height="559" src="https://blog.arduino.cc/wp-content/uploads/2026/05/Arduino.cc-Blogpost-Cover1100x600-1024x559.jpg" alt="" class="wp-image-42081" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/Arduino.cc-Blogpost-Cover1100x600-1024x559.jpg 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/Arduino.cc-Blogpost-Cover1100x600-300x164.jpg 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/Arduino.cc-Blogpost-Cover1100x600-768x419.jpg 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/Arduino.cc-Blogpost-Cover1100x600.jpg 1100w" sizes="(max-width: 1024px) 100vw, 1024px" /></div></figure>



<p>We&#8217;re bringing the maker community behind the scenes with a new live format: <em>Built with Arduino</em>, a candid conversation between our own Andrea Richetta (Senior Product Manager) from Arduino (for Qualcomm Europe) and Rafik from <a href="https://www.instagram.com/kamitronix/">Kamitronix</a>, the creator behind a <a href="https://www.instagram.com/reel/DSaVUVSjChc/?utm_source=ig_web_copy_link">smart mirror project</a> built entirely on the Arduino<sup>®</sup> UNO<sup><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2122.png" alt="&#x2122;" class="wp-smiley" style="height: 1em; max-height: 1em;" /></sup> Q board.</p>



<p>No polished demos, no scripted walkthrough. Just an honest, back-and-forth discussion about what it&#8217;s actually like to prototype with the UNO Q ecosystem: <a href="https://docs.arduino.cc/software/app-lab/bricks/use-bricks/">Bricks</a>, <a href="https://store.arduino.cc/collections/modulino?utm_source=google&amp;utm_medium=cpc&amp;utm_campaign=EU-Pmax&amp;gad_source=1&amp;gad_campaignid=22591755262&amp;gbraid=0AAAAACbEa85-MakvdQDoHiUBKOMljO3ph&amp;gclid=Cj0KCQjw8PDPBhCeARIsAOJwmWWEo_sS5MkOAfrRzyaQc1fQNzMgUbogI6B4EAXVvVkTAmfpA1LbIK8aAisCEALw_wcB">Modulino</a>, <a href="https://docs.arduino.cc/software/app-lab/">App Lab</a> and all.</p>



<p>Over 40 minutes, we&#8217;ll dig into the real architectural choices every UNO Q developer faces: what belongs on the Linux side, what belongs in the real-time MCU, and how the latest updates from Arduino<sup>®</sup> App Lab reduce the friction in between. The final 20 minutes will be open for audience questions.</p>



<p>Three live quizzes will keep the session interactive. Come ready to participate: <strong>May 13th · 4:00 PM CET</strong>.</p>



<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe title="Built with Arduino -  A live chat with Andrea and Kamitronix" width="500" height="281" src="https://www.youtube.com/embed/zU2P9yFzQq0?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
</div></figure>



<p class="has-text-align-center has-small-font-size"><em>Arduino and UNO and the Arduino logo are trademarks or registered trademarks of Arduino S.r.l.</em></p>
<p>The post <a href="https://blog.arduino.cc/2026/05/08/real-talk-building-with-arduino-uno-q/">Real talk: building with Arduino UNO Q</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
		<item>
		<title>One board, two brains? Three ways a dual architecture board makes building simpler</title>
		<link>https://blog.arduino.cc/2026/05/07/one-board-two-brains-three-ways-a-dual-architecture-board-makes-building-simpler/</link>
		
		<dc:creator><![CDATA[Arduino Team]]></dc:creator>
		<pubDate>Thu, 07 May 2026 16:09:30 +0000</pubDate>
				<category><![CDATA[App Lab]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[Arduino App Lab]]></category>
		<category><![CDATA[Arduino UNO Q]]></category>
		<category><![CDATA[Featured]]></category>
		<category><![CDATA[UNO Q]]></category>
		<guid isPermaLink="false">https://blog.arduino.cc/?p=42050</guid>

					<description><![CDATA[<p>Most embedded projects don’t stay simple for long. You start with a microcontroller (MCU), reading sensors and controlling outputs. Then you add connectivity, maybe a user interface, maybe even AI. At that point, a single MCU starts to feel limiting. So you introduce a Linux-based system. Now you have flexibility – but also a new layer […]</p>
<p>The post <a href="https://blog.arduino.cc/2026/05/07/one-board-two-brains-three-ways-a-dual-architecture-board-makes-building-simpler/">One board, two brains? Three ways a dual architecture board makes building simpler</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><div class="image-post"><img fetchpriority="high" decoding="async" width="1024" height="683" src="https://blog.arduino.cc/wp-content/uploads/2026/05/DSC6445-1024x683.jpeg" alt="" class="wp-image-42072" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/DSC6445-1024x683.jpeg 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/DSC6445-300x200.jpeg 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/DSC6445-768x512.jpeg 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/DSC6445-1536x1024.jpeg 1536w, https://blog.arduino.cc/wp-content/uploads/2026/05/DSC6445-2048x1365.jpeg 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /></div></figure>



<p>Most embedded projects don’t stay simple for long. You start with a microcontroller (MCU), reading sensors and controlling outputs. Then you add connectivity, maybe a user interface, maybe even AI. At that point, a single MCU starts to feel limiting. So you introduce a Linux-based system. Now you have flexibility –&nbsp;but also a new layer of complexity: two processors, two toolchains, and a growing amount of glue code just to keep everything in sync.</p>



<p><strong>You want the flexibility of Linux. You need the precision of real-time control</strong>. The <a href="https://www.arduino.cc/product-uno-q">Arduino<sup>®</sup> UNO<sup><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2122.png" alt="&#x2122;" class="wp-smiley" style="height: 1em; max-height: 1em;" /></sup> Q</a> board is designed to bring these two worlds together and make this friction a thing of the past.</p>



<h2 class="wp-block-heading">A dual-brain architecture gives you the best of two worlds</h2>



<p>UNO Q combines two distinct processing environments on a single board.</p>



<p>A Linux-capable microprocessor (MPU) handles high-level workloads such as networking, AI inference, and application logic. Alongside it, a microcontroller manages real-time I/O, deterministic timing, and direct hardware interaction. This separation is intentional.</p>



<p>The MPU runs tasks that benefit from an operating system: multitasking, connectivity stacks, and model execution. The MCU handles tasks where timing and reliability are critical: reading sensors, generating signals, and controlling actuators.</p>



<figure class="wp-block-image size-full"><div class="image-post"><img decoding="async" width="908" height="559" src="https://blog.arduino.cc/wp-content/uploads/2026/05/arduino_flow_chart_1.jpg" alt="" class="wp-image-42051" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/arduino_flow_chart_1.jpg 908w, https://blog.arduino.cc/wp-content/uploads/2026/05/arduino_flow_chart_1-300x185.jpg 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/arduino_flow_chart_1-768x473.jpg 768w" sizes="(max-width: 908px) 100vw, 908px" /></div></figure>



<p>Instead of forcing one processor to do everything, each side does what it’s best at – and the magic happens when the two “talk” to each other through the UNO Q bridge mechanism.&nbsp;</p>



<p>In practice, this means your Python code can interact directly with hardware-level events handled by the microcontroller (such as a button press, change in temperature, movement, etc.), and your MCU can react to high-level decisions made on the Linux side (e.g. updating a web interface, logging data, or triggering an AI-driven response). Without complex setup, <strong>you’re working within a single, coordinated architecture.</strong></p>



<h2 class="wp-block-heading"><strong>Arduino</strong><strong><sup>®</sup></strong><strong> App Lab offers a unified application model</strong></h2>



<p>The dual-brain architecture enables a different coding experience – so the real shift is not just in the hardware, but in how you develop for it.</p>



<figure class="wp-block-image size-large"><div class="image-post"><img decoding="async" width="1024" height="576" src="https://blog.arduino.cc/wp-content/uploads/2026/05/image-6-1024x576.png" alt="" class="wp-image-42070" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/image-6-1024x576.png 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-6-300x169.png 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-6-768x432.png 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-6-1536x864.png 1536w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-6-2048x1152.png 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /></div></figure>



<p>With Arduino App Lab, the MPU and MCU are exposed as parts of a single application. </p>



<p>Arduino App Lab provides developers with a unified, single-console environment. This centralized environment eliminates the need to switch between separate terminals or tools to monitor the two distinct environments. Within this consolidated interface, developers can monitor the logging output from both the primary <em>application</em> processor and the <em>real-time</em> microcontroller in parallel, offering a complete, time-correlated view of the entire system’s execution flow.</p>



<p>From a developer perspective, this <strong>removes the need to manually manage communication or synchronization between two separate systems.</strong></p>



<p>The best part? If you want to see how Arduino App Lab is working behind the scenes, the Github repo contains all the source code, no secrets here! <a href="https://github.com/arduino/arduino-app-lab">If you’re curious, just check it out here</a>.</p>



<h2 class="wp-block-heading">Arduino App Lab AI workflows bridge data insight and real-world action</h2>



<p>Edge AI often becomes complex at the integration stage. Running a model is one thing, but connecting it to real-world signals, managing timing, and triggering actions reliably is where things usually break down.</p>



<p>This is exactly where the dual-brain architecture of the UNO Q changes the game. By combining an MPU running Linux with an MCU handling real-time control, you can naturally split AI workflows: the MPU takes care of model execution, orchestration, and the MCU takes the role of the king of deterministic land.&nbsp;</p>



<p>It’s not just about running AI, it’s about making it fit and work reliably inside a real system.</p>



<figure class="wp-block-image size-full"><div class="image-post"><img loading="lazy" decoding="async" width="919" height="308" src="https://blog.arduino.cc/wp-content/uploads/2026/05/arduino_flow_chart_2-1.jpg" alt="" class="wp-image-42052" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/arduino_flow_chart_2-1.jpg 919w, https://blog.arduino.cc/wp-content/uploads/2026/05/arduino_flow_chart_2-1-300x101.jpg 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/arduino_flow_chart_2-1-768x257.jpg 768w" sizes="auto, (max-width: 919px) 100vw, 919px" /></div></figure>



<p>Arduino App Lab acts as the bridge between these two worlds, enabling seamless data exchange and coordinated execution across the MPU and MCU. <a href="https://blog.arduino.cc/2026/03/04/train-and-deploy-your-own-ai-models-in-arduino-app-lab-now-fully-integrated-with-edge-impulse">With the integration of Edge Impulse</a>, the path from model training to deployment becomes much more direct. You can move from data collection to inference without reworking your entire stack.</p>



<p>Now you can build and deploy custom models in a unified flow: start from the Arduino App Lab “Train New Model,” move to Edge Impulse for training and validation, and deploy back to Arduino App Lab –&nbsp;ready to run across the dual-brain system, from insight to action.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full is-resized"><div class="image-post"><img loading="lazy" decoding="async" width="260" height="128" src="https://blog.arduino.cc/wp-content/uploads/2026/05/unnamed-4-1.png" alt="" class="wp-image-42074" style="aspect-ratio:2.0314979855939446;width:504px;height:auto"/></div></figure>
</div>


<p>You can even switch between different models with a simple click of the mouse!</p>



<figure class="wp-block-image size-large"><div class="image-post"><img loading="lazy" decoding="async" width="1024" height="574" src="https://blog.arduino.cc/wp-content/uploads/2026/05/image-7-1-1024x574.png" alt="" class="wp-image-42079" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/image-7-1-1024x574.png 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-7-1-300x168.png 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-7-1-768x431.png 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-7-1-1536x861.png 1536w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-7-1.png 2048w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></div></figure>



<p>If you want to explore the full workflow step by step, you can dive deeper into the <a href="https://docs.arduino.cc/software/app-lab/integrations/ai-models/">dedicated article on training and deploying AI models in App Lab</a>, as well as the overview of the expanding UNO Q ecosystem.</p>



<h2 class="wp-block-heading">From architecture to applications</h2>



<p>This dual-brain approach is not just theoretical – you can already see it in action across different types of projects.</p>



<p>From <a href="https://projecthub.arduino.cc/AndreaRichetta/how-to-install-node-red-on-uno-q-using-docker-0d9c78">installing widely available tools like Node-RED</a> to vision-based inspection systems, image processing can run on the Linux side while the microcontroller handles precise triggering and control. This allows you to process complex visual data without sacrificing timing accuracy. You can even process images and short videos with text prompts to generate descriptions or answers, like in this project where <a href="https://projecthub.arduino.cc/marc-edgeimpulse/running-local-llms-and-vlms-on-the-arduino-uno-q-with-yzma-74e288">local LLMs and VLMs run on UNO Q</a>.</p>



<p>In energy monitoring and smart sensing applications –&nbsp;like <a href="https://projecthub.arduino.cc/jumaanji_2004/afa2026-physicalai-accident-response-system-0f4bdf">this accident response system that leverages physical AI</a> – the MCU continuously samples real-world signals, while the MPU aggregates data, runs analytics, and exposes results through services or dashboards.</p>



<figure class="wp-block-image size-large"><div class="image-post"><img loading="lazy" decoding="async" width="1024" height="768" src="https://blog.arduino.cc/wp-content/uploads/2026/05/image-1-1024x768.jpeg" alt="" class="wp-image-42078" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/image-1-1024x768.jpeg 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-1-300x225.jpeg 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-1-385x289.jpeg 385w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-1-768x576.jpeg 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/image-1.jpeg 1280w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></div></figure>



<h2 class="wp-block-heading">Three reasons, one simpler way to build</h2>



<p>When you put it all together, <a href="https://www.arduino.cc/product-uno-q">UNO Q</a> makes building complex systems simpler for three clear reasons.</p>



<p>First, a single, coordinated setup makes your builds more straightforward and efficient. You have two different brains, each one doing what it’s best at.</p>



<p>Second, the unified application model with Arduino App Lab turns two processors into one coherent development experience. You write, monitor, and debug everything from a single environment – no more switching between terminals, no different hardware for different tasks, no more glue code just to keep the two sides talking.</p>



<p>Third, AI workflows actually fit the system. With Edge Impulse, Qualcomm<sup>®</sup> AI hub, Hugging Face that can be integrated into the flow, you can go from data collection to a deployed model without rebuilding your stack along the way. The microprocessor runs the inference, the microcontroller handles the signals, and Arduino App Lab keeps them all together using code and Bricks – so edge AI stops being an integration headache and starts being just another part of your application.</p>



<p>Flexibility of Linux, precision of real-time control, and a development ecosystem that is able to handle every side of your next project without you having to jump around between platforms: it’s all in a single board, designed to make building simpler from day one.</p>



<p><em>Qualcomm branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries. Arduino and UNO are trademarks or registered trademarks of Arduino S.r.l.</em></p>
<p>The post <a href="https://blog.arduino.cc/2026/05/07/one-board-two-brains-three-ways-a-dual-architecture-board-makes-building-simpler/">One board, two brains? Three ways a dual architecture board makes building simpler</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
		<item>
		<title>This toy box does something incredible with AI-generated video</title>
		<link>https://blog.arduino.cc/2026/05/07/this-toy-box-does-something-incredible-with-ai-generated-video/</link>
		
		<dc:creator><![CDATA[Arduino Team]]></dc:creator>
		<pubDate>Thu, 07 May 2026 11:04:05 +0000</pubDate>
				<category><![CDATA[arduino]]></category>
		<category><![CDATA[generative ai]]></category>
		<category><![CDATA[Toy Box]]></category>
		<guid isPermaLink="false">https://blog.arduino.cc/?p=42055</guid>

					<description><![CDATA[<p>AI video generation may be impressive on a technical level, but typing out a prompt doesn’t exactly feel like creative work. Interaction designer Hun Han wondered how he could make that more of a collaborative experience and that led him to develop something pretty incredible: the Hush toy box. Hush is a small, enclosed lightbox […]</p>
<p>The post <a href="https://blog.arduino.cc/2026/05/07/this-toy-box-does-something-incredible-with-ai-generated-video/">This toy box does something incredible with AI-generated video</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><div class="image-post"><img fetchpriority="high" decoding="async" width="1024" height="683" src="https://blog.arduino.cc/wp-content/uploads/2026/05/bHUph2S1E3B3oYLfobSeILvj72Y.jpg-1024x683.avif" alt="" class="wp-image-42056" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/bHUph2S1E3B3oYLfobSeILvj72Y.jpg-1024x683.avif 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/bHUph2S1E3B3oYLfobSeILvj72Y.jpg-300x200.avif 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/bHUph2S1E3B3oYLfobSeILvj72Y.jpg-768x512.avif 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/bHUph2S1E3B3oYLfobSeILvj72Y.jpg-1536x1024.avif 1536w, https://blog.arduino.cc/wp-content/uploads/2026/05/bHUph2S1E3B3oYLfobSeILvj72Y.jpg.avif 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /></div></figure>



<p>AI video generation may be impressive on a technical level, but typing out a prompt doesn’t exactly feel like creative work. Interaction designer Hun Han wondered how he could make that more of a collaborative experience and that led him to develop something pretty incredible: <a href="https://hunhan.xyz/hush">the Hush toy box</a>.</p>



<p><a href="https://www.creativeapplications.net/member/hush-bringing-inanimate-objects-to-life/">Hush is a small, enclosed lightbox</a> for photography. Users pose inanimate objects — action figures, clay models, plants, and whatever else they can think of — inside the box, then close the lid. After that, the magic happens: Hush snaps a photo of the scene inside the box and feeds it as a prompt to a video generation AI.</p>



<figure class="wp-block-image size-large"><div class="image-post"><img decoding="async" width="1024" height="576" src="https://blog.arduino.cc/wp-content/uploads/2026/05/6FFC9oZUXWcucRPBZl1UnYI.jpg-copy-1024x576.jpg" alt="" class="wp-image-42059" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/6FFC9oZUXWcucRPBZl1UnYI.jpg-copy-1024x576.jpg 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/6FFC9oZUXWcucRPBZl1UnYI.jpg-copy-300x169.jpg 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/6FFC9oZUXWcucRPBZl1UnYI.jpg-copy-768x432.jpg 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/6FFC9oZUXWcucRPBZl1UnYI.jpg-copy-1536x864.jpg 1536w, https://blog.arduino.cc/wp-content/uploads/2026/05/6FFC9oZUXWcucRPBZl1UnYI.jpg-copy.jpg 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /></div></figure>



<p>The result is often fantastic, as AI models are now at a point where they do a very good job of generating and rendering realistic video. And in this case, that realistic video incorporates the real-world items in the box. Imagine your LEGO minifigs battling a clay dragon that you sculpted. That is exactly the kind of video Hush can produce and you get to be part of the creative process by deciding what to put in the box and how to pose those things within the scene. You also get control over day versus night and the simulated weather in the scene.</p>



<p>Kling v2.5 Turbo does the heavy lifting of video generation and a PC connects to that via the Replicate AI. The physical controls, including the weather selection dial and the Hall sensor that detects lid closure, connect to the PC through an Arduino. That board also controls the LED strips that illuminate Hush’s interior. The PC snaps a photo of the scene through OpenCV and a webcam. Finally, the rendered video results display on a repurposed iPhone 6, which is visible through a peep hole on the top of the box.&nbsp;</p>



<figure class="wp-block-image size-large"><div class="image-post"><img decoding="async" width="1024" height="683" src="https://blog.arduino.cc/wp-content/uploads/2026/05/VO55P8Dg8AGvJCx9T8CqpOWCJls.png-1-1024x683.avif" alt="" class="wp-image-42060" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/VO55P8Dg8AGvJCx9T8CqpOWCJls.png-1-1024x683.avif 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/VO55P8Dg8AGvJCx9T8CqpOWCJls.png-1-300x200.avif 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/VO55P8Dg8AGvJCx9T8CqpOWCJls.png-1-768x512.avif 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/VO55P8Dg8AGvJCx9T8CqpOWCJls.png-1-1536x1024.avif 1536w, https://blog.arduino.cc/wp-content/uploads/2026/05/VO55P8Dg8AGvJCx9T8CqpOWCJls.png-1.avif 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /></div></figure>



<p>When it comes to whimsy and entertainment, this might just be the best use of AI that we’ve come across. </p>



<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe loading="lazy" title="Hush" width="500" height="281" src="https://www.youtube.com/embed/S7ws2NcmpiA?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
</div></figure>
<p>The post <a href="https://blog.arduino.cc/2026/05/07/this-toy-box-does-something-incredible-with-ai-generated-video/">This toy box does something incredible with AI-generated video</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
		<item>
		<title>An Arduino library for the EByte E22-T series LoRa modules</title>
		<link>https://blog.adafruit.com/2026/05/06/an-arduino-library-for-the-ebyte-e22-t-series-lora-modules/</link>
		
		<dc:creator><![CDATA[Anne Barela]]></dc:creator>
		<pubDate>Wed, 06 May 2026 18:58:26 +0000</pubDate>
				<category><![CDATA[arduino]]></category>
		<category><![CDATA[LoRa]]></category>
		<category><![CDATA[radio]]></category>
		<category><![CDATA[Software]]></category>
		<guid isPermaLink="false">https://blog.adafruit.com/?p=656708</guid>

					<description><![CDATA[LoRa-E22T is an Arduino library for EByte E22-T series LoRa modules based on the Semtech SX1262/SX1268 chipsets. It supports the full range of E22-T module variants across the 230 MHz, 400 MHz, and 900 MHz frequency bands, at both 22 dBm and 30 dBm output power levels. The library provides a complete API to configure the module, send and receive data in Transparent or Fixed addressing modes, use […]]]></description>
										<content:encoded><![CDATA[<p><img fetchpriority="high" decoding="async" width="300" height="300" class="alignnone size-full wp-image-656709 img-responsive" src="https://cdn-blog.adafruit.com/uploads/2026/05/z-3.gif" alt="" /></p>
<p dir="auto"><strong>LoRa-E22T</strong> is an Arduino library for <strong>EByte E22-T</strong> series LoRa modules based on the <strong>Semtech SX1262/SX1268</strong> chipsets. It supports the full range of E22-T module variants across the <strong>230 MHz</strong>, <strong>400 MHz</strong>, and <strong>900 MHz</strong> frequency bands, at both <strong>22 dBm</strong> and <strong>30 dBm</strong> output power levels.</p>
<p dir="auto">The library provides a complete API to configure the module, send and receive data in Transparent or Fixed addressing modes, use power-saving Wake-on-Radio (WOR), hardware encryption, and RSSI reporting &#8211; all with consistent, typed error handling.</p>
<p dir="auto">Check out this MIT licensed project on <a href="https://github.com/alkonosst/LoRa-E22T"  rel="noopener">GitHub</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
		<item>
		<title>A mini electronic saxophone</title>
		<link>https://blog.adafruit.com/2026/05/06/a-mini-electronic-saxophone/</link>
		
		<dc:creator><![CDATA[Anne Barela]]></dc:creator>
		<pubDate>Wed, 06 May 2026 18:37:16 +0000</pubDate>
				<category><![CDATA[arduino]]></category>
		<category><![CDATA[Midi]]></category>
		<category><![CDATA[music]]></category>
		<category><![CDATA[musical instruments]]></category>
		<category><![CDATA[saxophone]]></category>
		<category><![CDATA[Teensy]]></category>
		<guid isPermaLink="false">https://blog.adafruit.com/?p=656703</guid>

					<description><![CDATA[The Circuit Sax (v2) by Nigel is a mini electronic saxophone. It’s made with mechanical switches, custom 3D printed keycaps, and breath input. It’s designed to feel as close to a regular saxophone as possible. Almost anything you can play on a real saxophone, you can translate to the Circuit Sax. Features Mechanical key switches […]]]></description>
										<content:encoded><![CDATA[<p><img fetchpriority="high" decoding="async" width="450" height="361" class="alignnone size-full wp-image-656704 img-responsive" src="https://cdn-blog.adafruit.com/uploads/2026/05/blink-5.png" alt="" srcset="https://cdn-blog.adafruit.com/uploads/2026/05/blink-5.png 450w, https://cdn-blog.adafruit.com/uploads/2026/05/blink-5-300x241.png 300w, https://cdn-blog.adafruit.com/uploads/2026/05/blink-5-150x120.png 150w" sizes="(max-width: 450px) 100vw, 450px" /></p>
<p>The Circuit Sax (v2) by Nigel is a mini electronic saxophone. It&#8217;s made with mechanical switches, custom 3D printed keycaps, and breath input. It’s designed to feel as close to a regular saxophone as possible.</p>
<blockquote><p>Almost anything you can play on a real saxophone, you can translate to the Circuit Sax.</p></blockquote>
<div class="markdown-heading" dir="auto">
<h3 class="heading-element" dir="auto" tabindex="-1">Features</h3>
<p><a id="user-content-features" class="anchor" href="https://github.com/Whackalenso/CircuitSax#features" aria-label="Permalink: Features"></a></p></div>
<ul dir="auto">
<li>Mechanical key switches with custom keycaps meant to mimic the feel of a saxophone</li>
<li>Breath input for articulation of notes and dynamic expression (compatible with any alto saxophone mouthpiece)</li>
<li>Motion detection for vibrato</li>
<li>Audio output via 3.5mm audio jack</li>
<li>MIDI output via USB to a computer or phone</li>
</ul>
<p>It uses a Teensy 4 microcontroller a Teensy 4 audio shield version 4 programmed in the Arduino IDE with Teensy extensions.</p>
<p>See all the details on <a href="https://github.com/Whackalenso/CircuitSax"  rel="noopener">GitHub</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
		<item>
		<title>Students build a lactose intolerance breath tester with Arduino® Nano&#x2122; board</title>
		<link>https://blog.arduino.cc/2026/05/06/students-build-a-lactose-intolerance-breath-tester-with-arduino-nano-board/</link>
		
		<dc:creator><![CDATA[Arduino Team]]></dc:creator>
		<pubDate>Wed, 06 May 2026 10:49:29 +0000</pubDate>
				<category><![CDATA[arduino]]></category>
		<category><![CDATA[Breath Tester]]></category>
		<category><![CDATA[Lactose Intolerance]]></category>
		<category><![CDATA[Nano]]></category>
		<guid isPermaLink="false">https://blog.arduino.cc/?p=42032</guid>

					<description><![CDATA[<p>What if your students could build a working biomedical prototype from scratch – one that explains human digestion, gas diffusion, sensor calibration, and programming all at once? That’s exactly what happened at ITTS “E. Divini” in San Severino Marche, Italy, where Professor Lorenzo Morresi and his colleagues Professors Battistini and Capri led a group of […]</p>
<p>The post <a href="https://blog.arduino.cc/2026/05/06/students-build-a-lactose-intolerance-breath-tester-with-arduino-nano-board/">Students build a lactose intolerance breath tester with Arduino® Nano&#x2122; board</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-full"><div class="image-post"><img fetchpriority="high" decoding="async" width="900" height="693" src="https://blog.arduino.cc/wp-content/uploads/2026/05/Mask.jpg" alt="" class="wp-image-42034" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/Mask.jpg 900w, https://blog.arduino.cc/wp-content/uploads/2026/05/Mask-300x231.jpg 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/Mask-768x591.jpg 768w" sizes="(max-width: 900px) 100vw, 900px" /></div></figure>



<p>What if your students could build a working biomedical prototype from scratch – one that explains human digestion, gas diffusion, sensor calibration, and programming all at once? That’s exactly what happened at <a href="https://divini.edu.it/">ITTS “E. Divini”</a> in San Severino Marche, Italy, where Professor Lorenzo Morresi and his colleagues Professors Battistini and Capri led a group of fifth-year chemistry and materials students – Noemi Aloi, Corrado Avellino, Michele Bagoi, Alessandro Fiorani, Priya Kaur, and Matteo Zagaglia – to prototype a hydrogen breath test system using an <a href="https://store.arduino.cc/products/arduino-nano">Arduino Nano</a> board. The project was featured in Italy’s Focus Scuola magazine and is a great example of what’s possible when curiosity meets the right tools.</p>



<h2 class="wp-block-heading">The science behind the idea</h2>



<p>Lactose intolerance isn’t a disease – it’s a condition caused by a deficiency of lactase, the enzyme that breaks down lactose into glucose and galactose in the small intestine. When lactase is absent or insufficient, undigested lactose reaches the colon, where gut bacteria ferment it and produce gases, including hydrogen (H?). That hydrogen passes into the bloodstream and is eventually exhaled through the lungs.</p>



<p>This is the principle behind the hydrogen breath test, a diagnostic method used in clinical settings: measuring the concentration of hydrogen in exhaled breath after ingesting lactose can help to detect malabsorption. The project team saw this as a perfect intersection of biochemistry, physics, and electronics – and decided to build it.</p>



<h2 class="wp-block-heading">The hardware: simple, accessible, effective</h2>



<p>The prototype uses three main components. A simple Nano board serves as the brain of the system, programmed in the Arduino language (based on C++) to handle sensor input and data output. A MiCS-5524 gas sensor – sensitive to reducing gases including hydrogen – handles detection across a range of 1 to 1,000 ppm. And to make the device practical to use, the sensor was integrated into a stan</p>



<p>dard aerosol mask, so exhaled breath hits the sensitive element directly. The choice of components was deliberate: accessible, affordable, and <strong>replicable by any school with a basic electronics lab</strong>.</p>



<figure class="wp-block-image size-large"><div class="image-post"><img decoding="async" width="1024" height="835" src="https://blog.arduino.cc/wp-content/uploads/2026/05/Sensor-2-1024x835.jpg" alt="" class="wp-image-42037" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/Sensor-2-1024x835.jpg 1024w, https://blog.arduino.cc/wp-content/uploads/2026/05/Sensor-2-300x245.jpg 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/Sensor-2-768x626.jpg 768w, https://blog.arduino.cc/wp-content/uploads/2026/05/Sensor-2-1536x1252.jpg 1536w, https://blog.arduino.cc/wp-content/uploads/2026/05/Sensor-2-2048x1670.jpg 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /></div></figure>



<h2 class="wp-block-heading">Calibration, protocol, and the scientific method</h2>



<p>The team didn’t stop at assembly. Without access to certified gas cylinders for calibration, students worked from the manufacturer’s logarithmic curves to translate raw electrical signals into hydrogen concentrations expressed in parts per million – a real exercise in dealing with the kind of uncertainty and approximation that comes with actual scientific work.</p>



<p>Test subjects followed a rigorous protocol: 12 hours of fasting, a baseline measurement, a low-residue diet the day prior, ingestion of milk, and breath measurements every 15 minutes for two hours. Data was then processed in Microsoft Excel to visualize the hydrogen curve over time – and the resulting graphs clearly resembled the hydrogen peaks characteristic of a breath test.</p>



<figure class="wp-block-image size-full"><div class="image-post"><img decoding="async" width="960" height="720" src="https://blog.arduino.cc/wp-content/uploads/2026/05/Risultati.jpg" alt="" class="wp-image-42036" srcset="https://blog.arduino.cc/wp-content/uploads/2026/05/Risultati.jpg 960w, https://blog.arduino.cc/wp-content/uploads/2026/05/Risultati-300x225.jpg 300w, https://blog.arduino.cc/wp-content/uploads/2026/05/Risultati-385x289.jpg 385w, https://blog.arduino.cc/wp-content/uploads/2026/05/Risultati-768x576.jpg 768w" sizes="(max-width: 960px) 100vw, 960px" /></div></figure>



<h2 class="wp-block-heading">A powerful teaching tool, not a medical device</h2>



<p>The team is clear about what the prototype is and isn’t. As Professor Morresi puts it, “We demonstrated the feasibility of our idea and its reproducibility by others. This is not a medical device – but it is a powerful teaching tool that brings together coding, physics, and health in a single lab activity.”</p>



<p>The project covers an impressive spread of curriculum topics in one hands-on experience: the physics of gas diffusion through the circulatory system, the biochemistry of enzyme function and fermentation, analog signal processing with a microcontroller, and the analysis of measurement uncertainty. Future iterations of the project aim to add methane (CH?) detection, which would make the results even more diagnostically meaningful.</p>



<h2 class="wp-block-heading">Open and replicable – by design</h2>



<p>One of the most generous aspects of this project is that Professor Morresi has made everything available to other schools: lab worksheets, Arduino code, sensor calibration data, and test protocols. The goal is straightforward – to show that in a technical high school, with good guidance and affordable components, students can turn ideas into working technology, and subjects like physics and chemistry stop being abstract concepts and start being tools for understanding the world.</p>



<p>If you’re a teacher looking to bring a genuinely interdisciplinary project into your classroom – one that connects biochemistry, physics, electronics, and data analysis in a way students can actually build – this one is worth exploring! All project materials are available on <a href="https://morresi.wordpress.com/didattica/formazione-on-line/itis-e-divini-a-s-2025-2026-classi-virtuali/breathtest/">Professor Morresi’s dedicated project page</a> (in Italian).</p>



<p><em>Arduino and Nano are trademarks or registered trademarks of Arduino S.r.l.</em></p>
<p>The post <a href="https://blog.arduino.cc/2026/05/06/students-build-a-lactose-intolerance-breath-tester-with-arduino-nano-board/">Students build a lactose intolerance breath tester with Arduino® Nano<img src="https://s.w.org/images/core/emoji/15.0.3/72x72/2122.png" alt="™" class="wp-smiley" style="height: 1em; max-height: 1em;" /> board</a> appeared first on <a href="https://blog.arduino.cc/">Arduino Blog</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
		<item>
		<title>rp2040js is a Raspberry Pi Pico Emulator for the Wokwi Simulation Platform which runs CircuitPython, Arduino and MicroPython</title>
		<link>https://blog.adafruit.com/2026/05/05/rp2040js-is-a-raspberry-pi-pico-emulator-for-the-wokwi-simulation-platform-which-runs-circuitpython-arduino-and-micropython/</link>
		
		<dc:creator><![CDATA[Anne Barela]]></dc:creator>
		<pubDate>Tue, 05 May 2026 19:16:45 +0000</pubDate>
				<category><![CDATA[arduino]]></category>
		<category><![CDATA[circuitpython]]></category>
		<category><![CDATA[emulation]]></category>
		<category><![CDATA[micropython]]></category>
		<category><![CDATA[pico]]></category>
		<category><![CDATA[Raspberry Pi]]></category>
		<category><![CDATA[Wokwi]]></category>
		<guid isPermaLink="false">https://blog.adafruit.com/?p=656535</guid>

					<description><![CDATA[rp2040js is a Raspberry Pi Pico emulator for the Wokwi Simulation Platform. It blinks, runs Arduino code, and even the MicroPython and CircuitPython REPLs. It’s coded nearly entirely in TypeScript. See more on the GitHub page and on Hackaday.]]></description>
										<content:encoded><![CDATA[<p><img fetchpriority="high" decoding="async" width="480" height="241" class="alignnone size-full wp-image-656538 img-responsive" src="https://cdn-blog.adafruit.com/uploads/2026/05/a-3.jpg" alt="" srcset="https://cdn-blog.adafruit.com/uploads/2026/05/a-3.jpg 480w, https://cdn-blog.adafruit.com/uploads/2026/05/a-3-300x151.jpg 300w, https://cdn-blog.adafruit.com/uploads/2026/05/a-3-150x75.jpg 150w" sizes="(max-width: 480px) 100vw, 480px" /></p>
<div class="markdown-heading" dir="auto">
<p class="heading-element" dir="auto" tabindex="-1">rp2040js is a Raspberry Pi Pico emulator for the <a href="https://wokwi.com/"  rel="nofollow noopener">Wokwi Simulation Platform</a>. It blinks, runs Arduino code, and even the MicroPython and CircuitPython REPLs.</p>
<p dir="auto" tabindex="-1">It&#8217;s coded nearly entirely in TypeScript.</p>
<p dir="auto" tabindex="-1">See more on the <a href="https://github.com/wokwi/rp2040js"  rel="noopener">GitHub page</a> and on <a href="https://hackaday.io/project/177082-raspberry-pi-pico-emulator"  rel="noopener">Hackaday</a>.</p>
</div>
]]></content:encoded>
					
		
		<enclosure url="" length="0" type="" />

			</item>
	</channel>
</rss>
