<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Digital Urban</title>
	<atom:link href="http://www.digitalurban.org/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.digitalurban.org/</link>
	<description>Data, Cities, IoT, Writing, Music and Making Things</description>
	<lastBuildDate>Mon, 01 Dec 2025 16:49:45 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>The Album: Place and Space</title>
		<link>https://www.digitalurban.org/blog/2025/12/01/the-album-place-and-space/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Mon, 01 Dec 2025 16:34:27 +0000</pubDate>
				<category><![CDATA[Music]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/?p=170079144</guid>

					<description><![CDATA[<p>We are pleased to announce that the Album &#8211; Place and Space is now available across all streaming formats or to purchase via Apple Music/Amazon. Simply search for Digital Urban...</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/12/01/the-album-place-and-space/">The Album: Place and Space</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>We are pleased to announce that the Album &#8211; Place and Space is now available across all streaming formats or to purchase via <a href="https://music.apple.com/us/album/place-and-space/1856448273">Apple Music</a>/<a href="https://www.amazon.co.uk/dp/B0G3TMYGMZ/ref=sr_1_1?crid=WHIFIRLY91OX&amp;dib=eyJ2IjoiMSJ9.z6RNTxC94kcNG1ouAB9cFi-dpXk_5XFZ9jVLyx6VbP-0K585ZRjns28oSgtNC0IEaSVJQ_fmFa9bIU3qiyC_o09WwWwdr9iTm7GITSwm_wa0oZVReXi-D8VsQafwS2poxpF5MBRvVjS_ikSCu-FcodfU3yZWrgZj_IUlLw7xlWWCN9SCEw3KoNWX0rZfAqh9DtuAPniP5BxJEq6cTzdN-uLdr-wsDjfXmAJkQq9Bb9U.0-Go17prYNgmfpr2OnnTselH_2C4AMhNf7Lu0fNVG2E&amp;dib_tag=se&amp;keywords=place+and+space&amp;qid=1764607004&amp;s=dmusic&amp;sprefix=place+and+space%2Cdigital-music%2C83&amp;sr=1-1">Amazon</a>. Simply search for Digital Urban or Place and Space on your platform of choice.</p>
<p><iframe style="border-radius: 12px;" src="https://open.spotify.com/embed/album/6zrN9Mgj3pf7xvF6GXXqsf?utm_source=generator" width="100%" height="352" frameborder="0" allowfullscreen="allowfullscreen" data-testid="embed-iframe"></iframe></p>
<p>Place and Space is themed around our recent jointly authored book &#8216;<a href="https://www.digitalurban.org/blog/2025/10/27/cities-in-the-metaverse-book/">Cities in the Metaverse: </a>Spatial Computing, Digital Twins, Avatars, Economics and Digital Habitation on the New Frontier&#8217; and consists of 10 tracks:</p>
<p class="p1">Digital Habitation</p>
<p class="p1">The New Frontier</p>
<p class="p1">Simulate</p>
<p class="p1">Invisible</p>
<p class="p1">Play the Game of Life</p>
<p class="p1">Mirror Worlds</p>
<p class="p1">Fork the Branch</p>
<p class="p1">Convariance Matrix</p>
<p class="p1">Let me See (in Augmented Reality)</p>
<p>We hope you enjoy it, over the coming weeks will have a &#8216;making of post&#8217;&#8230;</p>
<p>Andy</p>
<p>&nbsp;</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/12/01/the-album-place-and-space/">The Album: Place and Space</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Stepper Motor Linear Data Gauge</title>
		<link>https://www.digitalurban.org/blog/2025/11/13/stepper-motor-linear-data-gauge/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Thu, 13 Nov 2025 12:01:34 +0000</pubDate>
				<category><![CDATA[Making]]></category>
		<category><![CDATA[Data]]></category>
		<category><![CDATA[gauge]]></category>
		<category><![CDATA[iot]]></category>
		<category><![CDATA[stepper]]></category>
		<category><![CDATA[Weather]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/?p=170079124</guid>

					<description><![CDATA[<p>The latest upload to the Open Gauges Github is a Linear Gauge, using a timing belt to provide a full 1 metre range for the data visualisation. It uses a...</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/11/13/stepper-motor-linear-data-gauge/">Stepper Motor Linear Data Gauge</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The latest upload to the <a href="https://github.com/ucl-casa-ce/Open-Gauges/tree/main">Open Gauges Github</a> is a Linear Gauge, using a timing belt to provide</p>
<div id="attachment_170079126" style="width: 210px" class="wp-caption alignright"><img fetchpriority="high" decoding="async" aria-describedby="caption-attachment-170079126" class="wp-image-170079126 size-large" src="https://www.digitalurban.org/wp-content/uploads/2025/11/WindStepperLinear-200x1024.png" alt="Linear Stepper Motor Gauge" width="200" height="1024" srcset="https://www.digitalurban.org/wp-content/uploads/2025/11/WindStepperLinear-200x1024.png 200w, https://www.digitalurban.org/wp-content/uploads/2025/11/WindStepperLinear-300x1536.png 300w, https://www.digitalurban.org/wp-content/uploads/2025/11/WindStepperLinear-400x2048.png 400w, https://www.digitalurban.org/wp-content/uploads/2025/11/WindStepperLinear.png 402w" sizes="(max-width: 200px) 100vw, 200px" /><p id="caption-attachment-170079126" class="wp-caption-text">Linear Stepper Motor Gauge</p></div>
<p>a full 1 metre range for the data visualisation. It uses a stepper motor for precise needle movement, and a limit switch for calibration, it can be adapted to any MQTT data feed and any base mount. The example shown uses a 5cm by 10cm peice of wood, cut to 1 metre length and indicates wind speed from 0 to 60 mph.</p>
<p class="p1">The design uses a stepper motor (like the 28BYJ-48) which offers high-precision, 360-degree movement without the jitter or limited range of a standard servo. The limit switch allows the gauge to &#8220;home&#8221; itself on startup, ensuring the pointer always starts at a known zero position.</p>
<p class="p1">The main code &#8211; <span class="s1">WindStepperTimerBeltwithLimitSwitch.ino</span> in the Github has a distance calibration number, adjust for your range.</p>
<p class="p1"><b>Hardware Components</b></p>
<ul class="ul1">
<li class="li1">Arduino-compatible Board: Any board like an Arduino Uno, Nano, or a NodeMCU.</li>
<li class="li1">Stepper Motor: 28BYJ-48 5V stepper motor.</li>
<li class="li1">Stepper Driver: ULN2003 driver board (which often comes with the 28BYJ-48).</li>
<li class="li1">Limit Switch: A small microswitch to detect the pointers zero position.</li>
<li class="li1">Power Supply: USB.</li>
<li class="li1">Timer Belt <a href="https://www.amazon.co.uk/Timing-Pulley-Tensioner-Torsion-Printer/dp/B0C54ZXM88/ref=sxin_15_pa_sp_search_thematic_sspa?content-id=amzn1.sym.0a6bbb1a-ed2d-4392-adfc-40ed1cfcd8e2%3Aamzn1.sym.0a6bbb1a-ed2d-4392-adfc-40ed1cfcd8e2&amp;crid=L07UDXXKCZFX&amp;cv_ct_cx=timing%2Bbelt%2Bgt2&amp;keywords=timing%2Bbelt%2Bgt2&amp;pd_rd_i=B0C54ZXM88&amp;pd_rd_r=12141bbe-35d0-4c0a-8811-d7f399206de4&amp;pd_rd_w=AVwDb&amp;pd_rd_wg=JG5Mx&amp;pf_rd_p=0a6bbb1a-ed2d-4392-adfc-40ed1cfcd8e2&amp;pf_rd_r=H6T6R7VAGD99FGNPH875&amp;qid=1763028887&amp;sbo=RZvfv%2F%2FHxDF%2BO5021pAnSA%3D%3D&amp;sprefix=timer%2Bbelt%2Bgt2%2Caps%2C99&amp;sr=1-5-ad3222ed-9545-4dc8-8dd8-6b2cb5278509-spons&amp;aref=vwr3X339Nm&amp;sp_csd=d2lkZ2V0TmFtZT1zcF9zZWFyY2hfdGhlbWF0aWM&amp;th=1"><span class="s2">GT2 Timer Belt</span></a></li>
</ul>
<p>It is made to be as simple to build/power as possible but also adatable for a number of senarios.</p>
<p>The github provides the mount for the stepper motor, the pointer (which also joins together the timing belt) the limit switch and the end mount for the pulley. These allow the gauge to be adapted to any size required.</p>
<p>At the moment the gauge is sitting on the wall in our lounge and it has become one of our most used guages. The data updates every minute to show the maximum wind gust and due to the nature of the stepper motor, it provides a smooth movement, almost replicating the gust of wind.</p>
<p>&nbsp;</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/11/13/stepper-motor-linear-data-gauge/">Stepper Motor Linear Data Gauge</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Ships Lamp Wind Speed Gauge</title>
		<link>https://www.digitalurban.org/blog/2025/11/12/ships-lamp-wind-speed-gauge/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Wed, 12 Nov 2025 11:39:29 +0000</pubDate>
				<category><![CDATA[Making]]></category>
		<category><![CDATA[Data]]></category>
		<category><![CDATA[iot]]></category>
		<category><![CDATA[micropytho]]></category>
		<category><![CDATA[physical objects]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/?p=170079118</guid>

					<description><![CDATA[<p>Our MicroPython project turns a strip of NeoPixel LEDs into a “ship&#8217;s lamp&#8221; style wind speed gauge. It connects to an MQTT broker to receive real-time wind speed data and...</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/11/12/ships-lamp-wind-speed-gauge/">Ships Lamp Wind Speed Gauge</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Our MicroPython project turns a strip of NeoPixel LEDs into a “ship&#8217;s lamp&#8221; style wind speed gauge. It connects to an MQTT broker to receive real-time wind speed data and translates it into a flickering, color-coded light.</p>
<p>The light simulates an oil lamp by staying &#8220;steady&#8221; for a random period and then &#8220;flickering&#8221; for a short time by rapidly dimming and brightening &#8211; the flicker speed changes acording to the wind speed. The full code and details are available on <a href="https://github.com/ucl-casa-ce/Open-Gauges">Github as part of the ever growing Open Gauges Project</a>.</p>
<div id="attachment_170079120" style="width: 1034px" class="wp-caption aligncenter"><img decoding="async" aria-describedby="caption-attachment-170079120" class="wp-image-170079120 size-large" src="https://www.digitalurban.org/wp-content/uploads/2025/11/shipslamp2025-1024x768.jpeg" alt="Ships Lamp" width="1024" height="768" srcset="https://www.digitalurban.org/wp-content/uploads/2025/11/shipslamp2025-1024x768.jpeg 1024w, https://www.digitalurban.org/wp-content/uploads/2025/11/shipslamp2025-300x225.jpeg 300w, https://www.digitalurban.org/wp-content/uploads/2025/11/shipslamp2025-768x576.jpeg 768w, https://www.digitalurban.org/wp-content/uploads/2025/11/shipslamp2025-1536x1152.jpeg 1536w, https://www.digitalurban.org/wp-content/uploads/2025/11/shipslamp2025.jpeg 2016w" sizes="(max-width: 1024px) 100vw, 1024px" /><p id="caption-attachment-170079120" class="wp-caption-text">Ships Lamp</p></div>
<h2>Features</h2>
<ul>
<li><strong>Real-time Data:</strong> Connects to an MQTT broker to subscribe to a wind speed topic.</li>
<li><strong>Weather Map Gradient:</strong> Displays wind speed using an intuitive &#8220;weather map&#8221; color gradient:
<ul>
<li><strong>0 mph:</strong> Off (Black)</li>
<li><strong>1-10 mph:</strong> Solid Green</li>
<li><strong>10-20 mph:</strong> Fades from Green → Yellow</li>
<li><strong>20-30 mph:</strong> Fades from Yellow → Orange</li>
<li><strong>30-40 mph:</strong> Fades from Orange → Red</li>
<li><strong>40+ mph:</strong> Solid Red</li>
</ul>
</li>
<li><strong>Realistic Flicker Effect:</strong> The light doesn&#8217;t just stay solid; it cycles between a &#8220;steady&#8221; phase (10-60s) and a &#8220;flicker&#8221; phase (5-15s) to simulate a real lamp.</li>
<li><strong>Asynchronous &amp; Resilient:</strong> Built using <code>uasyncio</code> and <code>mqtt_as</code>. The <code>mqtt_as</code> library automatically handles and recovers from WiFi or MQTT broker disconnections, re-subscribing to topics as needed.</li>
<li><strong>Hardware Watchdog:</strong> Uses the Pico&#8217;s built-in <code>machine.WDT</code> (Watchdog Timer) to automatically reboot the device <em>only</em> if the main code loop freezes, ensuring high reliability. (This replaces the old 60-minute timer).</li>
<li><strong>Status LEDs:</strong> Provides a heartbeat flash on one LED and a WiFi status indicator on another.</li>
</ul>
<h2>Hardware Requirements<strong style="font-size: 16px;"><img decoding="async" style="font-weight: 400;" src="https://github.com/ucl-casa-ce/Open-Gauges/raw/main/Contributed/ShipsLamp/shipslamp.jpeg" alt="MQTT Ships Lamp" /></strong></h2>
<ul>
<li><strong>Raspberry Pi Pico W:</strong> (or any Pico with a WiFi-capable board).</li>
<li><strong>NeoPixel LED Strip:</strong> The code is configured for a strip, but can be any WS812B/NeoPixel compatible LEDs.</li>
<li><strong>Power Supply:</strong> A sufficient power supply for your LED strip (a strip of 60 LEDs can draw several amps at full brightness).</li>
<li><strong>A Ships Lamp (old or new).</strong></li>
</ul>
<h3>Default Pinout (Pico W)</h3>
<ul>
<li><strong>NeoPixel Data:</strong> <code>GP15</code></li>
<li><strong>Blue LED (Heartbeat):</strong> <code>blue_led</code> (defined in <code>config.py</code>, often the onboard LED).</li>
<li><strong>WiFi LED:</strong> <code>wifi_led</code> (defined in <code>config.py</code>).</li>
</ul>
<h2>Software &amp; Dependencies</h2>
<p>This project relies on a few key MicroPython libraries that you must have on your Pico:</p>
<ol>
<li><strong><code>neopixel.py</code>:</strong> The standard Adafruit NeoPixel library for MicroPython.</li>
<li><strong><code>mqtt_as.py</code>:</strong> A robust, asynchronous MQTT client. You can find it <a href="https://github.com/peterhinch/micropython-mqtt/blob/master/mqtt_as/mqtt_as.py" target="_blank" rel="noopener noreferrer">here</a>.</li>
<li><strong><code>config.py</code>:</strong> A file you must create to hold your credentials and pin definitions.</li>
</ol>
<h2>Configuration</h2>
<p>You <strong>must</strong> create a <code>config.py</code> file in the root of your Pico&#8217;s filesystem. This file should contain:</p>
<ol>
<li>Your WiFi and MQTT broker credentials.</li>
<li>Definitions for your <code>wifi_led</code> and <code>blue_led</code>.</li>
</ol>
<p>The <code>mqtt_as</code> library expects the <code>config.py</code> to contain a <code>config</code> dictionary.</p>
<p><strong>Example <code>config.py</code>:</strong></p>
<pre class="wp-block-code"><code># config.py
from machine import Pin

# --- WiFi Configuration ---
config['wifi_led'] = Pin("WL_GPIO0", Pin.OUT) # Onboard LED on Pico W
config['ssid'] = 'YOUR_WIFI_SSID'
config['wifi_pw'] = 'YOUR_WWIFI_PASSWORD'

# --- MQTT Configuration ---
# This example is for the open broker mqtt.cetools.org
config['server'] = 'mqtt.cetools.org'
config['port'] = 1884
config['client_id'] = 'pico_ships_lamp' # Or any unique ID

# --- Optional: For Secured Brokers ---
# If your broker requires a username and password, add these lines:
# config['user'] = 'YOUR_MQTT_USER'
# config['password'] = 'YOUR_MQTT_PASSWORD'

# --- Other Hardware ---
# This is for the heartbeat LED
blue_led = Pin(10, Pin.OUT) # Example: an external LED on GP10
</code></pre>
<h2>Running the Project</h2>
<ol>
<li>Upload <code>main.py</code>, <code>neopixel.py</code>, <code>mqtt_as.py</code>, and your <code>config.py</code> to your Raspberry Pi Pico.</li>
<li>Reset the device.</li>
<li>The device will automatically connect to your WiFi and MQTT broker.</li>
<li>It will subscribe to the topic <code>personal/ucfnaps/downhamweather/windSpeed_mph</code>.</li>
<li>As messages are published to that topic, the ship&#8217;s lamp will spring to life!</li>
</ol>
<h2>Customizing</h2>
<ul>
<li><strong>LED Count:</strong> Change the <code>numpix</code> variable at the top of <code>main.py</code> to match your strip.</li>
<li><strong>Data Pin:</strong> Change the <code>15</code> in <code>pixels = Neopixel(numpix, 0, 15, "GRB")</code> to match your data pin.</li>
<li><strong>MQTT Topic:</strong> Change the a topic name in the <code>conn_han</code> function to subscribe to your own data source.</li>
</ul>
<p>The post <a href="https://www.digitalurban.org/blog/2025/11/12/ships-lamp-wind-speed-gauge/">Ships Lamp Wind Speed Gauge</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Will AI Push the Human Planner to the Point of Irrelevance?</title>
		<link>https://www.digitalurban.org/blog/2025/11/06/will-ai-push-the-human-planner-to-the-point-of-irrelevance/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Thu, 06 Nov 2025 11:52:28 +0000</pubDate>
				<category><![CDATA[planning]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Planning]]></category>
		<category><![CDATA[writing]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/?p=170079103</guid>

					<description><![CDATA[<p>The post <a href="https://www.digitalurban.org/blog/2025/11/06/will-ai-push-the-human-planner-to-the-point-of-irrelevance/">Will AI Push the Human Planner to the Point of Irrelevance?</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[
		<div id="fws_69ca487ea5a8d"  data-column-margin="default" data-midnight="dark"  class="wpb_row vc_row-fluid vc_row"  style="padding-top: 0px; padding-bottom: 0px; "><div class="row-bg-wrap" data-bg-animation="none" data-bg-animation-delay="" data-bg-overlay="false"><div class="inner-wrap row-bg-layer" ><div class="row-bg viewport-desktop"  style=""></div></div></div><div class="row_col_wrap_12 col span_12 dark left">
	<div  class="vc_col-sm-12 wpb_column column_container vc_column_container col no-extra-padding inherit_tablet inherit_phone "  data-padding-pos="all" data-has-bg-color="false" data-bg-color="" data-bg-opacity="1" data-animation="" data-delay="0" >
		<div class="vc_column-inner" >
			<div class="wpb_wrapper">
				
<div class="wpb_text_column wpb_content_element " >
	<div class="wpb_wrapper">
		<hr class="wp-block-separator has-alpha-channel-opacity" />
<p> </p>
<h3 class="wp-block-heading"> </h3>
<p>The title is, of course, controversial. The question, however, comes from the closing section of a 2020 paper by Wargent, M., Moore, T., &amp; Tomaney, J. (2020), and arguably, it&#8217;s looking like the answer is Yes, and soon.  The impact will be profound, bringing with it impacts not only to the day-to-day professionals on the ground practising the art of planning but also the various planning schools around the country.</p>
<p>The role of Planning in the UK is clearly at a crossroads, in the line of sight of savings cuts and AI while the concept of Digital Planning finally comes into focus. There is the excellent Digital Task Force, The Connected Places Catapult and others looking at the future of the planning system; indeed, my own department at University College London suggested an ‘Online Planning’ system back in 2002. At that point, tech was seen as ‘for nerds’ as the RTPI Magazine wonderfully retitled our work, somehow failing to grasp the importance of digital on the future of the profession.</p>
<p>Fast forward 20 years onwards and Digital Planning is finally a thing, but it is arguably too late and the digital technology they are racing to embrace is the very technology that will replace them. Of course, such views are perhaps controversial, some would say clickbait, but it seems to be the elephant in the room. At various Catapult, Government, academic and social events no one really seems to be doing a proper future cast and it&#8217;s not even the distant future, it&#8217;s merely looking 5 to 10 years out. A future where not only the planner but also planning schools could be replaced by the technology the sector failed to see coming.</p>
<p>The smoking gun in this is the equally controversial £8.33 million tender for the &#8220;<a href="http://MHCLG Augmented Planning Decisions">MHCLG Augmented Planning Decisions</a>&#8220;.</p>
<p>The tender seeks to develop a planning tool that enables AI-augmented decision making for planning applications. The initial focus will be on householder developments (as defined in Town and Country Planning (Development Management Procedure) (England) Order 2015) with a view to expand into further application types within the &#8216;other&#8217; category (those not classified as Major or Minor) which represent 69% of all planning applications. The objective is to dramatically reduce planning application processing times initially targeting a reduction from upwards of 8 weeks to circa 4 weeks, with a long-term vision of near-instant decisions for straightforward applications.</p>
<p>It signals the shift from people talking about AI to actively making it part of the system. This framework looks to create a government &#8220;App Store,&#8221; allowing 350+ local councils to instantly procure AI-driven tools to digitise their plans, automate validation, and process applications.</p>
<p>But this move also begs two questions. First, why is the government fumbling with an £8.3 million framework when it could just ask Google, Elon Musk or Microsoft to fix the problem?</p>
<p>And second, the one that really matters to thousands of people on the ground, reading between the lines it&#8217;s actually about cost savings across local government, something of course people will deny, but in reality, it&#8217;s using technology to automate the system and thus design out the planner.</p>
<p>It’s tempting to look at that £8.3 million figure and see it as a sign the government is &#8220;late&#8221; or &#8220;cheap,&#8221; especially when Big Tech firms wield billion-dollar AI models. But this misunderstands the problem. The government will have, of course, talked to the big players but the problem is more complex. Google&#8217;s AI is &#8220;horizontal&#8221;—it knows a little about everything. UK planning is a &#8220;vertical&#8221; problem—it requires deep, specialist knowledge of a niche, legally complex system. A generalist AI doesn&#8217;t know what a Section 106 agreement is, nor does it care about the specific, contradictory policies of 350 different local councils. Although arguably the technology is moving so rapidly that we already have people in our department saying they could build it in a week and tbh a demo could be built rapidly and at low cost, but it&#8217;s mainly due to the fact that the UK planning sector is a tiny, unprofitable market.</p>
<p>The problem is critical for national infrastructure but perhaps too small for tech giants to solve. The government must therefore step in to create a market. This tender is an £8.3 million signal to smaller, specialist &#8220;PropTech&#8221; companies: &#8220;If you build the niche tools, we will guarantee you a path to market.&#8221; It also addresses the issue of handing all the UK&#8217;s sensitive planning data to a single tech giant which would be a legal, political, and data-sovereignty challenge.</p>
<p>This framework is therefore perhaps a pragmatic and necessary step to build a specialist, competitive market for the specific tools the system actually needs, but also one that risks putting its own data into a black box with issues around trust around the algorithms and plunging itself back into service agreements with whoever wins the tender – arguably the focus should be on an open source system but behind it will also be the need to commercialise. So the system shoots itself in the foot.</p>
<p><strong>Who Gets Replaced?</strong></p>
<p>Let&#8217;s be brutally honest. The government&#8217;s goal of achieving £45 billion in public sector savings isn&#8217;t just about making planners&#8217; lives easier. It&#8217;s about automation, and automation replaces human tasks. The tools being procured by this framework are aimed squarely at the &#8220;low-hanging fruit&#8221; of the planning system.</p>
<p>The &#8220;on the ground&#8221; roles most at risk are not the senior planners, but the vital administrative and technical staff that support them.</p>
<p>The Validation Officer: Their job is a manual, checklist-based task: &#8220;Are all 50+ required documents present?&#8221; An AI can do this in 0.2 seconds. This role is the primary target for automation.</p>
<p>The Planning Admin: Their role involves scanning, redacting, and uploading thousands of public consultation comments. An AI can read, group by theme (e.g., &#8220;Parking: 4,520 objections&#8221;), and summarise 10,000 comments before a human has finished their first coffee.</p>
<p>The Junior Planner / Technician: A part of their early-career work is the &#8220;science&#8221; of planning: looking up policies, using GIS systems, and cross-referencing a proposal against 500 pages of the Local Plan. An AI, trained on a new, digitised Local Plan, will do this instantly, flagging every breach.</p>
<p>For the people in these roles, AI is not an &#8220;augmenting&#8221; tool; it is a replacement. This will lead to leaner, smaller planning departments, which is precisely the &#8220;cost-saving&#8221; and &#8220;efficiency&#8221; the government is aiming for.</p>
<p><strong>What&#8217;s Left? The Planner as the &#8216;Human-in-the-Loop&#8217;</strong></p>
<p>If AI is automating validation, consultation, and policy-checking, what is the MSc-qualified planner left to do?</p>
<p>The positive view would be everything. They are finally freed from being a process manager and can become the strategic expert they were trained to be. The planner&#8217;s new role will be to manage the AI&#8217;s output, overrule it, and apply the 20% of human skills that create 80% of the value. The AI can do the &#8220;science,&#8221; but not the &#8220;art.&#8221; Sadly I don’t think that&#8217;s actually true and AI is coming for the &#8220;Art&#8221; part as well, including design and architecture, but that’s another post.</p>
<p>So will AI Push the human planner to the point of irrelevance? &#8211; I would argue yes and this post can be revisited by those (many) who will disagree in 10 years&#8217; time. We were right almost 20 years ago when we called for a Digital Planning system but the speed of AI has caught most of us out and the government talks about pushing the UK’s tech sector while also seeing savings out of the corner of its eye.</p>
<p>The implications for planning education are profound. The &#8220;routine&#8221; administrative and technical jobs—the validation roles, the junior policy-checking—are the very &#8220;on-ramp&#8221; positions that MSc graduates have relied on for decades to enter the profession and they will be gone. If AI automates this bottom layer, the profession is &#8220;hollowed out&#8221; from the bottom up. The only point of entry will be at a higher, strategic level. This creates a crisis for universities:</p>
<p>The MSc curriculum must change, fast (and that&#8217;s something its not good at). It can no longer just be about law, theory, and placemaking (the &#8220;art&#8221;). It must now formally integrate data science, digital literacy, and AI ethics (the &#8220;science&#8221;).</p>
<p>The planner&#8217;s role shifts. The graduate of 2027 will be an &#8216;AI-manager&#8217; and &#8216;ethical gatekeeper&#8217;, whose job is to question, interpret, and overrule the AI&#8217;s &#8220;near-instant&#8221; recommendations. Of course planning sits within a regulatory framework, so arguably along side AI will be a relaxation in some of the roles of planning committes, allowing a more automated system to go forward, perhaps we are already seeing some hints at this moving forward.</p>
<p>Future planners will be feeders and checkers of the Algorithm – typing in ‘make me a local plan for…. add in 1000 homes with a mixed development in the least controversial areas’ and checking, tweaking what comes out &#8211; still planning but different from what we have ever known before.</p>
<p>What comes out of AI is currently viewed as ‘AI Slop’ but it will only remain slop for a short while. The shift is coming and it&#8217;s no longer Digital Planning it&#8217;s Automated AI Generated Planning – one that has a higher level of expertise than a human.</p>
<p>Let&#8217;s just hope the output of the £8 million call is not a black box system linked to a monthly service charge for use with an algorithm that people in suits say has been tested but has the potential to blight our future landscape for years to come &#8211; the one thing about Models and Planning is they don&#8217;t actually work, life is simply more complex that the data we put in, and in the shake-down in 10 years&#8217; time, that might be where the human wins and actually the human replaces AI.</p>
<p><em><strong>Note &#8211; this text forms part of a thought piece for the forthcoming co-authored book Digital Cities of Tomorrow.</strong></em></p>
<p>Wargent, M., Moore, T., &amp; Tomaney, J. (2020). Will AI push the human planner to the point of irrelevance? <em>Planning Theory &amp; Practice</em>, 21(4), 652-658. DOI: 10.1080/14649357.2020.1776014)</p>
<p> </p>
	</div>
</div>




			</div> 
		</div>
	</div> 
</div></div><p>The post <a href="https://www.digitalurban.org/blog/2025/11/06/will-ai-push-the-human-planner-to-the-point-of-irrelevance/">Will AI Push the Human Planner to the Point of Irrelevance?</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Just Weather for The Pebble Watch: Making a Data Watch Face</title>
		<link>https://www.digitalurban.org/blog/2025/10/30/just-weather-for-the-pebble-watch-making-a-data-watch-face/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Thu, 30 Oct 2025 12:59:40 +0000</pubDate>
				<category><![CDATA[Apps]]></category>
		<category><![CDATA[copilot]]></category>
		<category><![CDATA[Data]]></category>
		<category><![CDATA[github]]></category>
		<category><![CDATA[Pebble Watch]]></category>
		<category><![CDATA[Weather]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/?p=170079095</guid>

					<description><![CDATA[<p>The post <a href="https://www.digitalurban.org/blog/2025/10/30/just-weather-for-the-pebble-watch-making-a-data-watch-face/">Just Weather for The Pebble Watch: Making a Data Watch Face</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[
		<div id="fws_69ca487eb09f1"  data-column-margin="default" data-midnight="dark"  class="wpb_row vc_row-fluid vc_row"  style="padding-top: 0px; padding-bottom: 0px; "><div class="row-bg-wrap" data-bg-animation="none" data-bg-animation-delay="" data-bg-overlay="false"><div class="inner-wrap row-bg-layer" ><div class="row-bg viewport-desktop"  style=""></div></div></div><div class="row_col_wrap_12 col span_12 dark left">
	<div  class="vc_col-sm-12 wpb_column column_container vc_column_container col no-extra-padding inherit_tablet inherit_phone "  data-padding-pos="all" data-has-bg-color="false" data-bg-color="" data-bg-opacity="1" data-animation="" data-delay="0" >
		<div class="vc_column-inner" >
			<div class="wpb_wrapper">
				
<div class="wpb_text_column wpb_content_element " >
	<div class="wpb_wrapper">
		<p>It&#8217;s an exciting time to be a Pebble fan. After years of being kept alive by the dedicated Rebble community, the Pebble is officially back. The <a href="https://store.repebble.com/">new Pebble 2 Duo watches</a> (the black-and-white model) are officially shipping to the first backers, with the high-resolution color Pebble Time 2 set to follow.</p>
<p>So, I decided to make one myself.</p>
<p>&nbsp;</p>
<h2 class="wp-block-heading">Introducing &#8216;Just Weather&#8217;</h2>
<p>I wanted a face that was clean, digital, and gave me all the key data at a glance, formatted to look great on the 144&#215;168 screen of the Pebble 2. I call it <strong>&#8220;Just Weather.&#8221;</strong></p>
<p>It uses the free Open-Meteo API to pull in a ton of useful, hyperlocal data right to your wrist:</p>
<p>&nbsp;</p>
<ul class="wp-block-list">
<li style="list-style-type: none;">
<ul>
<li>Current Location (from your phone&#8217;s GPS)</li>
<li>Temperature</li>
<li>Current Conditions (&#8220;Partly Cloudy,&#8221; &#8220;Rain,&#8221; etc.)</li>
<li>Barometric Pressure &amp; 3-Hour Trend</li>
<li>Wind Speed &amp; Precipitation</li>
<li>&#8230;and of course, the time!</li>
</ul>
</li>
</ul>
<p>&nbsp;</p>
<p>&nbsp;</p>
<h2 class="wp-block-heading">Built with GitHub CoPiliot</h2>
<p>The best part is that the new Pebble development workflow is incredibly modern. I was able to build this using the <strong>CloudPebble IDE</strong>, which now integrates directly with <strong>VS Code in the browser</strong>.</p>
<p>This meant I could use the emerging tools like <strong>GitHub Copilot</strong> to help generate the code and work through the trickiest parts—like making direct HTTPS requests to the weather API, which (after a lot of testing!) we proved is possible from the phone app.</p>
<p>After getting the data, the final step was tweaking the C code to make sure the layout wasn&#8217;t clipped and all the information fit perfectly on the 144&#215;168 screen. It&#8217;s now compatible with watches in the Pebble family, from the original Pebble Time (color) to the new <strong>Pebble 2 Duo</strong> and the upcoming <strong>Pebble Time 2</strong>.</p>
<p>&nbsp;</p>
<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="727" class="wp-image-170079097" src="https://www.digitalurban.org/wp-content/uploads/2025/10/Screenshot-2025-10-30-at-12.49.33-1024x727.png" alt="Pebble Watch Face" srcset="https://www.digitalurban.org/wp-content/uploads/2025/10/Screenshot-2025-10-30-at-12.49.33-1024x727.png 1024w, https://www.digitalurban.org/wp-content/uploads/2025/10/Screenshot-2025-10-30-at-12.49.33-300x213.png 300w, https://www.digitalurban.org/wp-content/uploads/2025/10/Screenshot-2025-10-30-at-12.49.33-768x545.png 768w, https://www.digitalurban.org/wp-content/uploads/2025/10/Screenshot-2025-10-30-at-12.49.33.png 1440w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption class="wp-element-caption">Pebble Watch Face</figcaption></figure>
<p>&nbsp;</p>
<h2 class="wp-block-heading">Available Now</h2>
<p>This project took around 6 hours, with the main issue being that Co-Pilot did not know how to get HTTP requests &#8211; it took me down a lot of rabbit holes, and in the end, it was down to using a simpler call on the data &#8211; XMLHttpRequest. Once this worked it all fell into place and it was simply a case of asking Copilot to add in the data fields, do the geocoding and then take a step back and explain how the code actually works.</p>
<p>If you&#8217;re like me and just want a simple, data-rich weather face, please give it a try&#8230;</p>
<p>&nbsp;</p>
<ul class="wp-block-list">
<li style="list-style-type: none;">
<ul>
<li><strong>Download &#8216;<a href="https://apps.rebble.io/en_US/application/69034d22d004720008412cf1">Just Weather</a>&#8216; from the Rebble Appstore</strong></li>
<li><strong>Check out <a href="https://github.com/digitalurban/just-weather-pebble-watchface">the source code on GitHub</a> for the latest updates &#8211; now includes its own settings page (its become a proper watch face app)&#8230;</strong></li>
</ul>
</li>
</ul>
<p>&nbsp;</p>
<p>&nbsp;</p>
	</div>
</div>




			</div> 
		</div>
	</div> 
</div></div><p>The post <a href="https://www.digitalurban.org/blog/2025/10/30/just-weather-for-the-pebble-watch-making-a-data-watch-face/">Just Weather for The Pebble Watch: Making a Data Watch Face</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Cities in the Metaverse: Spatial Computing, Digital Twins, Avatars, Economics and Digital Habitation on the New Frontier</title>
		<link>https://www.digitalurban.org/blog/2025/10/27/cities-in-the-metaverse-book/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Mon, 27 Oct 2025 11:25:42 +0000</pubDate>
				<category><![CDATA[Books and Papers]]></category>
		<category><![CDATA[cities]]></category>
		<category><![CDATA[metaverse]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/?p=170079089</guid>

					<description><![CDATA[<p>We are pleased to announce the publication of our new book &#8211; Cities in the Metaverse: Spatial Computing, Digital Twins, Avatars, Economics and Digital Habitation on the New Frontier and...</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/10/27/cities-in-the-metaverse-book/">Cities in the Metaverse: Spatial Computing, Digital Twins, Avatars, Economics and Digital Habitation on the New Frontier</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>We are pleased to announce the publication of our new book &#8211; <strong>Cities in the Metaverse: Spatial Computing, Digital Twins, Avatars, Economics and Digital Habitation on the New Frontier</strong> and we have 20% off using the code 25ESA4 via via <a href="https://www.routledge.com/Cities-in-the-Metaverse-Spatial-Computing-Digital-Twins-Avatars-Economics-and-Digital-Habitation-on-the-New-Frontier/Hudson-Smith-Wilson-Signorelli/p/book/9781032576695">Routledge</a> (it also available via all good bookshops/Amazon, etc),  the book offers a comprehensive exploration of the intersection between urban environments and digital realms.</p>


<div class="wp-block-image">
<figure class="alignright size-full"><img loading="lazy" decoding="async" width="366" height="525" class="wp-image-170079090" src="https://www.digitalurban.org/wp-content/uploads/2025/10/Screenshot-2025-10-27-at-11.17.59.png" alt="Cities in the Metaverse Book" srcset="https://www.digitalurban.org/wp-content/uploads/2025/10/Screenshot-2025-10-27-at-11.17.59.png 366w, https://www.digitalurban.org/wp-content/uploads/2025/10/Screenshot-2025-10-27-at-11.17.59-209x300.png 209w" sizes="auto, (max-width: 366px) 100vw, 366px" />
<figcaption class="wp-element-caption">Cities in the Metaverse Book</figcaption>
</figure>
</div>


<p>The book examines:</p>



<p>·         The size and shape of cities in the Metaverse</p>



<p>·         Spatial computing and its impact on digital interaction</p>



<p>·         Digital twins in urban planning and management</p>



<p>·         Avatars and social dynamics in virtual spaces</p>



<p>·         Economic models emerging in the Metaverse</p>



<p>·         The influence of gaming on immersive digital landscapes</p>



<p>·         The impact of Artificial Intelligence on the Metaverse</p>



<p>·         Potential futures of digital habitation</p>



<p>Drawing from architecture, computer science, urban planning, geography, social studies, and economics, the authors provide a multidisciplinary analysis of how virtual cities are shaping our digital future. Using key case studies, they trace the evolution from early cyberspace concepts to current spatial computing technologies, offering insights into both historical context and future possibilities. The book addresses key questions about the opportunities and challenges presented by metaverse technologies, including issues of accessibility, creativity, and the future of humans and artificial intelligence co-existing, side by side, in digital spaces. It serves as a practical guide, equipping readers with the knowledge they need to navigate the future of urban life in virtual environments with a thought-provoking examination of how we might build, inhabit, and govern cities in the Metaverse.</p>



<p>The authors explore the concept of digital twins, demonstrating how these virtual replicas of physical spaces can revolutionise urban planning and management. They delve into the social aspects of the metaverse, examining how avatars shape our interactions and relationships in digital realms. Economic considerations are central to the book ethos, with an analysis of emerging models that often leverage blockchain technologies. The book addresses the challenges, potential pitfalls and ethical considerations in creating and inhabiting digital cities. At the same time, it takes a step back and examines already abandoned digital worlds, offering lessons from past attempts at creating virtual spaces. </p>



<p>Essential reading for urban planners, geographers, economists, technologists, policymakers, and anyone interested in the future of cities and digital interaction, <em>Cities in the Metaverse</em> provides a balanced, informed perspective on this rapidly evolving field.</p>



<h2 class="wp-block-heading">Authors:</h2>



<p><strong>Andrew Hudson-Smith</strong> is a Professor of Digital Urban Systems at the Centre for Advanced Spatial Analysis (CASA), University College London. He focuses on real-time data, virtual environments and the Internet of Things within the urban environment. He is also an elected Fellow of the Royal Society of Arts, A Fellow of the Academy of Social Sciences and a Fellow of the Royal Geographic Society. Socials: @digitalurban</p>



<p><strong>Duncan Wilson</strong> is a Professor of Connected Environments at the Centre for Advanced Spatial Analysis (CASA), University College London. His research focuses on how emerging technologies, such as connected sensors and cognitive computing, can augment our understanding of the built and natural environment. He has over 25 years of experience in industry and is a Fellow of the Royal Society of Arts. Socials: @djdunc</p>



<p><strong>Valerio Signorelli</strong> is a Lecturer in Connected Environments at the Bartlett Centre for Advanced Spatial Analysis (CASA), University College London. He holds an MSc in Architecture and Urban Design and a PhD in Territorial Design and Government from the Department of Architecture and Urban Studies at the Politecnico di Milano (Italy). He is also a Fellow of the Royal Society of Arts. Socials: @ValeSignorelli</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/10/27/cities-in-the-metaverse-book/">Cities in the Metaverse: Spatial Computing, Digital Twins, Avatars, Economics and Digital Habitation on the New Frontier</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>TIME: An Open Source 3D Printed Clock</title>
		<link>https://www.digitalurban.org/blog/2025/08/05/time-an-open-source-3d-printed-clock/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Tue, 05 Aug 2025 09:33:43 +0000</pubDate>
				<category><![CDATA[3D Modelling]]></category>
		<category><![CDATA[3D Printing]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/blog/2025/08/05/time-an-open-source-3d-printed-clock/</guid>

					<description><![CDATA[<p>We are about to release ‘TIME,’ our fully 3D-printed mechanical clock, on Printables. This post provides insight into both its creation and development.</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/08/05/time-an-open-source-3d-printed-clock/">TIME: An Open Source 3D Printed Clock</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading">We have released ‘TIME,’ our fully 3D-printed mechanical clock, on Printables. This post provides insight into both its creation and development.</h2>



<p>Over the years, I have followed various clock makers online, looking at their designs, 3D printed models, and generally spending more time than I would like to admit trying to get a 3D printed clock running. Makers such as the excellent <a href="https://woodenclocks.co.uk/">Brian Law</a> (Wooden and 3D printed), <a href="https://www.stevesclocks.com/">Steve&#8217;s Clocks</a> (great insights and designs), <a href="https://engineezy.com/products/the-3d-printed-wall-clock?srsltid=AfmBOoo0TeeDPAo3nD324gp0kzedsIHfIQQR-34iwf9ZqKiXIjJkbbTH">JBV</a> (amazing, complex engineering) and <a href="https://wooden-gear-clocks.com/">Wooden Gear Clocks</a> (a lovely site for premade clocks which come in kits but still need more skills then it turns out i have in basic cutting/sanding of brass rods) have all inspired me, but have also driven my need for a simpler, open source design which would be free and easier to build.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>It has taken a year to perfect, mainly as a summer project last year (in between work) and refining it this year to reach a point where other people could make it.</p>
</blockquote>



<p>All 3D printed clocks are however never completely &#8216;easy&#8217; to build, they all need a few extra parts, mainly in the need to reduce friction with metal rods, bearings, but we have limited the parts to the minimum, made sure they are all off shelf parts and have also reduced the need for any sanding/cutting/ beyond one small part. We have also provided all the <a href="https://www.printables.com/model/1375086-time-a-3d-printed-clock">files available on Printables</a>, and incoming to Github, allowing others to refine and contribute new designs or updates. As time goes along, the clock will evolve &#8211; but for now it&#8217;s ready for its first release, and it’s called TIME.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/2ca3223f-279d-4a22-9cd6-5df9448a4e25_2400x1621.png" alt="" /></figure>



<p>We have tried and mainly failed with other designs online; as such, we wanted to build our own version, from first principles, not modifying others, but from the ground up, with the following aims:</p>



<ol class="wp-block-list">
<li>
<p>It should be as easy to build and replicate as possible</p>
</li>



<li>
<p>Any additional parts should be easy to source and low-cost</p>
</li>



<li>
<p>Cutting/Sanding should be limited</p>
</li>



<li>
<p>It should have a proper 1-second tick/tock sound &#8211; this was important.</p>
</li>
</ol>



<p>As such, the first thing to learn was how an Escapement Mechanism worked, how it contributes to the timing of a clock, the importance of the length of the pendulum and how it dictates the sound of the clock.</p>



<h3 class="wp-block-heading">First Steps: The Escapement</h3>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>While most people listen to music on their headphones on the way to work, last summer, I started listening to ChatGPT’s newly introduced voice mode to talk me through the history and detailed workings of a clock escapement mechanism. It allowed me to gain enough knowledge to start drawing my own deadbeat escapement.</p>
</blockquote>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/52c2b5c7-60b8-4f8e-8aa1-59b6b9773638_1806x1052.png" alt="" /></figure>



<p>A clock&#8217;s escapement is the heart of its mechanism, ingeniously translating the constant power from the gear train into the precise, rhythmic pulses that create the &#8220;tick-tock.&#8221; It performs two critical jobs: it allows the gears to &#8220;escape&#8221; forward one tooth at a time, regulating the speed of the hands, and it gives the pendulum a tiny push on each swing to overcome friction and keep it moving. The <strong>deadbeat escapement</strong>, perfected by George Graham around 1715, was a major leap in accuracy. Unlike earlier &#8220;recoil&#8221; escapements, where the escape wheel would kick backwards slightly after each tick, the deadbeat&#8217;s pallets are shaped so the teeth land &#8220;dead&#8221; with no recoil. This crucial improvement prevents the escapement from disturbing the pendulum&#8217;s natural, isochronous swing, making the clock significantly more accurate and establishing the design as the standard for precision regulator clocks. My 3D tool of choice to design mechanisms/enclosures for devices is Autodesk 360, as such, all I needed was a reference drawing and the thought that getting a working escapement would be a good first step. Over at <a href="https://www.abbeyclock.com/aeb3.html">Abbey Clock</a> there is an excellent guide on Drawing Graham Pallets &#8211; which led to our own slightly modified design and, as pictured below, our first working escapement:</p>



<p>We haven&#8217;t published our prototypes yet, but I can share them if people are interested in either the escapement above or the next stage &#8211; the Pomodoro Timer, below. The escapement mechanisim is powered by a weight and counterweight sytem with a 1 metre pendulum &#8211; this provided a way to perfect the initial &#8216;tick tock&#8217; of the clock, designed to provide a steady beat every 0.5 seconds, allowing the escapment to move forward, once every seconds and more importantly, make a full rotation once a minute.</p>



<h3 class="wp-block-heading">Gear Ratios: The Pomodoro Timer</h3>



<p>The second step was to add an additional gear and gain an understanding of gear ratios. Gear ratios for a clock are all-important as they not only define the number of gears you need, but also define<strong> the relationship between them</strong>, turning the fast-paced energy of the escapement into the slow, readable passage of time.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/765c7800-8baa-4dad-881f-f90faa58576e_2084x1156.png" alt="" /></figure>



<p>At its core, a gear ratio translates speed and torque between rotating shafts. In our clock, we have a known starting speed: our <strong>escape wheel&#8217;s shaft</strong> rotates once every minute (60 seconds). To build a timer that rings a bell every 25 minutes, we needed to create a gear train that would complete one full rotation in that time.</p>



<p>The maths to figure out the required gear ratio is straightforward:</p>



<ul class="wp-block-list">
<li>
<p><strong>Target Rotation Time</strong>: 25 minutes = 25×60=1500 seconds.</p>
</li>



<li>
<p><strong>Source Rotation Time</strong> (Escape Wheel Shaft): 60 seconds.</p>
</li>



<li>
<p><strong>Required Gear Ratio</strong>: Target Time​=601500​ or 25:1 &#8211; as we see later on, it is the ratio which is important, and arguably, easier to understand.</p>
</li>
</ul>



<p>As you can see in our prototype, we achieved this with a <strong>compound gear train</strong> made of two identical stages, which makes the design elegant and easy to replicate. A compound gear it a cluster of two or more gears of different sizes that are fixed together on the same shaft, forcing them to rotate at the same speed. In short, instead of having one massive gear drive a tiny one to get a big gear ratio, a compound gear lets you achieve the same result in stages.</p>



<p>This setup is the key to creating a <strong>gear train</strong> that can achieve a large change in speed or in a compact space</p>



<ol class="wp-block-list">
<li>
<p><strong>First Stage</strong>: The shaft from the escapement has an <strong>8-tooth pinion</strong> that drives a <strong>40-tooth gear</strong>. This gives a reduction of 840​=5:1.</p>
</li>



<li>
<p><strong>Second Stage</strong>: Mounted on the same shaft as the first 40-tooth gear, a second <strong>8-tooth pinion</strong> drives the final <strong>40-tooth gear</strong>. This provides another reduction of 840​=5:1.</p>
</li>
</ol>



<p>The total reduction is the product of the individual stages: 5×5=25:1. This ratio perfectly transforms the 60-second rotation of the escapement shaft into the 25-minute rotation needed to trigger the bell. Or at least that&#8217;s how it should be in a perfect world. To be honest, I experimented a little, and my timings were a little out, coming in at approximately 20 minutes per bell ring, with the weight running the timer for an hour. The main point is that I had extended out from the escapement and used compound gears to start using ratios for timings. Of note, Chat GPT was useful in the wider understanding of the clock mechanism and theory, but it would frequently miscalculate gear trains and ratios, so the final design was done with old-fashioned logic.</p>



<h3 class="wp-block-heading">TIME: 3D Printed Clock</h3>



<p>The final clock is simply a case of building out the number of gears, using the same logic as the Pomodoro Timer. I had an escapement rotating once a minute, and I needed the main gear rotating once an hour, so I could attach an hour hand to it &#8211; that’s a ratio of 60:1</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/7805b6ee-952f-4890-a295-16e1f69a886d_1624x1138.png" alt="" /></figure>



<p>Following that logic and a ratio of 60:1 &#8211;</p>



<ul class="wp-block-list">
<li>
<p><strong>Escapement</strong>: The Minute Gear &#8211; The escape wheel has 30 teeth, with a 10-tooth pinion on its shaft, delivering a 1-second &#8220;tick-tock&#8221; rhythm. The escape wheel completes one full rotation every minute.</p>
</li>



<li>
<p><strong>Gear 1</strong>: 40-tooth wheel (driven by the 10-tooth pinion) and 20-tooth pinion.</p>
</li>



<li>
<p><strong>Gear 2</strong>: 60-tooth wheel (driven by the 20-tooth pinion) and 24-tooth pinion.</p>
</li>



<li>
<p><strong>Gear 3</strong>: The Hour Gear- a 120-tooth wheel (driven by the 24-tooth pinion), which completes one rotation per hour and connects to a drive gear (details on the drive gear to follow).</p>
</li>
</ul>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/e8b2ae7d-1a4e-4c6c-8f28-717fa153c18f_1656x1214.png" alt="" /></figure>



<p>The gear ratios provide the necessary speed reduction to convert the escapement&#8217;s motion into hourly rotation. Starting from the escapement:</p>



<ul class="wp-block-list">
<li>
<p>The 10-tooth pinion drives the 40-tooth wheel of Gear 1, creating a 40:10 (or 4:1) reduction ratio—Gear 1 rotates at 1/4 the speed of the escape wheel.</p>
</li>



<li>
<p>Gear 1&#8217;s 20-tooth pinion drives Gear 2&#8217;s 60-tooth wheel, a 60:20 (or 3:1) ratio—Gear 2 rotates at 1/3 the speed of Gear 1.</p>
</li>



<li>
<p>Gear 2&#8217;s 24-tooth pinion drives Gear 3&#8217;s 120-tooth wheel, a 120:24 (or 5:1) ratio—Gear 3 rotates at 1/5 the speed of Gear 2.</p>
</li>
</ul>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>It all suddenly seems complicated &#8211; but if you look at it more simply, the cumulative reduction ratio is 4 × 3 × 5 = 60:1. Since the escape wheel rotates once per minute, Gear 3 rotates once every 60 minutes—or exactly once per hour.</p>
</blockquote>



<p>I could have chosen any ratio for the gears as long as it works out to 60:1.</p>



<p>With a main gear turning once an hour, upon which an hour hand can be attached, all that is now needed is a Drive Gear to power the clock and a small subset gear on which to put the minute hand.</p>



<h4 class="wp-block-heading">The Drive</h4>



<p>The drive gear serves a dual purpose as a catch mechanism, enabling the clock to be wound up while preventing unintended unwinding. This gear, connected to the drum, includes a ratchet and pawl system. When winding the clock, the ratchet allows the drum to rotate in the winding direction, lifting a weight of 2kg. The pawl engages the ratchet teeth, locking the drum in place to stop it from unwinding backwards once the winding is complete. This ensures the stored energy remains secure, releasing only through the controlled escapement and gear train during operation.</p>



<h4 class="wp-block-heading">The Minute Hand</h4>



<p>Finally, there is a separate gear chain for the minute hand, which fits behind the hour hand; this allows the traditional hour and minute hand configuration (the hour hand fits on a brass rod going through the hour gear, holding everything in place while also allowing it to freely move.</p>



<p>And that’s it! It seems simple when you break it down, although designing and building it from scratch took a little more time than I thought, and I lost count of the iterations it took to get here. The design will continue to be refined—perhaps with the addition of an hourly chime in the near future.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/11e5785d-98b9-45ad-89e8-029a75addc09_1526x1184.png" alt="" /></figure>



<p>For now, we hope you enjoy making the clock. Do let us know if you build one, over at <a href="https://www.printables.com/model/1375086-time-a-3d-printed-clock">Printables</a> (the 3D printed files are incoming, we are just collating the files)…</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/08/05/time-an-open-source-3d-printed-clock/">TIME: An Open Source 3D Printed Clock</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Chasing the Tween: My 25-Year Obsession with Weather Data</title>
		<link>https://www.digitalurban.org/blog/2025/07/25/chasing-the-tween-my-25-year-obsession-with-weather-data/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Fri, 25 Jul 2025 13:04:19 +0000</pubDate>
				<category><![CDATA[Weather]]></category>
		<category><![CDATA[Weather Display]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/blog/2025/07/25/chasing-the-tween-my-25-year-obsession-with-weather-data/</guid>

					<description><![CDATA[<p>From Adobe Flash through to Vibe Coding and back again to Physical Displays</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/07/25/chasing-the-tween-my-25-year-obsession-with-weather-data/">Chasing the Tween: My 25-Year Obsession with Weather Data</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading">From Adobe Flash through to Vibe Coding and back again to Physical Displays</h2>



<p>For those of us who run a personal weather station (PWS), the obsession isn&#8217;t just about collecting data; it&#8217;s about seeing it and sharing it, and that often means an online data dashboard. At the Connected Environments Lab based within the Centre for Advanced Spatial Analysis, University College London, we have been running weather stations in Central London since the early 2000s, each one uploading and communicating data every 3 seconds &#8211; that&#8217;s approximately 268,918,537 measures of wind speed, air pressure, temperature, and solar intensity, amoungst others, including cloud height and now air quality etc. Getting the data online, however, has never been quite as easy as it should be…</p>



<h3 class="wp-block-heading">The Past: The Golden Age of Tweens and Live Dials</h3>



<p>So, back in the early 2000s, getting a Davis, Oregon Scientific, or La Crosse station online was a rite of passage for anyone interested in environmental data. The challenge wasn&#8217;t just mounting the hardware; it was wrestling with software to display the information. In this era, two names stood out for many hobbyists: Cumulus and Weather Display, and we have used them both extensively over the years.</p>



<p><strong>Weather Display</strong>, the long-standing creation of New Zealand-based developer Brian Hamilton, is a testament to the remarkable longevity of software. First appearing around 1999, it has been continuously developed for over two decades with hundreds of updates, making it a cornerstone of the personal weather station community. Its web component, <strong>Weather Display Live</strong>, became the golden standard for its time. Built entirely on Adobe Flash, it presented a fully featured dashboard of analogue-style gauges and tickers updating in real-time. The wind speed needle would smoothly tween with every gust, the rain gauge would visibly fill, and a ticker tape would scroll with the latest observations, creating a dynamic and immersive view of the weather that was unparalleled in its era. I’ve been chasing that smooth ‘tween’ ever since in dashboards, and indeed, further into this article that tween is still the ultimate draw in real-time data.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/6e0f7ac1-9c5c-4146-975e-6040b344b15e_555x414.gif" alt=""/></figure>



<p>By today&#8217;s standards, Weather Display Live perhaps looks cluttered, as design language shifts, indeed we are about to enter the era of ‘glass’ with the latest iOS updates and that will bring about a whole new look to apps and displays. Despite Adobe Flash requiring a browser plugin it allows dashboard that using the then HTML standard would simply not have been possible.</p>



<p>For a brief time Flash burned bright and was the basis of data on a little desktop device known as the Chumby &#8211; the Chumby was a device ahead of its time, providing realtime data updates via a series of rotating dashboards. As such it was the perfect device for displaying weather data, although the technical knowledge at the time, escaped me. Luckly a computer scientist, now <a href="https://profiles.ucl.ac.uk/4441-steven-gray">Professor Steven Gray</a> had recently joined our lab and kindly wrote a custom script to display the data.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/d52d9de1-6609-4b62-9a21-48e749430e74_400x294.jpeg" alt=""/></figure>



<p>However, Flash’s days were numbered when Steve Jobs famously called it a ‘bag of hurt’, ushering in a new age of HTML5 and laying the foundations to the web we see today.</p>



<p>Alongside Weather Display was <strong>Cumulus</strong> (now <strong>Cumulus MX</strong>), which carved out a huge following, particularly for Windows users. It was, and is, a robust piece of software for processing PWS data. Its approach to web display was more traditional. It used a system of web templates and tags. You could design a basic HTML page, insert special tags like <code>&lt;#temp&gt;</code>, and Cumulus would periodically process this file, replacing the tags with current data and uploading it to your web server. While it lacked the fluid dynamism of Flash, it was reliable and customisable if you knew your way around HTML and CSS. It represented, arguably the dawn of the template-driven, self-hosted dashboard. Adobe Flash dials were replaced with new plugin free gauges via a system known as Steel Gauges.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/47099276-d732-43e5-8f17-0765c9781acc_1146x1363.jpeg" alt=""/></figure>



<p>The Steel Gauges are a much-loved part of the Cumulus software legacy, while not part of the official template that came with the software, they were a hugely popular style that the Cumulus community created and shared, made possible by the software&#8217;s flexible templating system. They are the perfect example of a design philosophy called <strong>skeuomorphism</strong> &#8211; which came and went but is actually one we end this article on with perhaps the best example of a weather dashboard, but more on that later.</p>



<p>Skeuomorphism&#8217;s goal is to make digital items resemble their real-world counterparts. The Steel Gauges aesthetic was heavily influenced by the design trends of the late 2000s and early 2010s, aiming to make a web page look like a physical, high-end piece of hardware. The designs were rich with photorealistic detail: brushed metal or carbon fibre backgrounds, glistening chrome bezels with drop shadows, and precisely rendered needles and tick marks. The goal was to create a sense of tangible reality on the screen.</p>



<p>The implementation was a clever two-part process:</p>



<ol class="wp-block-list">
<li>
<p><strong>The Visuals:</strong> A user would first design the entire dashboard as a single, static background image in a graphics program like Photoshop. This image contained all the &#8220;steel&#8221; dials, gauges, and labels, but with no data.</p>
</li>



<li>
<p><strong>The Data:</strong> They would then use this image as a background in an HTML file. The magic happened by using Cumulus&#8217;s simple but powerful templating engine. Users would strategically place Cumulus&#8217;s special &#8220;web tags&#8221; (e.g., <code>&lt;#temp&gt;</code>, <code>&lt;#windspeed&gt;</code>, <code>&lt;#press&gt;</code>) over the blank areas on the gauges.</p>
</li>
</ol>



<p>When Cumulus processed this file, it would replace the tags with the live weather data. The software would then automatically upload the updated HTML file and the background image to the user&#8217;s web server.</p>



<p>Today, the Steel Gauges are perhaps a time capsule of a specific design era. They were built for a desktop-first world and were not responsive, often looking cluttered on mobile screens. As web design shifted towards minimalism, flat design, and mobile-first principles, the Steel Gauge aesthetic was largely superseded by clean, abstract, and data-dense interfaces like the Belchertown skin for WeeWX, which we come to next. However, they remain a nostalgic and important step in the evolution of personal weather data visualisation.</p>



<h3 class="wp-block-heading">The Current: Between Open and Closed Systems</h3>



<p>Standing down our Weather Display and Cumulus systems, we moved onto software known as WeeWX, running on a Raspberry Pi and sending data to a variety of online sources. It remains the system we use today. <strong>WeeWX</strong> (often stylised as weewx ) is a foundational piece of software in the modern personal weather station (PWS) world, but it&#8217;s fundamentally different from its predecessors. It is best understood not as a single application, but as an open-source software engine for collecting, processing, and displaying weather data.</p>



<p>It was created earlier than we at first thought &#8211; by Tom Keffer, with initial development beginning in the winter of 2008, with the first official release of WeeWX in 2009.</p>



<p>WeeWX is written entirely in Python and designed to run on Linux-based systems. This makes it the go-to choice for hobbyists who want to run their weather station on a low-power, single-board computer like a Raspberry Pi, operating 24/7 with minimal energy consumption, making it a notable move away from our previous power-hungry Windows machines.</p>



<p>The core philosophy is based around modularity and extensibility. It treats the different parts of managing a weather station as separate, interconnected components:</p>



<ol class="wp-block-list">
<li>
<p><strong>The Engine:</strong> At its heart, weewx runs a main loop that communicates directly with the weather station hardware. It polls the station every few seconds for live data (like wind speed) and stores this information in a database (typically SQLite by default, though others are supported).</p>
</li>



<li>
<p><strong>The Reporting Cycle:</strong> At a regular, user-defined interval (the &#8220;archive interval,&#8221; often every 5 minutes), the reporting engine wakes up. It pulls the latest data from the database and uses it to generate reports.</p>
</li>



<li>
<p><strong>Skins and Generators:</strong> This is where the user-facing magic happens. A &#8220;skin&#8221; in weewx is a complete template for a website, containing HTML files, CSS, images, and configuration settings. Using the powerful Cheetah templating engine, weewx fills these templates with data to generate static HTML pages. It can also run &#8220;generators&#8221; that push data to public services like Weather Underground, PWSWeather, and in our case custom MQTT brokers &#8211; this is key as it opens up the new world of Vibe Coding, which we come to next.</p>
</li>



<li>
<p><strong>Extensibility:</strong> The most powerful feature of weewx is its architecture, which is designed to be extended. Users can write their own services in Python to calculate new variables (e.g., custom fire danger indices), add new sensor types, or create entirely new report formats. This makes it a flexible platform for tinkerers and data enthusiasts.</p>
</li>
</ol>



<p>Out of the box, weewx comes with a simple, functional skin. However, its skinning architecture is has been key to community projects. The most used by far is the <strong>Belchertown skin</strong>, created by Pat O&#8217;Brien. First gaining widespread popularity around early 2019, Belchertown provides a clean, modern, mobile-responsive interface with dark mode, interactive charts, and real-time streaming updates via MQTT. It has become the de-facto &#8220;face&#8221; of weex for many users and perfectly embodies the modernist design principles of clarity and function.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/c2962b81-6ed6-4e7d-94f4-e1d967c6d90d_2078x1310.png" alt=""/></figure>



<p>At the polar opposite of weewx are the more propoietry systems that are run by the weather station hardware providers &#8211; Davis, being one key example with their live data system via Weatherlink.</p>



<p>Davis Instruments is a respected name in the personal weather station market, occupying the premium &#8220;prosumer&#8221; space. Based in California, the company is renowned for producing accurate, durable, and reliable hardware that is arguably considered a gold standard by serious hobbyists, educators, and agricultural professionals.</p>



<p>Their flagship products are the <strong>Vantage Pro2</strong> (their expandable, top-of-the-line model) and the <strong>Vantage Vue</strong> (a more compact, all-in-one unit). For decades, the main challenge for Davis owners was getting the rich data from this excellent hardware onto a computer and the internet &#8211; we have run their Vantage Pro line since the 2000’s and now use weewx &#8211; but they do have their own ‘out of the box’ system for people who do not want to tinker with a Raspberry Pi.</p>



<p>Originally, connecting a Davis station required a piece of hardware called the <strong>WeatherLink Data Logger</strong>. This was a small module that slotted into the back of the weather station console, attached via a serial or USB cable to a PC. To get data online 24/7, a user had to leave a computer running constantly with the WeatherLink PC software. The data logger itself was a one-time hardware purchase, costing around £150-£250 &#8211; our data logger used to plug into our Windows machine and Weather Display.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/e9477f3e-033f-4101-910c-e235f1689a95_832x802.png" alt=""/></figure>



<p>The modern solution is thankfully far more elegant. Davis replaced the data logger with their <strong>WeatherLink Live </strong>unit in 2019. This small, standalone box acted as an internet bridge. It independently listened for the radio signals from the outdoor sensors and sent the data directly to their WeatherLink Cloud. This was the first device to eliminate the need for a console to be connected to a computer &#8211; it basically took the computer out of the workflow.</p>



<p>This was superseded by the <strong>WeatherLink Console (model 6313)</strong> in early 2023, updating the previous and notably dated previous iteration.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/c9222c82-63e0-4ee7-9c73-3f9074cdac56_1378x1040.png" alt=""/></figure>



<p>Today, WeatherLink refers to the entire cloud-based ecosystem (<code>WeatherLink.com</code> and the mobile app). When you buy and connect a WeatherLink Live device, you get access to this platform. It allows you to view your data from anywhere, see historical charts, and share your station publicly.</p>



<p>The web-based dashboard is, however, slightly ‘clunky’ and arguably badly designed:</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/10cac172-5be4-4dac-84e6-19e93c18ee95_2858x1250.png" alt=""/></figure>



<p>It also operates on a tiered subscription model, and to view your data in real-time, costs.</p>



<ol class="wp-block-list">
<li><p><strong>Basic Tier (Free):</strong> When you purchase your hardware (e.g., a Vantage Vue station + a WeatherLink Live device), you get the Basic tier for free. This includes:</p>
<ul class="wp-block-list">
<li>
<p>Your current, live data updated every minute.</p>
</li>



<li>
<p>A 24-hour chart of your data.</p>
</li>



<li>
<p>The ability to see your data on the app and website.</p>
</li>
</ul>
</li>



<li><p><strong>Pro Tier (Paid):</strong> This is the most common upgrade for enthusiasts. For a recurring fee, it unlocks much deeper access to your own data.</p>
<ul class="wp-block-list">
<li>
<p><strong>Cost:</strong> Approximately <strong>$3.95/month</strong> or <strong>$42.50/year</strong> (with prices subject to change and regional variation).</p>
</li>



<li><p><strong>Features:</strong></p>
<ul class="wp-block-list">
<li>
<p>Access to your full, detailed historical data archive.</p>
</li>



<li>
<p>Advanced, customisable charting tools to analyse your history.</p>
</li>



<li>
<p>More frequent &#8220;live&#8221; data updates on the web (every 2.5 seconds).</p>
</li>



<li>
<p>The ability to download your data archive as a CSV file.</p>
</li>
</ul>
</li>
</ul>
</li>
</ol>



<p>As such, we still have our Davis Vantage Pro plugged into a WeeWX system, which enables the transmission of data through an MQTT server and bypasses any paywalls to view our own data.</p>



<h3 class="wp-block-heading">Towards Personal Templates with Vibe Coding</h3>



<p>This would normally leave us with the Belchertown Skin, and while excellent, the itch to edit and innovate is always strong, and with it the desire to pull in increasingly customised data feeds, pulling in multiple sources of data to gain a better overview of the current environmental conditions. This would normally require a good understanding of data structures, application protocol interfaces, cascading style sheets, JavaScript and perhaps a bit of Firebase and databases. All of course are good to know and learn but if you want a new look to your data and only have a hour free then Vibe Coding is the current route to take. As in <a href="https://open.substack.com/pub/digitalurban/p/vibe-coding-from-ios-apps-to-an-asteroid?r=u7cv&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=true">our previous substack post</a>, the term ‘Vibe Coding’ was defined Andrej Karpathy, a co-founder at OpenAI in Early 2025. The basic idea is to rely entirely on Large Language Models &#8211; LLMs &#8211; and code by using only natural language prompting, rather than writing the code yourself. At the moment we are exploring this new way to develop Weather Dashboards and are in the process of adding examples to our GitHub.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/97dbcfce-e0bd-45a3-9545-004e2794b0b8_2172x1434.png" alt=""/></figure>



<p>Our first example (see <a href="https://digitalurban.github.io/VibeWeatherDashboards/mixed_dashboard.html">https://digitalurban.github.io/VibeWeatherDashboards/mixed_dashboard.html</a>), pictured above, was created using Gemini 2.5 Pro, which is currently available for free for a month. The Pro version provides sufficient usage to create an unlimited number of dashboards. Vibe Coding is good &#8211; but it&#8217;s not that good, it won’t create the dashboard above in a single prompt, it takes a number of iterations, starting small and building up. It was built using the Canvas feature, which allows both a view of the code and a live preview, making it quick and easy to make changes. For example, our first prompt into Gemini was ‘make a 5-day forecast for ‘our area’ using a free weather api and use flat icons’. This created the 5-day forecast, using the Open-Meteo API, we subsequently pointed Gemini to our weewx data feed and asked it to use the same icon style to provide a dashboard using the key weather data indicators of Temperature, Wind, Rain, Pressure and Solar. Once that was working, with a few edits back and forth, we then added the coloured background, which changes colour according to the outside temperature, ranging from a dark blue for -10 °C through to a deep red for 35 degrees and above.</p>



<p>At the moment, we have it running on an old iPad, which we have mounted in a frame, sitting on a bookshelf in the corner of the room:</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/d5d36992-40bc-4b19-bfc3-1e2f02bf3bb5_4032x3024.jpeg" alt=""/></figure>



<p>Vibe coding has also enabled us to explore the creation of AI Weather Landscape dashboards, which take the raw output of the UK Met Office forecast and send it to the Google Vertex AI platform to interpret it as an image of an English Landscape, twice a day.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/04e83eb9-9581-4337-85c1-34f6ea5b0c07_2878x1502.png" alt=""/></figure>



<p>Finally, Vibe Coding has allowed us to experiment with real-time graphing, something that harks back to Cumulus, as their templates integrated a series of graphs as part of a drop-down menu. Our graphing displays data in real-time over a period of 24 hours, allowing trends to be visualised and also the graphs to update as new data points come in.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/5ff88b88-0d27-49ea-8a91-e5ebb001582e_2526x1464.png" alt=""/></figure>



<p>The creation of a real-time graphing required the setting up of Firebase &#8211; something which is not overly self-explanatory. However, using Vibe Coding it only took an hour and a half from knowing little to nothing, to have a full system set up and running &#8211; it is all in the human language prompts and taking things in small steps. Our first prompt was ‘we have this data feed (insert mqtt feed link) and want to view it as a real-time graph, probably using Firebase, can you show us how to set it up and provide me with full code to view the data on a webpage’ &#8211; this was 6 months ago using ChatGPT, things move rapdily in this space and now there is <a href="https://firebase.google.com/docs/studio">Firebase Studio</a> which makes it even easier to get up and running.</p>



<p>Having been using and making weather dashboards in 25 years it is only in the last 6 months with the invention of Vibe Coding that innovation has speeded up. We have moved from clunky Windows PC looking interfaces through Flash, skeuomorphism, flat designs and now into hyper-customised dashboards using vibe computing. Yet there is one dashboard that I have kept looking at over the years and so far have been unable to replicate (mainly due to copyright issues, but also the code) &#8211; and that is an online version of the classic Intromet Climatica Weather System.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/0ad7a09e-94b3-4f51-80fb-4ca5b437fae1_2126x996.png" alt=""/></figure>



<p>It is perhaps the ultimate digital-analogue physical display, and one that is likely on every weather data enthusiast&#8217;s wish list. While the physical system remains out of reach for us, the developer over at <a href="https://weather.wilmslowastro.com/test/instromet/climatica/climatica.php">Wimslow Weather has made an online version</a> (hidden away under his developer section) and it is lovely:</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/08/51199862-05f6-4d50-bc90-c07c3df461fc_2862x1354.png" alt=""/></figure>



<p>Sure, it&#8217;s full on skeuomorphism &#8211; but the tween on the wind direction and wind speed is one of the best we have seen &#8211; so despite having access to the new opportunities of AI, it&#8217;s a physical design from a small <a href="https://instromet.co.uk/climatica-weather-station/">British company, Instromet</a>, that somehow still remains the best, in our view.</p>



<p>We would love to know what dashboard you like or us, do tell us in the comments, and if you like this article, the subscribe link is of course below.</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/07/25/chasing-the-tween-my-25-year-obsession-with-weather-data/">Chasing the Tween: My 25-Year Obsession with Weather Data</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Vibe Coding: From iOS Apps to an Asteroid Clock and 3D Cities</title>
		<link>https://www.digitalurban.org/blog/2025/07/16/vibe-coding-from-ios-apps-to-an-asteroid-clock-and-3d-cities/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Wed, 16 Jul 2025 20:38:17 +0000</pubDate>
				<category><![CDATA[3D Modelling]]></category>
		<category><![CDATA[Posts]]></category>
		<category><![CDATA[Gemini]]></category>
		<category><![CDATA[iOS App]]></category>
		<category><![CDATA[LLM]]></category>
		<category><![CDATA[Vibe Coding]]></category>
		<category><![CDATA[Weather]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/blog/2025/07/16/vibe-coding-from-ios-apps-to-an-asteroid-clock-and-3d-cities/</guid>

					<description><![CDATA[<p>Back in 2023, I wrote a blog post over on DigitalUrban.org creating an app called Frame-IT. It turns out I was using 'Vibe Coding' and it's changed the way I view the creation of almost everything...</p>
<p>The post <a href="https://www.digitalurban.org/blog/2025/07/16/vibe-coding-from-ios-apps-to-an-asteroid-clock-and-3d-cities/">Vibe Coding: From iOS Apps to an Asteroid Clock and 3D Cities</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<h4 class="wp-block-heading"><strong>Back in 2023, I wrote a blog post entitled Creating an app called Frame-IT. It turns out I was using &#8216;Vibe Coding&#8217; and it&#8217;s changed the way I view the creation of almost everything&#8230;</strong></h4>



<h2 class="wp-block-heading">Everything Changed</h2>



<p>The term ‘Vibe Coding’ was defined Andrej Karpathy, a co-founder at OpenAI in Early 2025. The basic idea is to rely entirely on Large Language Models &#8211; LLMs &#8211; and code by using only natural language prompting, rather than writing the code yourself.</p>



<p>Back in 2023, I published ‘Frame-IT’, an iOS app on the Apple Store, written in Swift but coded completely in the then-early version of ChatGPT. As I noted on <a href="https://www.digitalurban.org/">DigitalUrban</a>, every app has a story, from the idea through to story boarding, finding a design team, working with in house computer scientists or outsourcing. All of these steps take time, resources and often for the small ‘would be developer’ the obstacles are overwhelming, meaning that the spark of an idea often gets lost.</p>



<p>Yet, everything has changed, the whole landscape of developing, designing and creating has turned on its head. It has reached the point where the spark of an idea can be coded, designed, marketed and launched with the help of Artificial Intelligence. Frame-IT is a very simple app, doing a simple thing, but it came from an idea, a want, a desire to do something which before AI would have got lost in logistics, finding the time of computer scientists, and it would certainly be economically unfeasible.</p>



<h3 class="wp-block-heading">The First Idea</h3>



<p>Frame-IT came about while looking at Samsungs’ Frame Televisions, beyond the normal 4K television they blend into the background by displaying art and making the art look as it is mounted on a white background while also being enclosed in a frame – i.e. like a painting. Samsung does this well and the screens are excellent, but it is limited to art works, leading to the idea that it would be nice to frame websites, specifically ‘data’ based web sites and hang data on walls, data that changes in Realtime. Of course, any screen could be used, but if you simply show data on a screen or monitor, no one notices. If you hang it in a frame, as if its art, people will notice.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/07/51d58c8b-8cc2-49af-a1cb-46048fa08d49_730x584.webp" alt="" /></figure>



<p>The concept was therefore simple, build an application that puts a picture frame around websites, allow users to select which websites to show and if there is more than one, cycle through them, like a dashboard. This would allow small screens, such as iPads, to act as frames and, via AirPlay, devices such as projectors or general screens to gain a look beyond their norm and display websites and data in the same way as hanging art on the wall.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/07/170669e6-abd6-4279-9228-a053860fff00_1536x1009.png" alt="" /></figure>



<p>The problem was, this was an idea, to code it would require knowledge of Swift, a library of ‘picture frames’ and probably a business case to justify the time of resident computer scientists (I work at University College London, so we have a few around) and to be honest it was such a simple idea I’m note sure anyone would listen. So, with the rise of AI, I decided to build it myself, using AI from start to finish, from knowing little about Swift, and only as much about AI as my then twitter feed was full of and reading articles on site such as The Verge.</p>



<p>The app took me two days to build, two days where I was also working, so mainly doing things in-between other tasks, over lunch and a little bit in the evenings. With the initial learning curve out the way, I could rebuild it in a day. The first things I did was sign up to OpenAI to gain access to GPT 4 , which released on March 14th 2023. This allowed me access to the latest version and therefore the most up to date knowledge base, although looking back version 3 would probably have been fine and I could have simply used the free tier. My first prompt was “Can you write me swift code to take a website, centre it and add the image of a picture frame around it’. The app should work full screen, in landscape mode”. I had a sample picture frame (taken from Wikipedia and cutting out the actual image to leave a transparent centre).</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/07/764c8b30-9da0-4793-ae7a-be228d42eb10_1000x1000.jpeg" alt="" /></figure>



<p>Within 60 seconds, GPT gave me a section of code and then stopped, it turns out there was a limit in the amount of characters it could respond with. A quick Google search (ironically) showed me that by typing ‘continue’ ChatGPT will continue the code. With this I became a master of cutting and pasting, with the code often taking three continues and code which while sometimes was in code boxes, was also in simply text. It also allowed me to start to understand the layout and nature of Swift, within 30 minutes i had my first app running on an iPad, via Xcode. Simply by cutting and pasting and pressing ‘play’. Not all things worked out, there were often errors, but errors I could cut and paste back into GPT and it would solve them, most of the time.</p>



<p>Each time a milestone was made – a working version, a version with a picture frame in and a website showing, I would save the code as Chat GPT would sometimes break the next version while I was asking for new features. Features such as ‘Add a button to move to the next website’, ‘If there is more than one website, then cycle through them every 60 seconds’ (the websites were hard-coded at this point). Once i had a concept running on my own iPad, I realised it was quite neat and decided to then adapt it so others maybe able to use it. I added a settings page – via ‘Can you delete all the websites I have added and include a settings page, the settings page should be reached via a button and have the ability to add and delete websites, once added they should be saved so the app remembers them when reopened’. This is coding but in the new language – human language.</p>



<p>The picture frames were created in Image Creator by Bing, a search engine I never thought I would use (being a Mac user) but in that last week I had not been near Google. Bing allowed me to type in prompt and get images without any user limits. Simple prompts such as “create me a gold Victorian picture frame, it should be photorealistic with minimum reflections and the centre should be cut out” worked amazingly well, providing me with an almost limitless range of Picture frames to include as part of the assets.</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/07/6635903d-3268-45db-b3ce-13e2ed6eb961_2048x1345-1.png" alt="" /></figure>



<p>Not all the picture frames are from AI, a couple are from open source imagery, it felt wrong to cut out the picture, but the frames have some provenance and they were nice to include.</p>



<p>Marketing requires imagery as well – as such I took a very simple picture of my two test iPads and used the ‘PhotoRoom” AI app to transform it to two iPads on a bench with a concrete wall behind, whereas the reality was far from as glamourous.</p>



<p>The <a href="https://apps.apple.com/us/app/frame-it/id6447362214">App is now available via the Apple Store</a>, it even <a href="https://frame-it.framer.website/">has its own website</a> and logo – again designed by AI, designed using <a href="https://looka.com/">Looka</a> and hosted on <a href="https://framer.com/">Framer</a>. So while it worked, it was a little painful with the limited lengths of code in the early version of ChatGPT and the endless cutting and pasting/fixing things &#8211; it was quick, but it still took a notable commitment and time. It is also a very very niche app, but as we get to, thats the point of Vibe Coding.</p>



<h3 class="wp-block-heading">And then Everything Changed Again</h3>



<p>That was 2023 &#8211; late 2024, and everything has changed again, not only the workflow and the tools, but also the ability of the LLMs to Vibe Code, they now work, no more cutting and pasting small chunks of code, everything works. Code comes out ready to go and any errors can be fixed in a complete code base, with notably less errors compared to 2024 . As such, we made another app &#8211; again Vibe Coding, without knowing at the time &#8211; MQTTFrame.</p>



<p>The aim was to build on the concept of FrameIT but allow realtime data intergration directly in the app via MQTT (a lightweight, publish-subscribe network protocol for data) and to make it easy to subscribe to any MQTT topic and display live updates.</p>

<div class="pullquote">
<p>It is a niche thing to want to do, but that’s what Vibe Coding is great for &#8211; building those apps that you want, but perhaps few others do.</p>
</div>

<p>Our aim was to allow:</p>



<ul class="wp-block-list">
<li>
<p><strong>Customizable Frames</strong>: Choose from multiple high-quality decorative frames to complement your home or office décor.</p>
</li>



<li>
<p><strong>Clean and Clear Data Display</strong>: Showcase important data such as temperature, pressure, news, and more with clear typography and layout.</p>
</li>



<li>
<p><strong>Fullscreen Mode</strong>: Tap to seamlessly switch to an immersive fullscreen view, maximizing readability and aesthetics.</p>
</li>



<li>
<p><strong>Easy Management</strong>: Add, edit, and reorder topics quickly through an intuitive interface.</p>
</li>



<li>
<p><strong>Instant Updates</strong>: Receive real-time data updates over MQTT, ensuring you’re always in the know.</p>
</li>
</ul>



<p>And that&#8217;s what we did &#8211; but this time in a day and with everything created in ChatGPT &#8211; icons, graphics, code, marking text, and even walking me back through how to get it on the Apple Store (as it&#8217;s an oddly complex and easily forgotten process).</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/07/d4b1fad6-69b7-427f-8106-0d6459dca027_2014x1348.png" alt="" /></figure>



<p>So one day Vibe Coding and an app was taken from concept to ready to publish on the Apple Store &#8211; if you want to try it its currently <a href="https://apps.apple.com/gb/app/mqtt-frame/id6743003699">available for free, on the Apple Store</a></p>



<h3 class="wp-block-heading">Everything Again and Again</h3>



<p>That was the end of 2024, now in mid 2025 and term Vibe Coding has come into common use and Large Language Models are now more than capable of making anything from webpages to apps to writing posts (all of this was, however, written by an actual human; any grammatical errors are of course, intentional). We have now switched from ChatGPT to Google Gemini and use it almost daily to produce and make apps or webpages that perhaps only I want in my life, such as &#8211; an Asteroids Clock, made in 30 minutes (temporarily housed at &#8211; <a href="https://finchamweather.co.uk/astroidsclock.html">https://finchamweather.co.uk/astroidsclock.html</a> &#8211; we will move it to Github soon as we get chance).</p>

<div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;58d654b0-ee21-4fb6-ab79-a58beb898566&quot;,&quot;duration&quot;:null}"> </div>

<p>or a Procedural City Maker (<a href="https://finchamweather.co.uk/proceduralcity.html">https://finchamweather.co.uk/proceduralcity.html)</a></p>

<div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;8abdc2e7-b243-40d0-8713-c62e50d0623b&quot;,&quot;duration&quot;:null}"> </div>

<p>or a Weather Dashboard with the forecast made into the image of an English Landscape using data from the API from the UK Met Office:</p>



<figure class="wp-block-image size-large"><img decoding="async" src="https://www.digitalurban.org/wp-content/uploads/2025/07/43df66a8-a90e-4fe8-8edd-279c8d09aaaa_2908x1538.png" alt="" /></figure>



<p>You can view the dashboard live at <a href="https://finchamweather.co.uk/aiweatherimage.html">https://finchamweather.co.uk/aiweatherimage.html</a></p>



<p>All small, niche things that somehow I love and I am happy that Vibe Coding allowed them to exist &#8211; that&#8217;s the magic, Vibe Coding allows you to create anything in your imagination that did not exist to exist, it’s a great time to be making digital things.</p>



<p>If you liked this post, please do subscribe, it gives us a good reason to write another one…</p>


<hr class="wp-block-separator has-css-opacity" />
<div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://digitalurban.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM">
<div class="subscription-widget show-subscribe">
<div class="preamble">
<p class="cta-caption">Thanks for reading Digitalurban’s Substack! Subscribe for free to receive new posts and support my work.</p>
</div>
<form class="subscription-widget-subscribe"><input class="email-input" tabindex="-1" name="email" type="email" placeholder="Type your email…" /><input class="button primary" type="submit" value="Subscribe" />
<div class="fake-input-wrapper">
<div class="fake-input"> </div>
<div class="fake-button"> </div>
</div>
</form></div>
</div><p>The post <a href="https://www.digitalurban.org/blog/2025/07/16/vibe-coding-from-ios-apps-to-an-asteroid-clock-and-3d-cities/">Vibe Coding: From iOS Apps to an Asteroid Clock and 3D Cities</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>MQTT Frame &#8211; Beautifully Framed Live Data</title>
		<link>https://www.digitalurban.org/blog/2025/03/19/mqttframe/</link>
		
		<dc:creator><![CDATA[Andy]]></dc:creator>
		<pubDate>Wed, 19 Mar 2025 18:03:40 +0000</pubDate>
				<category><![CDATA[ios App]]></category>
		<category><![CDATA[dashboard]]></category>
		<category><![CDATA[Data]]></category>
		<category><![CDATA[frame]]></category>
		<category><![CDATA[mqtt]]></category>
		<category><![CDATA[realtime]]></category>
		<guid isPermaLink="false">https://www.digitalurban.org/?p=7886</guid>

					<description><![CDATA[<p>The post <a href="https://www.digitalurban.org/blog/2025/03/19/mqttframe/">MQTT Frame &#8211; Beautifully Framed Live Data</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></description>
										<content:encoded><![CDATA[
		<div id="fws_69ca487ec0804"  data-column-margin="default" data-midnight="dark"  class="wpb_row vc_row-fluid vc_row"  style="padding-top: 0px; padding-bottom: 0px; "><div class="row-bg-wrap" data-bg-animation="none" data-bg-animation-delay="" data-bg-overlay="false"><div class="inner-wrap row-bg-layer" ><div class="row-bg viewport-desktop"  style=""></div></div></div><div class="row_col_wrap_12 col span_12 dark left">
	<div  class="vc_col-sm-12 wpb_column column_container vc_column_container col no-extra-padding inherit_tablet inherit_phone "  data-padding-pos="all" data-has-bg-color="false" data-bg-color="" data-bg-opacity="1" data-animation="" data-delay="0" >
		<div class="vc_column-inner" >
			<div class="wpb_wrapper">
				
<div class="wpb_text_column wpb_content_element " >
	<div class="wpb_wrapper">
		<p><!-- /wp:post-content --></p>
<p><!-- wp:paragraph -->Transform your iPhone into a beautifully framed, dynamic data display with MQTT Frame. Stay informed about environmental conditions, breaking news, or custom live data feeds elegantly presented within a selection of artistic frames.</p>
<p><!-- /wp:paragraph --><br />

<a href='https://www.digitalurban.org/blog/2025/03/19/mqttframe/img_5262/'><img loading="lazy" decoding="async" width="473" height="1024" src="https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5262-473x1024.png" class="attachment-large size-large" alt="" srcset="https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5262-473x1024.png 473w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5262-139x300.png 139w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5262-768x1662.png 768w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5262-710x1536.png 710w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5262-947x2048.png 947w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5262.png 1284w" sizes="auto, (max-width: 473px) 100vw, 473px" /></a>
<a href='https://www.digitalurban.org/blog/2025/03/19/mqttframe/img_5261/'><img loading="lazy" decoding="async" width="473" height="1024" src="https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5261-473x1024.png" class="attachment-large size-large" alt="" srcset="https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5261-473x1024.png 473w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5261-139x300.png 139w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5261-768x1662.png 768w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5261-710x1536.png 710w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5261-947x2048.png 947w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5261.png 1284w" sizes="auto, (max-width: 473px) 100vw, 473px" /></a>
<a href='https://www.digitalurban.org/blog/2025/03/19/mqttframe/img_5264/'><img loading="lazy" decoding="async" width="473" height="1024" src="https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5264-473x1024.png" class="attachment-large size-large" alt="" srcset="https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5264-473x1024.png 473w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5264-139x300.png 139w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5264-768x1662.png 768w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5264-710x1536.png 710w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5264-947x2048.png 947w, https://www.digitalurban.org/wp-content/uploads/2025/03/IMG_5264.png 1284w" sizes="auto, (max-width: 473px) 100vw, 473px" /></a>
</p>
<h3 class="wp-block-heading">Key Features:</h3>
<p><!-- wp:list --></p>
<ul>
<li style="list-style-type: none;">
<ul><!-- wp:list-item --></ul>
</li>
<li><strong>Dynamic MQTT Integration</strong>: Easily subscribe to any MQTT topic and display live updates.</li>
<li><strong>Customizable Frames</strong>: Choose from multiple high-quality decorative frames to complement your home or office décor.</li>
<li><strong>Clean and Clear Data Display</strong>: Showcase important data such as temperature, pressure, news, and more with clear typography and layout.</li>
<li><strong>Fullscreen Mode</strong>: Tap to seamlessly switch to an immersive fullscreen view, maximizing readability and aesthetics.</li>
<li><strong>Easy Management</strong>: Add, edit, and reorder topics quickly through an intuitive interface.</li>
<li><strong>Instant Updates</strong>: Receive real-time data updates over MQTT, ensuring you’re always in the know.</li>
</ul>
<p><!-- /wp:list-item --></p>
<p><!-- wp:list-item /--></p>
<p><!-- wp:list-item /--></p>
<p><!-- wp:list-item /--></p>
<p><!-- wp:list-item /--></p>
<p><!-- wp:list-item /--><!-- /wp:list --></p>
<p><!-- wp:heading {"level":3} --></p>
<h3 class="wp-block-heading">Perfect For:</h3>
<p><!-- /wp:heading --></p>
<p><!-- wp:list --></p>
<ul>
<li style="list-style-type: none;">
<ul><!-- wp:list-item --></ul>
</li>
<li><strong>Home Assistant/Home Automation Enthusiasts</strong>: Display sensor data, home statuses, or environmental conditions.</li>
<li><strong>Personal Use</strong>: Stay updated with the latest news headlines, weather conditions, or any custom data stream.</li>
</ul>
<p><!-- /wp:list-item --> MQTT Frame merges elegant design with powerful real-time MQTT functionality, creating the perfect smart display for any setting.</p>
<p><!-- /wp:list --></p>
<p><!-- wp:heading {"level":5} --></p>
<h5 class="wp-block-heading">MQTT Frame is currently <a href="https://apps.apple.com/gb/app/mqtt-frame/id6743003699">available for free, on the Apple Store</a>, for support dm @digitalurban on X and bring your data to life in a beautiful new way.</h5>
<p><!-- /wp:heading --></p>
	</div>
</div>




			</div> 
		</div>
	</div> 
</div></div><p>The post <a href="https://www.digitalurban.org/blog/2025/03/19/mqttframe/">MQTT Frame &#8211; Beautifully Framed Live Data</a> appeared first on <a href="https://www.digitalurban.org">Digital Urban</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
