<?xml version="1.0" encoding="UTF-8" standalone="no"?><rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#" xmlns:georss="http://www.georss.org/georss" xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:sy="http://purl.org/rss/1.0/modules/syndication/" version="2.0"><channel><title>Elio Struyf</title><description>SharePoint Branding &amp; Development Blog</description><link>https://www.eliostruyf.com/</link><language>en-us</language><lastBuildDate>Thu, 09 Apr 2026 22:20:18 GMT</lastBuildDate><atom:link href="https://www.eliostruyf.com/feed.xml" rel="self" type="application/rss+xml"/><item><title>Why GitHub Copilot CLI is my new go-to for heavy lifting</title><link>https://www.eliostruyf.com/github-copilot-cli-heavy-lifting/</link><guid isPermaLink="true">https://www.eliostruyf.com/github-copilot-cli-heavy-lifting/</guid><description>Discover why GitHub Copilot CLI is essential for managing large codebases efficiently, enhancing performance without sacrificing context.</description><pubDate>Fri, 20 Mar 2026 12:26:13 GMT</pubDate><content:encoded>&lt;p&gt;As developers, we are always trying to find the most efficient way to build ship, and debug our code. Recently, I’ve been doing a lot of work in a gigantic monorepo for a customer—think 75+ sub-projects bundled into one repository. With such a massive codebase, I found myself hitting performance issues when using the GitHub Copilot Chat extension in Visual Studio Code. One of the main reasons is that Visual Studio Code extensions share the same resources, so when you are making a lot of changes and the Git integration is busy processing them, it ends up competing for the same resources Copilot Chat needs as well. If you want to do some manual changes at the same time, and require formatting updates on save, it can lead to a very laggy experience.&lt;/p&gt;
&lt;p&gt;Due to these performance issues, I decided to start exploring the GitHub Copilot CLI, which runs in a completely separate process and doesn’t rely on Visual Studio Code’s resources. In this post, I’ll share my experience with the Copilot CLI and how it has become my new go-to for heavy lifting in large codebases.&lt;/p&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;Don’t get me wrong, I love the &lt;a href="https://github.com/features/copilot"&gt;GitHub Copilot Chat&lt;/a&gt; extension in &lt;a href="https://code.visualstudio.com/"&gt;Visual Studio Code&lt;/a&gt;. The ability to quickly reference file selections and ask questions directly in the editor is fantastic, but when you are making repo-wide changes across dozens of sub-projects, things start to break down.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="the-challenge-with-gigantic-monorepos"&gt;The challenge with gigantic monorepos&lt;a aria-hidden="true" tabindex="-1" href="#the-challenge-with-gigantic-monorepos"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;When you ask Copilot Chat to execute complex, multi-file changes in a massive monorepo, it uses your Visual Studio Code instance’s resources. In my experience with this specific customer project, asking for a substantial change would cause my Visual Studio Code instance to freeze and stop responding.&lt;/p&gt;
&lt;p&gt;It takes ages before changes are applied, and you are left staring at a lagging editor. One workaround I found was to disable the Git integration in Visual Studio Code, but that is hardly a sustainable solution as that breaks the GitHub Copilot Chat’s ability to implemnent changes.&lt;/p&gt;
&lt;p&gt;I needed a way to leverage the power of Copilot’s agents without sacrificing my IDE’s performance.&lt;/p&gt;
&lt;h2 id="the-workflow-shift-running-outside-the-ide"&gt;The workflow shift: running outside the IDE&lt;a aria-hidden="true" tabindex="-1" href="#the-workflow-shift-running-outside-the-ide"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;The biggest advantage of the Copilot CLI is that it runs on a completely different instance and thread. It doesn’t use Visual Studio Code’s resources to do the heavy lifting.&lt;/p&gt;
&lt;p&gt;Initially, I hesitated to move to the CLI because I didn’t want to lose the context-awareness of the Chat interface. I loved being able to point to a specific file or code block. However, the CLI now features a brilliant integration with Visual Studio Code. If you have a Visual Studio Code instance open, the CLI can automatically connect to it and leverage the same context-awareness as the Chat extension.&lt;/p&gt;
&lt;p&gt;Here is how my workflow looks in practice:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;I open my project in Visual Studio Code.&lt;/li&gt;
&lt;li&gt;I open &lt;a href="https://ghostty.org/"&gt;Ghostty&lt;/a&gt; (my terminal of choice) in a separate window.&lt;/li&gt;
&lt;li&gt;I run the &lt;code&gt;copilot&lt;/code&gt; command.&lt;/li&gt;
&lt;li&gt;The CLI automatically connects to my active Visual Studio Code instance, and I can start prompting.&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/03/copilot-cli-connected.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRnAAAABXRUJQVlA4WAoAAAAQAAAACQAAAgAAQUxQSB8AAAAAgLe3t7e3t7e3gLb//////////7aDu7u7u7u7u7uDAFZQOCAqAAAA0AEAnQEqCgADAAFAJiWcAnQBER7zFIAA/v64ygJcHG7JHUZgNyyLgAAA" data-src="https://www.eliostruyf.com/uploads/2026/03/copilot-cli-connected.webp" alt="GitHub Copilot CLI connected to VS Code" style="width:997px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;GitHub Copilot CLI connected to VS Code&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;aside class="callout callout-info" aria-label="info"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;circle cx="12" cy="12" r="10"&gt;&lt;/circle&gt;&lt;path d="M12 16v-4"&gt;&lt;/path&gt;&lt;path d="M12 8h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;info&lt;/p&gt;&lt;div class="callout-text"&gt;Because Copilot now has its own dedicated process running in my terminal, the performance is much faster than executing the same tasks inside the Visual Studio Code terminal (as the terminal also uses Visual Studio Code’s resources) or Chat pane.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="bridging-the-gap-context-in-the-cli"&gt;Bridging the gap: context in the CLI&lt;a aria-hidden="true" tabindex="-1" href="#bridging-the-gap-context-in-the-cli"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;You might be wondering: &lt;em&gt;“If you are in a separate terminal, how do you pass context to the AI?”&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;The CLI-to-VSCode integration handles this beautifully. Once your CLI session is connected to your editor, you have two great options:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;You can naturally use &lt;code&gt;@&lt;/code&gt; mentions in the CLI prompt to reference specific files.&lt;/li&gt;
&lt;li&gt;You can right-click directly in your Visual Studio Code editor and select &lt;strong&gt;Add file/selection to Copilot CLI&lt;/strong&gt; from the context menu.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;You get the performance of a standalone CLI process with the exact same UX conveniences of the Chat extension.&lt;/p&gt;
&lt;h2 id="why-the-cli-is-a-game-changer-for-autonomous-tasks"&gt;Why the CLI is a game-changer for autonomous tasks&lt;a aria-hidden="true" tabindex="-1" href="#why-the-cli-is-a-game-changer-for-autonomous-tasks"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Besides the massive performance boost, the CLI has fundamentally changed how I approach larger tasks.&lt;/p&gt;
&lt;h3 id="zero-gui-friction"&gt;Zero GUI friction&lt;a aria-hidden="true" tabindex="-1" href="#zero-gui-friction"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;With the Chat interface, you are constantly managing the GUI. You have to click “Keep” or “Undo” for every projected change. The CLI excels at autonomous task completion. It just does the work. You don’t suffer from GUI overhead; you simply review the resulting changes in your Git diffs later, which is how we naturally review code anyway.&lt;/p&gt;
&lt;h3 id="running-parallel-tasks-with-fleet"&gt;Running parallel tasks with fleet&lt;a aria-hidden="true" tabindex="-1" href="#running-parallel-tasks-with-fleet"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;This is perhaps the coolest feature of the CLI. In Copilot Chat, if you want multiple things done simultaneously, you have to open multiple chat sessions and prompt each one individually.&lt;/p&gt;
&lt;p&gt;The Copilot CLI introduces the &lt;code&gt;/fleet&lt;/code&gt; command, which allows you to pass a single prompt that the CLI splits into multiple actions, working on them in parallel.&lt;/p&gt;
&lt;p&gt;For example, I can run:&lt;/p&gt;
&lt;div class="expressive-code"&gt;&lt;link rel="stylesheet" href="/_astro/ec.gf7e9.css"&gt;&lt;script type="module" src="/_astro/ec.8zarh.js"&gt;&lt;/script&gt;&lt;figure class="frame is-terminal"&gt;&lt;figcaption class="header"&gt;&lt;span class="title"&gt;&lt;/span&gt;&lt;span class="sr-only"&gt;Terminal window&lt;/span&gt;&lt;/figcaption&gt;&lt;pre data-language="bash"&gt;&lt;code&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;1&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#B392F0;--1:#6F42C1"&gt;/fleet&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;check&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;all&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;my&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;projects&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;to&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;update&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;the&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;labels&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;to&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;the&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;new&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;flat&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;structure&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div class="copy"&gt;&lt;button title="Copy to clipboard" data-copied="Copied!" data-code="/fleet check all my projects to update the labels to the new flat structure"&gt;&lt;div&gt;&lt;/div&gt;&lt;/button&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;p&gt;The CLI orchestrates the tasks concurrently, saving an immense amount of time. You can read more about this in the official &lt;a href="https://docs.github.com/en/copilot/concepts/agents/copilot-cli/fleet"&gt;Copilot CLI fleet documentation&lt;/a&gt;.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/03/copilot-cli-fleet.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRnoAAABXRUJQVlA4WAoAAAAQAAAACQAABAAAQUxQSCwAAAABL6CQbQTIH2oQ87lHIyLiLAYG2jbZHaDhwcGDASzg3w4aIvofk+QE1Ne2F1ZQOCAoAAAA0AEAnQEqCgAFAAFAJiWcAnQBDw4bOIAA/v6JawMWO3eGkhRRRy0IAA==" data-src="https://www.eliostruyf.com/uploads/2026/03/copilot-cli-fleet.webp" alt="Parallel tasks for GitHub Copilot CLI" style="width:1076px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;Parallel tasks for GitHub Copilot CLI&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h2 id="when-i-still-use-copilot-chat"&gt;When I still use Copilot Chat&lt;a aria-hidden="true" tabindex="-1" href="#when-i-still-use-copilot-chat"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I haven’t abandoned Chat entirely. It still has its place.&lt;/p&gt;
&lt;p&gt;If I simply need to implement a quick single-file utility function, or if I need the AI to quickly explain a block of code I’m looking at, the inline Chat is still my go-to. It is perfect for small tasks, but for everything else like heavy agent usage, repo-wide refactors, or anytime I’m working in my massive monorepos, the CLI is the only way to go.&lt;/p&gt;
&lt;p&gt;Let me know what you think when you try it out!&lt;/p&gt;
&lt;h2 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/features/copilot"&gt;GitHub Copilot&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://code.visualstudio.com/"&gt;Visual Studio Code&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://ghostty.org/"&gt;Ghostty Terminal&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.github.com/en/copilot/concepts/agents/copilot-cli/fleet"&gt;Copilot CLI Fleet Documentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/github/copilot-cli-for-beginners"&gt;GitHub Copilot CLI for Beginners (GitHub Repo)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://marketplace.visualstudio.com/items?itemName=eliostruyf.vscode-ghostwriter"&gt;Ghostwriter for VS Code&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Switching from Apple Watch Ultra to Garmin Venu 4</title><link>https://www.eliostruyf.com/apple-watch-ultra-to-garmin-venu-4/</link><guid isPermaLink="true">https://www.eliostruyf.com/apple-watch-ultra-to-garmin-venu-4/</guid><description>Why I traded my Apple Watch Ultra for a Garmin Venu 4 to complete my cycling training data.</description><pubDate>Thu, 19 Mar 2026 15:09:06 GMT</pubDate><content:encoded>&lt;p&gt;As a developer and avid cyclist, I am always curious about how data can improve my day-to-day life and training. For the longest time, my setup was pretty standard: a Garmin Edge on the bike for my workouts and an Apple Watch Ultra on my wrist for everything else.&lt;/p&gt;
&lt;p&gt;But there was a problem: the “data black hole.”&lt;/p&gt;
&lt;p&gt;Because I use &lt;a href="https://join.cc"&gt;Join&lt;/a&gt; alongside Garmin to manage my training schedule, I ran into a major wall. Garmin doesn’t pull metrics like sleep or recovery from Apple Health. My Edge knew exactly how much power I was putting down during a ride, but it knew absolutely nothing about my recovery, sleep, or overall fatigue. Without that 24/7 context, my Training Readiness score couldn’t be calculated properly, and the workouts Join suggested weren’t adapting to my actual physical state.&lt;/p&gt;
&lt;p&gt;That made me decide to change my hardware. This is the story of how and why I switched to the Garmin Venu 4.&lt;/p&gt;
&lt;h2 id="the-choice-why-the-venu-4"&gt;The choice: why the Venu 4?&lt;a aria-hidden="true" tabindex="-1" href="#the-choice-why-the-venu-4"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;A couple of my cycling friends have Garmin watches and raved about their experience. I knew I needed a Garmin wearable to feed that sweet off-bike data into my ecosystem, but I didn’t want a hardcore running watch like a Forerunner or Fenix. The Garmin Edge handles all my cycling navigation and tracking, I just needed a lifestyle watch that tracks health metrics accurately.&lt;/p&gt;
&lt;p&gt;I went with the Garmin Venu 4. It looks great, features a round face (which is more my style), and comes with a reasonable price tag.&lt;/p&gt;
&lt;h2 id="the-first-few-weeks-patience-is-required"&gt;The first few weeks: patience is required&lt;a aria-hidden="true" tabindex="-1" href="#the-first-few-weeks-patience-is-required"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I must admit, the first day wasn’t perfect. Setting up the Venu 4 required four separate software updates back-to-back. Why not just package them into one? It’s a small quirk, but the onboarding experience could have been smoother.&lt;/p&gt;
&lt;p&gt;More importantly, you can’t just strap on a Garmin and expect instant, life-changing insights. Features like Training Readiness, HRV Status, and overall Health Status require about two to three weeks of continuous wear to establish a baseline. Before that, your training status might just stubbornly say “Unproductive.” Pulse Ox data and broader health trends also take time to appear.&lt;/p&gt;
&lt;aside class="callout callout-info" aria-label="info"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;circle cx="12" cy="12" r="10"&gt;&lt;/circle&gt;&lt;path d="M12 16v-4"&gt;&lt;/path&gt;&lt;path d="M12 8h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;info&lt;/p&gt;&lt;div class="callout-text"&gt;You will get some immediate scores, but no trend lines. For things like HRV, the app literally tells you to keep wearing it for three weeks before it can give you the data.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h3 id="the-medication-curveball"&gt;The medication curveball&lt;a aria-hidden="true" tabindex="-1" href="#the-medication-curveball"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Just as I got past that initial baseline period, I had an unexpected physiological shift. I recently stopped taking beta-blockers, which naturally keep your heart rate lower. My resting heart rate increased by about 10-15 beats per minute, and my peak output changed as well.&lt;/p&gt;
&lt;p&gt;The Garmin immediately noticed. Suddenly, my HRV was flagged as “off,” my training status was red, and the watch assumed I was constantly fatigued. It was fascinating to see the algorithms react to my biology, but it meant I had to start that 2-3 week calibration clock all over again.&lt;/p&gt;
&lt;p&gt;By around week four, the Health Status view finally started showing meaningful trends for resting heart rate, HRV, respiration, skin temperature, and pulse oximeter. That was the point where the platform started to feel genuinely useful instead of just “collecting numbers.”&lt;/p&gt;
&lt;h2 id="smartwatch-features-and-the-apple-ecosystem-hangover"&gt;Smartwatch features and the Apple ecosystem hangover&lt;a aria-hidden="true" tabindex="-1" href="#smartwatch-features-and-the-apple-ecosystem-hangover"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Don’t get me wrong, the Apple Watch Ultra is a vastly superior &lt;em&gt;smartwatch&lt;/em&gt;. There are definitely things I miss:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Ecosystem integration:&lt;/strong&gt; I used to walk up to my Mac, and it would seamlessly unlock. Now, I use the fingerprint scanner on my keyboard. It works, and it’s only a second slower, but it’s a slight friction point.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Notification handling:&lt;/strong&gt; Apple clears notifications across all devices once read. Garmin still struggles a bit with this unified approach (limitation by Apple).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Screen real estate:&lt;/strong&gt; Because the Venu 4 is round and has a relatively fat bezel, you lose a lot of corner space. When reading long messages, you have to scroll the text right into the middle horizontal strip of the screen to read it properly.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;I’ve also experienced a couple of random freezes where a notification would come in, or I’d start an activity, and the screen would just go black and reboot. It’s not a dealbreaker, but the interface is noticeably less responsive than Apple’s.&lt;/p&gt;
&lt;h2 id="battery-life-changes-behavior"&gt;Battery life changes behavior&lt;a aria-hidden="true" tabindex="-1" href="#battery-life-changes-behavior"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;This is where the Venu 4 shines. The battery life actually changes how you interact with the device.&lt;/p&gt;
&lt;p&gt;With the Apple Watch, I had a strict daily charging ritual. I rarely wore it to bed because I didn’t want to deal with a dead battery the next day, plus I liked keeping electronics out of the bedroom.&lt;/p&gt;
&lt;p&gt;With the Venu 4’s AMOLED display set to “Always On,” I still get a solid week of battery life. In one cycle, it dropped to 58% after about four days and was down to 18% after roughly a week, which lined up with Garmin’s estimate.&lt;/p&gt;
&lt;p&gt;When I turned Always On off to test it, I had 69% battery left after four days, with the watch predicting another eight days of life.&lt;/p&gt;
&lt;p&gt;Because I don’t have to babysit the battery, I sometimes forget I’m even wearing it. I now sleep with it on every single night, which allows me to finally capture that crucial sleep and recovery data. My charging ritual is now just throwing it on the charger once a week while taking a shower after a hard workout.&lt;/p&gt;
&lt;h2 id="nice-to-have"&gt;Nice to have&lt;a aria-hidden="true" tabindex="-1" href="#nice-to-have"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;The built-in flashlight is a genuinely handy extra. I already used it once when I needed quick light and my phone was not nearby. I probably will not use it every day, but it is a nice-to-have feature when you don’t have your phone on you.&lt;/p&gt;
&lt;h2 id="the-verdict-for-cyclists"&gt;The verdict for cyclists&lt;a aria-hidden="true" tabindex="-1" href="#the-verdict-for-cyclists"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;If you are a cyclist who uses an Edge unit for the real work but relies on an Apple Watch for daily wear, you are missing half the picture.&lt;/p&gt;
&lt;p&gt;The Apple Watch is a brilliant smart assistant, but the Garmin Venu 4 acts as an actual training partner. It bridges the gap between rides, taking the guesswork out of recovery. Once you survive the initial setup and the baseline calibration, having your holistic health data dictate your training readiness is a massive upgrade.&lt;/p&gt;
&lt;p&gt;For my use case, it is the right middle ground: great for cyclists who already rely on an Edge and want better off-bike data, without paying for running-first features from the Forerunner/Fenix line.&lt;/p&gt;
&lt;p&gt;Let me know what you think, or if you’ve made a similar jump!&lt;/p&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://marketplace.visualstudio.com/items?itemName=eliostruyf.vscode-ghostwriter"&gt;Ghostwriter for VS Code&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Control Your Mac with a 3-Button Voice Setup</title><link>https://www.eliostruyf.com/control-mac-3-button-voice-setup/</link><guid isPermaLink="true">https://www.eliostruyf.com/control-mac-3-button-voice-setup/</guid><description>Discover how to control your Mac effortlessly with a 3-button voice setup for a seamless workflow and enhanced productivity.</description><pubDate>Wed, 11 Mar 2026 12:00:00 GMT</pubDate><content:encoded>&lt;p&gt;A while back, I wrote a post called &lt;a href="https://www.eliostruyf.com/stop-typing-start-talking/"&gt;Stop typing, start talking: How voice dictation changed my workflow&lt;/a&gt;, where I shared how I was shifting my workflow to use my voice more and more. At the time, I was still relying heavily on my main keyboard to trigger these voice commands. It was an improvement, certainly, but it wasn’t perfect. I still had to wrangle my fingers into odd positions to hit complex shortcut combinations.&lt;/p&gt;
&lt;p&gt;As a developer, I’m always curious about optimizing my physical workspace. I wanted a way to trigger my voice tools without thinking, and without looking down. So, I decided to buy a 3-button keyboard (actually 4, but I only use three).&lt;/p&gt;
&lt;p&gt;In this post, I will share how I use a tiny, three-button keyboard to control my entire machine with my voice.&lt;/p&gt;
&lt;h2 id="the-hardware-three-buttons-to-rule-them-all"&gt;The hardware: three buttons to rule them all&lt;a aria-hidden="true" tabindex="-1" href="#the-hardware-three-buttons-to-rule-them-all"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Nowadays, I use a new programmable keyboard with just three buttons (well, technically four, but I only need three to use it). It is the SinLoon Mini Programmable Mechanical Keyboard.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/03/mini-keyboard.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRp4AAABXRUJQVlA4IJIAAADwAQCdASoKAA0AAUAmJbACdAEU7Y+bpsAA/vbCaNPtKPtpL0O9HO7AGIBgUfhNhTykPXlMSWZ+xFsAf4/Pj4i1tQ8uH/XRo37jkNnx/Rwi8Q5K08H28fRPWtaeJcYy7ToapZpRh//lppkx/cSMH96op9Gjyh1EESj/iHcUH/mZYYtjd04kn6LcLf9uRLpqC8AAAA==" data-src="https://www.eliostruyf.com/uploads/2026/03/mini-keyboard.webp" alt="My 3-button SinLoon macro keyboard setup" style="width:1200px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;My 3-button SinLoon macro keyboard setup&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;p&gt;The goal is simple: instead of hitting &lt;code&gt;Ctrl+Alt+Shift+Something&lt;/code&gt;, I just press a single chunky button.&lt;/p&gt;
&lt;p&gt;I must admit, the setup for the hardware has one slight catch. The SinLoon keyboard uses a custom app that you have to download via a Google Drive link. That sounds a bit shady, but it is totally fine. The real downside is that the configuration app only works on Windows, while I do all my development work on macOS.&lt;/p&gt;
&lt;p&gt;To map the keys, I plugged the keyboard into a Windows machine, opened the app, and set the shortcuts by clicking the virtual keys in the UI window. Once you send the keybindings to the macro board, they are saved directly on the device. When I plug it back into my Mac, it works flawlessly.&lt;/p&gt;
&lt;h2 id="the-software-translating-buttons-to-actions"&gt;The software: translating buttons to actions&lt;a aria-hidden="true" tabindex="-1" href="#the-software-translating-buttons-to-actions"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I configured the three buttons to trigger the specific tools that power my voice-first workflow:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Button 1:&lt;/strong&gt; Triggers &lt;a href="https://handy.computer/"&gt;Handy&lt;/a&gt;, my go-to tool for transcribing voice to text anywhere. It sends &lt;code&gt;Ctrl+Alt+Shift+R&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Button 2:&lt;/strong&gt; Triggers &lt;a href="https://github.com/estruyf/VoiceSnippets"&gt;VoiceSnippets&lt;/a&gt;, a custom app I built for running commands and text expansion based on spoken trigger words. It sends &lt;code&gt;Ctrl+Alt+Shift+S&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Button 3:&lt;/strong&gt; Acts as my &lt;code&gt;Enter&lt;/code&gt; key.&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/03/voicesnippets.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRnwAAABXRUJQVlA4WAoAAAAQAAAACQAACAAAQUxQSCwAAAABL6CQbQTI3/E+kptGRAT6AAratmGqMqhG4AMYhfEnNgYR/Y/JzCMA/5qkBVZQOCAqAAAA0AEAnQEqCgAJAAFAJiWkAAMXnLvxRAAA/v5LQhuh7jVwzWF+995AbIAA" data-src="https://www.eliostruyf.com/uploads/2026/03/voicesnippets.webp" alt="VoiceSnippets app to controls your computer with trigger words" style="width:3796px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;VoiceSnippets app to controls your computer with trigger words&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;aside class="callout callout-tip" aria-label="tip"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M15 14c.2-1 .7-1.7 1.5-2.5 1-.9 1.5-2.2 1.5-3.5A6 6 0 0 0 6 8c0 1 .2 2.2 1.5 3.5.7.7 1.3 1.5 1.5 2.5"&gt;&lt;/path&gt;&lt;path d="M9 18h6"&gt;&lt;/path&gt;&lt;path d="M10 22h4"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;tip&lt;/p&gt;&lt;div class="callout-text"&gt;By offloading these complex &lt;code&gt;Ctrl+Alt+Shift&lt;/code&gt; shortcuts to a dedicated macro pad, you completely eliminate the awkward finger aerobics usually required to trigger background apps.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="voicesnippets-in-action"&gt;VoiceSnippets in action&lt;a aria-hidden="true" tabindex="-1" href="#voicesnippets-in-action"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;The real magic happens when you pair a single button press with automation. VoiceSnippets listens for specific trigger phrases and executes pre-defined actions.&lt;/p&gt;
&lt;p&gt;Here is how that works in practice. When I want to spin up a project in Visual Studio Code, I press Button 2 and say “Start development”.&lt;/p&gt;
&lt;p&gt;VoiceSnippets interprets that and runs the following configuration:&lt;/p&gt;
&lt;div class="expressive-code"&gt;&lt;link rel="stylesheet" href="/_astro/ec.gf7e9.css"&gt;&lt;script type="module" src="/_astro/ec.8zarh.js"&gt;&lt;/script&gt;&lt;figure class="frame"&gt;&lt;figcaption class="header"&gt;&lt;/figcaption&gt;&lt;pre data-language="json"&gt;&lt;code&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;1&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;{&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;2&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"id"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"492188f5-4ddd-4f12-ba02-37b1a72eaa36"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;3&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"trigger_word"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"start development"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;4&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"expansion"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"&amp;#x3C;delay Cmd+Shift+Pms&gt; → Focus terminal view → &amp;#x3C;Enter&gt; → &amp;#x3C;delay 1000ms&gt; → npm run dev → &amp;#x3C;Enter&gt;"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;5&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"command_type"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"Workflow"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;6&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"category"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;null&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;7&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"aliases"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: [],&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;8&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"workflow_steps"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: [&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;9&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;    &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;{ &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"step_type"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"shortcut"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;, &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"value"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"Cmd+Shift+P"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; },&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;10&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;    &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;{ &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"step_type"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"text"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;, &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"value"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"Focus terminal view"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; },&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;11&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;    &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;{ &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"step_type"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"key"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;, &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"value"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"Enter"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; },&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;12&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;    &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;{ &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"step_type"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"delay"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;, &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"value"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"1000"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; },&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;13&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;    &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;{ &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"step_type"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"text"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;, &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"value"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"npm run dev"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; },&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;14&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;    &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;{ &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"step_type"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"key"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;, &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"value"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"Enter"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; }&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;15&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;  &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;],&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;16&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"app_filters"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: [&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;17&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;    &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;{ &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"id"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"com.microsoft.VSCode"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;, &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"name"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"Code"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; }&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;18&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;  &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;]&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;19&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;}&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div class="copy"&gt;&lt;button title="Copy to clipboard" data-copied="Copied!" data-code="{&#127;  &amp;#x22;id&amp;#x22;: &amp;#x22;492188f5-4ddd-4f12-ba02-37b1a72eaa36&amp;#x22;,&#127;  &amp;#x22;trigger_word&amp;#x22;: &amp;#x22;start development&amp;#x22;,&#127;  &amp;#x22;expansion&amp;#x22;: &amp;#x22;&lt;delay Cmd+Shift+Pms&gt; → Focus terminal view → &lt;Enter&gt; → &lt;delay 1000ms&gt; → npm run dev → &lt;Enter&gt;&amp;#x22;,&#127;  &amp;#x22;command_type&amp;#x22;: &amp;#x22;Workflow&amp;#x22;,&#127;  &amp;#x22;category&amp;#x22;: null,&#127;  &amp;#x22;aliases&amp;#x22;: [],&#127;  &amp;#x22;workflow_steps&amp;#x22;: [&#127;    { &amp;#x22;step_type&amp;#x22;: &amp;#x22;shortcut&amp;#x22;, &amp;#x22;value&amp;#x22;: &amp;#x22;Cmd+Shift+P&amp;#x22; },&#127;    { &amp;#x22;step_type&amp;#x22;: &amp;#x22;text&amp;#x22;, &amp;#x22;value&amp;#x22;: &amp;#x22;Focus terminal view&amp;#x22; },&#127;    { &amp;#x22;step_type&amp;#x22;: &amp;#x22;key&amp;#x22;, &amp;#x22;value&amp;#x22;: &amp;#x22;Enter&amp;#x22; },&#127;    { &amp;#x22;step_type&amp;#x22;: &amp;#x22;delay&amp;#x22;, &amp;#x22;value&amp;#x22;: &amp;#x22;1000&amp;#x22; },&#127;    { &amp;#x22;step_type&amp;#x22;: &amp;#x22;text&amp;#x22;, &amp;#x22;value&amp;#x22;: &amp;#x22;npm run dev&amp;#x22; },&#127;    { &amp;#x22;step_type&amp;#x22;: &amp;#x22;key&amp;#x22;, &amp;#x22;value&amp;#x22;: &amp;#x22;Enter&amp;#x22; }&#127;  ],&#127;  &amp;#x22;app_filters&amp;#x22;: [&#127;    { &amp;#x22;id&amp;#x22;: &amp;#x22;com.microsoft.VSCode&amp;#x22;, &amp;#x22;name&amp;#x22;: &amp;#x22;Code&amp;#x22; }&#127;  ]&#127;}"&gt;&lt;div&gt;&lt;/div&gt;&lt;/button&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/03/start-development-command.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRoYAAABXRUJQVlA4WAoAAAAQAAAACQAACwAAQUxQSC0AAAABL6CQbQTI3/FekTuNiIhj76AQkhXqJoMfQA5vAkghf6IIIvofI+kM6Ody9wUAVlA4IDIAAAAQAgCdASoKAAwAAUAmJZwAAxamOnC5GWyAAP7+S0IbobsSr3vnkVtiizGZWxisPgAAAA==" data-src="https://www.eliostruyf.com/uploads/2026/03/start-development-command.webp" alt="VoiceSnippet: Start development command configuration" style="width:3796px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;VoiceSnippet: Start development command configuration&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;p&gt;This sequence opens the VS Code command palette, focuses the terminal view, waits a second to ensure the terminal is ready, and then types and runs &lt;code&gt;npm run dev&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;I also use text expansion with variables. For example, checking out a git branch:&lt;/p&gt;
&lt;div class="expressive-code"&gt;&lt;figure class="frame"&gt;&lt;figcaption class="header"&gt;&lt;/figcaption&gt;&lt;pre data-language="json"&gt;&lt;code&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;1&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;{&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;2&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"id"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"474bf19d-938f-4893-bfe2-390990bd7c17"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;3&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"trigger_word"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"branch {name}"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;4&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"expansion"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"git checkout {name}"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;5&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"command_type"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"TextExpansion"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;6&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"category"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;null&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;7&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"aliases"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: [],&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;8&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"workflow_steps"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;null&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;,&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;9&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;  &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;"app_filters"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: []&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;10&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;}&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div class="copy"&gt;&lt;button title="Copy to clipboard" data-copied="Copied!" data-code="{&#127;  &amp;#x22;id&amp;#x22;: &amp;#x22;474bf19d-938f-4893-bfe2-390990bd7c17&amp;#x22;,&#127;  &amp;#x22;trigger_word&amp;#x22;: &amp;#x22;branch {name}&amp;#x22;,&#127;  &amp;#x22;expansion&amp;#x22;: &amp;#x22;git checkout {name}&amp;#x22;,&#127;  &amp;#x22;command_type&amp;#x22;: &amp;#x22;TextExpansion&amp;#x22;,&#127;  &amp;#x22;category&amp;#x22;: null,&#127;  &amp;#x22;aliases&amp;#x22;: [],&#127;  &amp;#x22;workflow_steps&amp;#x22;: null,&#127;  &amp;#x22;app_filters&amp;#x22;: []&#127;}"&gt;&lt;div&gt;&lt;/div&gt;&lt;/button&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/03/voicesnippets-branch-command.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRoQAAABXRUJQVlA4WAoAAAAQAAAACQAACAAAQUxQSCwAAAABL6CQbQTI3/E+kptGRAT6AAratmGqMqhG4AMYhfEnNgYR/Y/JzCMA/5qkBVZQOCAyAAAAkAEAnQEqCgAJAAFAJiWcAAMWnK2AAP7+S0IbobsSt3wkT4rjNuAP338WLF+MJSqAAAA=" data-src="https://www.eliostruyf.com/uploads/2026/03/voicesnippets-branch-command.webp" alt="VoiceSnippet: branch command" style="width:3796px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;VoiceSnippet: branch command&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;p&gt;If I say “branch dev”, VoiceSnippets automatically expands it to &lt;code&gt;git checkout dev&lt;/code&gt;.&lt;/p&gt;
&lt;h2 id="a-real-world-workflow"&gt;A real-world workflow&lt;a aria-hidden="true" tabindex="-1" href="#a-real-world-workflow"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;When I am answering questions or being interviewed by an AI tool like Ghostwriter, the seamless nature of this setup really shines. Here is the exact loop:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;I hold down &lt;strong&gt;Button 1&lt;/strong&gt; (Handy).&lt;/li&gt;
&lt;li&gt;I speak my answer out loud.&lt;/li&gt;
&lt;li&gt;I release &lt;strong&gt;Button 1&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;I tap &lt;strong&gt;Button 3&lt;/strong&gt; (Enter).&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;Handy transcribes the text directly into the prompt box, and the Enter button submits it. I can do this process over and over while leaning back in my chair. I do not have to touch my mouse or look down at a full keyboard. I just press the button and talk.&lt;/p&gt;
&lt;h2 id="the-takeaway"&gt;The takeaway&lt;a aria-hidden="true" tabindex="-1" href="#the-takeaway"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;It is remarkably faster to have dedicated physical inputs for voice triggers. Once muscle memory sets in, you never have to look down—you just know the button is exactly where it needs to be, and it will work every single time.&lt;/p&gt;
&lt;p&gt;If you spend a lot of time writing, prompting, or executing repetitive terminal commands, I highly recommend exploring a voice-driven approach. Start using your voice today!&lt;/p&gt;
&lt;p&gt;Let me know what you think.&lt;/p&gt;
&lt;h2 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://github.com/estruyf/VoiceSnippets"&gt;VoiceSnippets&lt;/a&gt;&lt;/strong&gt; - My custom app for macOS voice automation.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://handy.computer"&gt;Handy&lt;/a&gt;&lt;/strong&gt; - The AI transcription tool I use for dictation.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://www.amazon.de/-/en/dp/B0GC97ZT4M"&gt;SinLoon Mini Programmable Mechanical Keyboard&lt;/a&gt;&lt;/strong&gt; - The hardware powering the 3-button setup.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://marketplace.visualstudio.com/items?itemName=eliostruyf.vscode-ghostwriter"&gt;Ghostwriter for VS Code&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Are We Killing Indie Development with AI?</title><link>https://www.eliostruyf.com/killing-indie-development-with-ai/</link><guid isPermaLink="true">https://www.eliostruyf.com/killing-indie-development-with-ai/</guid><description>Explore the impact of AI on indie development and the need for a moral compass in coding. Are we sacrificing quality for speed?</description><pubDate>Sun, 15 Feb 2026 17:53:29 GMT</pubDate><content:encoded>&lt;p&gt;I get this feeling a lot lately. I wake up with an idea, grab a coffee, open my editor, and thanks to the current generation of AI tools, I can have a working prototype before breakfast.&lt;/p&gt;
&lt;p&gt;The barrier to entry for software development hasn’t just been lowered; it’s effectively been removed. We are in the era of “vibe coding,” where natural language prompts turn into deployed applications in minutes. It is exhilarating. It is powerful.&lt;/p&gt;
&lt;p&gt;But lately, I have started to wonder: &lt;strong&gt;Are we killing indie development with AI?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Don’t get me wrong, I love these tools. I use &lt;a href="https://github.com/features/copilot"&gt;GitHub Copilot&lt;/a&gt; and other LLMs daily. But I believe we have reached a tipping point where &lt;em&gt;the speed of building has outpaced the thinking part&lt;/em&gt;. We are so focused on &lt;em&gt;how fast&lt;/em&gt; we can build that we stopped asking &lt;em&gt;if&lt;/em&gt; we should build.&lt;/p&gt;
&lt;aside class="callout callout-info" aria-label="info"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;circle cx="12" cy="12" r="10"&gt;&lt;/circle&gt;&lt;path d="M12 16v-4"&gt;&lt;/path&gt;&lt;path d="M12 8h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;info&lt;/p&gt;&lt;div class="callout-text"&gt;The speed of building has outpaced the thinking part.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;p&gt;In this post, I want to talk about why we need a new “moral compass” for development in the AI age, and a potential solution to help us get there.&lt;/p&gt;
&lt;aside class="callout callout-important" aria-label="important"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M20 13c0 5-3.5 7.5-7.66 8.95a1 1 0 0 1-.67-.01C7.5 20.5 4 18 4 13V6a1 1 0 0 1 1-1c2 0 4.5-1.2 6.24-2.72a1.17 1.17 0 0 1 1.52 0C14.51 3.81 17 5 19 5a1 1 0 0 1 1 1z"&gt;&lt;/path&gt;&lt;path d="M12 8v4"&gt;&lt;/path&gt;&lt;path d="M12 16h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;important&lt;/p&gt;&lt;div class="callout-text"&gt;I don’t want to do any harm to any developer or their work. This is my own perspective and experience, feel free to disagree or make your own conclusions.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="the-speed-trap"&gt;The speed trap&lt;a aria-hidden="true" tabindex="-1" href="#the-speed-trap"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Five years ago, if you had an idea for a SaaS tool, say, a screenshot editor or a niche time-tracker,you had to sit down and plan. The friction of coding was a natural filter. You had to ask yourself: “Is this worth X hours of my life?”&lt;/p&gt;
&lt;p&gt;Today, that cost is near zero. If you don’t like the screenshot tool you’re paying $15 a year for, you can prompt an AI to build a clone in an afternoon.&lt;/p&gt;
&lt;p&gt;On the surface, this looks like freedom. But look a little deeper. That $15 tool you just cloned? It was likely built by another indie developer. Someone who spent months thinking about edge cases, designing the interface, writing documentation, and supporting users. By cloning it just because you can, you aren’t just saving $15; you are actively devaluing the craft of independent software development and the livelihood of the person behind it.&lt;/p&gt;
&lt;p&gt;If we all just clone everything we use, we completely commoditize the market. We create a sea of “good enough” AI-generated noise where no one can actually sustain a business.&lt;/p&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This screenshot tool is where I started to think about the impact of AI on indie development. I use &lt;a href="https://xnapper.com/"&gt;Xnapper&lt;/a&gt; for years now, got a license for every device I own. Yesterday I saw somebody creating a clone, and mentioning it only took one hour to build. That is when I realized the true impact of AI on indie development and I started thinking about the broader implications.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/02/ai-tool.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRlgAAABXRUJQVlA4IEwAAADQAQCdASoKAAUAAUAmJQBOgCIfOobUAAD+1+L0d4THLYIyj3o/FJbXeyB+CrLpto2sPpMZqddTOcGVC47DMs3meEuOqabGrv1G9TAA" data-src="https://www.eliostruyf.com/uploads/2026/02/ai-tool.webp" alt="The tool is not the issue, it is the mindset" style="width:2816px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;The tool is not the issue, it is the mindset&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h2 id="the-life-of-an-indie-developer-in-the-ai-age"&gt;The life of an indie developer in the AI age&lt;a aria-hidden="true" tabindex="-1" href="#the-life-of-an-indie-developer-in-the-ai-age"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Let me paint a picture that I think a lot of developers are starting to recognize.&lt;/p&gt;
&lt;p&gt;You spend weeks, maybe months, building something. You think about the problem, you design the interface, you handle the edge cases, you support your users, you write the docs. You pour yourself into it. Then one morning, someone sees your product, opens their AI editor, and builds a “good enough” version in an afternoon. They ship it. Maybe they make it free, maybe they make it open source, maybe they just use it themselves and tell their friends, their community, their followers.&lt;/p&gt;
&lt;p&gt;They did not steal your code. They did not copy your product. They just… rebuilt it. Close enough. Good enough. And now your product has competition that cost someone a few hours of prompting while it cost you months of your life.&lt;/p&gt;
&lt;p&gt;But it does not stop there. A third developer sees that clone and thinks, “I can do this too, but I want it slightly different.” So they prompt their own version. And a fourth. And a fifth. Each one is not a copy in the traditional sense. Nobody is violating a license. Nobody is stealing intellectual property. They are just building their own version that matches their use case.&lt;/p&gt;
&lt;p&gt;It is a lot like art. You create a painting, something original, something you are proud of. Then somebody sees it and recreates it. Not a forgery, just their interpretation. But they have a bigger budget, a larger audience, better distribution. Suddenly their version is the one people see first. Others share that version instead of yours. This is what is happening a lot on social media with AI-generated content. The original creator is overshadowed by the faster, more accessible clone.&lt;/p&gt;
&lt;p&gt;In the art world, we have a word for this erosion: it is called &lt;strong&gt;devaluation&lt;/strong&gt;. In the software world, we are doing it at industrial scale, and we are calling it &lt;strong&gt;innovation&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;I am not saying you should never build something that already exists. Competition is healthy, and sometimes a fresh perspective genuinely improves a category. But there is a difference between thoughtful competition and reflexive duplication. The question every developer should ask themselves is: &lt;strong&gt;“If I know someone can clone my work in an afternoon, is it still worth building?”&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The answer, I believe, is yes, but only for the things that cannot be cloned in an afternoon. The deep domain knowledge. The community around your tool. The years of user feedback baked into every feature. The trust you have earned. Those are the things AI cannot reproduce with a prompt, and I definitely don’t want to discourage people from building those things.&lt;/p&gt;
&lt;p&gt;But you can only build those things if you commit to something long enough for them to develop. And that is the real danger of the current moment: not that AI makes building easy, but that it makes &lt;em&gt;abandoning&lt;/em&gt; easy. Why invest years in one product when you can ship a new one every week?&lt;/p&gt;
&lt;h2 id="i-am-guilty-of-this-too"&gt;I am guilty of this too&lt;a aria-hidden="true" tabindex="-1" href="#i-am-guilty-of-this-too"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I have no room to preach. I am right there in the trenches with you.&lt;/p&gt;
&lt;p&gt;When I built &lt;strong&gt;&lt;a href="https://frontmatter.codes"&gt;Front Matter CMS&lt;/a&gt;&lt;/strong&gt;, it was way before the AI boom. I had to think deeply about the problem because the investment of time was massive. I looked at the market, saw a gap in Visual Studio Code, and built it because nothing else existed.&lt;/p&gt;
&lt;p&gt;Compare that to recently. I built a set of cycling tools (never released by the way) for myself. Did similar tools exist? Absolutely. Were they better? Definitely. But I wanted to see how far I could get with AI. I treated it as a training exercise. In the end, I started paying for a tool called &lt;strong&gt;&lt;a href="https://join.cc"&gt;Join&lt;/a&gt;&lt;/strong&gt;, which does the same thing, because it was better and I could focus on my actual work instead of maintaining a tool that was just “good enough” for me.&lt;/p&gt;
&lt;p&gt;I did the same with &lt;strong&gt;FrameFit&lt;/strong&gt;. I investigated the market a little, didn’t see an exact match, and just started building.&lt;/p&gt;
&lt;p&gt;There is a difference between building for education (learning how AI tools work) and releasing products that dilute the hard work of others. My worry is that we are blurring that line. We are shipping our “training exercises” as products, and it is making the ecosystem messy for everyone.&lt;/p&gt;
&lt;p&gt;And I know this because I have been on both sides of it.&lt;/p&gt;
&lt;h2 id="what-actually-survives"&gt;What actually survives&lt;a aria-hidden="true" tabindex="-1" href="#what-actually-survives"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Here is the thing that made me stop and reflect. I have projects on both sides of this line, and they feel completely different.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;a href="https://demotime.show"&gt;Demo Time&lt;/a&gt;&lt;/strong&gt; is something I have been building for years. Not weeks, not weekends, years. It started because I was a conference speaker who kept running into the same problem: demos failing on stage. Nobody had built a proper solution inside Visual Studio Code, so I did. Over time, it grew because I kept showing up. I used it at conferences, talked to other speakers, iterated based on real feedback from people doing real presentations at events like Microsoft Ignite, GitHub Universe, and OpenAI DevDays. Today it has over 26,000 installations.&lt;/p&gt;
&lt;p&gt;None of that came from code. The code is open source. Anyone can see it, fork it, or rebuild it. Someone could probably vibe-code a basic version in a weekend. But what they cannot replicate is twelve years of conference speaking that taught me what presenters actually need. You would need that experience, or a big company and budget behind you, to even come close. The relationships with the community, the trust that comes from being the person who shows up, year after year, and keeps making the tool better because you genuinely use it yourself. That is not something you can prompt into existence.&lt;/p&gt;
&lt;p&gt;Compare that to FrameFit. I built it, I use it, and it works. But if it disappeared tomorrow, I wouldn’t lose any sleep over it. Demo Time? That is like a child to me. I put my passion into it.&lt;/p&gt;
&lt;p&gt;That contrast taught me something important: &lt;strong&gt;AI cannot commoditize the human context around software.&lt;/strong&gt; Community, trust, domain expertise, showing up consistently over time. These are not features you ship. They are moats you build by caring about something longer than a weekend.&lt;/p&gt;
&lt;p&gt;The developers who will thrive are not the fastest shippers. They are the ones who pair AI speed with human judgment. Who build communities, not just codebases. Who invest in trust, not just features. But that only happens if we slow down enough to think about what we are doing.&lt;/p&gt;
&lt;h2 id="problem-is-much-bigger"&gt;Problem is much bigger&lt;a aria-hidden="true" tabindex="-1" href="#problem-is-much-bigger"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;In the first week of February 2026, the SaaS sector lost approximately nearly $300 billion in market value in just 48 hours. Traders are calling it the “SaaSpocalypse.” Salesforce is down roughly 30% year-to-date. Thomson Reuters dropped 16% in a single day. LegalZoom plummeted 20%.&lt;/p&gt;
&lt;p&gt;The trigger? The launch of agentic AI tools like Claude Cowork and the realization that companies might no longer need to buy software when AI can do the work directly. Why pay per-seat licenses for a CRM, a legal research tool, or an analytics platform when an AI agent can handle 80% of those tasks autonomously?&lt;/p&gt;
&lt;p&gt;This is the same dynamic I described above, but at enterprise scale. Indie developers see someone clone their $15 tool in an afternoon. SaaS companies see their entire business model questioned because a client can now reproduce “good enough” software internally, powered by AI, at a fraction of the cost.&lt;/p&gt;
&lt;p&gt;The investors are not panicking because SaaS products are bad. They are panicking because the moat around those products, the complexity of building and maintaining enterprise software, is eroding fast. The same moat that used to protect indie developers.&lt;/p&gt;
&lt;p&gt;The parallel is hard to ignore. Whether you are a solo developer with a screenshot tool or a billion-dollar company with a CRM platform, the question is the same: what do you offer that an AI cannot reproduce in an afternoon?&lt;/p&gt;
&lt;p&gt;And the answer, I believe, is the same at every scale: deep domain knowledge, trust, community, and the human judgment to keep improving based on real-world use. The companies and developers who survive this moment will be the ones who invested in those things long before the SaaSpocalypse arrived.&lt;/p&gt;
&lt;h2 id="the-enterprise-advantage"&gt;The enterprise advantage&lt;a aria-hidden="true" tabindex="-1" href="#the-enterprise-advantage"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;While smaller indie tools face existential threats from AI clones, larger SaaS platforms have built-in protections that are harder to replicate. The difference comes down to breadth and support.&lt;/p&gt;
&lt;p&gt;A company using Salesforce isn’t just paying for a CRM. They are paying for integration with their email, their reporting tools, their customer support system, and dozens of other modules that work together seamlessly. Yes, someone could use AI to clone the contact management part in an afternoon. But cloning the entire ecosystem? The years of integrations, the training resources, the 24/7 support team that understands your business? That is not a weekend project.&lt;/p&gt;
&lt;p&gt;The same applies to Adobe Creative Suite or Microsoft Office. You could vibe-code a decent text editor or image editor with AI. People will, but you will not replicate the entire suite of interconnected tools that designers and businesses have built their workflows around. The moat is not just the software; it is the ecosystem.&lt;/p&gt;
&lt;p&gt;This dynamic actually makes the indie developer position even more precarious. The large SaaS companies can survive the clone wars because they offer a breadth of features and support that cannot be easily replicated. Indie developers, by definition, cannot. We are specialists, not platforms. We are vulnerable to the exact thing that makes enterprise software resilient: the ability to do one thing extremely well means we are easy to clone, while doing one thing well is all we have.&lt;/p&gt;
&lt;p&gt;This is why the thinking process matters even more. If you are going to build as an indie developer in this era, it cannot just be “another screenshot tool.” It has to be the screenshot tool with the community, the support, the domain expertise, and the commitment to evolving with your users. You have to be willing to be the long-term partner, not the quick solution.&lt;/p&gt;
&lt;h2 id="the-thinking-process"&gt;The thinking process&lt;a aria-hidden="true" tabindex="-1" href="#the-thinking-process"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;We need to re-introduce friction into our process. Not the old friction of writing boilerplate code. That friction is gone, and good riddance. I am talking about the friction of &lt;em&gt;thinking&lt;/em&gt;. The pause that forces you to examine your intentions before you act on them.&lt;/p&gt;
&lt;p&gt;Before AI, “thinking” was mandatory. The cost of building was high enough that it naturally filtered out bad ideas. Now, that filter is gone, and thinking must be a conscious, deliberate choice. When I have an idea now, I am trying to force myself to pause before I open Visual Studio Code or prompt a new agent.&lt;/p&gt;
&lt;p&gt;I try to run through these four questions:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;What problem does this actually solve?&lt;/strong&gt; (Is it a real pain point, or just a cool feature?)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Does it already exist?&lt;/strong&gt; (Have I actually looked, or am I assuming I’m the first?)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;If it exists, what is my unfair advantage?&lt;/strong&gt; (Why will mine be better? Is it just cheaper because I didn’t verify the edge cases?)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Can I make the existing solution better instead of rebuilding it?&lt;/strong&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;That last one is crucial. If there is an open-source tool that does 80% of what you want, the “old” way was to contribute a Pull Request. The “AI way” often tempts us to just rebuild the whole thing from scratch because it feels faster.&lt;/p&gt;
&lt;p&gt;But “faster” isn’t always “better” for the community. And here is the irony: we could use AI itself for this thinking step. Instead of prompting an LLM to start building, prompt it to research what already exists first. Use AI for the thinking, not just the building.&lt;/p&gt;
&lt;h2 id="introducing-the-product-moral-compass"&gt;Introducing the “Product Moral Compass”&lt;a aria-hidden="true" tabindex="-1" href="#introducing-the-product-moral-compass"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I don’t expect AI platforms that allow you to vibe code to solve this for us. Their business model is predicated on you writing more code (read: prompts), not less. They want you to spin up new projects constantly. They have no incentive to say, “Hey, wait, this already exists.”&lt;/p&gt;
&lt;p&gt;Think about it: when was the last time you saw a developer advocate from one of these platforms demonstrate how to &lt;em&gt;contribute to an existing project&lt;/em&gt; instead of building something new from scratch? Their marketing is all about speed, novelty, and the thrill of creation. Not about responsibility.&lt;/p&gt;
&lt;aside class="callout callout-info" aria-label="info"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;circle cx="12" cy="12" r="10"&gt;&lt;/circle&gt;&lt;path d="M12 16v-4"&gt;&lt;/path&gt;&lt;path d="M12 8h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;info&lt;/p&gt;&lt;div class="callout-text"&gt;Instead of marketing how quickly they can build the next screenshot tool, AI platforms could show how to contribute to an existing one.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;p&gt;So, I started thinking: &lt;strong&gt;What if we used AI to stop us from building with AI?&lt;/strong&gt; You could say that this is a paradox, but I think it is actually a necessary evolution of our responsibility as developers.&lt;/p&gt;
&lt;p&gt;I am exploring the idea of a &lt;strong&gt;Product Moral Compass Agent&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Imagine a mandatory first step in your “vibe coding” workflow. Before you start generating code, you pitch your idea to this agent. It interviews you, not to judge you, but to make sure you are making an informed decision.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;You:&lt;/strong&gt; “I want to build a Chrome extension that organizes simple bookmarks.”&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Agent:&lt;/strong&gt; “Okay, analyzing… I found 3 highly-rated open-source projects and 5 indie SaaS products that do exactly this. Here are the links, the pricing, and what each one covers. Do you still want to proceed?”&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This agent would act as the “thinking partner” we are skipping. It could:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Perform deep market analysis in seconds.&lt;/li&gt;
&lt;li&gt;Surface existing open-source repos you could contribute to instead.&lt;/li&gt;
&lt;li&gt;Present paid alternatives with pricing, so you can see what $15 a year actually gets you.&lt;/li&gt;
&lt;li&gt;Challenge your unique value proposition with honest questions.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;If you still want to build it after that? Great. Go ahead and start coding. But at least you are making an informed, conscious decision rather than reflexively adding more noise to the world.&lt;/p&gt;
&lt;h2 id="the-challenge"&gt;The challenge&lt;a aria-hidden="true" tabindex="-1" href="#the-challenge"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I am currently building this agent. The first version is available on GitHub: &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai/blob/main/agents/product-moral-compass.ghostwriter.md"&gt;Product Moral Compass Agent&lt;/a&gt;. Yes, I am aware of the irony, I am proposing to build something new to stop people from building new things. But I ran it through my own four questions first, and nothing like it exists yet.&lt;/p&gt;
&lt;p&gt;Once it is ready, I will share it openly so that any developer can use it as part of their workflow. Not as a gatekeeper, but as a guide. A thinking partner that helps you pause, research, and decide before you build.&lt;/p&gt;
&lt;p&gt;In the meantime, here is what you can do right now: the next time you have an idea, spend ten minutes with your favorite AI tool and ask it to find every existing solution first. Check your own bank statements. Are you already paying for a tool that solves this? If so, respect that developer’s work. Look at GitHub. Is there a repo that could use your help instead of your competition?&lt;/p&gt;
&lt;p&gt;The time to learn is right now, but the time to &lt;em&gt;think&lt;/em&gt; is also right now.&lt;/p&gt;
&lt;p&gt;I want you to keep building. I want you to be prolific. But let’s not let the ease of creation destroy the value of what we create.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;I am curious to hear your thoughts.&lt;/strong&gt; Is this gatekeeping, or is it a necessary evolution of our responsibility as developers? Let me know in the comments below.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.cnbc.com/2026/02/06/ai-anthropic-tools-saas-software-stocks-selloff.html"&gt;AI fears pummel software stocks: Is it ‘illogical’ panic or a SaaS apocalypse? - CNBC&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.bloomberg.com/news/articles/2026-02-04/what-s-behind-the-saaspocalypse-plunge-in-software-stocks"&gt;What’s Behind the ‘SaaSpocalypse’ Plunge in Software Stocks - Bloomberg&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://finance.yahoo.com/news/traders-dump-software-stocks-ai-115502147.html"&gt;‘Get me out’: Traders dump software stocks as AI fears erupt - Yahoo Finance / Bloomberg&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.trendingtopics.eu/the-anthropic-effect-fear-of-ai-agents-trigger-major-saas-stock-sell-off/"&gt;The Anthropic Effect: Fear of AI Agents Trigger Major SaaS Stock Sell-Off - Trending Topics&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.tradingkey.com/analysis/stocks/us-stocks/261581475-ai-saas-selloff-who-really-wins-and-loses"&gt;Is SaaS Dead? The Truth Behind the Software Meltdown - TradingKey&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://searchengineland.com/saas-ai-traffic-drop-469149"&gt;The real story behind the 53% drop in SaaS AI traffic - Search Engine Land&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://marketplace.visualstudio.com/items?itemName=eliostruyf.vscode-ghostwriter"&gt;Ghostwriter for VS Code&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Stop typing, start talking</title><link>https://www.eliostruyf.com/stop-typing-start-talking/</link><guid isPermaLink="true">https://www.eliostruyf.com/stop-typing-start-talking/</guid><description>Discover how voice dictation can transform your workflow and boost productivity. Stop typing and start talking today!</description><pubDate>Fri, 13 Feb 2026 10:12:50 GMT</pubDate><content:encoded>&lt;p&gt;I must admit, the way I interact with my computer has changed drastically over the last few months. As developers, we are used to living on the keyboard. We learn shortcuts, we buy mechanical keyboards, and we pride ourselves on our typing speed.&lt;/p&gt;
&lt;p&gt;But recently, I realized something: I am spending less time writing code and more time writing &lt;em&gt;prompts&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Whether I am communicating with &lt;a href="https://www.anthropic.com/"&gt;Claude&lt;/a&gt;, &lt;a href="https://blog.google/products/ai/gemini-ai/"&gt;Gemini&lt;/a&gt;, &lt;a href="https://github.com/features/copilot"&gt;GitHub Copilot&lt;/a&gt;, or just replying to messages on LinkedIn, the volume of text I need to produce has gone up. That is why I decided to stop typing and start talking.&lt;/p&gt;
&lt;p&gt;In fact, the draft for this very post was created using my voice. It is a shift that has made my workflow significantly faster, even if it took me a while to get comfortable with it.&lt;/p&gt;
&lt;h2 id="the-awkward-phase"&gt;The awkward phase&lt;a aria-hidden="true" tabindex="-1" href="#the-awkward-phase"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;This isn’t my first attempt at voice control. A couple of years ago, GitHub released &lt;a href="https://githubnext.com/projects/copilot-voice/"&gt;GitHub Copilot Voice&lt;/a&gt;, which allowed you to code by talking.&lt;/p&gt;
&lt;p&gt;It wasn’t for me. Trying to articulate complex code syntax out loud felt unnatural and slower than just typing it out. I gave it a try, but I quickly went back to my keyboard.&lt;/p&gt;
&lt;p&gt;The turning point came recently when &lt;a href="https://andrewconnell.com/"&gt;Andrew Connell&lt;/a&gt; (AC) told me he was using a tool called Wispr Flow to handle dictation. He wasn’t using it to write &lt;code&gt;if&lt;/code&gt; statements; he was using it to dump thoughts into documents, emails, AI prompts, and more.&lt;/p&gt;
&lt;p&gt;I was intrigued. Since December, I decided to make voice my main method of “typing” for long-form text. I started with a generic Whisper wrapper for about a month, and while it was an improvement, I eventually found a tool that fit my workflow perfectly.&lt;/p&gt;
&lt;h2 id="why-handy-won-me-over"&gt;Why Handy won me over&lt;a aria-hidden="true" tabindex="-1" href="#why-handy-won-me-over"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I discovered a tool called &lt;strong&gt;Handy&lt;/strong&gt;, and it has become a daily driver for me.&lt;/p&gt;
&lt;p&gt;The beauty of &lt;a href="https://handy.computer/"&gt;Handy&lt;/a&gt; lies in its simplicity. Use tools to remove friction, not add to it. Handy starts up when my device boots, and it stays out of the way until I need it.&lt;/p&gt;
&lt;p&gt;Here is how that works in practice:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;I press my global hotkey (for me, it’s &lt;code&gt;Option&lt;/code&gt; + &lt;code&gt;R&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt;I start talking.&lt;/li&gt;
&lt;li&gt;I release the keys when I’m done.&lt;/li&gt;
&lt;li&gt;Handy transcribes the audio and pastes the text directly into whatever window is in focus.&lt;/li&gt;
&lt;/ol&gt;
&lt;aside class="callout callout-tip" aria-label="tip"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M15 14c.2-1 .7-1.7 1.5-2.5 1-.9 1.5-2.2 1.5-3.5A6 6 0 0 0 6 8c0 1 .2 2.2 1.5 3.5.7.7 1.3 1.5 1.5 2.5"&gt;&lt;/path&gt;&lt;path d="M9 18h6"&gt;&lt;/path&gt;&lt;path d="M10 22h4"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;tip&lt;/p&gt;&lt;div class="callout-text"&gt;When choosing a productivity tool, look for ones that integrate seamlessly into your OS. If you have to open a dedicated app window to use it, you probably won’t use it.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h3 id="under-the-hood"&gt;Under the hood&lt;a aria-hidden="true" tabindex="-1" href="#under-the-hood"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Handy uses the &lt;strong&gt;Parakeet V3&lt;/strong&gt; model for transcription, but it has many more. I was pleasantly surprised by the accuracy. English isn’t my native language, so I sometimes worry about an AI understanding my accent. However, it does a really good job.&lt;/p&gt;
&lt;p&gt;It even handles my mother tongue, Dutch, although I don’t use it often.&lt;/p&gt;
&lt;h2 id="the-workflow-shift"&gt;The workflow shift&lt;a aria-hidden="true" tabindex="-1" href="#the-workflow-shift"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Don’t get me wrong, this workflow has a specific time and place.&lt;/p&gt;
&lt;p&gt;I work from home, which is the ideal environment for this. I can talk to my computer without feeling awkward or annoying people around me. I am definitely not going to use this while working from a coffee shop!&lt;/p&gt;
&lt;p&gt;But in the privacy of my home office, it is a game-changer for:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;AI Prompting:&lt;/strong&gt; Explaining a complex problem to an AI is much faster when you treat it like a conversation.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Social Media:&lt;/strong&gt; Replying to comments or scrolling through LinkedIn is effortless when you can just speak your reply.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Drafting Content:&lt;/strong&gt; Like this interview/article process.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="is-this-the-future"&gt;Is this the future?&lt;a aria-hidden="true" tabindex="-1" href="#is-this-the-future"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;As we rely more and more on natural language to interface with technology, voice input feels like the logical next step. It is simply more convenient to say what you want than to type it.&lt;/p&gt;
&lt;p&gt;Will keyboards disappear? No! AI is evolving so rapidly, and new tools are being created every single day. Perhaps in the future, the models will anticipate what we want before we even speak. But for right now, swapping my keyboard for a microphone has made me a lot faster.&lt;/p&gt;
&lt;p&gt;If you find yourself typing endless paragraphs to your AI assistants, give voice dictation a shot. It might just change how you work.&lt;/p&gt;
&lt;h3 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://handy.computer/"&gt;Handy on GitHub&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://githubnext.com/projects/copilot-voice/"&gt;GitHub Copilot Voice&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://andrewconnell.com/"&gt;Andrew Connell&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://marketplace.visualstudio.com/items?itemName=eliostruyf.vscode-ghostwriter"&gt;Ghostwriter for VS Code&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Is blogging still a thing? Thriving in the AI era</title><link>https://www.eliostruyf.com/blogging-thriving-ai-era/</link><guid isPermaLink="true">https://www.eliostruyf.com/blogging-thriving-ai-era/</guid><description>Explore the future of blogging in the AI era and learn how to thrive with authenticity and strategic content creation.</description><pubDate>Thu, 05 Feb 2026 14:01:59 GMT</pubDate><content:encoded>&lt;p&gt;Let’s address the elephant in the room immediately: &lt;strong&gt;Is blogging still a thing?&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;With the rise of Large Language Models (LLMs) and generative AI, the internet is flooded with content. You might be wondering if it’s worth investing hours into writing a post when a bot can generate a 1,000-word article in seconds, or when users can just ask ChatGPT for an answer instead of visiting your site.&lt;/p&gt;
&lt;p&gt;The short answer is: &lt;strong&gt;Yes, it still matters.&lt;/strong&gt;, but the &lt;em&gt;way&lt;/em&gt; we blog, the &lt;em&gt;reason&lt;/em&gt; we blog, and &lt;em&gt;who&lt;/em&gt; is reading our blogs has changed firmly.&lt;/p&gt;
&lt;p&gt;I love blogging. It helps me clarify my thoughts and share knowledge. However, my approach has evolved. I’ve stopped fighting the AI wave and started surfing it, using it to amplify my voice rather than replace it. Here is the reality of blogging in 2026 and how you can adapt.&lt;/p&gt;
&lt;h2 id="the-reality-check-where-did-the-humans-go"&gt;The reality check: where did the humans go?&lt;a aria-hidden="true" tabindex="-1" href="#the-reality-check-where-did-the-humans-go"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;If you have looked at your analytics lately, you might have noticed a trend: human traffic is dipping, but “visitor” numbers might be steady or even rising. If you dig deeper, you’ll realize those aren’t people. They are bots.&lt;/p&gt;
&lt;p&gt;Specifically, they are AI crawlers.&lt;/p&gt;
&lt;p&gt;I recently checked my Cloudflare logs, and the activity is undeniable. My site is being crawled heavily by agents like &lt;code&gt;ChatGPT-User&lt;/code&gt; (OpenAI), &lt;code&gt;PetalBot&lt;/code&gt; (Huawei), and &lt;code&gt;Meta-ExternalAgent&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;Here is a snapshot of what that traffic looks like on my end (the last 24 hours):&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;ChatGPT-User:&lt;/strong&gt; 2.22k requests&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;BingBot:&lt;/strong&gt; 499 requests&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Meta-ExternalAgent:&lt;/strong&gt; 498 requests&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/02/crawl-bots.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRogAAABXRUJQVlA4WAoAAAAQAAAACQAABwAAQUxQSCkAAAABLyAkIP6PlPzgRkTEnMFA2ya7AzQ8OHgwgAX8C8JBRP9jkpyAerTtBQBWUDggOAAAALABAJ0BKgoACAABQCYlpAAC52q4vyAA/v4SLKLoIIoHDmbjxhEVs3kK21cZW1bE9VTRd3gAAAAA" data-src="https://www.eliostruyf.com/uploads/2026/02/crawl-bots.webp" alt="Cloudflare - AI Crawl Control" style="width:4116px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;Cloudflare - AI Crawl Control&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;p&gt;If I dig deeper, for the &lt;code&gt;ChatGPT-User&lt;/code&gt; agent, I can see that the most requested pages are my review posts.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/02/chatgpt-most-crawled.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRoAAAABXRUJQVlA4WAoAAAAQAAAACQAABAAAQUxQSCsAAAABJ6CQbQTIX/I+kJ1GRMTFgEJIVqibDN4k8BVIIX+uGCL6H1NVZ0B8LmkBAFZQOCAuAAAAsAEAnQEqCgAFAAFAJiWkAALnZu1+0AD+/hIqJDv9i8QNh2R/BbcUzOXF1rgAAA==" data-src="https://www.eliostruyf.com/uploads/2026/02/chatgpt-most-crawled.webp" alt="Most crawled pages by ChatGPT" style="width:4116px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;Most crawled pages by ChatGPT&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;aside class="callout callout-info" aria-label="info"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;circle cx="12" cy="12" r="10"&gt;&lt;/circle&gt;&lt;path d="M12 16v-4"&gt;&lt;/path&gt;&lt;path d="M12 8h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;info&lt;/p&gt;&lt;div class="callout-text"&gt;Cloudflare also offers managed &lt;code&gt;robots.txt&lt;/code&gt; and bot blocking features, so you can decide how much of that traffic you want to allow or stop. It is worth noting that some bots have been caught bypassing &lt;code&gt;robots.txt&lt;/code&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h3 id="the-shift-in-consumption"&gt;The shift in consumption&lt;a aria-hidden="true" tabindex="-1" href="#the-shift-in-consumption"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;In the past, a user would search for a problem, click your link, read your tutorial, and maybe leave a comment. Today, an AI model scrapes your content, ingests the knowledge, and summarizes it for the user directly in a chat interface.&lt;/p&gt;
&lt;p&gt;This drives users away from your site. It is a harsh reality.&lt;/p&gt;
&lt;p&gt;We saw ta brutal example of this recently with &lt;strong&gt;Tailwind CSS&lt;/strong&gt;. Auhenticity and community are vital, but for businesses relying on documentation traffic, AI can be devastating. &lt;a href="https://adamwathan.me/"&gt;Adam Wathan&lt;/a&gt;, the creator of Tailwind, noted in a GitHub discussion that traffic to their documentation dropped &lt;strong&gt;40%&lt;/strong&gt; from early 2023, despite the framework being more popular than ever. This loss of direct traffic contributed to significant layoffs because the docs were the primary funnel for their commercial products. Blogs and product documentation are not the same, but the principle remains: if AI can answer your users’ questions without them visiting your site, your direct relationship with your audience weakens.&lt;/p&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;Read more about it in the &lt;a href="https://github.com/tailwindlabs/tailwindcss.com/pull/2388#issuecomment-3717222957"&gt;Tailwind CSS Discussion&lt;/a&gt;&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="the-problem-with-ai-slop"&gt;The problem with “AI slop”&lt;a aria-hidden="true" tabindex="-1" href="#the-problem-with-ai-slop"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Because AI is eating up traffic, many creators have responded by using AI to churn out mass content to game the system. You have seen these posts. They are soulless.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;They overuse emojis &#128640;✨&lt;/li&gt;
&lt;li&gt;They use dashes and bullet points for everything.&lt;/li&gt;
&lt;li&gt;They adopt weird fads, like writing entire articles in lowercase.&lt;/li&gt;
&lt;li&gt;The prompt was clearly just: &lt;em&gt;“Hey AI, write a post about X.”&lt;/em&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;As a reader, when I encounter this, I stop reading immediately, certainly when text is written in all lowercase. My mind cannot focus on it. It breeds distrust. If you didn’t care enough to write it (or at least structure the thought), why should I care enough to read it?&lt;/p&gt;
&lt;aside class="callout callout-important" aria-label="important"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M20 13c0 5-3.5 7.5-7.66 8.95a1 1 0 0 1-.67-.01C7.5 20.5 4 18 4 13V6a1 1 0 0 1 1-1c2 0 4.5-1.2 6.24-2.72a1.17 1.17 0 0 1 1.52 0C14.51 3.81 17 5 19 5a1 1 0 0 1 1 1z"&gt;&lt;/path&gt;&lt;path d="M12 8v4"&gt;&lt;/path&gt;&lt;path d="M12 16h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;important&lt;/p&gt;&lt;div class="callout-text"&gt;The second part is most important. You need to give your thoughts, your content, and your ideas. Make it authentic!&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;p&gt;&lt;strong&gt;Authenticity is your only moat.&lt;/strong&gt; The specific reason my reviews are currently my most-read content is that they are deeply personal, subjective, and based on real-world testing, things AI cannot fully fake yet (or maybe it can).&lt;/p&gt;
&lt;h2 id="a-new-workflow-interviewing-myself"&gt;A new workflow: interviewing myself&lt;a aria-hidden="true" tabindex="-1" href="#a-new-workflow-interviewing-myself"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;So, how do we balance efficiency with authenticity? I don’t write entirely manually anymore, but I also never let AI write from scratch.&lt;/p&gt;
&lt;p&gt;My secret weapon is &lt;strong&gt;getting interviewed by AI&lt;/strong&gt; (not so secret anymore).&lt;/p&gt;
&lt;p&gt;I use a tool called &lt;strong&gt;Ghostwriter&lt;/strong&gt;. It started as AI agents, evolved into an Electron app, and now includes a VS Code extension. Instead of staring at a blank page, I initiate an interview session. That way, I get to speak my thoughts out loud, and the AI captures them. It are my words, my voice, and my opinions, but structured by AI.&lt;/p&gt;
&lt;h3 id="how-it-works"&gt;How it works:&lt;a aria-hidden="true" tabindex="-1" href="#how-it-works"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;h4 id="the-interview-process"&gt;The interview process&lt;a aria-hidden="true" tabindex="-1" href="#the-interview-process"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;I tell the AI the topic I want to cover (e.g., “Is blogging still a valid career?”).&lt;/li&gt;
&lt;li&gt;The AI acts as a journalist. It asks me 10 to 50 specific questions depending on the depth required.&lt;/li&gt;
&lt;li&gt;I answer these questions.&lt;/li&gt;
&lt;li&gt;Once the interview is complete, a transcript is available for the writer agent to process.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4 id="the-writing-process"&gt;The writing process&lt;a aria-hidden="true" tabindex="-1" href="#the-writing-process"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h4&gt;
&lt;ul&gt;
&lt;li&gt;Once the interview is complete, I assign the transcript to the writer agent.&lt;/li&gt;
&lt;li&gt;It compiles my answers into a structured blog post.&lt;/li&gt;
&lt;li&gt;From the draft, I can ask to do further refinements, like adding an introduction, conclusion, or formatting code blocks.&lt;/li&gt;
&lt;/ul&gt;
&lt;aside class="callout callout-important" aria-label="important"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M20 13c0 5-3.5 7.5-7.66 8.95a1 1 0 0 1-.67-.01C7.5 20.5 4 18 4 13V6a1 1 0 0 1 1-1c2 0 4.5-1.2 6.24-2.72a1.17 1.17 0 0 1 1.52 0C14.51 3.81 17 5 19 5a1 1 0 0 1 1 1z"&gt;&lt;/path&gt;&lt;path d="M12 8v4"&gt;&lt;/path&gt;&lt;path d="M12 16h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;important&lt;/p&gt;&lt;div class="callout-text"&gt;This distinction is critical: &lt;strong&gt;It is my input, my voice, and my thoughts.&lt;/strong&gt; The AI is simply doing the heavy lifting of structure and grammar. It isn’t researching the topic for me; it’s organizing &lt;em&gt;my&lt;/em&gt; research.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="to-block-or-not-to-block"&gt;To block or not to block?&lt;a aria-hidden="true" tabindex="-1" href="#to-block-or-not-to-block"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Given that bots are scraping our content and lowering our direct traffic, should we block them?&lt;/p&gt;
&lt;p&gt;It depends on your goal.&lt;/p&gt;
&lt;h3 id="scenario-a-the-passion-blogger"&gt;Scenario A: the passion blogger&lt;a aria-hidden="true" tabindex="-1" href="#scenario-a-the-passion-blogger"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;If blogging is a hobby, a personal brand play, or a way to learn (like it is for me), &lt;strong&gt;let the bots scrape.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;I am not going to block AI bots. I believe my content provides value, and if that value reaches a user via ChatGPT or via my website, I am generally okay with that. My goal is to share knowledge. I’ve been blogging since 2010 and I know that blogging is not going to generate a source of income for me directly. It is a way to build my personal brand, share knowledge, and connect with like-minded people.&lt;/p&gt;
&lt;h3 id="scenario-b-the-business"&gt;Scenario B: the business&lt;a aria-hidden="true" tabindex="-1" href="#scenario-b-the-business"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;If your blog is your product, like the Tailwind documentation, you need to be strategic. If AI is answering your customers’ questions without them ever seeing your product pitch, your business model is bleeding.&lt;/p&gt;
&lt;p&gt;In this case, you are not being hostile to progress by blocking bots in your &lt;code&gt;robots.txt&lt;/code&gt;; you are protecting the direct relationship with your audience that keeps your lights on.&lt;/p&gt;
&lt;h2 id="the-verdict"&gt;The verdict&lt;a aria-hidden="true" tabindex="-1" href="#the-verdict"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Blogging is not dead, but the era of “content farming” is over.&lt;/p&gt;
&lt;p&gt;If you want to blog in 2026:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Make it YOURS.&lt;/strong&gt; Authenticity is the only thing AI cannot replicate.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Go Shorter.&lt;/strong&gt; People (and bots) want concise info. You don’t always need a 2,000-word essay.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Use AI as a tool, not a creator.&lt;/strong&gt; Let it interview you. Let it format for you. Do not let it think for you.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Diversify traffic.&lt;/strong&gt; You could build an email list, show up on social, and encourage direct visits. Organic search is less reliable now.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Build a community.&lt;/strong&gt; Engaged readers matter more than raw traffic. Reply, ask questions, and create a place for people to gather.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Build in public.&lt;/strong&gt; Show your work, share decisions, and invite feedback. That kind of transparency is hard to fake with AI.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Check your stats.&lt;/strong&gt; Know who is reading—humans or machines—and decide if you need to protect your content.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Blog to support, not replace, your brand.&lt;/strong&gt; You won’t build a personal brand through blogging alone. Combine it with speaking, building products, or creating tools. My blog supports my conference talks and Demo Time work—it’s part of the ecosystem, not the whole thing.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;I still learn best by writing. I write to clarify my thinking. If that’s why you blog, keep going. The bots can watch, but they can’t replace the human spark.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=eliostruyf.vscode-ghostwriter"&gt;Ghostwriter for VS Code&lt;/a&gt; - The extension I use for AI interviewing.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/estruyf/ghostwriter-agents-ai/blob/main/agents/interview.ghostwriter.md"&gt;Ghostwriter Agent Definitions&lt;/a&gt; - The specific “Interview” agent I use.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.eliostruyf.com/interviewed-ai-write-blog-posts/"&gt;How I get interviewed by AI&lt;/a&gt; - My detailed guide on this workflow.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/tailwindlabs/tailwindcss.com/pull/2388#issuecomment-3717222957"&gt;Tailwind CSS Discussion&lt;/a&gt; - Commentary from Adam Wathan on the impact of AI on documentation traffic.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.cloudflare.com/learning/ai/how-to-block-ai-crawlers/"&gt;Cloudflare - How to block AI crawlers&lt;/a&gt; - Guide on blocking AI crawlers with Cloudflare.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://marketplace.visualstudio.com/items?itemName=eliostruyf.vscode-ghostwriter"&gt;Ghostwriter for VS Code&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Ghostwriter for VS Code: your AI interviewer in your editor</title><link>https://www.eliostruyf.com/ghostwriter-code-ai-interviewer-editor/</link><guid isPermaLink="true">https://www.eliostruyf.com/ghostwriter-code-ai-interviewer-editor/</guid><description>I am releasing a new VS Code extension that brings the Ghostwriter interview experience directly into your editor, helping you write authentic content faster.</description><pubDate>Fri, 30 Jan 2026 10:08:05 GMT</pubDate><content:encoded>&lt;p&gt;Last week, I was experimenting with the &lt;a href="https://github.blog/2023-10-18-github-copilot-extensions-public-beta/"&gt;GitHub Copilot SDK&lt;/a&gt; to see how to use it within an Electron app. That experiment led to the creation of the standalone &lt;a href="https://github.com/estruyf/ghostwriter-app"&gt;Ghostwriter App&lt;/a&gt;. But as I was building it, I realized something important.&lt;/p&gt;
&lt;p&gt;Why leave the editor?&lt;/p&gt;
&lt;p&gt;Most of us live in &lt;strong&gt;Visual Studio Code&lt;/strong&gt;. Switching context to a separate app just to draft a blog post feels like friction we don’t need. So, I decided to minimize that friction.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-vscode.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRsAAAABXRUJQVlA4WAoAAAAQAAAACQAABwAAQUxQSC4AAAABN6CgbRuGP8q+4VhoRESg5wCKIklqLiiIBuIbC1jBvxEcRPQ/NjMfA8D5CBG5VlA4IGwAAADwAQCdASoKAAgAAUAmJZQCdAEPEP56egAA/pq3/orRvd/7vT8KPT2Z2lLfag3bo5Mv7/7zcn4bWE2MyY7D2fwX/45+ynKh/+lTObUa8hYe8GZo51Q4tL21wZiw8niJCPDWXOFhzmODKwDgAAA=" data-src="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-vscode.webp" alt="Ghostwriter for VSCode" style="width:4116px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;Ghostwriter for VSCode&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;p&gt;In this post, I want to introduce you to the &lt;strong&gt;Ghostwriter for VS Code&lt;/strong&gt; extension. It brings the concept of “interview-based writing” directly into your favorite editor, leveraging the GitHub Copilot Chat API without needing the CLI or complex SDK setups.&lt;/p&gt;
&lt;aside class="callout callout-info" aria-label="info"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;circle cx="12" cy="12" r="10"&gt;&lt;/circle&gt;&lt;path d="M12 16v-4"&gt;&lt;/path&gt;&lt;path d="M12 8h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;info&lt;/p&gt;&lt;div class="callout-text"&gt;Go to the VS Code marketplace to install &lt;a href="https://marketplace.visualstudio.com/items?itemName=eliostruyf.vscode-ghostwriter"&gt;Ghostwriter for VS Code&lt;/a&gt;&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="the-problem-with-ai-writing"&gt;The problem with AI writing&lt;a aria-hidden="true" tabindex="-1" href="#the-problem-with-ai-writing"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;We have all seen &lt;em&gt;that&lt;/em&gt; kind of AI content. You know the type of content, it sounds smart but says nothing useful. That’s what you get when AI writes for you instead of with you.&lt;/p&gt;
&lt;p&gt;The standard workflow usually looks like this:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Open ChatGPT or Copilot.&lt;/li&gt;
&lt;li&gt;Type: “Write me a blog post about React hooks.”&lt;/li&gt;
&lt;li&gt;Get a wall of text that sounds like everyone else.&lt;/li&gt;
&lt;li&gt;Spend an hour rewriting it to sound like you.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;I wanted to flip this script.&lt;/p&gt;
&lt;p&gt;AI-assisted writing works best when it acts as your &lt;strong&gt;interviewer&lt;/strong&gt;, not your ghostwriter.&lt;/p&gt;
&lt;h2 id="enter-ghostwriter-for-vs-code"&gt;Enter Ghostwriter for VS Code&lt;a aria-hidden="true" tabindex="-1" href="#enter-ghostwriter-for-vs-code"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;The core philosophy behind the entire &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;Ghostwriter ecosystem&lt;/a&gt; is simple: &lt;strong&gt;You are the expert.&lt;/strong&gt; The AI’s job is just to get the information out of your head and organize it.&lt;/p&gt;
&lt;p&gt;Instead of prompting an AI to generate content from thin air, Ghostwriter facilitates a conversation. It asks you thoughtful questions to draw out your unique insights, experiences, and specific examples. Then, and only then, does it structure those answers into a polished draft.&lt;/p&gt;
&lt;p&gt;And now, you can do this right inside VS Code.&lt;/p&gt;
&lt;h3 id="how-it-works"&gt;How it works&lt;a aria-hidden="true" tabindex="-1" href="#how-it-works"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;I already integrated similar chat-based functionality into &lt;a href="https://frontmatter.codes/"&gt;Front Matter CMS&lt;/a&gt;, so extending this logic to a dedicated extension felt like a natural progression.&lt;/p&gt;
&lt;p&gt;Here is how the workflow looks in practice:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Open the Extension:&lt;/strong&gt; You start a new session within VS Code by using the &lt;code&gt;Ghostwriter: Open Ghostwriter&lt;/code&gt; command.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;The interview:&lt;/strong&gt; The extension (powered by GitHub Copilot) starts asking you questions based on the topic you want to cover.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Your answers:&lt;/strong&gt; You answer casually, just like you are talking to a colleague. You share your war stories, your code snippets, and your opinions.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;The writing:&lt;/strong&gt; Once the interview is done, you can use the transcript to generate a draft blog post, article, or documentation.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;The result isn’t a generic AI article; it’s &lt;strong&gt;your&lt;/strong&gt; article, organized by AI. It solves the “blank page” problem without sacrificing authenticity.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-homepage.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRoQAAABXRUJQVlA4WAoAAAAQAAAACQAABQAAQUxQSCgAAAABJyAkIP6PlPzgRkTEZFAIyQp1z6CPwZsEUsjfK4WI/sfYPgHxdmYuVlA4IDYAAADQAQCdASoKAAYAAUAmJZQCdAEOPANYAAD+/rjTvlGqXztl3JUQSh1995K7ZPjK2UCesRbgAAA=" data-src="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-homepage.webp" alt="Ghostwriter extension homepage" style="width:4116px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;Ghostwriter extension homepage&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h4 id="the-interview-process"&gt;The interview process&lt;a aria-hidden="true" tabindex="-1" href="#the-interview-process"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h4&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-interview.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRoYAAABXRUJQVlA4WAoAAAAQAAAACQAABQAAQUxQSCgAAAABJyAkIP6PlPzgRkTEZFAIyQp1z6CPwZsEUsjfK4WI/sfYPgHxdmYuVlA4IDgAAADQAQCdASoKAAYAAUAmJQBOgCPw4GTLwAD+/k2ncDmHw6MfoArHzPXz3PVncQRivMGliRhF3ygAAA==" data-src="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-interview.webp" alt="The interview process" style="width:4116px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;The interview process&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h4 id="writing-the-draft"&gt;Writing the draft&lt;a aria-hidden="true" tabindex="-1" href="#writing-the-draft"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h4&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-draft.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRogAAABXRUJQVlA4WAoAAAAQAAAACQAABQAAQUxQSCgAAAABJyAkIP6PlPzgRkTEZFAIyQp1z6CPwZsEUsjfK4WI/sfYPgHxdmYuVlA4IDoAAADQAQCdASoKAAYAAUAmJQBOgCICis5skAD+/pH64akELjotUOGpTd8Tnp/bSRroY8b63Y6NzStmaAAA" data-src="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-draft.webp" alt="The generated draft" style="width:4116px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;The generated draft&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h2 id="building-the-extension"&gt;Building the extension&lt;a aria-hidden="true" tabindex="-1" href="#building-the-extension"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Technically, building this was a fun challenge. I wanted to showcase how you can use GitHub Copilot’s capabilities from an extension without actually requiring the heavy lifting of the &lt;a href="https://docs.github.com/en/copilot/github-copilot-in-the-cli/about-github-copilot-in-the-cli"&gt;GitHub Copilot CLI&lt;/a&gt; or the full SDK for every little interaction.&lt;/p&gt;
&lt;p&gt;By tapping into the chat API provided to VS Code extensions, I could create a lightweight wrapper content creation workflow. It removes the need for you to manually manage agent files or prompts, you just experience the app.&lt;/p&gt;
&lt;p&gt;For those interested in the code, or if you want to see how I handled the conversation flow, the project is open-source.&lt;/p&gt;
&lt;aside class="callout callout-info" aria-label="info"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;circle cx="12" cy="12" r="10"&gt;&lt;/circle&gt;&lt;path d="M12 16v-4"&gt;&lt;/path&gt;&lt;path d="M12 8h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;info&lt;/p&gt;&lt;div class="callout-text"&gt;You can find the source code and contribute to the project on GitHub: &lt;a href="https://github.com/estruyf/ghostwriter-vscode"&gt;Ghostwriter for VS Code&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="why-this-matters"&gt;Why this matters&lt;a aria-hidden="true" tabindex="-1" href="#why-this-matters"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I wrote deeper about the philosophy of “interview-based writing” in a previous post: &lt;a href="https://www.eliostruyf.com/interviewed-ai-write-blog-posts/"&gt;How I was interviewed by AI to write blog posts&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The gist is this: It is faster than writing everything from scratch, but it is far more genuine than pure AI generation. It captures your voice because the source material is literally &lt;em&gt;your voice&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;If you have valuable knowledge to share but struggle to structure it, this tool offers a middle path.&lt;/p&gt;
&lt;h2 id="try-it-out"&gt;Try it out&lt;a aria-hidden="true" tabindex="-1" href="#try-it-out"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;The extension is part of a growing ecosystem. If you prefer a standalone experience, you can still check out the experimental &lt;a href="https://github.com/estruyf/ghostwriter-app"&gt;Electron App&lt;/a&gt;. But for my daily workflow, staying in VS Code is king.&lt;/p&gt;
&lt;p&gt;You can &lt;a href="https://marketplace.visualstudio.com/items?itemName=eliostruyf.vscode-ghostwriter"&gt;install the Ghostwriter extension from the VS Code marketplace&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;I am keen to hear what you think. Does being “interviewed” help you write better? Give it a spin and let me know.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Happy writing!&lt;/strong&gt;&lt;/p&gt;
&lt;h2 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Ghostwriter Agents:&lt;/strong&gt; &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;github.com/estruyf/ghostwriter-agents-ai&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Ghostwriter App (Electron):&lt;/strong&gt; &lt;a href="https://github.com/estruyf/ghostwriter-app"&gt;github.com/estruyf/ghostwriter-app&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Original Article:&lt;/strong&gt; &lt;a href="https://www.eliostruyf.com/interviewed-ai-write-blog-posts/"&gt;How I was interviewed by AI to write blog posts&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://github.com/estruyf/ghostwriter-app"&gt;Ghostwriter app&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Exploring the GitHub Copilot SDK: Building a Ghostwriter App</title><link>https://www.eliostruyf.com/exploring-github-copilot-sdk-building-ghostwriter-app/</link><guid isPermaLink="true">https://www.eliostruyf.com/exploring-github-copilot-sdk-building-ghostwriter-app/</guid><description>Discover how to build a Ghostwriter app using the GitHub Copilot SDK and streamline your AI integration for technical content creation.</description><pubDate>Mon, 26 Jan 2026 10:10:06 GMT</pubDate><content:encoded>&lt;p&gt;As a developer, I am always curious about how to optimize my workflow, especially when it comes to writing technical content. That was part of the reason I started to create the &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;Ghostwriter Agents for AI&lt;/a&gt;. One of the things I tried to take it a step further was to build a “Ghostwriter” application, but I never got around to finishing it. The reason was the AI layer. Should I build my own? Use an existing API?&lt;/p&gt;
&lt;p&gt;With the release of the &lt;a href="https://github.blog/news-insights/company-news/build-an-agent-into-any-app-with-the-github-copilot-sdk/"&gt;GitHub Copilot SDK&lt;/a&gt; I decided to revisit that idea. This artilce is the story of how I used the SDK to finally build my Ghostwriter app, and the lessons I learned along the way.&lt;/p&gt;
&lt;h2 id="the-challenge-seamless-ai-integration"&gt;The challenge: seamless AI integration&lt;a aria-hidden="true" tabindex="-1" href="#the-challenge-seamless-ai-integration"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;My goal was simple. I wanted an application that could act as an interviewer, asking me questions about a topic, and then use my answers to draft a post.&lt;/p&gt;
&lt;p&gt;I already had the prompts ready in my &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;Ghostwriter Agents AI&lt;/a&gt; project:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Interviewer:&lt;/strong&gt; A persona that digs into the topic.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Writer:&lt;/strong&gt; A persona that takes the interview context and generates the article.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;The missing piece was the “glue.” In my previous attempt, managing the AI context and API calls was a headache. I wanted something seamless, where I could just use what is already available in the ecosystem without reinventing the wheel.&lt;/p&gt;
&lt;h2 id="starting-with-astro"&gt;Starting with Astro&lt;a aria-hidden="true" tabindex="-1" href="#starting-with-astro"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I must admit, when I first looked at the Copilot SDK, documentation was a bit scarce. There were some videos, but not many concrete samples. But hey, that’s part of the fun of the job, right? The best way I learn is by starting a new project.&lt;/p&gt;
&lt;p&gt;I spun up a new &lt;a href="https://astro.build/"&gt;Astro&lt;/a&gt; project. My plan was to create a simple website with an API that communicated with the Copilot CLI.&lt;/p&gt;
&lt;p&gt;Here is the cool part: I actually used GitHub Copilot to write the integration code for me. I pointed it to the &lt;a href="https://github.com/github/copilot-sdk/blob/main/docs/getting-started.md"&gt;GitHub Copilot SDK - Getting started&lt;/a&gt; page in the documentation, and it scaffolded the connection almost perfectly. All I had to do was review it to understand how the data flowed.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-website.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRoYAAABXRUJQVlA4WAoAAAAQAAAACQAACAAAQUxQSCkAAAABLyAkIP6PlPzgRkTEnMFA2ya7AzQ8OHgwgAX8C8JBRP9jkpyA+rbtBQBWUDggNgAAAPABAJ0BKgoACQABQCYljAJ0AQ8DBViJAAD+/loqpNhofflH0vAK+YM6qFGoxnO20Qxj5OnAAA==" data-src="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-website.webp" alt="Ghostwriter as an Astro website" style="width:4116px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;Ghostwriter as an Astro website&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h2 id="the-pivot-to-electron"&gt;The pivot to Electron&lt;a aria-hidden="true" tabindex="-1" href="#the-pivot-to-electron"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;The Astro prototype worked great for the interview logic, but I hit a snag when I thought about distribution.&lt;/p&gt;
&lt;p&gt;The Copilot SDK relies on communicating with the Copilot CLI. If I hosted this as a web app, the backend wouldn’t have that local CLI connection. I needed this to run locally on the user’s machine.&lt;/p&gt;
&lt;p&gt;I initially thought, “Perfect, I’ll build a Tauri app!” I love working with Tauri, and it seemed like a natural fit. Unfortunately, the SDK doesn’t support Rust just yet. So, I decided to move the project to &lt;strong&gt;Electron&lt;/strong&gt;.&lt;/p&gt;
&lt;aside class="callout callout-tip" aria-label="tip"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M15 14c.2-1 .7-1.7 1.5-2.5 1-.9 1.5-2.2 1.5-3.5A6 6 0 0 0 6 8c0 1 .2 2.2 1.5 3.5.7.7 1.3 1.5 1.5 2.5"&gt;&lt;/path&gt;&lt;path d="M9 18h6"&gt;&lt;/path&gt;&lt;path d="M10 22h4"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;tip&lt;/p&gt;&lt;div class="callout-text"&gt;When writing this blog, I found out that ther is a community project available: &lt;a href="https://github.com/copilot-community-sdk/copilot-sdk-rust"&gt;copilot-sdk (Rust)&lt;/a&gt;&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="solving-the-packaging-puzzle"&gt;Solving the packaging puzzle&lt;a aria-hidden="true" tabindex="-1" href="#solving-the-packaging-puzzle"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Moving to Electron solved the runtime issue, but it introduced a new one: packaging. When I tried to bundle the application, the communication with the Copilot CLI kept failing. It was one of those “it works on my machine” moments that drives you crazy.&lt;/p&gt;
&lt;p&gt;That is when I discovered a feature that saved the project: the &lt;strong&gt;Copilot CLI Server&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Instead of relying on the default environment integration, you can manually start the Copilot CLI in server mode on a specific port.&lt;/p&gt;
&lt;div class="expressive-code"&gt;&lt;link rel="stylesheet" href="/_astro/ec.gf7e9.css"&gt;&lt;script type="module" src="/_astro/ec.8zarh.js"&gt;&lt;/script&gt;&lt;figure class="frame is-terminal"&gt;&lt;figcaption class="header"&gt;&lt;span class="title"&gt;&lt;/span&gt;&lt;span class="sr-only"&gt;Terminal window&lt;/span&gt;&lt;/figcaption&gt;&lt;pre data-language="bash"&gt;&lt;code&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;1&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#B392F0;--1:#6F42C1"&gt;copilot&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;--server&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;--port&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;4321&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div class="copy"&gt;&lt;button title="Copy to clipboard" data-copied="Copied!" data-code="copilot --server --port 4321"&gt;&lt;div&gt;&lt;/div&gt;&lt;/button&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;p&gt;Once that server is running, you can tell the SDK exactly where to look. Here is how I configured the client in my Electron app:&lt;/p&gt;
&lt;div class="expressive-code"&gt;&lt;figure class="frame"&gt;&lt;figcaption class="header"&gt;&lt;/figcaption&gt;&lt;pre data-language="typescript"&gt;&lt;code&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;1&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;import&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; { CopilotClient } &lt;/span&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;from&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"@github/copilot-sdk"&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;;&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;2&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;
&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;3&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#99A0A6;--1:#616972"&gt;// Connect to the local CLI server&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;4&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;const&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;client&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;=&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;new&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#B392F0;--1:#6F42C1"&gt;CopilotClient&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;({&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;5&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;    &lt;/span&gt;&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;cliUrl: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"localhost:4321"&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;6&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;});&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;7&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;
&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;8&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#99A0A6;--1:#616972"&gt;// Use the client normally to start the interview session&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;9&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;const&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;session&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;=&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;await&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; client.&lt;/span&gt;&lt;span style="--0:#B392F0;--1:#6F42C1"&gt;createSession&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;();&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div class="copy"&gt;&lt;button title="Copy to clipboard" data-copied="Copied!" data-code="import { CopilotClient } from &amp;#x22;@github/copilot-sdk&amp;#x22;;&#127;&#127;// Connect to the local CLI server&#127;const client = new CopilotClient({&#127;    cliUrl: &amp;#x22;localhost:4321&amp;#x22;&#127;});&#127;&#127;// Use the client normally to start the interview session&#127;const session = await client.createSession();"&gt;&lt;div&gt;&lt;/div&gt;&lt;/button&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;p&gt;By spawning the CLI server when the Electron app starts, I established a reliable bridge between my UI and the AI.&lt;/p&gt;
&lt;h2 id="the-result"&gt;The result&lt;a aria-hidden="true" tabindex="-1" href="#the-result"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;With the connection fixed, the app now works exactly as I imagined. It runs the “Interviewer” agent to gather my thoughts, passes that context to the “Writer” agent, and generates a draft. All powered by the Copilot account I’m already signed into.&lt;/p&gt;
&lt;p&gt;Don’t get me wrong, it’s still an early version, but the friction of “building the AI” is completely gone. I can focus entirely on refining the prompts and the user experience.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-app.gif" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR42mNk+M9QDwADhgGAWjR9awAAAABJRU5ErkJggg==" data-src="https://www.eliostruyf.com/uploads/2026/01/ghostwriter-app.gif" alt="The Ghostwriter app" style="width:auto;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;The Ghostwriter app&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h2 id="conclusion"&gt;Conclusion&lt;a aria-hidden="true" tabindex="-1" href="#conclusion"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;The GitHub Copilot SDK is a game-changer for developers looking to integrate AI into their applications without the overhead of managing authentication, AI, and API complexity. My journey from a stalled project to a working Ghostwriter app in Electron is proof of that.&lt;/p&gt;
&lt;p&gt;What impressed me the most was how smoothly things came together once I understood the core concepts. The SDK abstracts away the messy parts like token management, session handling, streaming responses. It leaves you free to focus on your prompts and user experience.&lt;/p&gt;
&lt;p&gt;If you’re sitting on an idea that needs AI capabilities, the Copilot SDK removes one of the biggest barriers to entry. No need to negotiate with multiple API providers or implement your own LLM infrastructure. Just point to your local CLI, define your prompts, and build.&lt;/p&gt;
&lt;p&gt;The journey wasn’t completely frictionless. The initial documentation gap and the Electron packaging hurdle taught me a thing or two, but those are the kinds of challenges that are good to think out of the box, use your creativity, or GitHub Copilot to get it solved.&lt;/p&gt;
&lt;p&gt;Let me know what you think if you give the SDK a try.&lt;/p&gt;
&lt;h2 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/github/copilot-sdk"&gt;GitHub Copilot SDK&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/estruyf/ghostwriter-app"&gt;Ghostwriter App Repository&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;Ghostwriter Agents AI&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://github.com/estruyf/ghostwriter-app"&gt;Ghostwriter app&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>VS Code Workspaces for better AI assistant context</title><link>https://www.eliostruyf.com/vcode-workspaces-ai-assistant-context/</link><guid isPermaLink="true">https://www.eliostruyf.com/vcode-workspaces-ai-assistant-context/</guid><description>Enhance your AI coding assistant's context with VS Code Workspaces for seamless multi-project collaboration and improved code coherence.</description><pubDate>Thu, 15 Jan 2026 10:25:20 GMT</pubDate><content:encoded>&lt;p&gt;I get this question a lot: “How can I give my AI coding assistant context across multiple projects?” If you’re working with a separate frontend and backend, or any split repository setup, you’ve probably run into this problem yourself. Your AI assistant only sees the code in the current project, which means you end up repeating yourself, explaining data structures, and essentially acting as a translator between your projects.&lt;/p&gt;
&lt;p&gt;Here’s a quick tip that changed my workflow: use &lt;a href="https://code.visualstudio.com/docs/editor/workspaces"&gt;VS Code Workspaces&lt;/a&gt; as a “virtual mono-repo” for AI context.&lt;/p&gt;
&lt;h2 id="the-problem-with-separate-projects"&gt;The problem with separate projects&lt;a aria-hidden="true" tabindex="-1" href="#the-problem-with-separate-projects"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;When you have your frontend and backend in different folders (or even different repositories), opening them in separate VS Code instances means your AI assistant lacks the full picture. Want to create a dashboard that fetches data from your API? You’ll need to be very detailed about the data structures, the API endpoints, and the implementation details in each project separately.&lt;/p&gt;
&lt;p&gt;I previously avoided workspaces, preferring separate VS Code instances or actual mono-repos. But not every project allows for a mono-repo structure, and that’s where this trick comes in handy.&lt;/p&gt;
&lt;h2 id="the-solution-vs-code-workspaces"&gt;The solution: VS Code Workspaces&lt;a aria-hidden="true" tabindex="-1" href="#the-solution-vs-code-workspaces"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;A VS Code Workspace lets you open multiple folders in a single editor instance. For AI coding assistants, this means they can see both your frontend and backend code simultaneously. Instead of manually specifying data structures, the AI can discover them from your backend and implement matching code in your frontend, all in a single prompt.&lt;/p&gt;
&lt;p&gt;Here’s the practical difference:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Without shared context:&lt;/strong&gt; “Create a dashboard component. The API returns an object with &lt;code&gt;sessionName&lt;/code&gt; (string), &lt;code&gt;description&lt;/code&gt; (string), and &lt;code&gt;managementUrl&lt;/code&gt; (string)…”&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;With a workspace:&lt;/strong&gt; “Create a dashboard that displays the session data from the API”, and the AI figures out the data structure from your backend code.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="how-to-set-it-up"&gt;How to set it up&lt;a aria-hidden="true" tabindex="-1" href="#how-to-set-it-up"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Setting up a workspace takes less than a minute:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Open one of your projects in VS Code&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;File&lt;/strong&gt; in the menu bar&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Add Folder to Workspace&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Choose your second project folder (e.g., your backend if you started with the frontend)&lt;/li&gt;
&lt;li&gt;Both projects now appear in the Explorer sidebar&lt;/li&gt;
&lt;/ol&gt;
&lt;aside class="callout callout-tip" aria-label="tip"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M15 14c.2-1 .7-1.7 1.5-2.5 1-.9 1.5-2.2 1.5-3.5A6 6 0 0 0 6 8c0 1 .2 2.2 1.5 3.5.7.7 1.3 1.5 1.5 2.5"&gt;&lt;/path&gt;&lt;path d="M9 18h6"&gt;&lt;/path&gt;&lt;path d="M10 22h4"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;tip&lt;/p&gt;&lt;div class="callout-text"&gt;Save your workspace as a &lt;code&gt;.code-workspace&lt;/code&gt; file (File &gt; Save Workspace As) so you can quickly reopen it later. The workspace file stores the folder paths and any workspace-specific settings.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="tool-compatibility"&gt;Tool compatibility&lt;a aria-hidden="true" tabindex="-1" href="#tool-compatibility"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I’ve tested this approach with different AI assistants, and here’s what I found:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://github.com/features/copilot"&gt;GitHub Copilot&lt;/a&gt;:&lt;/strong&gt; Works great. Copilot natively understands the workspace context and can reference code across all folders in your workspace.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://claude.ai/"&gt;Claude&lt;/a&gt;:&lt;/strong&gt; Also works, but takes a different approach. Claude tends to use bash scripts to find the backend location rather than directly leveraging the workspace context. Still functional, just less seamless.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="why-this-matters"&gt;Why this matters&lt;a aria-hidden="true" tabindex="-1" href="#why-this-matters"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;The real value here isn’t just convenience. When your AI assistant has full context, it produces more coherent code. It understands the relationship between your frontend and backend, uses consistent naming, and catches type mismatches before they become runtime errors.&lt;/p&gt;
&lt;p&gt;If you’re building full-stack applications and using AI coding assistants, give workspaces a try. It’s a simple change that can make your AI collaboration significantly more effective.&lt;/p&gt;
&lt;h2 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://code.visualstudio.com/docs/editor/workspaces"&gt;VS Code Workspaces documentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/features/copilot"&gt;GitHub Copilot&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://code.visualstudio.com/docs/editor/multi-root-workspaces"&gt;Multi-root Workspaces in VS Code&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;ghostwriter AI agents&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Why code review and testing matter more than ever with AI</title><link>https://www.eliostruyf.com/code-review-testing-matter-ai/</link><guid isPermaLink="true">https://www.eliostruyf.com/code-review-testing-matter-ai/</guid><description>Explore why code review and testing are essential in the age of AI to ensure quality and reliability in your development workflow.</description><pubDate>Thu, 08 Jan 2026 11:41:14 GMT</pubDate><content:encoded>&lt;p&gt;A couple of days ago, I posted a tweet that sparked more controversy than I expected. The message was simple: “Always review the code AI generates. Here’s an example where it thought it was a good idea to add a hardcoded file path to load an image.”&lt;/p&gt;
&lt;p&gt;The responses? Let’s just say not everyone was on board with my approach.&lt;/p&gt;
&lt;p&gt;“Perhaps it is a better solution to write the code yourself in the first place? Microslop and other companies going all-in on ‘AI’ are soon to see it fail them completely, why waste time on it now?”&lt;/p&gt;
&lt;p&gt;“Perchance, learn to program yourself. Get good, etc.”&lt;/p&gt;
&lt;p&gt;These dismissive comments became the catalyst for this post. Because here’s the thing: those critics are missing the point entirely. This isn’t about whether AI is “good enough” or whether we should use it at all. It’s about how we adapt our workflows to work effectively with AI while maintaining the quality and reliability our users deserve.&lt;/p&gt;
&lt;h2 id="the-hardcoded-path-that-started-it-all"&gt;The hardcoded path that started it all&lt;a aria-hidden="true" tabindex="-1" href="#the-hardcoded-path-that-started-it-all"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Let me tell you about the macOS notch app I built. I wanted a simple utility to show my currently playing music and GitHub Copilot premium request status in the notch area. There’s one small problem: I don’t know Swift. At all.&lt;/p&gt;
&lt;p&gt;So I did what any developer in 2026 would do. I created a project plan and handed it over to &lt;a href="https://github.com/features/copilot"&gt;GitHub Copilot&lt;/a&gt;. Within a couple of hours, I had a working application. It ran perfectly on my Mac Mini. Everything loaded, the interface looked great.&lt;/p&gt;
&lt;p&gt;Later that day, I switched to my MacBook. The app launched, but something was wrong. The images weren’t loading. That’s when I actually looked at the code.&lt;/p&gt;
&lt;div class="expressive-code"&gt;&lt;link rel="stylesheet" href="/_astro/ec.gf7e9.css"&gt;&lt;script type="module" src="/_astro/ec.8zarh.js"&gt;&lt;/script&gt;&lt;figure class="frame"&gt;&lt;figcaption class="header"&gt;&lt;/figcaption&gt;&lt;pre data-language="swift"&gt;&lt;code&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;1&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;func&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#B392F0;--1:#6F42C1"&gt;loadCopilotIcon&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;() {&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;2&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;    &lt;/span&gt;&lt;span style="--0:#99A0A6;--1:#616972"&gt;// Multiple attempts to load icon&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;3&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;    &lt;/span&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;let&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; path &lt;/span&gt;&lt;span style="--0:#F97583;--1:#BF3441"&gt;=&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"/Users/eliostruyf/Developer/nodejs/DevNotch/assets/$2x32.png"&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;4&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span class="indent"&gt;    &lt;/span&gt;&lt;span style="--0:#99A0A6;--1:#616972"&gt;// ... more hardcoded paths&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;5&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;}&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div class="copy"&gt;&lt;button title="Copy to clipboard" data-copied="Copied!" data-code="func loadCopilotIcon() {&#127;    // Multiple attempts to load icon&#127;    let path = &amp;#x22;/Users/eliostruyf/Developer/nodejs/DevNotch/assets/$2x32.png&amp;#x22;&#127;    // ... more hardcoded paths&#127;}"&gt;&lt;div&gt;&lt;/div&gt;&lt;/button&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;p&gt;There it was. A hardcoded file path. With my username. From my Mac Mini. The AI had generated code that worked perfectly in one context but failed completely in another.&lt;/p&gt;
&lt;aside class="callout callout-important" aria-label="important"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M20 13c0 5-3.5 7.5-7.66 8.95a1 1 0 0 1-.67-.01C7.5 20.5 4 18 4 13V6a1 1 0 0 1 1-1c2 0 4.5-1.2 6.24-2.72a1.17 1.17 0 0 1 1.52 0C14.51 3.81 17 5 19 5a1 1 0 0 1 1 1z"&gt;&lt;/path&gt;&lt;path d="M12 8v4"&gt;&lt;/path&gt;&lt;path d="M12 16h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;important&lt;/p&gt;&lt;div class="callout-text"&gt;I didn’t review the code because it worked, so I assumed it was all good. This was just for personal use, not production. But that mindset is exactly the problem.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;p&gt;If this had been production code and I hadn’t caught it during development, customers would have discovered the issue for me. And that’s never a good look.&lt;/p&gt;
&lt;h2 id="what-code-review-actually-catches"&gt;What code review actually catches&lt;a aria-hidden="true" tabindex="-1" href="#what-code-review-actually-catches"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Here’s what I advocate at conferences/events/talks and what I failed to do myself: review AI-generated code. Not because AI is inherently bad, but because it doesn’t always understand the full context of what you’re building.&lt;/p&gt;
&lt;p&gt;The hardcoded path issue required code review to catch. There were no tests yet. I discovered it by manually reading the code after the failure. But this brings up an interesting question: what if you don’t have time to review every line? What if you’re moving fast and the code “just works”?&lt;/p&gt;
&lt;p&gt;This is where testing becomes your safety net.&lt;/p&gt;
&lt;h2 id="testing-as-your-ai-safety-net"&gt;Testing as your AI safety net&lt;a aria-hidden="true" tabindex="-1" href="#testing-as-your-ai-safety-net"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I’ll be honest. Even though everyone should review code, especially AI-generated code, people will still blindly accept what’s created because “it just works.” I’ve done it myself. We’re all human, and we all take shortcuts.&lt;/p&gt;
&lt;p&gt;Having good tests in place ensures everything that already works keeps working as intended. Tests don’t get tired. They don’t skip steps. They don’t assume that because something worked yesterday, it’ll work today.&lt;/p&gt;
&lt;p&gt;Here’s my testing approach:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Create use cases first&lt;/li&gt;
&lt;li&gt;Create the solution from those use cases&lt;/li&gt;
&lt;li&gt;Create end-to-end tests from the solution and use cases&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;For E2E testing, I use &lt;a href="https://playwright.dev/"&gt;Playwright&lt;/a&gt;. It gives me several critical capabilities:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Screenshot matching&lt;/strong&gt;: Verify that the UI looks exactly as expected&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;User flow validation&lt;/strong&gt;: Ensure buttons work and trigger the right actions&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;API verification&lt;/strong&gt;: Confirm that backend calls happen correctly&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Change detection&lt;/strong&gt;: Catch unexpected modifications&lt;/li&gt;
&lt;/ul&gt;
&lt;aside class="callout callout-tip" aria-label="tip"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M15 14c.2-1 .7-1.7 1.5-2.5 1-.9 1.5-2.2 1.5-3.5A6 6 0 0 0 6 8c0 1 .2 2.2 1.5 3.5.7.7 1.3 1.5 1.5 2.5"&gt;&lt;/path&gt;&lt;path d="M9 18h6"&gt;&lt;/path&gt;&lt;path d="M10 22h4"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;tip&lt;/p&gt;&lt;div class="callout-text"&gt;When AI implements new features and changes pages or dashboards, if something new doesn’t exist in your screenshots, the test fails. If AI accidentally deletes something (like a button) or creates duplicates, your tests will catch it.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;p&gt;This is particularly powerful with AI-generated code. You’re informed by tests that things changed, even if you didn’t catch it during code review.&lt;/p&gt;
&lt;h2 id="how-my-workflow-has-evolved"&gt;How my workflow has evolved&lt;a aria-hidden="true" tabindex="-1" href="#how-my-workflow-has-evolved"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I spend less time writing code these days. That probably sounds strange coming from someone who loves developing things, but it’s true. Instead, I spend more time:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Reviewing code&lt;/li&gt;
&lt;li&gt;Thinking about what I want AI to create&lt;/li&gt;
&lt;li&gt;Defining use cases and requirements&lt;/li&gt;
&lt;li&gt;Running tests and validating outcomes&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I still write code myself when I can’t explain to AI how to do certain things, or when I have it so clear in my head that it’s faster to just write it. I still love developing things personally. But AI agents allow my creativity to take the lead, and I can do things much quicker.&lt;/p&gt;
&lt;aside class="callout callout-info" aria-label="info"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;circle cx="12" cy="12" r="10"&gt;&lt;/circle&gt;&lt;path d="M12 16v-4"&gt;&lt;/path&gt;&lt;path d="M12 8h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;info&lt;/p&gt;&lt;div class="callout-text"&gt;I wrote about my evolving relationship with AI in development in my post: &lt;a href="https://www.eliostruyf.com/evolving-relationship-ai-development/"&gt;My evolving relationship with AI in development&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="understanding-the-fear"&gt;Understanding the fear&lt;a aria-hidden="true" tabindex="-1" href="#understanding-the-fear"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Those dismissive responses to my tweet? They didn’t come from a place of technical superiority. They came from fear. And I get it. People are afraid AI will take over jobs. And yes, it will take over things. But that fear is causing some developers to dig in their heels rather than adapt.&lt;/p&gt;
&lt;p&gt;Let me break down what I think is actually happening when developers react negatively to AI:&lt;/p&gt;
&lt;h3 id="layer-1-job-displacement-anxiety"&gt;Layer 1: Job displacement anxiety&lt;a aria-hidden="true" tabindex="-1" href="#layer-1-job-displacement-anxiety"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;This is the most visceral fear. You watch AI generate working code in seconds, and you think: “What does this mean for my career?” This hits junior developers especially hard. They’re wondering if the traditional entry path into development is being automated away before they even get started.&lt;/p&gt;
&lt;h3 id="layer-2-loss-of-craft"&gt;Layer 2: Loss of craft&lt;a aria-hidden="true" tabindex="-1" href="#layer-2-loss-of-craft"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Developers love the puzzle-solving aspect of programming. There’s a worry that relying on AI will atrophy skills. That the profession becomes about prompt engineering versus actual engineering. There’s an identity component here: if code generation becomes trivial, what makes a developer valuable?&lt;/p&gt;
&lt;h3 id="layer-3-quality-and-understanding-concerns"&gt;Layer 3: Quality and understanding concerns&lt;a aria-hidden="true" tabindex="-1" href="#layer-3-quality-and-understanding-concerns"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;This is more pragmatic. Developers who’ve debugged AI-generated code that looked plausible but was subtly wrong understand this fear. There’s anxiety about teams adopting AI without understanding its limitations, leading to technical debt or security vulnerabilities.&lt;/p&gt;
&lt;h3 id="layer-4-economic-uncertainty"&gt;Layer 4: Economic uncertainty&lt;a aria-hidden="true" tabindex="-1" href="#layer-4-economic-uncertainty"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Productivity gains mean companies might need fewer engineers. The bar for “good enough” code might drop, so non-developers can build what used to require specialists. This is a legitimate business concern, not just individual anxiety.&lt;/p&gt;
&lt;h3 id="layer-5-the-irony"&gt;Layer 5: The irony&lt;a aria-hidden="true" tabindex="-1" href="#layer-5-the-irony"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Here’s what I’ve observed: developers who actually integrate AI into their workflow, using &lt;a href="https://github.com/features/copilot"&gt;Copilot&lt;/a&gt; or &lt;a href="https://www.anthropic.com/claude"&gt;Claude&lt;/a&gt; for tedious bits while focusing on architecture, debugging, and genuinely hard problems, end up LESS afraid, not more. Fear seems highest among those who haven’t found equilibrium with these tools.&lt;/p&gt;
&lt;aside class="callout callout-important" aria-label="important"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M20 13c0 5-3.5 7.5-7.66 8.95a1 1 0 0 1-.67-.01C7.5 20.5 4 18 4 13V6a1 1 0 0 1 1-1c2 0 4.5-1.2 6.24-2.72a1.17 1.17 0 0 1 1.52 0C14.51 3.81 17 5 19 5a1 1 0 0 1 1 1z"&gt;&lt;/path&gt;&lt;path d="M12 8v4"&gt;&lt;/path&gt;&lt;path d="M12 16h.01"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;important&lt;/p&gt;&lt;div class="callout-text"&gt;The ‘just learn to code’ mindset represents staying stuck in old methods while others embrace modern tools. It’s driven by fear rather than practical evaluation.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="the-skills-that-will-matter"&gt;The skills that will matter&lt;a aria-hidden="true" tabindex="-1" href="#the-skills-that-will-matter"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Eventually, AI code will be perfect. Or close enough. But we’ll still need to understand if it produces expected outcomes. We need to stay in control, and testing gives us that control.&lt;/p&gt;
&lt;p&gt;Here are the skills and mindsets that will become most valuable:&lt;/p&gt;
&lt;h3 id="creativity"&gt;Creativity&lt;a aria-hidden="true" tabindex="-1" href="#creativity"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;This is the core fundamental skill. AI can write code, but it can’t imagine the solution your users actually need. That comes from you.&lt;/p&gt;
&lt;h3 id="systems-thinking-over-syntax-mastery"&gt;Systems thinking over syntax mastery&lt;a aria-hidden="true" tabindex="-1" href="#systems-thinking-over-syntax-mastery"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;You need to understand what AI creates. Development and learning to develop is very important right now, but the focus shifts from memorizing syntax to understanding how systems work together.&lt;/p&gt;
&lt;h3 id="taste-and-judgment"&gt;Taste and judgment&lt;a aria-hidden="true" tabindex="-1" href="#taste-and-judgment"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Can you make judgment calls on whether what’s been created fits your current solutions, patterns, and architecture? Does it feel right? Does it align with your team’s standards?&lt;/p&gt;
&lt;h3 id="problem-decomposition"&gt;Problem decomposition&lt;a aria-hidden="true" tabindex="-1" href="#problem-decomposition"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Understanding business problems and formulating them into well-scoped tasks and use cases becomes critical. If you can’t clearly define what you want, AI can’t build it effectively.&lt;/p&gt;
&lt;h3 id="debugging-and-verification"&gt;Debugging and verification&lt;a aria-hidden="true" tabindex="-1" href="#debugging-and-verification"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;AI is getting better, but sometimes it can’t do what humans can. You need full control over the environment your app is running in. You need to understand what’s happening under the hood and figure out how to solve things when they go wrong.&lt;/p&gt;
&lt;h3 id="communication-and-collaboration"&gt;Communication and collaboration&lt;a aria-hidden="true" tabindex="-1" href="#communication-and-collaboration"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;The technical ceiling is rising for everyone, so differentiation happens elsewhere. Developers who can translate between business stakeholders and technical teams, mentor others, write clearly, and present well become force multipliers.&lt;/p&gt;
&lt;p&gt;In my conference experience, the speakers who resonate aren’t always the deepest technical experts. They’re the ones who make complex ideas accessible.&lt;/p&gt;
&lt;h3 id="comfort-with-ambiguity-and-continuous-learning"&gt;Comfort with ambiguity and continuous learning&lt;a aria-hidden="true" tabindex="-1" href="#comfort-with-ambiguity-and-continuous-learning"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;The landscape is shifting fast. Clinging to a fixed skill set is risky. A mindset of “I’ll figure it out” beats “I already know this” every time.&lt;/p&gt;
&lt;h2 id="actionable-advice-for-skeptical-developers"&gt;Actionable advice for skeptical developers&lt;a aria-hidden="true" tabindex="-1" href="#actionable-advice-for-skeptical-developers"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;If you’re feeling anxious or skeptical about AI tools, here’s my advice: pick one small, annoying task you’ve been putting off. Something tedious but not critical. Not your main project, not anything high-stakes. Maybe:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Writing tests for a module you’ve neglected&lt;/li&gt;
&lt;li&gt;Converting old code to a newer pattern&lt;/li&gt;
&lt;li&gt;Drafting documentation&lt;/li&gt;
&lt;li&gt;Scaffolding a small utility script&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The point isn’t to be impressed. It’s to get hands-on experience with what AI does well and where it falls short. Abstract fear thrives on imagination. Practical experience replaces it with calibration.&lt;/p&gt;
&lt;p&gt;Here’s what you’ll find:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;AI gets you 80% of the way surprisingly fast&lt;/li&gt;
&lt;li&gt;You’ll spend time fixing the last 20%, and that’s instructive&lt;/li&gt;
&lt;li&gt;You’ll start developing intuition for when to reach for AI and when to write code yourself&lt;/li&gt;
&lt;li&gt;You’ll discover which prompts yield useful output versus confident nonsense&lt;/li&gt;
&lt;li&gt;You’ll shift from “will this replace me?” to “how do I use this effectively?”&lt;/li&gt;
&lt;/ul&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;Developers who struggle most either dismiss AI entirely without trying, or expect magic and get disillusioned. Developers who adapt fastest treat AI as a capable but flawed collaborator, useful for first drafts, brainstorming, and grunt work, while keeping their own judgment in the loop.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="the-organizational-reality"&gt;The organizational reality&lt;a aria-hidden="true" tabindex="-1" href="#the-organizational-reality"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;When I talk about testing at companies, I actually get no pushback. People already know they need to test. They acknowledge testing is important. They recognize they need to do this for their solutions.&lt;/p&gt;
&lt;p&gt;The real problem isn’t resistance to the idea. It’s taking the time to actually start doing it. It’s the problem of procrastination: “We are going to do this later.” Pushing it down the road instead of implementing it now.&lt;/p&gt;
&lt;p&gt;The companies that have successfully made this shift? Most were already using E2E tests. Specifically, the companies I work for already had E2E tests because I implemented them. These companies had a much quicker and easier start when adopting AI during their development lifecycle.&lt;/p&gt;
&lt;p&gt;For them, it wasn’t really a change because everything was already there. Having tests in place BEFORE adopting AI made the transition smooth. The safety net was already built.&lt;/p&gt;
&lt;h2 id="start-embracing-learning-and-using-it"&gt;Start embracing, learning, and using it&lt;a aria-hidden="true" tabindex="-1" href="#start-embracing-learning-and-using-it"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;We should &lt;strong&gt;NOT&lt;/strong&gt; be afraid of AI. We should embrace it, learn to use it, and get used to it. If we stay stuck in old routines, others will overtake us.&lt;/p&gt;
&lt;p&gt;Use your strengths. Use your creativity to come up with solutions. And most importantly, start doing what we’ve always advocated: testing.&lt;/p&gt;
&lt;p&gt;Write use cases. Develop end-to-end tests to validate your solutions keep working. Know what works after new features are implemented. Testing isn’t new, but it’s crucial now because we don’t always know what AI is creating or doing.&lt;/p&gt;
&lt;p&gt;Even if people don’t review code as thoroughly as they should, tests catch issues before production. That’s the safety net that lets you move fast with AI without breaking everything.&lt;/p&gt;
&lt;p&gt;Don’t stay stuck. Don’t let fear drive your decisions. Start small, build intuition, and adapt. The future of development isn’t about choosing between human code and AI code. It’s about knowing how to work with both effectively.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.eliostruyf.com/evolving-relationship-ai-development/"&gt;My evolving relationship with AI in development&lt;/a&gt; - My journey from prompt engineering to agentic AI&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.eliostruyf.com/ai-code-review-journey-copilot-coderabbit-macroscope/"&gt;My AI Code Review Journey: Copilot, CodeRabbit, Macroscope&lt;/a&gt; - Exploring various AI code review tools&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/features/copilot"&gt;GitHub Copilot&lt;/a&gt; - AI pair programmer&lt;/li&gt;
&lt;li&gt;&lt;a href="https://playwright.dev/"&gt;Playwright&lt;/a&gt; - End-to-end testing framework&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.anthropic.com/claude"&gt;Claude&lt;/a&gt; - AI assistant for development tasks&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;ghostwriter AI agents&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Reflecting on 2025: Finding Joy and Creating Value</title><link>https://www.eliostruyf.com/reflecting-2025-finding-joy-creating-value/</link><guid isPermaLink="true">https://www.eliostruyf.com/reflecting-2025-finding-joy-creating-value/</guid><description>Reflect on 2025's transformative journey of joy, creativity, and sustainable value creation as we embrace a brighter future together.</description><pubDate>Mon, 22 Dec 2025 16:06:57 GMT</pubDate><content:encoded>&lt;p&gt;As the year comes to a close, it is time to reflect on the past 12 months. 2025 has been a “transformative” year for me, not just in terms of the code I wrote or the talks I gave, but in how I approach my work and life. It was a year where I actively sought to rediscover joy in what I do, embraced a new way of working with AI, and started thinking about a sustainable future for my projects.&lt;/p&gt;
&lt;p&gt;In this post, I want to share the story of my 2025, the highlights, the lessons learned, and a glimpse into what 2026 holds.&lt;/p&gt;
&lt;h2 id="highlights-of-2025"&gt;Highlights of 2025&lt;a aria-hidden="true" tabindex="-1" href="#highlights-of-2025"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Here are some of the key moments from this year:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;13.000km cycled&lt;/strong&gt;: Completed my biggest cycling year yet, where I had my personal best on the &lt;strong&gt;Mont Ventoux&lt;/strong&gt;, but also my worst time, six weeks later on it.&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2025/12/mont-ventoux.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRlgAAABXRUJQVlA4IEwAAADwAQCdASoKAAgAAUAmJZQCdAD0n23yWAAA/vPELIGfZbIldyejMqEvXlJRGWzO7Jk+0xAeMibbbT24AB1M4b0CENc1N/ewmmObAAAA" data-src="https://www.eliostruyf.com/uploads/2025/12/mont-ventoux.webp" alt="Reaching the summit of Mont Ventoux" style="width:1600px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;Reaching the summit of Mont Ventoux&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://demotime.show"&gt;Demo Time&lt;/a&gt;’s explosive growth&lt;/strong&gt;: Reached over &lt;strong&gt;26,000 installs&lt;/strong&gt; and saw it used at major stages like &lt;strong&gt;OpenAI DevDays&lt;/strong&gt;, &lt;strong&gt;Microsoft Ignite&lt;/strong&gt;, and &lt;strong&gt;GitHub Universe&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Launching EngageTime&lt;/strong&gt;: Built and successfully tested a new attendee engagement platform at &lt;strong&gt;CollabDays Belgium&lt;/strong&gt; and &lt;strong&gt;VisugXL&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Speaking across Europe&lt;/strong&gt;: Delivered &lt;strong&gt;15+ sessions&lt;/strong&gt; at conferences including ESPC, Techorama, and the European Collaboration Summit.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Embracing Agentic AI&lt;/strong&gt;: Shifted my workflow to treat AI as a teammate, allowing me to build tools like &lt;strong&gt;FrameFit&lt;/strong&gt; in just 3 hours.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Hugo to Astro migration&lt;/strong&gt;: Successfully migrated my blog to Astro after 6 years with Hugo, improving my developer experience.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Community contributions&lt;/strong&gt;: Published &lt;strong&gt;30 blog posts&lt;/strong&gt; and continued my journey as a Microsoft MVP, GitHub Star, and Google Developer Expert.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="2025-in-numbers"&gt;2025 in Numbers&lt;a aria-hidden="true" tabindex="-1" href="#2025-in-numbers"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;I love looking at the data to see what I’ve been up to. Here is a breakdown of my year in numbers:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;5378&lt;/strong&gt; GitHub commits&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;41&lt;/strong&gt; New repositories created&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;26,000+&lt;/strong&gt; &lt;a href="https://demotime.show"&gt;Demo Time&lt;/a&gt; installations&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;30&lt;/strong&gt; Blog posts published&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;15+&lt;/strong&gt; Speaking engagements&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;13,000&lt;/strong&gt; Kilometers cycled&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;50.000.000&lt;/strong&gt; badges generated with the &lt;a href="https://visitorbadge.io/"&gt;Visitor Badge&lt;/a&gt; service&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2025/12/github-stats.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRlwAAABXRUJQVlA4IFAAAADQAQCdASoKAA4AAUAmJZwCdAELTaZGiAD+/W7TsXfMJ/9Y8WHRv024MqubFX0N0MK9Y9DdLw9TFpU0Rag3zwxJ7G34iJy2pMBUGBo5ZaOAAA==" data-src="https://www.eliostruyf.com/uploads/2025/12/github-stats.webp" alt="GitHub Stats from 2025" style="width:1179px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;GitHub Stats from 2025&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h2 id="finding-joy-and-creativity"&gt;Finding joy and creativity&lt;a aria-hidden="true" tabindex="-1" href="#finding-joy-and-creativity"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;At the start of the year, I realized I needed to get to know myself better. I wanted to find a way to get more joy out of my work, and to identify what gives me energy and what drains it. I decided to work with a coach, and it was one of the best decisions I made.&lt;/p&gt;
&lt;p&gt;This journey allowed me to explore my creativity again. It started with small things, like building a simple app to update my desktop background with the daily Microsoft Bing image, just because seeing a beautiful new landscape each day made me smile. It extended to practical tools, like an app to check the number of remaining GitHub Copilot premium calls. These small projects were the spark I needed to reignite my passion for building.&lt;/p&gt;
&lt;h2 id="embracing-ai-as-a-teammate"&gt;Embracing AI as a teammate&lt;a aria-hidden="true" tabindex="-1" href="#embracing-ai-as-a-teammate"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;A huge part of this creative rediscovery was shifting how I work with AI. I moved from simple prompt engineering to an &lt;strong&gt;“agentic-first” workflow&lt;/strong&gt;. I stopped just asking for code snippets and started treating AI as a partner in the development process.&lt;/p&gt;
&lt;p&gt;The Bing wallpaper app, for instance, was purely created with GitHub Copilot and coding agents. I defined what I wanted and the technology stack, and we built it together. This new workflow allowed me to be more ambitious. I built &lt;strong&gt;FrameFit&lt;/strong&gt;, a macOS screenshot utility, in just three hours. It wasn’t about replacing my skills; it was about removing the friction between an idea and its execution.&lt;/p&gt;
&lt;h2 id="getting-interviewed-by-ai"&gt;Getting interviewed by AI&lt;a aria-hidden="true" tabindex="-1" href="#getting-interviewed-by-ai"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;One of the experiments I ran this year was changing how I write. Inspired by &lt;a href="https://bsky.app/profile/danicat83.bsky.social"&gt;Daniela Petruzalek&lt;/a&gt;’s Speedgrapher, I started using AI not just to generate text, but to &lt;strong&gt;interview me&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;I created a set of &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;Ghostwriter Agents&lt;/a&gt; that work across GitHub Copilot, VS Code, and Claude. The process is simple: an AI agent interviews me about a topic, asking open-ended questions to capture the context and narrative. Then, another agent analyzes my writing style (my “voice”), and a third agent combines the transcript and voice profile to draft the post.&lt;/p&gt;
&lt;p&gt;The process reshaped my writing: it makes me explain the reasons behind my choices and teases out the narratives that vanish in a blank editor. This post began as an AI-led interview that I later edited and shaped into the final draft.&lt;/p&gt;
&lt;h2 id="from-passion-project-to-sustainable-future"&gt;From passion project to sustainable future&lt;a aria-hidden="true" tabindex="-1" href="#from-passion-project-to-sustainable-future"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;One of the most important lessons I learned this year is about value and freedom. I used to think that making all my tools available for free was the ultimate form of freedom. I realized that wasn’t the case. &lt;strong&gt;Value is creating the freedom.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://demotime.show"&gt;Demo Time&lt;/a&gt; is the perfect example. It started as a tool to solve my own frustration, because “demos are meant to fail.” Today, it’s a staple for speakers worldwide. To ensure I can keep supporting and innovating on it, I announced a plan to build a sustainable ecosystem around it, while keeping the core extension free and open-source. This shift in mindset is crucial for the longevity of the projects I love.&lt;/p&gt;
&lt;h2 id="the-birth-of-engagetime"&gt;The birth of EngageTime&lt;a aria-hidden="true" tabindex="-1" href="#the-birth-of-engagetime"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;This year also saw the birth of a major new project: &lt;a href="https://engagetime.live"&gt;EngageTime&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;As a speaker and organizer since 2012, I’ve always felt something was missing. I wanted a “Bird’s-eye view” of my sessions—better feedback, easier connections, and a seamless experience for attendees. In June, I started listing these frustrations, and EngageTime was the answer.&lt;/p&gt;
&lt;p&gt;We tested it live at &lt;strong&gt;CollabDays Belgium&lt;/strong&gt; and &lt;strong&gt;Visug XL&lt;/strong&gt;, and the results were fantastic. Seeing attendees join sessions by simply scanning a QR code—no auth, no app download required—and interacting in real-time was a validation of the vision. It proved that we were solving a real problem.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2025/12/54897584965_24a98ca86c_o.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRl4AAABXRUJQVlA4IFIAAAAQAgCdASoKAAcAAUAmJYwCdH8AGMJldZMAAP7wTX0fFwfsXS73r/++h2ROdGAwMOqp//D2c5Wj9HKcnrNn++WFfNY4vcSQWr2tCbadRwEY4UAA" data-src="https://www.eliostruyf.com/uploads/2025/12/54897584965_24a98ca86c_o.webp" alt="EngageTime in action at Techorama NL" style="width:1200px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;EngageTime in action at Techorama NL&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h2 id="looking-forward-to-2026"&gt;Looking forward to 2026&lt;a aria-hidden="true" tabindex="-1" href="#looking-forward-to-2026"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2025/12/fortune-cookie.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRnwAAABXRUJQVlA4IHAAAABQAgCdASoKAAgAAUAmJagCdLoAAs3VVLHrhgAA/vX/g+AiSGbLJCA7+PrBOEAPEedCRAEdFSire9Ui8K33cQ/XGLT5oXa8Zhc/GcNrAGbou0EZY3i8zZzSdc6uQs2cwHHDT95Vy4F+CMmrxljvAAAA" data-src="https://www.eliostruyf.com/uploads/2025/12/fortune-cookie.webp" alt="It doesn’t matter where you go, as long as you do it with a big smile." style="width:1024px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;It doesn’t matter where you go, as long as you do it with a big smile.&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;p&gt;As I look toward 2026, I have some big goals:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Commercializing EngageTime&lt;/strong&gt;: I will start selling EngageTime to events and speakers, with a foundational talk coming up in early January.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Schleck GranFondo&lt;/strong&gt;: My first major personal goal of the year is to ride the Schleck GranFondo in May.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Tour des Grands Cols Alpes&lt;/strong&gt;: Later in the year, I will embark on a 7-day cycling tour with 17,000 meters of elevation. We are riding this in memory of a friend who passed away from cancer this year. He rode it in 2022, and his wife will be joining us. It will be a special and emotional journey. &#128171;&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2025/12/2025-kotk-yves.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRlwAAABXRUJQVlA4IFAAAADwAQCdASoKAAoAAUAmJYgC7AELXbAgRQAA/vqjfG9gL9VCrfOPq7Rdxtlo7UzC4UKvF9sZkwMXHY4WZfidj+hfqjLpflk39sSjS4+e0sAAAA==" data-src="https://www.eliostruyf.com/uploads/2025/12/2025-kotk-yves.webp" alt="Sticker we sell for charity" style="width:1080px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;Sticker we sell for charity&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;p&gt;2025 was the year I rediscovered my rhythm.. I’m entering 2026 with a clear sense of direction, a sustainable approach to my work, and a heart full of gratitude.&lt;/p&gt;
&lt;p&gt;Happy New Year! &#127878;&lt;/p&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;ghostwriter AI agents&lt;/a&gt;.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item><item><title>Getting interviewed by AI to write better blog posts</title><link>https://www.eliostruyf.com/interviewed-ai-write-blog-posts/</link><guid isPermaLink="true">https://www.eliostruyf.com/interviewed-ai-write-blog-posts/</guid><description>Discover how AI interviews can transform your blog writing process, enhancing creativity and structure for better technical content.</description><pubDate>Sat, 13 Dec 2025 13:04:28 GMT</pubDate><content:encoded>&lt;p&gt;A couple of weeks ago, I saw something in a Google GDE call that completely changed how I think about writing technical content. &lt;a href="https://bsky.app/profile/danicat83.bsky.social"&gt;Daniela Petruzalek&lt;/a&gt; demoed her &lt;a href="https://github.com/danicat/speedgrapher"&gt;Speedgrapher MCP&lt;/a&gt; for &lt;a href="https://github.com/google-gemini/gemini-cli"&gt;Gemini CLI&lt;/a&gt;, and I’ll admit, I was immediately hooked. Not because it automated writing (it doesn’t, not really), but because it introduced a conversational layer between “I have an idea” and “here’s a 2,000-word article.”&lt;/p&gt;
&lt;p&gt;The concept is simple: instead of staring at a blank screen trying to organize your thoughts, you let an AI agent interview you about your topic. The AI asks questions, you answer them naturally, and by the end you have a structured transcript that captures not just the facts, but the narrative arc, the pain points, and the technical details that make a blog post actually useful.&lt;/p&gt;
&lt;p&gt;I’ve now written three blog posts using this approach. This article you’re reading right now? It started as an AI interview too. Here’s why this workflow clicked for me, and how I adapted it to work with GitHub Copilot CLI, VS Code, and Claude, not just Gemini CLI.&lt;/p&gt;
&lt;h2 id="the-problem-with-traditional-blog-writing"&gt;The problem with traditional blog writing&lt;a aria-hidden="true" tabindex="-1" href="#the-problem-with-traditional-blog-writing"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Normally when I write a technical article, I start with a vague sense of what might be interesting and start typing. Sometimes that works great. Other times I realize halfway through that I’ve organized the whole thing wrong, or I’ve skipped critical context that seemed obvious in my head but won’t be obvious to readers.&lt;/p&gt;
&lt;p&gt;Don’t get me wrong, I’ve written hundreds of blog posts this way and it’s served me well. But there’s always been this gap between what I know and what ends up on the page. The writing process filters out a lot: the tangential insights, the “oh by the way” details, the thought process behind technical decisions. Those pieces often get lost because I’m focused on the structure and flow of the final article.&lt;/p&gt;
&lt;h2 id="what-makes-the-interview-approach-different"&gt;What makes the interview approach different&lt;a aria-hidden="true" tabindex="-1" href="#what-makes-the-interview-approach-different"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Here’s what I noticed when I first tried Daniela’s Speedgrapher: the AI interviewer doesn’t let you skip ahead. It asks open-ended questions that make you think outside your usual mental framework. When you explain something out loud (or in text) conversationally, you naturally include context you’d otherwise assume was obvious. You mention the rabbit holes you went down. You explain &lt;em&gt;why&lt;/em&gt; you chose one approach over another.&lt;/p&gt;
&lt;p&gt;The interview format creates a structure without forcing rigidity. The AI uses what’s called an “Open-Focused-Closed” questioning model:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Open questions&lt;/strong&gt; explore the topic broadly: “What problem were you trying to solve?”&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Focused questions&lt;/strong&gt; drill into specifics: “Can you share the exact error message?” or “What did that code snippet look like?”&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Closed questions&lt;/strong&gt; confirm understanding: “So the fix was upgrading to version 2.1?”&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;This progression naturally builds a narrative arc. You’re not just dumping information—you’re telling a story with clear cause and effect.&lt;/p&gt;
&lt;h2 id="my-first-experiment-testing-speedgrapher"&gt;My First Experiment: Testing Speedgrapher&lt;a aria-hidden="true" tabindex="-1" href="#my-first-experiment-testing-speedgrapher"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;After seeing Daniela’s demo, I did what any developer would do: I immediately went to the &lt;a href="https://github.com/danicat/speedgrapher"&gt;Speedgrapher repo&lt;/a&gt;, cloned it, built the MCP (Model Context Protocol server), and configured it for Gemini CLI.&lt;/p&gt;
&lt;p&gt;For context: Gemini CLI supports &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/gemini-cli-custom-slash-commands"&gt;custom slash commands&lt;/a&gt;, which are essentially shortcuts that trigger specific AI behaviors. Speedgrapher uses this to implement its interviewer as a slash command. When you invoke it, the AI switches into interview mode and starts asking questions.&lt;/p&gt;
&lt;p&gt;The first test went really well. So well that I wrote three blog posts with it over the next week. The workflow felt natural: I’d have a rough idea for an article, start the interview, spend 10 minutes answering questions, and end up with a detailed transcript that captured way more nuance than my initial mental outline.&lt;/p&gt;
&lt;h2 id="the-limitation-platform-lock-in"&gt;The Limitation: Platform Lock-In&lt;a aria-hidden="true" tabindex="-1" href="#the-limitation-platform-lock-in"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Here’s the catch: Speedgrapher was built specifically for Gemini CLI’s slash command system. That’s great if you’re using Gemini CLI, but I also use GitHub Copilot CLI, VS Code with Copilot Chat, and Claude regularly. Each tool has its own way of handling AI assistants, and none of them (at the time) supported Gemini’s slash command format.&lt;/p&gt;
&lt;p&gt;I could have just stuck with Gemini CLI for article writing. But I realized the core concept, an AI agent with a specific interviewer prompt, didn’t have to be tied to one platform. The “interview” behavior is just a carefully crafted prompt that guides the AI’s questioning style and output format. If I could package that prompt in a way each platform understood, I could use the same workflow everywhere.&lt;/p&gt;
&lt;h2 id="building-cross-platform-ghostwriter-agents"&gt;Building cross-platform Ghostwriter Agents&lt;a aria-hidden="true" tabindex="-1" href="#building-cross-platform-ghostwriter-agents"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;This is where I got a bit ambitious: I decided to create agent files that would work across GitHub Copilot CLI, VS Code, and Claude from day one. No point building for one platform and porting later, I use all three tools regularly depending on context.&lt;/p&gt;
&lt;p&gt;Here’s the elegant part: agents are nothing more than Markdown files with prompts inside them. Each tool reads these files from specific locations:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;GitHub Copilot CLI&lt;/strong&gt;: &lt;code&gt;~/.copilot/agents/&lt;/code&gt; (global)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;VS Code&lt;/strong&gt;: &lt;code&gt;.github/agents/&lt;/code&gt; in your project&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Claude&lt;/strong&gt;: &lt;code&gt;~/.claude/agents/&lt;/code&gt; (global)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The structure is simple. Here’s a simplified version of what an agent file looks like:&lt;/p&gt;
&lt;div class="expressive-code"&gt;&lt;link rel="stylesheet" href="/_astro/ec.gf7e9.css"&gt;&lt;script type="module" src="/_astro/ec.8zarh.js"&gt;&lt;/script&gt;&lt;figure class="frame"&gt;&lt;figcaption class="header"&gt;&lt;/figcaption&gt;&lt;pre data-language="markdown"&gt;&lt;code&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;1&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;---&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;2&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#85E89D;--1:#1E7734"&gt;name&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"ghostwriter-interviewer"&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;3&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#85E89D;--1:#1E7734"&gt;description&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;: &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;"Interviews an author to produce a technical blog post"&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;4&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;---&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;5&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;
&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;6&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;Act as an expert interviewer for a technical blog...&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;7&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt;[Full prompt with instructions, guidelines, and behavioral rules]&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div class="copy"&gt;&lt;button title="Copy to clipboard" data-copied="Copied!" data-code="---&#127;name: &amp;#x22;ghostwriter-interviewer&amp;#x22;&#127;description: &amp;#x22;Interviews an author to produce a technical blog post&amp;#x22;&#127;---&#127;&#127;Act as an expert interviewer for a technical blog...&#127;[Full prompt with instructions, guidelines, and behavioral rules]"&gt;&lt;div&gt;&lt;/div&gt;&lt;/button&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;p&gt;That’s it. The YAML frontmatter gives the agent a name and description, and the markdown body contains the prompt that defines how the AI should behave.&lt;/p&gt;
&lt;h2 id="the-agents"&gt;The agents&lt;a aria-hidden="true" tabindex="-1" href="#the-agents"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Speedgrapher consists of various slash commands, but I distilled the core functionality into five agents:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The interviewer: conducts the interview and gathers raw material&lt;/li&gt;
&lt;li&gt;The voice agent: analyzes your writing style to create a voice profile&lt;/li&gt;
&lt;li&gt;The writer agent: expands the interview transcript into a full article&lt;/li&gt;
&lt;li&gt;The context agent: when you need to restart, or want to continue with an existing draft interview, you can load the context with this agent&lt;/li&gt;
&lt;li&gt;The reviewer agent: reviews and refines the draft for clarity and flow&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Here’s a breakdown of the main three agents you’ll use most often:&lt;/p&gt;
&lt;h3 id="1-the-interviewer-agent"&gt;1. The interviewer agent&lt;a aria-hidden="true" tabindex="-1" href="#1-the-interviewer-agent"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Its job is to ask questions and gather raw material. The prompt instructs it to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Ask exactly one question per turn (no overwhelming multi-part questions)&lt;/li&gt;
&lt;li&gt;Use the Open-Focused-Closed model to progressively narrow in on details&lt;/li&gt;
&lt;li&gt;Request actual artifacts: real code snippets, exact error messages, specific version numbers&lt;/li&gt;
&lt;li&gt;Focus on narrative elements: pain points, breakthroughs, “aha moments”&lt;/li&gt;
&lt;li&gt;Maintain a cozy but professional tone (think knowledgeable peer, not distant expert)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;When you’re done, you tell the agent to stop, and it outputs a complete markdown transcript saved as &lt;code&gt;INTERVIEW.md&lt;/code&gt;.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2025/12/interview.webp" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,UklGRmwAAABXRUJQVlA4WAoAAAAQAAAACQAAAgAAQUxQSB8AAAAAoM7Ozs7Ozs7OoMj//////////8ii0NDQ0NDQ0NCiAFZQOCAmAAAA0AEAnQEqCgADAAFAJiWkAnQBDvbOLAAA/v5Ob13ZO5+htsJLYAA=" data-src="https://www.eliostruyf.com/uploads/2025/12/interview.webp" alt="The interview with AI" style="width:1323px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;The interview with AI&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h3 id="2-the-voice-agent-optional-but-recommended"&gt;2. The voice agent (optional but recommended)&lt;a aria-hidden="true" tabindex="-1" href="#2-the-voice-agent-optional-but-recommended"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;The voice agent scans your existing content like blog posts, documentation, whatever you’ve written before and analyzes your writing style. It produces a detailed profile that includes:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Voice characteristics&lt;/strong&gt;: tone, pacing, formality level, typical sentence structure&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Style rules&lt;/strong&gt;: what you do and don’t do in your writing&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Lexicon&lt;/strong&gt;: your favorite phrases, transitions, and words you avoid&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Structure patterns&lt;/strong&gt;: how you typically start and end articles, your heading style&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This profile gets saved and used by the writer agent in the next step. The result? AI-generated drafts that actually sound like they came from you, not from Generic Tech Blog #47.&lt;/p&gt;
&lt;p&gt;Running the voice agent is optional, you can skip straight to the writer, but I highly recommend doing it at least once. The quality difference is noticeable.&lt;/p&gt;
&lt;div class="caption"&gt;
  &lt;figure class="caption__figure"&gt;
    &lt;a class="lightbox" href="https://www.eliostruyf.com/uploads/2025/12/voice.png" title="Show image"&gt;
      &lt;span class="sr-only"&gt;Show image&lt;/span&gt;
      &lt;img src="data:image/jpeg;data/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAAoAAAAFCAYAAAB8ZH1oAAAACXBIWXMAAAsTAAALEwEAmpwYAAAAvklEQVR4nDXDTYrCMBQA4NyhNkl/kpf0vbaZ0aYooqu528AwMKcYKPQ2QleC4No7CLXwxIUffMJT/KGwu1LYnXzdT9j2k8VucnU/AXYn6zfXAj6+BTbduD98cdweF2wjUxvZ04YdrbmwtBgfWOXVIPKyGrr+wHUb71CF2bhmlhnMK2XnRJr7SgEn0vwLA/WI9MnWh6WEhnNDrIuKVe5fH7pEVhkMQpf4lxm8pRrOqbKXVLuLfM/cWWp3U9r9PgFP+D5BGJJ6kwAAAABJRU5ErkJggg==" data-src="https://www.eliostruyf.com/uploads/2025/12/voice.png" alt="The voice agent" style="width:804px;" class="lazyload"&gt;
    &lt;/a&gt;
    &lt;figcaption class="caption__text"&gt;The voice agent&lt;/figcaption&gt;
  &lt;/figure&gt;
&lt;/div&gt;
&lt;h3 id="3-the-writer-agent"&gt;3. The writer agent&lt;a aria-hidden="true" tabindex="-1" href="#3-the-writer-agent"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;This agent takes the interview transcript and (if available) your voice profile, then expands it into a full article. But it’s not just a transcript formatter. The prompt instructs it to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Add context and definitions for complex terms&lt;/li&gt;
&lt;li&gt;Identify every tool, library, or technology mentioned and add proper markdown links&lt;/li&gt;
&lt;li&gt;Explain the &lt;em&gt;why&lt;/em&gt; behind code examples, not just the syntax&lt;/li&gt;
&lt;li&gt;Maintain narrative flow and the article’s emotional arc&lt;/li&gt;
&lt;li&gt;Follow “cozy web” editorial guidelines (helpful, relatable, narrative-focused)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The output is a markdown document ready for review. In my experience, the drafts are already well-structured—I mostly add screenshots, refine code examples, and adjust a few phrasing choices.&lt;/p&gt;
&lt;h2 id="using-the-agents-in-practice"&gt;Using the Agents in practice&lt;a aria-hidden="true" tabindex="-1" href="#using-the-agents-in-practice"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Each platform has a slightly different invocation method, but they all use the same underlying markdown files:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;In Claude:&lt;/strong&gt;
Type &lt;code&gt;@agent-ghostwriter-interviewer&lt;/code&gt; to start the interview. The &lt;code&gt;@&lt;/code&gt; symbol lets you reference agents by name.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;In GitHub Copilot CLI:&lt;/strong&gt;
Launch the CLI, select &lt;code&gt;/agent&lt;/code&gt;, then choose the ghostwriter-interviewer from the list.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;In VS Code with Copilot Chat:&lt;/strong&gt;
Open the agent selector dropdown and pick the interviewer agent.&lt;/p&gt;
&lt;p&gt;Once invoked, the experience is remarkably consistent across all three platforms. You answer questions, the AI follows up naturally, and you end with a complete transcript.&lt;/p&gt;
&lt;h2 id="what-ive-learned-after-three-articles"&gt;What I’ve learned after three articles&lt;a aria-hidden="true" tabindex="-1" href="#what-ive-learned-after-three-articles"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;The biggest surprise isn’t the time savings, although it does feel faster, but the quality of the raw material. When an AI interviews you, it asks about things you might not have thought to include. It pushes you to articulate the “why” behind decisions. It captures those tangential insights that usually get edited out.&lt;/p&gt;
&lt;p&gt;Something I’m still working on: making it easier to include screenshots and code samples during the interview process. Right now I add those in the review phase, after the writer agent has produced the draft. It works, but it feels like there’s room for improvement there.&lt;/p&gt;
&lt;p&gt;The other thing I’ve noticed is that the articles feel more complete from the start. There’s less “oh, I forgot to explain this fundamental concept” during editing because the interviewer forced me to explain it during the conversation.&lt;/p&gt;
&lt;h2 id="try-it-yourself"&gt;Try it yourself&lt;a aria-hidden="true" tabindex="-1" href="#try-it-yourself"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;All credit for this idea goes to &lt;a href="https://bsky.app/profile/danicat83.bsky.social"&gt;Daniela Petruzalek&lt;/a&gt; and her &lt;a href="https://github.com/danicat/speedgrapher"&gt;Speedgrapher MCP&lt;/a&gt;. I just adapted the concept to work across multiple platforms.&lt;/p&gt;
&lt;p&gt;I’ve made the ghostwriter agents available at &lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;github.com/estruyf/ghostwriter-agents-ai&lt;/a&gt;. Installation is straightforward, you can install for all supported platforms at once:&lt;/p&gt;
&lt;div class="expressive-code"&gt;&lt;figure class="frame is-terminal"&gt;&lt;figcaption class="header"&gt;&lt;span class="title"&gt;&lt;/span&gt;&lt;span class="sr-only"&gt;Terminal window&lt;/span&gt;&lt;/figcaption&gt;&lt;pre data-language="bash"&gt;&lt;code&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;1&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#B392F0;--1:#6F42C1"&gt;npx&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;@estruyf/ghostwriter&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div class="copy"&gt;&lt;button title="Copy to clipboard" data-copied="Copied!" data-code="npx @estruyf/ghostwriter"&gt;&lt;div&gt;&lt;/div&gt;&lt;/button&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;p&gt;Or target specific platforms:&lt;/p&gt;
&lt;div class="expressive-code"&gt;&lt;figure class="frame is-terminal"&gt;&lt;figcaption class="header"&gt;&lt;span class="title"&gt;&lt;/span&gt;&lt;span class="sr-only"&gt;Terminal window&lt;/span&gt;&lt;/figcaption&gt;&lt;pre data-language="bash"&gt;&lt;code&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;1&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#99A0A6;--1:#616972"&gt;# Install for specific platforms&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;2&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#B392F0;--1:#6F42C1"&gt;npx&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;@estruyf/ghostwriter&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;--vscode&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;3&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#B392F0;--1:#6F42C1"&gt;npx&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;@estruyf/ghostwriter&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;--copilot&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;div class="ec-line"&gt;&lt;div class="gutter"&gt;&lt;div class="ln" aria-hidden="true"&gt;4&lt;/div&gt;&lt;/div&gt;&lt;div class="code"&gt;&lt;span style="--0:#B392F0;--1:#6F42C1"&gt;npx&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#9ECBFF;--1:#032F62"&gt;@estruyf/ghostwriter&lt;/span&gt;&lt;span style="--0:#E1E4E8;--1:#24292E"&gt; &lt;/span&gt;&lt;span style="--0:#79B8FF;--1:#005CC5"&gt;--claude&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div class="copy"&gt;&lt;button title="Copy to clipboard" data-copied="Copied!" data-code="npx @estruyf/ghostwriter --vscode&#127;npx @estruyf/ghostwriter --copilot&#127;npx @estruyf/ghostwriter --claude"&gt;&lt;div&gt;&lt;/div&gt;&lt;/button&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;p&gt;The installer copies the agent files to the right locations for each tool.&lt;/p&gt;
&lt;h2 id="whats-next"&gt;What’s next&lt;a aria-hidden="true" tabindex="-1" href="#whats-next"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;p&gt;Right now I’m waiting to see how others use this workflow. Different writing styles, different content types, I’m curious whether the three-agent pipeline holds up across different use cases.&lt;/p&gt;
&lt;p&gt;The screenshot integration is still on my mind. There’s probably a way to handle visual artifacts more elegantly during the interview phase.&lt;/p&gt;
&lt;p&gt;If you write technical articles or blog posts, I’d encourage you to give this a try. It’s a genuinely different experience from traditional writing, not better or worse, just different. You might find, like I did, that having an AI ask you questions unlocks parts of your knowledge that wouldn’t have made it into the article otherwise.&lt;/p&gt;
&lt;p&gt;Let me know what you think. And if you try the ghostwriter agents, I’d love to hear how it works for your writing process.&lt;/p&gt;
&lt;hr&gt;
&lt;aside class="callout callout-note" aria-label="note"&gt;&lt;div class="callout-icon" aria-hidden="true"&gt;&lt;svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"&gt;&lt;path d="M2 6h4"&gt;&lt;/path&gt;&lt;path d="M2 10h4"&gt;&lt;/path&gt;&lt;path d="M2 14h4"&gt;&lt;/path&gt;&lt;path d="M2 18h4"&gt;&lt;/path&gt;&lt;rect width="16" height="20" x="4" y="2" rx="2"&gt;&lt;/rect&gt;&lt;path d="M9.5 8h5"&gt;&lt;/path&gt;&lt;path d="M9.5 12H16"&gt;&lt;/path&gt;&lt;path d="M9.5 16H14"&gt;&lt;/path&gt;&lt;/svg&gt;&lt;/div&gt;&lt;div class="callout-content"&gt;&lt;p class="callout-title"&gt;note&lt;/p&gt;&lt;div class="callout-text"&gt;This article was created using the ghostwriter-agents workflow. I was interviewed by the AI, ran the voice agent to capture my writing style, and used the writer agent to produce this draft. The meta experience of writing &lt;em&gt;about&lt;/em&gt; the tool &lt;em&gt;using&lt;/em&gt; the tool was… appropriately recursive.&lt;/div&gt;&lt;/div&gt;&lt;/aside&gt;
&lt;h2 id="resources"&gt;Resources&lt;a aria-hidden="true" tabindex="-1" href="#resources"&gt;&lt;span class="icon icon-link"&gt;&lt;/span&gt;&lt;/a&gt;&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://bsky.app/profile/danicat83.bsky.social"&gt;Daniela Petruzalek on Bluesky&lt;/a&gt; - Original creator of Speedgrapher&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/danicat/speedgrapher"&gt;Speedgrapher MCP&lt;/a&gt; - Original interview-based writing tool for Gemini CLI&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/google-gemini/gemini-cli"&gt;Gemini CLI&lt;/a&gt; - Google’s command-line interface for Gemini&lt;/li&gt;
&lt;li&gt;&lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/gemini-cli-custom-slash-commands"&gt;Gemini CLI Custom Slash Commands&lt;/a&gt; - Documentation on Gemini’s slash command system&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/estruyf/ghostwriter-agents-ai"&gt;Ghostwriter Agents AI&lt;/a&gt; - Cross-platform agent files for GitHub Copilot, VS Code, and Claude&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.npmjs.com/package/@estruyf/ghostwriter"&gt;@estruyf/ghostwriter on npm&lt;/a&gt; - Installation package for ghostwriter agents&lt;/li&gt;
&lt;/ul&gt;</content:encoded><dc:creator>Elio Struyf</dc:creator><author>Elio Struyf</author></item></channel></rss>