<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>James Governor&#039;s Monkchips</title>
	<atom:link href="https://redmonk.com/jgovernor/feed/" rel="self" type="application/rss+xml" />
	<link>https://redmonk.com/jgovernor</link>
	<description>An industry analyst blog looking at software ecosystems and convergence</description>
	<lastBuildDate>Fri, 14 Nov 2025 17:56:00 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.1</generator>
<site xmlns="com-wordpress:feed-additions:1">140672375</site>	<item>
		<title>On Cursor, Erich Gamma, VS Code forks and the surprising role of the Eclipse Foundation</title>
		<link>https://redmonk.com/jgovernor/on-cursor-vs-code-forks-and-the-surprising-role-of-the-eclipse-foundation/</link>
					<comments>https://redmonk.com/jgovernor/on-cursor-vs-code-forks-and-the-surprising-role-of-the-eclipse-foundation/#comments</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Fri, 14 Nov 2025 17:47:01 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5379</guid>

					<description><![CDATA[I was writing this post today when the news dropped that Cursor has just raised a new round.   &#62; We’re pleased to announce a new round of financing: our Series D of $2.3B at a $29.3B post-money valuation. We’re excited to deepen our work with existing investors, including Accel, Thrive, Andreessen Horowitz, and DST, and]]></description>
										<content:encoded><![CDATA[<p><a href="http://redmonk.com/jgovernor/files/2025/11/workbench-3_2M3-xp.gif"><img fetchpriority="high" decoding="async" class="aligncenter size-full wp-image-5380" src="http://redmonk.com/jgovernor/files/2025/11/workbench-3_2M3-xp.gif" alt="" width="644" height="491" /></a></p>
<p><span style="font-weight: 400;">I was writing this post today when the </span><a href="https://cursor.com/blog/series-d"><span style="font-weight: 400;">news dropped</span></a><span style="font-weight: 400;"> that Cursor has just raised a new round.  </span></p>
<p><span style="font-weight: 400;">&gt; We’re pleased to announce a new round of financing: our Series D of $2.3B at a $29.3B post-money valuation. We’re excited to deepen our work with existing investors, including Accel, Thrive, Andreessen Horowitz, and DST, and welcome new partners Coatue, NVIDIA, and Google. </span></p>
<p><span style="font-weight: 400;">Now I am old enough to remember when a company that approached a VC to say they were raising money for a dev tools startup would have been laughed at. Obviously that was a different time, but raising $2.3bn on a VS Code fork/Claude Sonnet wrapper is quite impressive all the same, no matter how great the developer experience is. Cursor is excellent software to be fair.</span></p>
<p><span style="font-weight: 400;">So what does The Eclipse Foundation have to do with this? It’s not the first name you think of when considering organisations supporting the current wave of innovation in AI tooling? I mean wasn’t Eclipse that open source Java IDE from the 1990s and 2000s? Yes it absolutely was, but the Foundation has continued to support the standardisation of open technologies, and find new niches (like automotive) to fill. [It&#8217;s also worth mentioning, though not the subject of this post, the excellent work the foundation is doing in corralling other open source foundations around security, for example helping them to come to terms with the implications of the European <a href="https://redmonk.com/blog/2025/10/09/rmc-cra-eclipsefoundation/">Cyber Resiliency Act</a>. Further side note &#8211; if you build software you absolutely have to start paying attention to the CRA.]  </span></p>
<p><a href="https://open-vsx.org/"><span style="font-weight: 400;">Open VSX</span></a><span style="font-weight: 400;">, an Eclipse Foundation project, is an open source registry of extensions for VS Code compatible editors such as Cursor and Windsurf and vibe coding tools including Bolt.</span></p>
<p><span style="font-weight: 400;">Microsoft’s position is pretty clear &#8211; you can’t use the VS Code Marketplace to enable or support products outside the Visual Studio family. So while, for example, VSCodium is a community-driven, MIT-licensed distro of VS Code, which third parties can build on, they can’t take advantage of the official marketplace. VSCodium uses Open VSX registry as its default extension marketplace, as do most other VS Code forks. </span></p>
<p><span style="font-weight: 400;">Microsoft has been fairly benign about forks of VS Code until quite recently, but with the huge success of the Cursor fork of VS Code it has begun to bare its teeth. </span><span style="font-weight: 400;">Dion Almaer wrote about the forking issue, and Microsoft’s approach to it </span><a href="https://ainativedev.io/news/microsofts-going-to-war"><span style="font-weight: 400;">here</span></a><span style="font-weight: 400;">.</span></p>
<p><span style="font-weight: 400;">&gt; I want healthy competition. I want VSCode to open up more extension points so anyone can build great experiences without having to fork. I want changes to the marketplace rules so it can be more open. I want companies to be able to work together via open source so we can all gain from the rising tide (a la Chromium++).</span></p>
<p><span style="font-weight: 400;">&gt; We are going through such an explosion in the world of development thanks to AI and the surfaces where we get to use it.</span></p>
<p><span style="font-weight: 400;">&gt; What will be next? Will the companies with chess pieces on the board make moves that can help all? As developers what is our role to play? We can be clear on what we want to see… and we can adopt the tools that tie to our values.</span></p>
<p><span style="font-weight: 400;">Open VSX will be part of any pushback to Microsoft efforts to regain control of the situation. Amazon Web Services adopted Open VSX as the default registry for its <a href="https://kiro.dev/">Kiro</a> AI IDE, which is another big win for the project. AWS recently announced it’s </span><a href="https://blogs.eclipse.org/post/mike-milinkovich/aws-invests-strengthening-open-source-infrastructure-eclipse-foundation"><span style="font-weight: 400;">investing to support the Eclipse Foundation accordingly</span></a><span style="font-weight: 400;">. So yes, Open VSX is in an interesting place.</span></p>
<p><span style="font-weight: 400;">As I wrote recently &#8211; </span><a href="https://redmonk.com/jgovernor/ai-disruption-code-editors-are-up-for-grabs/"><span style="font-weight: 400;">Editors are up for Grabs</span></a><span style="font-weight: 400;">&#8211; and the Eclipse Foundation is supporting this innovation and choice.</span></p>
<p>For those of you that are IT history enjoyers one significant irony of all of this is the original Eclipse IDE was written by Erich Gamma. He also just happens to be the original lead on the VS Code project, which began as Project Monaco, a web-based IDE in 2011. He&#8217;s still the key figurehead behind it. So an organisation founded to support a project that he wrote 25 years ago is now becoming a counterweight to what he later built at Microsoft &#8211; VS Code, one of the most successful and well-loved code editors of all time. What a towering influence Gamma has been on the world of developer tools. He really is the GOAT.</p>
<p>&nbsp;</p>
<p><span style="font-weight: 400;">Disclosure: AWS, The Eclipse Foundation and Microsoft are all clients. </span></p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/on-cursor-vs-code-forks-and-the-surprising-role-of-the-eclipse-foundation/feed/</wfw:commentRss>
			<slash:comments>2</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5379</post-id>	</item>
		<item>
		<title>Progressive Delivery, the book, is here</title>
		<link>https://redmonk.com/jgovernor/progressive-delivery-the-book-is-here/</link>
					<comments>https://redmonk.com/jgovernor/progressive-delivery-the-book-is-here/#respond</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Wed, 12 Nov 2025 15:46:36 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5376</guid>

					<description><![CDATA[It&#8217;s been a long time coming but the book has finally shipped. I am super proud to have worked with my co-authors Adam Zimman, Heidi Waterhouse, and Kimberly Harrison to deliver something that I think is going to be really useful to people. I can&#8217;t think of a better publisher than IT Revolution. I mean]]></description>
										<content:encoded><![CDATA[<p><a href="http://redmonk.com/jgovernor/files/2025/11/progressive-delivery.png"><img decoding="async" class="aligncenter size-full wp-image-5377" src="http://redmonk.com/jgovernor/files/2025/11/progressive-delivery.png" alt="" width="619" height="346" srcset="https://redmonk.com/jgovernor/files/2025/11/progressive-delivery.png 619w, https://redmonk.com/jgovernor/files/2025/11/progressive-delivery-300x168.png 300w, https://redmonk.com/jgovernor/files/2025/11/progressive-delivery-480x268.png 480w" sizes="(max-width: 619px) 100vw, 619px" /></a></p>
<p>It&#8217;s been a long time coming but the book has finally shipped. I am super proud to have worked with my co-authors Adam Zimman, Heidi Waterhouse, and Kimberly Harrison to deliver something that I think is going to be really useful to people. I can&#8217;t think of a better publisher than IT Revolution. I mean if you&#8217;re in the same stable as Accelerate and The Phoenix Project, you know you&#8217;re doing something right.</p>
<p>We wanted a strong analytical framework and great case studies and that&#8217;s exactly what we&#8217;ve got. We also wanted to cast a fresh eye on engineering and product management disciplines, putting the user first but really thinking about the business of software delivery and how to improve it. In the book for example we think about the third loop. It&#8217;s not enough just to have dev and ops. You also need the user in the frame. So much of the foundational work in defining modern software delivery was established before we had the world’s computing resources at our fingertips in the cloud. Before we had infrastructure as code as a guiding principle. Before we could really model and accurately codify every single element in our IT estate. Before we had accepted the primacy, even, of engineering and software developers in delivering new digital products. Before we could do real user monitoring. Before we could easily do dark launches using purpose built platforms. Before observability had brought all of the signals together that we would need to build, operate and iterate more effectively. We therefore established a framework based on the world as it is now, and the new requirements and capabilities, based on 4 key tenets &#8211; Abundance, Automation, Alignment and Autonomy. The past is a different country and the new country is progressive delivery.</p>
<p>Good business books need great case studies. And to that end we did really well. To have engineers from Amazon Web Services, GitHub, Sumo Logic, and Nike share knowledge and experience with us about the way they had built systems that put the user first, that would enable them to roll new services out to specific cohorts in controlled ways before broader rollouts, reliably, safely and quickly. Yeah that was super exciting and we&#8217;re so glad that we were able to include them in the book.</p>
<blockquote><p>At its core, Progressive Delivery exists to serve a fundamental purpose that can be distilled into a simple yet powerful statement: delivering what users need when they need it at the least cost and risk to everyone involved.</p>
<p>This is perhaps the manifesto of Progressive Delivery—not a description of its methods but a declaration of its ultimate goal. It captures the essence of what we’re trying to achieve when we implement these practices. By keeping this goal at the forefront of our thinking, we create a clear criterion against which all of our technical decisions, organizational structures, and delivery practices can be measured.</p>
<p>When we center our work on this purpose, we naturally align our teams, our technologies, and our processes toward creating value rather than just producing output. It transforms software delivery from the mechanical process of shipping features to a thoughtful practice of providing solutions that genuinely help people accomplish their goals.”</p></blockquote>
<p>– <a href="https://progressivedelivery.com/">Progressive Delivery: Build the Right Thing for the Right People at the Right Time</a>.</p>
<p>So Progressive Delivery is partly about developer and team autonomy, supporting alignment with the business. But it’s also very much about user needs. One of the great advantages of abundance is that we can run multiple versions of applications and services, and in some cases let the user decide when they’re ready to migrate to a new service. We need to put the user’s needs first. Now of course, great design and great product management are about bringing the user on a journey, and helping them find and enjoy new ways of working and playing. But jerking them around is not the way to do it.</p>
<p>So yeah, I’m pretty excited about the book. If you’re building digital products and services it will give you some important food for thought, and perhaps make you rethink some of your assumptions. Copies are available at Amazon and even book stores, which feels pretty wild. When you read it please review it too &#8211; that would be super helpful in getting the word out!</p>
<p>If you’d like to discuss Progressive Delivery, or have me or one of my co authors present these ideas at your conference or event please let me know.</p>
<p><a href="http://redmonk.com/jgovernor/files/2025/11/signal-2025-11-12-14-39-08-263-1.jpg"><img decoding="async" class="aligncenter size-large wp-image-5378" src="http://redmonk.com/jgovernor/files/2025/11/signal-2025-11-12-14-39-08-263-1-1024x632.jpg" alt="four happy looking people in a pastiche of the famous picture from the Friends TV show. one guy with long hair, a dark haired woman, a guy in an orange checked shirt, and a women with brightly coloured purplish hair." width="1024" height="632" srcset="https://redmonk.com/jgovernor/files/2025/11/signal-2025-11-12-14-39-08-263-1-1024x632.jpg 1024w, https://redmonk.com/jgovernor/files/2025/11/signal-2025-11-12-14-39-08-263-1-300x185.jpg 300w, https://redmonk.com/jgovernor/files/2025/11/signal-2025-11-12-14-39-08-263-1-768x474.jpg 768w, https://redmonk.com/jgovernor/files/2025/11/signal-2025-11-12-14-39-08-263-1-480x296.jpg 480w, https://redmonk.com/jgovernor/files/2025/11/signal-2025-11-12-14-39-08-263-1-1017x627.jpg 1017w, https://redmonk.com/jgovernor/files/2025/11/signal-2025-11-12-14-39-08-263-1.jpg 1524w" sizes="(max-width: 1024px) 100vw, 1024px" /></a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/progressive-delivery-the-book-is-here/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5376</post-id>	</item>
		<item>
		<title>VMware Cloud Foundation &#8211; what&#8217;s actually going on?</title>
		<link>https://redmonk.com/jgovernor/vcf-whatsup/</link>
					<comments>https://redmonk.com/jgovernor/vcf-whatsup/#comments</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Tue, 28 Oct 2025 21:22:09 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5375</guid>

					<description><![CDATA[It’s been nearly two years since Broadcom acquired VMware, and a turbulent two years at that. There has been plenty of disruption. Customers and partners have both been fairly vocal about business model changes. But as renewal deals have been signed things have calmed down for now. I recently talked to Prashanth Shenoy, vice president]]></description>
										<content:encoded><![CDATA[<p><iframe class='youtube-player' width='640' height='360' src='https://www.youtube.com/embed/_aqnVs9CgyI?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent' allowfullscreen='true' style='border:0;' sandbox='allow-scripts allow-same-origin allow-popups allow-presentation'></iframe></p>
<p><span style="font-weight: 400;">It’s been nearly two years since Broadcom acquired VMware, and a turbulent two years at that. There has been plenty of disruption. Customers and partners have both been fairly vocal about business model changes. But as renewal deals have been signed things have calmed down for now. </span><span style="font-weight: 400;">I recently talked to Prashanth Shenoy, vice president of product marketing in the VMware Cloud Foundation (VCF) Division of Broadcom about what’s going on with the integration, and what we can expect in future. </span></p>
<p><b> </b><span style="font-weight: 400;">A few things really stood out in this interview:</span></p>
<ol>
<li style="font-weight: 400;" aria-level="1"><span style="font-weight: 400;">Clarity.  The way Shenoy communicates what&#8217;s going on is really admirable. It&#8217;s well worth watching the video and/or reading the entire transcript because of that. His thoughts are well structured and very clear.</span></li>
<li style="font-weight: 400;" aria-level="1"><span style="font-weight: 400;">Continuity. Prashant repeatedly makes clear that in fact a lot of the changes that have been made since the acquisition were already in play with VMware. The difference is that Broadcom tore off the band-aid. Broadcom made a dramatic shift, but the direction of travel was already in place. It&#8217;s not like Broadcom came in and threw away everything that VMware was doing. It just moved more aggressively and with more clarity.</span></li>
<li style="font-weight: 400;" aria-level="1"><span style="font-weight: 400;">Chutzpah. Considering all the sound and fury about the acquisition, it was interesting to have Shenoy straight out argue that VCF lowered prices. As a VMware customer, industry commentator, partner, or competitor you will be surprised to hear that. But Prashant went there and made the claim that the biggest difference is the transition to subscription-based pricing and that in some cases there were reductions over what you would have paid for the equivalent VMware offerings. Bold moves and bold claims. </span></li>
</ol>
<p><span style="font-weight: 400;">Shenoy challenged me about the industry at large and the transition to subscription-based pricing &#8211; frankly he’s absolutely right that we live in a subscription rather than a perpetual license economy. These transitions are indeed painful. Adobe for example got in there fairly early &#8211; customers still complain about the transition to a subscription-based model, but Adobe’s growth since the decision shows it was merited.</span></p>
<p><span style="font-weight: 400;">Here&#8217;s his take on on &#8220;price cuts&#8221;.</span></p>
<blockquote><p><span style="font-weight: 400;">In fact, when we had our subscription product for VCF, it was $700 per core per year. And we cut it down for $350 per core per year, the list price, right? Which is a 50% reduction. So, the contrary to popular belief that we raised the price, we actually decreased the price. But there was a shift from perpetual to subscription. </span></p></blockquote>
<p><span style="font-weight: 400;">One big theme of the conversation, and indeed the raison d’etre for VCF is portfolio simplification &#8211; he said that before the acquisition VMware had 9,000 SKUs for its cloud infrastructure product. It is absolutely inarguable that the VMware portfolio had become complicated, unwieldy, and messy. It needed to be simplified. There had been so many acquisitions, so many overlaps born of strategic shifts, that it was hard to fully grasp the portfolio. Tanzu for example started as a container platform, before it became a grab bag of technologies.</span></p>
<blockquote><p>So, we went ahead and simplified the product based on what the customer wanted. give me a platform that truly helps me give a strong alternative to public cloud, like a true private cloud platform, right? And that was VCF. So, that’s where we focused our energy and intention. Portfolio simplification.</p></blockquote>
<p><span style="font-weight: 400;"> A lot of our customers now understand the value that the platform provides. In fact, we’ve seen some major improvements in the total cost of ownership once they go through the Capex hurdle. And we have seen a lot of our customers move their workloads, including the modern workloads, containerized workloads, onto VCF to run their business. So, it’s been a pretty exciting transformation, but with its bumps. We could have done a lot better in terms of communicating why we did this and bringing the customer along. But we were so fast in making the changes that it caused some disruption in the market.”</span></p>
<p><span style="font-weight: 400;">Certainly true. Change management is hard.</span></p>
<p><span style="font-weight: 400;">One area where VMware is in excellent shape is in data, infrastructure, and operational sovereignty. The market has moved its way decisively. Global geopolitics has pushed sovereignty right to the top of enterprise IT concerns in 2025, certainly in Europe and also to some extent Asia Pac. If private cloud seemed like a luxury that only truly made sense for regulated industries, now it seems more like a luxury not to be hedging with private cloud bets. And that’s of course where VMware has been investing and building. We also spoke a fair bit about personae. Shenoy said VCF caters to three key personae &#8211; the cloud admin, the platform engineer, and the developer who doesn’t even want to think about underlying services. </span></p>
<p><span style="font-weight: 400;">What do these apps look like? Enterprises want to drive value with new AI-based applications, but they’re really concerned about compliance, privacy, and security. </span></p>
<blockquote><p><span style="font-weight: 400;">How can we help customers move at the pace of AI, pace of software, pace of what the developer needs? Those are the kind of customers that we are truly helping to work with to provide them a platform, the private cloud platform, on which they can run their business for both their containerized application as well as the VM application.</span></p></blockquote>
<p><span style="font-weight: 400;">When it comes to AI, one question I had was about GPUs. Don’t we live in a cloud-based GPU world now? What can Broadcom offer customers there? Well &#8211; apparently some enterprises are indeed buying their own GPUs, and Shenoy argues that virtualisation capabilities are more important than ever.</span></p>
<p><span style="font-weight: 400;">So Broadcom has a platform called Private AI Foundation that it built jointly with NVIDIA &#8211; which is designed to allow enterprises the most efficiency for the GPU hardware they buy, with 25 years of innovation behind it &#8211; dynamic resource scheduling, v-motion etc. Inference and training require high performance, and high speed data connectivity.</span></p>
<blockquote><p><span style="font-weight: 400;">One of the key, again, misconceptions in the industry has been like, hey, you can run all these AI workloads on bare metal. If you look at all of the hyperscalers, they don’t necessarily run it on bare metal. They have a Linux kernel that they run on top of. It’s the same thing with us, too. We run it on top of ESX, which is our hypervisor. And all of the capabilities of virtualization that we built in also comes along with that. So, when we have run performance benchmarks with MLPerf, we’ve seen pretty much VCF and vSphere retaining 99% of the performance of a bare metal with all of the other virtualization capabilities that I just talked about.</span></p></blockquote>
<p><span style="font-weight: 400;">For more about the simplification around VCF and Tanzu I again suggest you watch the video or read the transcript. How Broadcom decided what was infrastructure and what were application services as containers and VMs collided, given the application and PaaS services in its portfolio. VCF now includes Kubernetes runtime services, but not the higher level services.</span></p>
<p><span style="font-weight: 400;">Before signing off I have to mention customer feelings, which are still running hot. We’ll see over the next few years how things settle down, especially as renewals come up again. The competition, notably IBM, is very focused on competing more head to head with VMware. Hyperscalers too sense opportunity, sovereignty issues notwithstanding. </span></p>
<p><span style="font-weight: 400;">But Broadcom is absolutely crushing it from a revenue and share price perspective. Remaining employees are certainly happy with their options. And the portfolio and messaging are a lot more straightforward. VCF is a marker. It was a good conversation. </span></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>disclosure: VMware is a customer and sponsored the video. IBM is also a customer.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/vcf-whatsup/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5375</post-id>	</item>
		<item>
		<title>Why Log Data Management Is a Thing</title>
		<link>https://redmonk.com/jgovernor/why-log-data-management-is-a-thing/</link>
					<comments>https://redmonk.com/jgovernor/why-log-data-management-is-a-thing/#respond</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Mon, 27 Oct 2025 16:54:48 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5373</guid>

					<description><![CDATA[With all the buzz around Observability over the last few years it’s easy to imagine that when it comes to logs, metrics and traces, it’s game over. Just stick all the data you need in a database, or these days a data lakehouse, and start building queries and dashboards. Easy.  Glibness aside, Observability tools vendors]]></description>
										<content:encoded><![CDATA[<p><a href="http://redmonk.com/jgovernor/files/2025/10/ship-log-stock-scaled.jpeg"><img loading="lazy" decoding="async" class="aligncenter size-large wp-image-5374" src="http://redmonk.com/jgovernor/files/2025/10/ship-log-stock-1024x696.jpeg" alt="picture of 3 men on a ship with a cord, measuring ship log. Ship log or chip log is a navigation tool to estimate the speed of the ship throwing a wooden chip overboard at the end of a line marked by knots during a predetermined interval of time" width="1024" height="696" srcset="https://redmonk.com/jgovernor/files/2025/10/ship-log-stock-1024x696.jpeg 1024w, https://redmonk.com/jgovernor/files/2025/10/ship-log-stock-300x204.jpeg 300w, https://redmonk.com/jgovernor/files/2025/10/ship-log-stock-768x522.jpeg 768w, https://redmonk.com/jgovernor/files/2025/10/ship-log-stock-1536x1043.jpeg 1536w, https://redmonk.com/jgovernor/files/2025/10/ship-log-stock-2048x1391.jpeg 2048w, https://redmonk.com/jgovernor/files/2025/10/ship-log-stock-480x326.jpeg 480w, https://redmonk.com/jgovernor/files/2025/10/ship-log-stock-923x627.jpeg 923w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></a></p>
<p><span style="font-weight: 400;">With all the buzz around Observability over the last few years it’s easy to imagine that when it comes to logs, metrics and traces, it’s game over. Just stick all the data you need in a database, or these days a data lakehouse, and start building queries and dashboards. Easy. </span></p>
<p><span style="font-weight: 400;">Glibness aside, Observability tools vendors generally claim they can manage all your metrics and telemetry data in a single coherent store, with consistent access mechanisms. </span><span style="font-weight: 400;">But as telemetry data has exploded so have costs. This is especially true when we want to correlate the data in terms of business needs- the problem with things like customer number, user or product ID, or IP address is that they are inherently high cardinality. Columns with many unique values drive up costs of memory and compute, and these costs get passed on to customers. </span></p>
<p><span style="font-weight: 400;">Where it used to be that folks complained Splunk was expensive, these days we hear the same about Datadog. Datadog, long seen as the darling of the APM space, rather than a “legacy player” is now seen as expensive. In 2025 this is a Datadog weakness &#8211; this issue comes up repeatedly in customer conversations. It’s not that people don’t value Datadog &#8211; they rave about the user experience. But costs are a concern. </span></p>
<p><span style="font-weight: 400;">There is a huge opportunity here around cost management, notably in the emerging log data management (LDM) space. Organisations are concerned with costs of storage, and cardinality. </span></p>
<p><span style="font-weight: 400;">So what is Log Data Management and why is it useful? </span></p>
<p><span style="font-weight: 400;">The bottom line is that log management is indeed a data management problem. Data sources continue to fragment, with every new platform the organisation uses. Modern Observability is not about instrumentation but data, especially in the open standards world of Open Telemetry. But we’re not yet living in a world where every piece of your infrastructure is using OTel. There is plenty of telemetry in different systems that needs to be integrated, collated and transformed before it’s truly useful- similarly to ETL in the data warehousing space.</span></p>
<p><span style="font-weight: 400;">So we’re faced with at least two factors that need to be addressed &#8211; cost of storage, and complexity of the data landscape.</span></p>
<p><span style="font-weight: 400;">You might justifiably claim if your tool is primarily used for troubleshooting by developers that you don’t actually need to store all the unique events in your log stream, but a lot of organisations with a strong focus on security and compliance, such as those in regulated industries, do indeed want to store all the telemetry. </span></p>
<p><span style="font-weight: 400;">With log data management you’re concentrating on integrating data from a range of sources, often leaving it in place, but with pipeline routing and data refinery capabilities to allow you to manage all of your log data cost as cost effectively as possible.</span></p>
<p><span style="font-weight: 400;">The company that arguably best represents this view of the market right now is Cribl. It doesn’t position itself as a replacement for Observability or even log management vendors, but rather as an adjunct to them.</span></p>
<p><span style="font-weight: 400;">Cribl Stream, formerly called LogStream, is about sending data to multiple places, rather than collecting it into one. It can enrich or redact data, and is used by customers to reduce volume for existing ingest platforms. Storing every AWS CloudTrail or Windows XML event gets very expensive quickly. </span></p>
<p><span style="font-weight: 400;">Cribl is taking a similar approach with Cribl Search. Rather than consolidating data in one place before searching, the platform will search at the edge, so that you can search across multiple contexts, with data left in place, using a unified query language. Federated search is </span><i><span style="font-weight: 400;">really</span></i><span style="font-weight: 400;"> hard, but the philosophy of “search-in-place” is a great play for customers that don’t want to buy another centralisation promise. </span></p>
<p><span style="font-weight: 400;">Chronosphere moved from using CrowdStrike as an OEM log provider to launching their own log control product in June 2025. Chronosphere’s platform is all about letting users control the volume of data they are storing, but because they don’t price on ingest users can have more visibility into what they need to keep. When Chronopshere launched their log product, CEO Martin Mao told RedMonk:</span></p>
<blockquote><p><span style="font-weight: 400;">One of the big gaps is you don&#8217;t know which and which sections of your logs to reduce and how to reduce them. And the way we solve that, and this is one of the reasons why we built our own back end, is we actually have to analyze all of how you use the logs in terms of dashboards, alerts and things like that. Then we turn the analytics into suggestions to feed a telemetry pipeline, and you can reduce your data volume. </span></p></blockquote>
<p><span style="font-weight: 400;">Startup </span><a href="https://www.controltheory.com/"><span style="font-weight: 400;">Control Theory</span></a><span style="font-weight: 400;"> was created to give companies more operational and cost control over their logs. Co-founder Bob Quillen <a href="https://redmonk.com/blog/2025/06/30/rmc-bob-quillin/">argues that</a> OTel helped democratize instrumentation and collection of telemetry, but it still led to “fat dumb pipes that dump into a data lake, and then you pay for ingest of that data, indexing it, and retaining it. And we thought, ‘there’s got to be a better way to do this.’” And so he and his co-founders set out to create a control layer to sit on top of telemetry data to better manage it.</span></p>
<p><a href="https://hydrolix.io/"><span style="font-weight: 400;">Hydrolix</span></a><span style="font-weight: 400;">, on the other hand, is tackling the cost problem of logs by delivering an extremely high compression data lake that can stay always hot. Hydrolix’s approach is maybe tangential to the other log data management approaches mentioned here. While other competitors focus on reducing the total volume of logs saved and stored, Hydrolix instead have focused on reigning in costs by building their own proprietary compression methodology. A big reason why they can compress so efficiently is that their solution focuses exclusively on logs, not any other type of telemetry. </span></p>
<p><span style="font-weight: 400;">Honeycomb, which positions itself as the best solution for querying and analysing high cardinality data at scale, has responded to the need for pipeline-based log data management approaches. It launched the Honeycomb Telemetry Pipeline product. Telemetry Pipeline Manager uses the OpenTelemetry Collector to scrape and collect system logs. The collector supports multiple log formats. Logs can also be refined, the elimination of redundant data. The Refiner also enables the identification of potentially important events &#8211; showing for example, errors or slow requests. The rest of the data is archived, but you can rehydrate full-fidelity logs and traces directly from S3.</span></p>
<p><span style="font-weight: 400;">Datadog also offers an Observability Pipelines product &#8211; customers save on egress costs by sending only valuable logs to a chosen observability vendor, and then routing other logs to long-term storage such as AWS S3, Azure Blob Storage or Google Cloud. As ever Datadog offers plenty of out of the box functionality, in this case more than 150 predefined parsing rules. to transform logs into structured formats for querying using its Grok parser. </span></p>
<p><span style="font-weight: 400;"> </span><span style="font-weight: 400;">Other products to look at include Mezmo (telemetry pipelines and log analysis) &#8211; it actively markets itself round, for example, </span><a href="https://www.mezmo.com/blog/controlling-datadog-costs-with-telemetry-pipelines"><span style="font-weight: 400;">reducing datadog spend</span></a><span style="font-weight: 400;">. Edge Delta also plays in the (telemetry pipeline space)[</span><a href="https://edgedelta.com/comparison/edge-delta-vs-cribl/"><span style="font-weight: 400;">https://edgedelta.com/comparison/edge-delta-vs-cribl/</span></a><span style="font-weight: 400;">]</span></p>
<p><span style="font-weight: 400;">So telemetry pipelines is definitely part of the solution, but we feel an active and intentional approach to log data management, with a specific focus on, in effect, hierarchical storage, where data be stored in cheaper blog storage, but also quickly rehydrated and made available for querying. The key point here is that log storage is expensive. Every enterprise or SaaS company we talk to feels that pain. Which is where log data management comes in.</span></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p><span style="font-weight: 400;">Disclosure: Splunk, Cribl, Chronosphere, Control Theory, Honeycomb, AWS, Microsoft (Azure), and Google Cloud are RedMonk clients. </span></p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/why-log-data-management-is-a-thing/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5373</post-id>	</item>
		<item>
		<title>Java relevance in the AI era &#8211; agent frameworks emerge.</title>
		<link>https://redmonk.com/jgovernor/java-relevance-in-the-ai-era-agent-frameworks-emerge/</link>
					<comments>https://redmonk.com/jgovernor/java-relevance-in-the-ai-era-agent-frameworks-emerge/#comments</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Mon, 27 Oct 2025 13:43:04 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5370</guid>

					<description><![CDATA[I recently appeared as a guest on the Context Window podcast hosted by IBM’s Anant Jhingran and Ed Anuff. It inspired a couple of posts. This one is about skills relevance &#8211; namely Java and agents. Ed said that he had talked to some people that still flatly refuse to believe that AI works in]]></description>
										<content:encoded><![CDATA[<p><a href="http://redmonk.com/jgovernor/files/2025/10/embabel.jpg"><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-5371" src="http://redmonk.com/jgovernor/files/2025/10/embabel.jpg" alt="" width="800" height="800" srcset="https://redmonk.com/jgovernor/files/2025/10/embabel.jpg 800w, https://redmonk.com/jgovernor/files/2025/10/embabel-300x300.jpg 300w, https://redmonk.com/jgovernor/files/2025/10/embabel-150x150.jpg 150w, https://redmonk.com/jgovernor/files/2025/10/embabel-768x768.jpg 768w, https://redmonk.com/jgovernor/files/2025/10/embabel-480x480.jpg 480w, https://redmonk.com/jgovernor/files/2025/10/embabel-627x627.jpg 627w" sizes="auto, (max-width: 800px) 100vw, 800px" /></a></p>
<p><span style="font-weight: 400;">I recently appeared as a guest on the <a href="https://youtube.com/playlist?list=PLm-EPIkBI3YqXTgboKALGzNmGELWp_oTT&amp;si=ckfriDCyO7wwCCfT">Context Window podcast</a> hosted by IBM’s Anant Jhingran and Ed Anuff. It inspired a couple of posts. This one is about skills relevance &#8211; namely Java and agents.</span></p>
<p><span style="font-weight: 400;">Ed said that he had talked to some people that still flatly refuse to believe that AI works in any context. Perhaps they’ve had poor results, or couldn’t get something to work. My suggestion is to try again. The models and tools are getting better all the time. </span><span style="font-weight: 400;">We also talked about the fact it feels like some people are saying you need to learn an entirely new stack in order to be relevant in the age of AI- trying to keep up with achingly cool kids. As someone that has tracked programming language adoption and usage for a long time this seems particularly wrong-headed to me.</span></p>
<p>Of course there is probably an underlying factor at work here &#8211; fear. People are naturally a little unnerved about the impact of AI on the software development jobs market. But this is where continued learning is so important. The swiftest path to irrelevance is to refuse to learn new skills and or refresh the ones you already have.</p>
<p>So what about Java?</p>
<p><span style="font-weight: 400;">Sure new languages and frameworks continue to emerge, and move to dominance. Java is no longer a top three programming language (Python, JavaScript and TypeScript are all ahead). </span><span style="font-weight: 400;">But, and this is the important bit &#8211; that’s not to say your Java skills are not relevant. On the contrary &#8211; it’s highly likely they’re going to come into their own, particularly in enterprise contexts. </span><span style="font-weight: 400;">Don’t think OMG I am a Java developer, but now I need to learn Python because it’s the language of AI. Python may have overtaken Java in terms of the current industry conversation &#8211; it’s the language of machine learning and AI after all. Sure, OpenAI is a huge Python shop. Python is the language of frontier models, and the Python ecosystem of libraries is just an incredible industry asset. </span><span style="font-weight: 400;">But that doesn’t mean your Java skills aren’t relevant for developing apps that <em>use</em> models.</span></p>
<p><span style="font-weight: 400;">Java has incredible antibodies. Its ability to swallow and digest new innovation, to find new niches, is why it&#8217;s lasted so long in this industry and been so successful. Think about big data. For a while, people were saying, &#8220;Oh no, there&#8217;s no innovation in Java.&#8221; Big data came along, and sure enough, we saw frameworks like Hadoop and Spark written in Java and JVM languages. Java has maintained relevance through all of the waves that we&#8217;ve seen over the last couple of decades &#8211; it is the exemplar of a general purpose programming language and runtime. With the distributed systems and cloud revolution, so many of the applications and systems that were built, so much of the infrastructure, was built in Java, new languages like Go and Rust notwithstanding. The idea that somehow Java isn&#8217;t going to be play well with AI doesn&#8217;t make any sense.</span></p>
<p>Let&#8217;s look at an interesting example of innovation in the space, an an antibody in his own right. <span style="font-weight: 400;">Rod Johnson founded the Spring project. Millions of developers around the world use Spring and Spring Boot every day. He’s now created </span><a href="https://medium.com/@springrod/embabel-a-new-agent-platform-for-the-jvm-1c83402e0014"><span style="font-weight: 400;">Embabel</span></a><span style="font-weight: 400;">, a strongly-typed agent framework written for the JVM. It&#8217;s designed to bring determinism to your project plan using a model that isn&#8217;t an LLM, before using autonomous agents to generate the code to map to that plan. Not everything is decided by LLM. A</span><span style="font-weight: 400;">ccording to Rod:</span></p>
<blockquote><p><span style="font-weight: 400;">The critical adjacency for building business apps with LLMs is existing business logic and infrastructure. And the critical skill set is building sophisticated business applications. In both these areas, the JVM is far ahead of Python and likely to remain so.</span></p></blockquote>
<p><span style="font-weight: 400;">Honestly, this checks out. Embabel </span><span style="font-weight: 400;">is an enterprise play, and one where Java developers&#8217; skills are on point. Spring has proven itself for business logic, systems that are built to last, event-driven systems, transaction systems and so on. Adjacency is a thing. Rod again:</span></p>
<blockquote><p>If you’re a Spring developer, you’ll find building agents with Embabel to be as natural as building a Spring MVC REST interface.</p></blockquote>
<p><span style="font-weight: 400;">On the LLM side, folks might just be thinking surely Model Context Protocol (MCP) is all you need? The hype might make it seem so, but security concerns around the standard mean the answer is almost certainly not. MCP became an industry standard remarkably quickly, but arguably too too much so. MCP became the new Hello World for every enterprise technology vendor, but there is still a great deal of work to do.</span></p>
<p><span style="font-weight: 400;">Here’s what Rod believes MCP lacks:</span></p>
<ul>
<li style="font-weight: 400;" aria-level="1">
<blockquote><p><span style="font-weight: 400;">Explainability: Why were choices made in solving a problem?</span></p></blockquote>
</li>
<li style="font-weight: 400;" aria-level="1">
<blockquote><p><span style="font-weight: 400;">Discoverability: MCP skirts this important problem. How do we find the right tools at each point, and ensure that models aren’t confused in choosing between them?</span></p></blockquote>
</li>
<li style="font-weight: 400;" aria-level="1">
<blockquote><p><span style="font-weight: 400;">Ability to mix models, so that we are not reliant on God models but can use local, cheaper, private models for many tasks</span></p></blockquote>
</li>
<li style="font-weight: 400;" aria-level="1">
<blockquote><p><span style="font-weight: 400;">Ability to inject guardrails at any point in a flow</span></p></blockquote>
</li>
<li style="font-weight: 400;" aria-level="1">
<blockquote><p><span style="font-weight: 400;">Ability to manage flow execution and introduce greater resilience</span></p></blockquote>
</li>
<li style="font-weight: 400;" aria-level="1">
<blockquote><p><span style="font-weight: 400;">Composability of flows at scale. We’ll soon be seeing not just agents running on one system, but federations of agents.</span></p></blockquote>
</li>
<li style="font-weight: 400;" aria-level="1">
<blockquote><p><span style="font-weight: 400;">Safer integration with sensitive existing systems such as databases, where it is dangerous to allow even the best LLM write access.</span></p></blockquote>
</li>
</ul>
<p><span style="font-weight: 400;">The last point really is absolutely critical. </span></p>
<p><span style="font-weight: 400;">Rod is building in Kotlin, which is an interesting design choice. His argument is that Java can do a better job than Python agent frameworks like <a href="https://www.crewai.com/">crew.ai</a>. The proof will be in adoption, so this is a project I will be tracking with interest.</span></p>
<p>Another open source project to mention is <a href="https://github.com/langchain4j/langchain4j">LangChain4J</a>, supported by vendors including Red Hat and Microsoft. Dmytro Liubarskyi, founder and project lead, <a href="https://devblogs.microsoft.com/java/microsoft-and-langchain4j-a-partnership-for-secure-enterprise-grade-java-ai-applications/#:~:text=Dmytro%20Liubarskyi%2C%20founder%20and%20project,%2C%20scalability%2C%20or%20developer%20experience.">says</a>:</p>
<blockquote><p>“Our goal with LangChain4j has always been to make advanced AI capabilities easily accessible to Java developers — without compromising on security, scalability, or developer experience.&#8221;</p></blockquote>
<p>LangChain4J is designed to allow Java developers to easily work with LLMs, vector stores and agents. It also has a set of Kotlin extensions.</p>
<p>Meanwhile Jetbrains has built a Kotlin-based agent framework called <a href="https://docs.koog.ai/">Koog</a> &#8211; &#8220;an idiomatic, type-safe Kotlin DSL designed specifically for JVM and Kotlin developers&#8221;.</p>
<p>It&#8217;s also worth checking out this <a href="https://www.the-main-thread.com/p/java-langchain4j-ai-enterprise">post by Marcus Eisele</a> for further context.</p>
<blockquote><p>What makes LangChain4j attractive to architects is its alignment with established enterprise patterns. The framework offers unified interfaces for chat models (OpenAI, Anthropic, Azure, Google Gemini), embeddings, and vector stores.</p>
<p>Developers declare @AiService interfaces, similar in feel to REST controllers, and annotate methods with @UserMessage, @SystemMessage, or @Tool to define prompts and expose domain logic. This design keeps interactions type-safe, composable, and predictable.</p>
<p>LangChain4j also moves beyond simple model calls. Tool calling allows LLMs to invoke Java methods directly. This controlled bridging between models and systems turns generative AI into a first-class part of enterprise logic.</p></blockquote>
<p>It&#8217;s those Java and JVM antibodies at work. <span style="font-weight: 400;">I mean, sure, by all means learn some Python and or some TypeScript. TypeScript is the current language of the day in building dev tools. It&#8217;s really exciting how much innovation is happening there, TypeScript is exploding &#8211; it seems partly because AI is generating so much of the code written today. If you&#8217;re building dev tools, you&#8217;re building modern dev infrastructure, chances are high that you&#8217;re building in TypeScript. But one can&#8217;t learn every new thing that comes along. </span></p>
<p><span style="font-weight: 400;">Of course questions will always remain. For example &#8211; what’s the future for Spring now it’s owned by Broadcom? But there is plenty of innovation out there in Java frameworks such as Quarkus (that team is currently <a href="https://quarkus.io/blog/quarkus-meets-langchain4j/">working on</a> a Langchain4j extension. Oracle remains a solid steward of the core language. IBM and Red Hat continue to invest in Java, and aren’t going to give up on the AI, LLM and agent opportunity lightly. </span></p>
<p>According to my colleague Dr Kate Holterhoff <a href="https://redmonk.com/kholterhoff/2025/09/17/java-25-oracle-is-cool-again/?utm_source=redmonk&amp;utm_medium=email&amp;utm_campaign=redmonk-october-2025-update">Java is cool again</a>. Anthropic and IBM are partnering &#8211; great post by my colleague Stephen O&#8217;Grady about the implications <a href="https://redmonk.com/sogrady/2025/10/08/enterprise-ai-market/?utm_source=redmonk&amp;utm_medium=email&amp;utm_campaign=redmonk-october-2025-update">here</a>&#8211; and I believe that definitely means Java modernisation and integration with LLMs.  IBM&#8217;s Project Bob IDE is explicitly being pitched for Java modernisation, with a focus on enterprise security when using agents. If you’re a Java programmer, you may be in better shape than you thought for building AI-enabled apps, integrating agents into your workflows.</p>
<p>&nbsp;</p>
<p>Bit of a bonus update here. After I posted this on linkedin, Tyler Jewell CEO of Akka commented that I could have mentioned the company in this post. I think this is fair, given Akka literally pivoted from its historical language and framework roots to focus squarely on AI agent workflows. Akka was originally a framework written in the JVM-based Scala language for building high performance, concurrent distributed systems. Now the platform is positioned is as a safe, secure, agentic AI platform. According to Jewell:</p>
<blockquote><p>Agents are unreliable:<br />
&#8211; complexity with agents, memory, orchestration, streaming, endpoints, APIs, tools, integration, stochastic LLMs &#8230; all now running in a distributed system.<br />
&#8211; distrust from unreliable systems, limited agent security protocols, lack of agent identity, transparency and explainability of LLM interactions, inconsistent outputs, and new AI security threats.<br />
&#8211; shadow costs that extend beyond LLM fees as agentic systems require constant maintenance, integration with feedback loops, and continuous governance.</p>
<p>Java and the JVM is well suited to overcoming these complexity, trust, and cost issues.</p></blockquote>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p><span style="font-weight: 400;">disclosure statement: IBM, Red Hat, Microsoft, Oracle and Broadcom are all RedMonk clients.  </span></p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/java-relevance-in-the-ai-era-agent-frameworks-emerge/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5370</post-id>	</item>
		<item>
		<title>Some people that will make you smarter about the practical uses of AI</title>
		<link>https://redmonk.com/jgovernor/5367/</link>
					<comments>https://redmonk.com/jgovernor/5367/#respond</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Fri, 24 Oct 2025 13:47:19 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5367</guid>

					<description><![CDATA[I recently guested on the Context Window podcast hosted by IBM’s Anant Jhingran and Ed Anuff. It was fun. I usually try to bring some pragmatism to a conversation, and this was no exception. One message I really want to get across to people is that your skills are still relevant. AI is disruptive, but]]></description>
										<content:encoded><![CDATA[<p><a href="http://redmonk.com/jgovernor/files/2025/10/fat-snake-scaled.jpeg"><img loading="lazy" decoding="async" class="aligncenter size-large wp-image-5368" src="http://redmonk.com/jgovernor/files/2025/10/fat-snake-1024x683.jpeg" alt="snake digesting a large meal" width="1024" height="683" srcset="https://redmonk.com/jgovernor/files/2025/10/fat-snake-1024x683.jpeg 1024w, https://redmonk.com/jgovernor/files/2025/10/fat-snake-300x200.jpeg 300w, https://redmonk.com/jgovernor/files/2025/10/fat-snake-768x512.jpeg 768w, https://redmonk.com/jgovernor/files/2025/10/fat-snake-1536x1024.jpeg 1536w, https://redmonk.com/jgovernor/files/2025/10/fat-snake-2048x1365.jpeg 2048w, https://redmonk.com/jgovernor/files/2025/10/fat-snake-480x320.jpeg 480w, https://redmonk.com/jgovernor/files/2025/10/fat-snake-941x627.jpeg 941w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></a></p>
<p><span style="font-weight: 400;">I recently guested on the </span><a href="https://youtube.com/playlist?list=PLm-EPIkBI3YqXTgboKALGzNmGELWp_oTT&amp;si=ckfriDCyO7wwCCfT"><span style="font-weight: 400;">Context Window podcast </span></a><span style="font-weight: 400;">hosted by IBM’s Anant Jhingran and Ed Anuff. It was fun. I usually try to bring some pragmatism to a conversation, and this was no exception. One message I really want to get across to people is that your skills are still relevant. AI is disruptive, but that doesn’t mean it won’t create great opportunities for you.</span></p>
<p><span style="font-weight: 400;">Ed asked me about the challenge of keeping up with everything happening in the industry right now, which reminds us of the Javascript framework wars &#8211; oh my god I didn&#8217;t read Hacker News this morning, how will I even be able to write code?</span></p>
<p><span style="font-weight: 400;">He asked: “You’re a dev so how do you not lose your mind? What’s the path to sanity?”</span></p>
<p><span style="font-weight: 400;">Now It’s literally my job to know all of that stuff &#8211; OMG there is a new model. A new thing dropped. A widget, a platform, a frontier model, a startup. We&#8217;re in the midst of a builder frenzy.</span></p>
<p><span style="font-weight: 400;">I said that, as ever, people are a big part of the answer. </span><span style="font-weight: 400;">You need some folks to help digest all of this stuff for you (<a href="https://redmonk.com/rstephens/2025/09/25/development-productivity/">snakes swallowing elephants</a>). Yes AI wants to do it (&#8220;let me summarise that for you&#8221;), but it turns out humans are really good at digesting, understanding and summarising stuff for you. Who knew?</span></p>
<p><span style="font-weight: 400;">So yeah, who&#8217;s gonna help you navigate the thicket? You can’t spend all of your time trying to work out what’s really important to your day job, or your side project. </span><span style="font-weight: 400;">There are some amazing summarisers, so I would say listen to them, and learn from them, to help you navigate this incredibly fast moving space, without being worried about that pace of innovation. </span></p>
<p><span style="font-weight: 400;">Don’t try and follow everyone. Don’t spend all your time on X worrying about the latest launch you missed. Don’t be stressed about it. But by all means use AI in your daily work. If you’ve tried it before and been disappointed then I recommend you try it again. Generated code doesn’t all have six or four fingers any more.</span></p>
<p><span style="font-weight: 400;">So here are some people who can make your life better, by doing the work.</span></p>
<p><strong><a href="https://angiejones.tech/blog/">Angie Jones</a> </strong><span style="font-weight: 400;">was originally a Java developer and is now a world class developer relations practitioner, technologist and educator. Today she runs dev rel at Block and she’s helping to build momentum around the company’s </span><a href="https://github.com/block/goose"><span style="font-weight: 400;">Goose AI agent</span></a><span style="font-weight: 400;">. The work Angie and her colleague </span><a href="https://x.com/blackgirlbytes"><span style="font-weight: 400;">Rizel Scarlett</span></a><span style="font-weight: 400;"> are doing is practical, with applications of AI to software development. Angie is also helping to make it abundantly clear that your existing skills are relevant. Her recent post about </span><a href="https://angiejones.tech/how-devrel-is-leading-ai-adoption/"><span style="font-weight: 400;">How Developer Relations leading AI adoption</span></a><span style="font-weight: 400;"> is thought-provoking: it turns out that when everything is changing all the time, as it is in AI, teaching and samples and proofs of concept are more important than ever. Her team is doing industry defining work. Dev rel is having a renaissance because AI is all about education. So Follow Angie!</span></p>
<p><span style="font-weight: 400;"><strong><a href="https://simonwillison.net/">Simon Willison</a></strong>&#8211; every time I give a talk about AI in any context I tell people to pay attention to him</span><span style="font-weight: 400;">. He is from the Python world, but endlessly curious about seemingly everything AI-related. Brilliant communicator. If there is one thing you read every day, go and check out what Simon Willison has to say. He will help you understand which technologies are ready for prime time and what’s happening across the industry.</span></p>
<p><span style="font-weight: 400;"><strong>Claire Vo</strong> &#8211; CEO and founder of </span><a href="https://www.chatprd.ai/"><span style="font-weight: 400;">ChatPRD</span></a><span style="font-weight: 400;">. Amazing communicator about the value of AI tools from a business perspective. She is building a product management platform using all the latest models and AI tools. Follow Claire, you&#8217;ll get an entertaining, positive view of what works and what doesn’t. So get yourself over to her YouTube channel </span><a href="https://www.youtube.com/@howiaipodcast"><span style="font-weight: 400;">How I AI</span></a><span style="font-weight: 400;">.</span></p>
<p><span style="font-weight: 400;"><strong>Jesse Vincent</strong> &#8211;  Best known for his work in the PERL community, where he was project lead. Also creator of the K-9 Mail email app for Android, which was acquired by Mozilla and rebranded as Thunderbird for Android. Anyway he’s doing excellent work helping people understand things like Claude Code Skills. Very practical, great writing. Read his blog </span><a href="https://blog.fsck.com/"><span style="font-weight: 400;">Massively Parallel Procrastination</span></a><span style="font-weight: 400;">. This piece, for example, is a good read &#8211; </span><a href="https://blog.fsck.com/2025/10/19/mcps-are-not-like-other-apis/"><span style="font-weight: 400;">When it comes to MCPs, everything we know about API design is wrong</span></a><span style="font-weight: 400;">. </span></p>
<blockquote><p><span style="font-weight: 400;">“This might be a hard lesson to hear, but tools you build for LLMs are going to work much, much better if you think of your end-user as a &#8220;person&#8221; rather than a computer. Build your tools like they&#8217;re a set of scripts you&#8217;re handing to that undertrained kid who just got hired in the NOC. They are going to page you at 2AM when they can&#8217;t figure out what&#8217;s going on or when they misuse the tools in a way they can&#8217;t unwind.</span></p>
<p><span style="font-weight: 400;">Names and method descriptions matter far more than they ever have before.”</span></p></blockquote>
<p><span style="font-weight: 400;"><strong>Richard Seroter</strong> &#8211; he doesn’t just take a Googley view of the world. It’s about education. He takes a practitioner-based view and does a great job of keeping track of the latest developments, models and tools. His </span><a href="https://seroter.com/category/daily-reading-list/"><span style="font-weight: 400;">daily reading list</span></a><span style="font-weight: 400;"> is very useful.</span></p>
<p><span style="font-weight: 400;">And finally </span><strong><a href="https://redmonk.com/rstephens/">Rachel Stephens</a></strong><span style="font-weight: 400;"> and </span><strong><a href="https://redmonk.com/kholterhoff/">Kate Holterhoff</a></strong><span style="font-weight: 400;">, my colleagues &#8211; are doing great work, using the tools and helping people understand them.</span></p>
<p><span style="font-weight: 400;">Your skills are relevant. There are great educators out there that are willing and ready to help you. If you’re struggling, then go on the internet and they will help you. You don’t even need to ask ChatGPT. </span><span style="font-weight: 400;">I think it’s really exciting. To be honest I got slightly worried in 2023 &#8211; have I really got another revolution in me? As this bubble went into super expansion &#8211; I am a developers and practitioners matter guy, and do I have another rev in me? The answer is yes I do, and I think you probably do too. Its reassuring to know there are people ready and willing to help.</span></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>disclosure statement &#8211; IBM and Google are both clients.</p>
<p>&nbsp;</p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/5367/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5367</post-id>	</item>
		<item>
		<title>Some thoughts on &#8220;Agentic DevOps&#8221;, AIOps, and Vibe Coding. With Gene Kim and Nicole Forsgren.</title>
		<link>https://redmonk.com/jgovernor/some-thoughts-on-agentic-devops-aiops-and-vibe-coding-with-gene-kim-and-nicole-forsgren/</link>
					<comments>https://redmonk.com/jgovernor/some-thoughts-on-agentic-devops-aiops-and-vibe-coding-with-gene-kim-and-nicole-forsgren/#respond</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Fri, 25 Jul 2025 12:44:10 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5363</guid>

					<description><![CDATA[A couple of months back Microsoft introduced the term &#8220;Agentic DevOps&#8221; at its Build conference (my full write up about the show is here). I thought it was interesting. I mean obviously, AI is having a profound impact on how we build and manage applications and services. Generative AI is driving the cost of creating]]></description>
										<content:encoded><![CDATA[<p><a href="http://redmonk.com/jgovernor/files/2025/07/agentic-devops.png"><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-5365" src="http://redmonk.com/jgovernor/files/2025/07/agentic-devops.png" alt="" width="512" height="512" srcset="https://redmonk.com/jgovernor/files/2025/07/agentic-devops.png 512w, https://redmonk.com/jgovernor/files/2025/07/agentic-devops-300x300.png 300w, https://redmonk.com/jgovernor/files/2025/07/agentic-devops-150x150.png 150w, https://redmonk.com/jgovernor/files/2025/07/agentic-devops-480x480.png 480w" sizes="auto, (max-width: 512px) 100vw, 512px" /></a></p>
<p>A couple of months back Microsoft introduced the term &#8220;Agentic DevOps&#8221; at its Build conference (my full write up about the show is <a href="https://redmonk.com/jgovernor/2025/06/20/microsoft-build-2025-agents-models-github-and-beast-mode-windows/">here</a>). I thought it was interesting. I mean obviously, AI is having a profound impact on how we build and manage applications and services. Generative AI is driving the cost of creating software to near zero, and potentially increasing development velocity, and yet&#8230; the jury is still out. A lot more software means a lot more technical debt. Writing software is the easy bit &#8211; running and maintaining it &#8211; that&#8217;s where the costs are. And that&#8217;s where DevOps comes in. DevOps was intended to break down distinctions between development and operations, with a focus on automating and streamlining the entire software development lifecycle, from dev and test to deployment and operations. The idea reached a peak with the mantra &#8211; you build it, you run it. But with agents taking on more tasks, questions of responsibility are blurring. It makes sense that Agentic DevOps would be about using agents to take the automation aspects of DevOps to the next level, but in collaboration it also has obvious implications &#8211; because with modern coding practices we&#8217;re now increasingly collaborating with machines, raising a whole set of questions about specification, prompting, asynchronous workflows, and so on.</p>
<p>One of the intriguing aspects of generative AI is just how willing software developers seemingly are to jettison the last few decades of engineering best practices. The talk now is of One Shot, Vibe Coding, You Only Live Once (YOLO). It seems to me that Agentic DevOps is more about FAFO than YOLO. We&#8217;ve seen amazing horror stories emerge lately about just how cavalier folks are becoming in the gen AI era &#8211; oh yeah I&#8217;m giving Claude full access to my production database and asking it to change some up some functions in the identity management system. One thing DevOps should never be is cavalier.</p>
<p>But yes &#8211; FAFO. This is the age of experimentation. We&#8217;ll be spinning things up, seeing how they work, improving them rapidly, and iterating. Incident management is one of the DevOps principles where agents can potentially provide huge benefits. If using agents meaning getting paged less, and not woken up in the middle of the night, that has to be a good thing, right? And folks are indeed using agents in this way. Event-driven agentic systems will be at the heart of Agentic DevOps. Let the agent analyse performance and watch for anomalies. Let the agent suggest a fix. As we gain confidence let the agent make the fix, or simply roll back the change that led to performance degradation.</p>
<p>At this point I want to talk a little bit about AIOps, a now somewhat discredited term. AIOps was a term that came out of the monitoring space, and it meant using machine learning to improve the state of the art in some functions &#8211; particularly anomaly detection. While rudimentary automation was expected, AIOps was specifically focused on monitoring and logging. Developers and practitioners didn&#8217;t like the term, and the company that coined the term, Gartner, has recently retired it in favour of &#8220;Event intelligence&#8221;. Some enterprise organisations have literally banned use of the term. So yeah &#8220;AIOps&#8221; is kind of cooked.</p>
<p>And yet.. It seems to me that Agentic DevOps is really what AIOps always should have been, and perhaps would have been, if we&#8217;d had the AI technology available then that we do today. Remember that ChatGPT exploded onto the scene in 2022, and the rate of innovation across the industry has been frankly bonkers since then. We don&#8217;t fully understand the capabilities of new Frontier models as they are rolled out, but they are really really good at some of the things that we&#8217;ve needed for effective DevOps. Creating documentation of code and practices, creating runbooks, generating scripts, monitoring&#8230; monitoring systems for changes, writing plans, writing tests. Collaborating with operators, developer and users in natural languages. More detailed bug reporting (no more need to try and understand what&#8217;s happening in that screenshot, upload it and the system understands already), reading and parsing logs. Centralising and summarising information. And obviously agents don&#8217;t need to sleep, which is kind of handy in DevOps scenarios.</p>
<p>One of the areas I am particularly interested in given my work in Progressive Delivery is how generative AI impacts what AIOps was in the 2016-2024 sense &#8211; which is to say, AI used in Observability use cases. The days of trying to parse logs by hand &#8211; maybe they are indeed behind us. Generative AI is going to fundamentally remake monitoring, logging and tracing. Observability plus AI is a whole new frontier. Imagine not needing to learn a new query language in order to interact with system data.</p>
<p>Anyway &#8211; at Build I was lucky enough to spend some time with Gene Kim and Doctor Nicole Forsgren talking about these issues. Gene literally wrote <a href="https://www.amazon.co.uk/Devops-Handbook-World-Class-Reliability-Organizations/dp/1942788002">the book on DevOps</a> and is currently writing a book about <a href="https://itrevolution.com/product/vibe-coding-book/">Vibe Coding</a> with Steve Yegge, which is sure to be essential reading &#8211; both of them are really good at writing prose that draws you in, to explain complex subjects. Kim is now a Vibe Coding maximalist &#8211; he believes the days of hand-written code are behind us.</p>
<p>Forsgren has made a huge impact on the the industry through her work as one creator of the widely adopted DORA and SPACE frameworks, and co-author of the Accelerate and the DevOps Handbook, alongside Kim. Naturally she is also working on a <a href="https://developerexperiencebook.com/">new book &#8211; about Developer Experience</a>. She&#8217;s a little more pragmatic about people writing code.</p>
<p>Anyway &#8211; I can&#8217;t think of two smarter and more appropriate people to be in conversation about the future of DevOps in the agentic era.</p>
<p>I will end this post with a couple of quotes. First Gene Kim:</p>
<blockquote><p>Agentic DevOps is a significant part of the way we all code now. Because no one should have to type in code by hand anymore. And the person who actually coined that was Dr. Eric Meijer. So famous in this community for his work on Visual Basic, C Sharp. He went on to develop the Hack language at Facebook Meta. And he said, yeah, we are probably the last generation of developers who will write code by hand.</p>
<p>And I just thought that just so spoke to me. And, yeah, so agentic is when you can actually show the LLM agent the output of his work and it can fix it for you.</p>
<p>And, you know, from my perspective, I mean, it’s just utterly transformative. I have an experience where I spent 45 minutes being bossed around by an LLM telling me to type this, type that. And, you know, the aha moment was asking it to run curl by itself. Right? And 45 seconds later, it fixed the issue with the Trello API that I was struggling with in 45 seconds. The same thing with Google Docs. I mean, so it’s just like once you see something like that happening it just becomes so obvious, you know, that there are some things that you just shouldn’t be in the loop. Because the only thing you are now is the bottleneck, copying and pasting from one window to another. So, and it’s not just for developers, it’s for everybody. Technologists, for operations, infrastructure, DevOps, all of that. So, I don’t think we want to say that the only people who benefit from this are just developers because as someone famous said, who broke my build is going to be said even more often.&#8221;</p></blockquote>
<p>And Nicole Forsgren:</p>
<blockquote><p>We’ve really spent a lot of time focusing on writing code and how AI and LLMs and agents, agents can kind of help us bootstrap and amplify and accelerate that, and I think there’s also a ton of opportunity, not just in the inner loop, but all the way through the outer loop. Absolutely. What do we think about the opportunity for AI? Especially now we have agents to improve local test and build, to improve our pull request and code review process, to improve build and integration, to improve things like release. So, for me, I think that’s, when I think about agentic DevOps, that’s kind of what it is. The foundational principles are still really important, especially now that the speed and volume of creation has sped up so much. Now the rest of that software development loop is even more important.</p></blockquote>
<p>You can find the whole transcript of the interview <a href="https://redmonk.com/videos/a-redmonk-conversation-introducing-agentic-devops/">here</a>, and the video is embedded below.</p>
<p><iframe class='youtube-player' width='640' height='360' src='https://www.youtube.com/embed/QTDJrvQwIrs?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent' allowfullscreen='true' style='border:0;' sandbox='allow-scripts allow-same-origin allow-popups allow-presentation'></iframe></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>Microsoft is a client, and sponsored the video.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/some-thoughts-on-agentic-devops-aiops-and-vibe-coding-with-gene-kim-and-nicole-forsgren/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5363</post-id>	</item>
		<item>
		<title>Chainguard builds a market, everyone else wants in.</title>
		<link>https://redmonk.com/jgovernor/chainguard-builds-a-market-everyone-else-wants-in/</link>
					<comments>https://redmonk.com/jgovernor/chainguard-builds-a-market-everyone-else-wants-in/#respond</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Fri, 18 Jul 2025 18:22:52 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5360</guid>

					<description><![CDATA[talking to Chainguard. dear lord they&#8217;ve apparently established a license to print money. they&#8217;re in very good shape. huge amount of progress in the last 18 months The above quote is from me posting on linkedin a few months pack. Chainguard essentially built a market from scratch for Secure Hardened Container Images with guarantees against]]></description>
										<content:encoded><![CDATA[<p><a href="http://redmonk.com/jgovernor/files/2025/07/AdobeStock_1226585929-scaled.jpeg"><img loading="lazy" decoding="async" class="aligncenter size-large wp-image-5361" src="http://redmonk.com/jgovernor/files/2025/07/AdobeStock_1226585929-1024x645.jpeg" alt="image of a yellow container, with a protective lead seal " width="1024" height="645" srcset="https://redmonk.com/jgovernor/files/2025/07/AdobeStock_1226585929-1024x645.jpeg 1024w, https://redmonk.com/jgovernor/files/2025/07/AdobeStock_1226585929-300x189.jpeg 300w, https://redmonk.com/jgovernor/files/2025/07/AdobeStock_1226585929-768x484.jpeg 768w, https://redmonk.com/jgovernor/files/2025/07/AdobeStock_1226585929-1536x967.jpeg 1536w, https://redmonk.com/jgovernor/files/2025/07/AdobeStock_1226585929-2048x1289.jpeg 2048w, https://redmonk.com/jgovernor/files/2025/07/AdobeStock_1226585929-480x302.jpeg 480w, https://redmonk.com/jgovernor/files/2025/07/AdobeStock_1226585929-996x627.jpeg 996w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></a></p>
<blockquote><p>talking to Chainguard. dear lord they&#8217;ve apparently established a license to print money. they&#8217;re in very good shape. huge amount of progress in the last 18 months</p></blockquote>
<p>The above quote is from <a href="https://www.linkedin.com/posts/jamesgovernor_talking-to-chainguard-dear-lord-theyve-activity-7298397561340452864-gqtS?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAAAABr9QBdHS_yhvAWPb_5xy6L_IJyjICTr4">me posting on linkedin</a> a few months pack. Chainguard essentially built a market from scratch for Secure Hardened Container Images with guarantees against Common Vulnerabilities and Exposures (CVEs), hitting the knee in the curve last year, with a huge growth in customer logos, and an associated revenue burst.  We have rarely seen competitors coalesce around an opportunity so quickly &#8211; Docker, SUSE, <a class="c-link" href="http://root.io/" target="_blank" rel="noopener noreferrer" data-stringify-link="http://Root.io" data-sk="tooltip_parent">Root.io, </a>RapidFort are all pushing hard to win market share against the newly minted market incumbent. Replicated is pivoting into the space with <a href="https://www.replicated.com/blog/introducing-securebuild">SecureBuild</a>. The latest company to join the fray is perhaps surprising &#8211; Wiz just announced <a href="https://www.wiz.io/blog/introducing-wizos-hardened-near-zero-cve-base-images.">WizOS</a>, a hardened Linux distro with its own build pipeline and security model. Cloud security vendor goes after developer build security, an adjacency.</p>
<p>When Chainguard was founded in 2021, it initially focused on securing the software supply chain by improving SBOM (Software Bill of Materials) generation, signing, and verification. The company&#8217;s early offerings were built around software provenance tools &#8211; notably Sigstore and Cosign. The real need however was for secure-by-default software, and that&#8217;s the approach it now takes, with its own container-native &#8220;un-distro&#8221; <a href="https://www.chainguard.dev/unchained/introducing-wolfi-the-first-linux-un-distro-designed-for-securing-the-software-supply-chain">Wolfi. </a>Images are minimal &amp; hardened, only includes what&#8217;s necessary (for example they don&#8217;t include the shell, package manager, or unnecessary binaries), reducing the attack surface.</p>
<p>Linux distributions of the pre-cloud era were not designed for kind of rapid change we see in software development today, with packages being downloaded and used in new software builds in an ongoing basis. Containers changed the game. The need for a more real time approach to updating images maps to today&#8217;s software delivery lifecycle requirements. Organisations want to be feel confident their developers are not using packages with known CVEs. Playing patching whack-a-mole sucks, and it&#8217;s bad for business. That&#8217;s the opportunity.</p>
<p>There&#8217;s going to be even more competition, which should be good for customers, and good for secure software supply chains everywhere. So far the claim from these new market entrants has mostly been that Chainguard is expensive, but they&#8217;re going to need to sharpen their attacks and do better from a product management perspective to really cut through. For example, Chainguard now supports virtual machines as well as containers. That said, platform incumbency is a huge advantage, so you can&#8217;t write off competitors.</p>
<p>If you&#8217;re an organisation using containers, or just shipping regularly, you really need to consider a Hardened Images platform. They can be a core foundation for better, faster, more secure, software delivery. Security is actually shifted left, because developers are using trusted images. Move fast and trust things.</p>
<p>This post certainly isn&#8217;t intended a thorough comparison of market competitors, but was triggered by the news about WizOS, which again, surprised me. The market opportunity is very real.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>disclosure: Chainguard and Docker are both clients.</p>
<p>&nbsp;</p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/chainguard-builds-a-market-everyone-else-wants-in/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5360</post-id>	</item>
		<item>
		<title>Giants awaken. Google Cloud GeminiCLI, AWS Kiro, developer experience and the need to ship and keep shipping</title>
		<link>https://redmonk.com/jgovernor/giants-awaken-google-geminicli-aws-kiro-developer-experience-and-the-need-to-ship-and-keep-shipping/</link>
					<comments>https://redmonk.com/jgovernor/giants-awaken-google-geminicli-aws-kiro-developer-experience-and-the-need-to-ship-and-keep-shipping/#respond</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Fri, 18 Jul 2025 17:39:06 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5357</guid>

					<description><![CDATA[During this period of AI-driven cosmological inflation in tech there is a market premium on shipping. You’re either shipping or you’re being left behind. S-curves and exponential growth are the order of the day. Lovable is a unicorn eight months after launch. In June 2025, Replit CEO Amjad Masad announced that his company had crossed]]></description>
										<content:encoded><![CDATA[<p><a href="http://redmonk.com/jgovernor/files/2025/07/bigbang-scaled.png"><img loading="lazy" decoding="async" class="aligncenter size-large wp-image-5358" src="http://redmonk.com/jgovernor/files/2025/07/bigbang-1024x683.png" alt="" width="1024" height="683" srcset="https://redmonk.com/jgovernor/files/2025/07/bigbang-1024x683.png 1024w, https://redmonk.com/jgovernor/files/2025/07/bigbang-300x200.png 300w, https://redmonk.com/jgovernor/files/2025/07/bigbang-768x512.png 768w, https://redmonk.com/jgovernor/files/2025/07/bigbang-1536x1024.png 1536w, https://redmonk.com/jgovernor/files/2025/07/bigbang-2048x1365.png 2048w, https://redmonk.com/jgovernor/files/2025/07/bigbang-480x320.png 480w, https://redmonk.com/jgovernor/files/2025/07/bigbang-941x627.png 941w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></a></p>
<p><span style="font-weight: 400;">During this period of AI-driven cosmological inflation in tech there is a market premium on shipping. You’re either shipping or you’re being left behind. S-curves and exponential growth are the order of the day. Lovable is a unicorn eight months after launch. In June 2025, Replit CEO Amjad Masad announced that his company had crossed $100M ARR, up from just $10M at the end of 2024. </span></p>
<p><span style="font-weight: 400;">Aggression, developer experience and shipping product are everything right now. As an industry analyst it’s hard to keep up with the pace of innovation. As I have said &#8211; </span><a href="https://redmonk.com/jgovernor/2025/02/21/ai-disruption-code-editors-are-up-for-grabs/"><span style="font-weight: 400;">everything is in play</span></a><span style="font-weight: 400;">. My colleague Stephen </span><a href="https://redmonk.com/sogrady/2025/07/09/promiscuity-of-modern-developers/"><span style="font-weight: 400;">concurs.</span></a></p>
<p><span style="font-weight: 400;">Take Google, for example, which is playing the game adroitly, and steering into the curve. </span></p>
<p><span style="font-weight: 400;">AI was always Google&#8217;s market to lose. Of all of the hyperscalers and major tech vendors Google was the most associated with AI, and felt with some justification that it was uniquely ready to benefit from AI. So when ChatGPT dropped and the world suddenly changed, Google ended up on the back foot. OpenAI had built on the Transformer model Google had invented (see the seminal paper <a href="https://www.google.com/url?q=https://arxiv.org/abs/1706.03762&amp;sa=D&amp;source=docs&amp;ust=1752859600988759&amp;usg=AOvVaw0EsEY-fYR66x_7IjrU7cvf">Attention is All You Need</a>) &#8211; productising Google’s invention. The best packager in any tech wave wins, and wins big. </span></p>
<p><span style="font-weight: 400;">Inventing a technology is of course no guarantee of success in tech. Xerox invented modern desktop computing, only to see Apple and Microsoft and Apple dominate the space. IBM invented the relational model only to see Oracle dominate the market for SQL databases.</span></p>
<p><span style="font-weight: 400;">So ChatGPT dropping in late 2022 was a burning platform moment for Google. It was Netscape in 1994, or the iPhone in 2007. </span></p>
<p><span style="font-weight: 400;">Google was on the back foot through 2023, but last year it hit its stride in terms of delivering on the promise of AI, and this year it’s in increasingly good shape. </span><span style="font-weight: 400;">Its current push around its capable Gemini frontier model is a case study in big company aggression. At Google Next in April it had its swagger back. With a set of industry partners it launched A2A, an industry standard framework for agent to agent communication. But it also made it very clear that it was also adopting the competing Model Context Protocol. But most importantly in terms of swagger, Google was able to lead with Gemini 2.5. A month later, at Google I/O, it had plenty more to say and launch. Two events in the space of just a few weeks, both packed with news.</span></p>
<p><span style="font-weight: 400;">With the </span><a href="https://blog.google/technology/developers/introducing-gemini-cli-open-source-ai-agent/"><span style="font-weight: 400;">launch of GeminiCLI</span></a><span style="font-weight: 400;"> in June 2025 put down a marker &#8211; an open source Cloud Code competitor, built from the ground up by a small Google team, with a focus on getting to market as quickly as possible. Perhaps just as impressive as the launch itself is the rate of new feature delivery &#8211; for example launching support for cutting and pasting images into the CLI (a favourite feature in coding agents and assistants of my colleague Kate Holterhoff) just a few weeks after the initial launch. </span></p>
<p><span style="font-weight: 400;">Oh yeah, <a href="https://github.com/google-gemini/gemini-cli">GeminiCLI</a> is already at 61k GitHub stars and 5.6k forks since launch last month &#8211; maybe there’s life in open source yet.</span></p>
<p><span style="font-weight: 400;">Game on. And then there’s Google CEO Sundar Pinchai casually hiring some of Windsurf&#8217;s senior staff, including CEO Varun Mohan and co-founder Douglas Chen, and licensing a chunk of its IP in a deal worth $2.4bn to further accelerate its own efforts. This after some outlets had reported that OpenAI had already acquired Windsurf. TDLR &#8211; OpenAI and Meta definitely aren’t going to get <em>all</em> the talent. </span><span style="font-weight: 400;">DeepMind is still an aspirational place to carry out fundamental research in AI. And apparently, it pays well. </span>Google is back in the game. Like I saw, swagger.</p>
<p><span style="font-weight: 400;">But what of Amazon Web Services, which has been uncharacteristically uncertain in the face of the AI big bang, shipping enterprise products like Bedrock, but leaving developers cold with some of its other efforts? </span><span style="font-weight: 400;">This week it seems to have finally shaken off its shackles. It launched </span><a href="https://kirodev/blog/introducing-kiro/"><span style="font-weight: 400;">Kiro</span></a><span style="font-weight: 400;">, a vibe coding tool, which attempts to bring back a little software engineering rigour alongside the You Only Live Once (YOLO) vibes, with a spec-driven development approach. <a href="https://redmonk.com/kholterhoff/2025/07/14/ai-engineers-and-the-hot-vibe-code-summer/">Vibe coding meets AI Engineering</a>. </span><span style="font-weight: 400;">Kiro&#8217;s reception by developers has been really positive &#8211;  so much so that the service has been returning errors, and Amazon has had to reintroduce a waitlist, to throttle adoption. </span></p>
<p><span style="font-weight: 400;">Even Corey Quinn is <a href="https://bsky.app/profile/quinnypig.com/post/3ltwkewvyhs2f">impressed</a>. The RedMonk team has already kicked Kiro&#8217;s tyres a fair bit, and Kate likes it enough to be using it most evenings, and <a href="https://buttondown.com/redmonk/archive/redmonk-june-2025-update/">coined</a> the phrase </span><b><i>Hot Vibe Code Summer </i></b><span style="font-weight: 400;">accordingly. </span></p>
<p><span style="font-weight: 400;">Talking of heat, Deepak Singh, AWS VP DevEx &amp; Agents lit the fire with this product. Having helped drive AWS to success in the container market he’s now leading efforts here. Intriguingly Kiro isn’t branded AWS or even Amazon, it stands alone &#8211;  it is simply Kiro. </span><span style="font-weight: 400;">Amazon is going to markedly increase velocity in building its AI dev tools. Kiro: watch this space. </span></p>
<p><span style="font-weight: 400;">As I <a href="https://www.google.com/url?q=https://www.linkedin.com/posts/jamesgovernor_if-amazon-web-services-aws-plays-its-cards-activity-7351650704136060929-nq1I&amp;sa=D&amp;source=docs&amp;ust=1752859600989846&amp;usg=AOvVaw0a0X-YXcnVy9i5sh_mtA3a">said on linkedin</a> this week </span></p>
<blockquote><p><span style="font-weight: 400;">If Amazon Web Services (AWS) plays its cards right with</span> <span style="font-weight: 400;">Kiro it&#8217;s going to be one of the company&#8217;s fastest growing products ever. People like the tool. If Amazon can get out of the way, play the token game adroitly, and scale the service, it will be in really good shape. One way or another this is the most successful developer launch from AWS we&#8217;ve seen in a long time. The team deserves a lot of credit.</span></p>
<p><span>Selling to enterprises isn&#8217;t enough. You&#8217;ve got to sell to the builders.</span> Kiro<a href="http://kiro.dev/"> </a><span>takes that very much to heart.</span></p></blockquote>
<p><span style="font-weight: 400;">If you’re wondering about my take on Microsoft, in the context of this piece you should read this </span><a href="https://redmonk.com/jgovernor/2025/06/20/microsoft-build-2025-agents-models-github-and-beast-mode-windows/"><span style="font-weight: 400;">post</span></a><span style="font-weight: 400;"> about its Build conference &#8211; the TDLR is essentially that Microsoft is in decent shape, but really needs a moonshot for its own Frontier LLM model. </span></p>
<p><span style="font-weight: 400;">To conclude &#8211; the industry giants of the cloud buildout may have taken a while to get their gets together and start successfully addressing AI market opportunities, but we’re certainly seeing interesting moves that map to the needs for aggressive shipping, with a much stronger focus on developer experience in the AI cosmic inflation era.</span></p>
<p>&nbsp;</p>
<p>disclosure: AWS, Google Cloud, and Microsoft are all clients.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/giants-awaken-google-geminicli-aws-kiro-developer-experience-and-the-need-to-ship-and-keep-shipping/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5357</post-id>	</item>
		<item>
		<title>Microsoft Build 2025 &#8211; agents, models, GitHub, and beast mode Windows.</title>
		<link>https://redmonk.com/jgovernor/microsoft-build-2025-agents-models-github-and-beast-mode-windows/</link>
					<comments>https://redmonk.com/jgovernor/microsoft-build-2025-agents-models-github-and-beast-mode-windows/#respond</comments>
		
		<dc:creator><![CDATA[James Governor]]></dc:creator>
		<pubDate>Fri, 20 Jun 2025 17:46:57 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">https://redmonk.com/jgovernor/?p=5354</guid>

					<description><![CDATA[Microsoft did an excellent job at its Build conference last month, showcasing its strengths and ambitions for the AI era. It’s always hard to summarise Build, because the company has such a long history and a number of different developer and ops centers of gravity &#8211; including Microsoft Azure (cloud deployment and management), Developer Division]]></description>
										<content:encoded><![CDATA[<p><a href="http://redmonk.com/jgovernor/files/2025/06/native-support-for-MCP-on-Windows.jpg"><img loading="lazy" decoding="async" class="aligncenter size-large wp-image-5355" src="http://redmonk.com/jgovernor/files/2025/06/native-support-for-MCP-on-Windows-1024x768.jpg" alt="image showing Satya Nadella on stage in front of a slide saying &quot;native support for MCP on Windows&quot;" width="1024" height="768" srcset="https://redmonk.com/jgovernor/files/2025/06/native-support-for-MCP-on-Windows-1024x768.jpg 1024w, https://redmonk.com/jgovernor/files/2025/06/native-support-for-MCP-on-Windows-300x225.jpg 300w, https://redmonk.com/jgovernor/files/2025/06/native-support-for-MCP-on-Windows-768x576.jpg 768w, https://redmonk.com/jgovernor/files/2025/06/native-support-for-MCP-on-Windows-480x360.jpg 480w, https://redmonk.com/jgovernor/files/2025/06/native-support-for-MCP-on-Windows-107x80.jpg 107w, https://redmonk.com/jgovernor/files/2025/06/native-support-for-MCP-on-Windows-836x627.jpg 836w, https://redmonk.com/jgovernor/files/2025/06/native-support-for-MCP-on-Windows.jpg 1179w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></a></p>
<p><span style="font-weight: 400;">Microsoft did an excellent job at its Build conference last month, showcasing its strengths and ambitions for the AI era.</span></p>
<p><span style="font-weight: 400;">It’s always hard to summarise Build, because the company has such a long history and a number of different developer and ops centers of gravity &#8211; including Microsoft Azure (cloud deployment and management), Developer Division &#8211; aka DevDiv &#8211; and GitHub (developer tools and platforms), Power Platform (low code and AI powered business applications), Windows (desktop and server infrastructure). From year to year one division or another will get more attention than others, depending on the vagaries of Big Launches. Then of course there are the cross-cutting concerns &#8211; issues which impact the company across its entire portfolio &#8211; open source and AI are good examples. </span></p>
<p><span style="font-weight: 400;">Everyone gets their moment on the keynote stage after Satya Nadella kicks things off on day one, which can make things feel hurried, where storytelling suffers. What Microsoft wanted us to come away with this year was the emergence of what it is calling The Agentic Web, the idea that AI-based agents are going to fundamentally remake how we use technology to conduct business.</span></p>
<p><span style="font-weight: 400;">While a lot of tech companies are currently proposing AI agent platforms as thinly disguised (sometimes not disguised at all) replacement for humans, Microsoft is keen to thread the needle of agents as platforms to enable humans to get their work done more effectively. Recent and future layoffs may not help that message, but it’s the message nonetheless. </span></p>
<p><span style="font-weight: 400;">After reflecting on the event (for quite a while, apparently), I am ready to tell some stories from Build as I see them. </span></p>
<p><b>The GitHub Embrace</b></p>
<p><span style="font-weight: 400;">Let’s start with DevDiv and GitHub &#8211; a key takeaway was that this was the most integrated Build we’ve seen yet from Microsoft and GitHub. They set the tone from minute one of the conference, introducing Seth Suarez, Microsoft principal program manager and Kadesha Kerr of GitHub as co-hosts, to introduce the winners of this year’s Imagine Cup, and then handing over to Microsoft CEO Satya Nadella for his keynote. Presentations and demos interwove GitHub seamlessly into the keynote content and the conference content more generally. So the framing was pretty clear &#8211; if GitHub and Microsoft are closer than ever at Build, you can rest assured that they’re closer than ever the rest of the time too. </span></p>
<p><span style="font-weight: 400;">Which is to say &#8211; one important implication of the AI revolution is that the arm’s length relationship between Microsoft and GitHub is going to be… less arm’s length. The integration isn’t just going to be at Build keynotes. As Microsoft gears up for a more competitive, winner take all, more cut-throat era, GitHub is a key asset, and is going to be treated as such.</span></p>
<p><span style="font-weight: 400;">Microsoft has kept GitHub at relative arm’s length since acquiring the company in 2018, which has made a lot of sense (up until now). It’s effectively an independent subsidiary. Microsoft has given GitHub time to grow up, with its own distinct culture, approach to UX and developer experience largely intact, while at the same time lighting a fire under the company from a product delivery perspective, with an infusion of its own people (see Nat Friedman and Thomas Domke, both Microsofties, but importantly company founders in their own right &#8211; see Xamarin and Hockeystick). </span></p>
<p><span style="font-weight: 400;">GitHub ships now. Perhaps not at the relentless pace of some startups, but fast enough to get ahead of the market, and respond to threats. The Microsoft-GitHub combination came into its own by leading the current AI wave crashing onto the shores of the industry. It’s worth remembering that GitHub Copilot launched in October 2021. OpenAI launched ChatGPT just over a year later in late November 2022. And now in 2025 with the emergence of AI-native editors like Cursor and Windsurf and a slew of agent platforms, </span><a href="https://redmonk.com/jgovernor/2025/02/21/ai-disruption-code-editors-are-up-for-grabs/"><span style="font-weight: 400;">everything is up for grabs</span></a><span style="font-weight: 400;">. </span></p>
<p><span style="font-weight: 400;">There are some that would argue that Microsoft and GitHub didn’t respond quickly enough to the emerging threats, but I would argue that’s overblown. Things are moving extremely quickly in the industry right now. Responding to threats doesn’t always mean heading them off entirely.</span></p>
<p><span style="font-weight: 400;">Peacetime assumptions are behind us &#8211; competition is fierce, decisions need to be made quickly, and new products and features shipped remorselessly. Let’s just say GitHub is only too aware of this reality. As is the mothership. </span></p>
<p><span style="font-weight: 400;">Just as accessing LLMs with chat to generate code has been supplanted by integrating LLMs directly into editors, there is another center of gravity which arguably makes even more sense for agents &#8211; that center of gravity being the GitHub workflow. Agents are asynchronous &#8211; developers can’t be sitting around waiting for them to finish a job. The kind of long-running reasoning that agents provide is really anathema to developer flow, if you’re in an editor or chat interface. But GitHub was built for the kind of asynchronous workflow we’re seeing emerge. Agents and pull requests go together like peanut butter and chocolate. </span></p>
<p><a href="https://github.com/newsroom/press-releases/coding-agent-for-github-copilot"><span style="font-weight: 400;">GitHub Introduces Coding Agent For GitHub Copilot</span></a></p>
<blockquote><p><span style="font-weight: 400;">The agent starts its work when you assign a GitHub issue to Copilot or ask it to start working from Copilot Chat in VS Code. As the agent works, it pushes commits to a draft pull request, and you can track it every step of the way through the agent session logs. Developers can give feedback and ask the agent to iterate through pull request reviews. </span></p>
<p>The agent is expressly designed to preserve your existing security posture, with additional built-in features like branch protections and controlled internet access to ensure safe and policy-compliant development workflows. Plus, the agent’s pull requests require human approval before any CI/CD workflows are run, creating an extra protection control for the build and deployment environment.”</p></blockquote>
<p><span style="font-weight: 400;">This human approval is essential. Vibe coding is all very well and good, but someone has to take responsibility for checking in the code. And indeed take the credit for doing so. Developers are the human in the loop, which is just as it should be. Software engineering in general, and enterprise software engineering in particular, is really not amenable to You Only Live Once. </span><span style="font-weight: 400;">So GitHub and Microsoft are positioning the engineer and engineering team as the point of control and quality, as much as curation and overall decision-making.</span></p>
<p><span style="font-weight: 400;">From an industry context perspective it’s worth mentioning Jules &#8220;an asynchronus coding agent&#8221;, which Google launched the same week. It’s another autonomous agent platform, designed to fit into the GitHub workflow. As Google </span><a href="https://blog.google/technology/google-labs/jules/"><span style="font-weight: 400;">stressed</span></a><span style="font-weight: 400;">:</span></p>
<blockquote><p><span style="font-weight: 400;">GitHub integration: Jules works where you already do, directly inside your GitHub workflow. No context-switching, no extra setup.</span></p></blockquote>
<p><span style="font-weight: 400;">So deep GitHub integration is the new hotness for coding agents. Go where developers are, and enable asynchronous work.</span></p>
<p><span style="font-weight: 400;">In a move that was both offensive and defensive GitHub also announced that it is open sourcing Copilot Chat in VS Code, shoring up Code’s position as the modern developer’s editor of choice. As Cursor and Windsurfer double down on integration and tightly packaged experiences, contributing the VS Code GitHub Copilot Chat extension makes it more appealing from an ecosystem perspective. Open source is still a very very useful lever to play in 2025 &#8211; using the permissive MIT license emphasizes the ecosystem. Microsoft and GitHub also made a commitment to integrate and open source further AI capabilities into VS Code core.</span></p>
<p><span style="font-weight: 400;">Talking of ecosystems, Coding Agent will also be rolled out for Jetbrains, Xcode and Eclipse.</span></p>
<p><b>Beast mode Windows </b></p>
<p><span style="font-weight: 400;">Let’s use open source as the transition to the next section &#8211; where we examine Windows &#8211; specifically to see if Microsoft can make the platform more appealing to developers. Apple has owned the modern web developer for the last 15 years or so. Developers may kvetch about Apple Xcode, but they love the shiny, highly performant hardware and OS user experience, and Macbooks continue to dominate.</span></p>
<p><span style="font-weight: 400;">So Microsoft needs to compete on the basis of software, hardware (performance matters!), dev tool integration, and flexibility. That’s a lot. </span></p>
<p><span style="font-weight: 400;">A long standing industry joke is that this is (finally) the year of Linux on the desktop. Ironically enough, Microsoft has been doing its best to build Linux support into the OS with Windows Services for Linux, allowing you to run Linux on your Windows machine without needing a virtual machine or dual-boot setup. The WSL experience has been steadily improving, since its introduction in 2016. 9 years later Microsoft has finally responded to the very</span><a href="https://github.com/microsoft/WSL/issues/1"><span style="font-weight: 400;"> first issue on the WSL repo</span></a><span style="font-weight: 400;"> &#8211; to open source the code. WSL, which supports most popular distros including Arch Linux, doesn’t suddenly become every Linux user&#8217;s tool of choice &#8211; but the target isn’t actually Linux, it’s OS X. The open source decision just removes one potential objection. </span></p>
<p><span style="font-weight: 400;">If Microsoft really wants to turn the dial it needs to improve WSL performance and the out of the box experience for drivers and third party apps. The faster, snappier and less hassle it is, the more developers are likely to adopt it. Lag is one of the common complaints about WSL. But the open source move certainly won’t hurt. That said, Apple isn’t standing still &#8211; at its WWDC event it just announced a new Swift-based container runtime framework &#8211; enabling developers to create and run Linux container images directly on their Macs. It’s vm-based, but Apple performance is such that’s unlikely to be an issue. For a version one feature it looks pretty compelling. </span></p>
<p><span style="font-weight: 400;">Talking of performance Microsoft is clearly a long way behind Apple when it comes to silicon &#8211; Apple is ahead on both raw performance and energy consumption. Indeed, unlike Apple, Microsoft has to rely on traditional OEM partners for its microprocessors. In order to become more competitive it had to push ahead with support for ARM-based chips. It reached an important milestone in that respect last year with the launch of the Copilot+ form factor, and Microsoft’s first Surface laptops based on Qualcomm Snapdragon X chips. </span></p>
<p><span style="font-weight: 400;">The Copilot+ series was positioned as an AI-first architecture, for example including neural processing units (NPU), dedicated to AI and machine learning tasks, to best showcase and be performant. But optimising for NPUs was always going to be problematic in terms of getting developers excited about what they could do with a Windows machine. At launch Qualcomm got all of the attention because it was the basis of Microsoft’s own machines. Yay to ARM laptops! But in the land of LLMs, GPUs are king. Requiring NPUs is all well and good &#8211; but if you don’t do a great job of supporting Nvidia you’re really not in the game.</span></p>
<p><span style="font-weight: 400;">Which brings us back to Build 2025 and one of the more interesting announcements &#8211; which played directly to the idea of developer choice. Microsoft announced </span><a href="https://learn.microsoft.com/en-us/windows/ai/overview"><span style="font-weight: 400;">Windows AI Foundry,</span></a><span style="font-weight: 400;"> which through WindowsML, supports inferencing regardless of microprocessor, supporting AMD, Intel, Nvidia and Qualcomm, whether NPU, GPU or NPU. The platform also offers catalogs like Ollama and the Nvidia NIMs microservices packaging, as well as Microsoft’s LORA and Phi Silica small language models. This could be a big deal. Just as with driver support in Windows back in the day, out of the box model and framework support across all hardware architectures is a classic Microsoft play.</span></p>
<p><span style="font-weight: 400;">Microsoft also announced the private developer preview of a framework for managing Model Context Protocol (MCP) access to Windows applications. It makes perfect sense to adopt MCP, &#8211; it has overnight become a de facto industry standard, but it also pays to be cautious &#8211; MCP lacks a strong security model, and Microsoft needs to be very careful about potentially opening Windows to LLM-based security exploits.</span></p>
<p><span style="font-weight: 400;">One missed opportunity here was to show a beast mode gamer rig also used for running AI workloads. A lot of gamers choose Windows because of Xbox and hardware flexibility &#8211; there has to be a subset of folks that would love to upgrade their machine to use for both gaming and LLMs. A single machine for vibe gaming and vibe coding. That remains an open marketing opportunity. </span></p>
<p><b>Low Code is finally a thing</b></p>
<p><span style="font-weight: 400;">Not everything is about hardcore developers though &#8211; the Microsoft Power Platform also got its moment in the sun. Charles Lamanna, CVP of Business and Copilots at Microsoft always makes a compelling case, but the infusion of AI agents into the platform with Copilot Studio has the potential to finally deliver on the promises of low code. As a long term low-code skeptic I am increasingly coming round to the idea that AI is the underlying technology that will make a ton of new use cases possible for a broad range of users. Salesforce is in a similar position with its Agentforce platform &#8211; Salesforce admins and developers are going to build a ton of cool extensions for Salesforce customers. Salesforce has a deeply engaged global community, folks with business domain experience, and they&#8217;re ready to be unleashed with Agentforce if the user experience is right. The salesforce platform is increasingly horizontal, not being all about CRM, and Agentforce is a marker for that. </span></p>
<p><span style="font-weight: 400;">While some in the industry argue that AI will remove the need for packaged applications &#8211; just have a solid data foundation and a bunch of agents instead! &#8211; for companies with solid enterprise application customer bases, and a low-code/AI play, there is going to be plenty of opportunity for upside. And if there’s a question about whether AI will replace low-code, Power Platform provides an answer and the answer is no &#8211; the centers of gravity will come together and package agents up for business users.</span></p>
<p><span style="font-weight: 400;">So yes I think we can expect significant further growth from Power Platform &#8211; in a business that already has a lot of momentum picking up new enterprise logos. Microsoft did a solid job of showcasing Copilot Studio. We’re obviously not going to see everyone learning to code just because code assistants are out there; the drag and drop application development experience, augmented with natural language commands and prompts, makes a great deal of sense in the AI agent era. </span></p>
<p><span style="font-weight: 400;">So that’s Microsoft low code business, but what about infrastructure?</span></p>
<p><b>A quiet year for Azure</b></p>
<p><span style="font-weight: 400;">Azure… was not the belle of the ball at Build this year. Cloud infrastructure was somewhat relegated at Build 2025, which not to say there were no announcements, but mostly Azure was simply the place to run all the apps built with the new tooling.</span></p>
<p><span style="font-weight: 400;">One bit of news that did catch my eye was </span><a href="https://learn.microsoft.com/en-us/azure/ai-foundry/what-is-azure-ai-foundry"><span style="font-weight: 400;">Azure AI Foundry</span></a><span style="font-weight: 400;">. Model management and guardrails will be a core part of any infrastructure cloud, but what really struck me was that Microsoft is effectively describing what I call Progressive Delivery as a core AI pattern (our book on the subject will be </span><a href="https://itrevolution.com/product/progressive-delivery/"><span style="font-weight: 400;">published in November</span></a><span style="font-weight: 400;">)</span></p>
<blockquote><p><span style="font-weight: 400;">Azure AI Foundry will offer a new Model Router, in preview, which will automatically select the best OpenAI model for prompts, leading to higher quality and lower cost outputs. Additionally, automated evaluation, A/B experimentation and tracing in Foundry Observability will support rollback to proven models if new ones underperform, enabling developers to stay on the cutting-edge of model capabilities to deliver cost-effective solutions.</span></p></blockquote>
<p><span style="font-weight: 400;">Stay on the cutting edge, but manage the blast radius. </span></p>
<p><span style="font-weight: 400;">Scott Guthrie executive vice president of the Microsoft Cloud + AI Group, in his keynote slot on day two positioned CosmosDB as an underlying service that has helped propel OpenAI forward. OpenAI is taking advantage of Microsoft managed data services, not just infrastructure as a service.</span></p>
<p><span style="font-weight: 400;">He also talked up Microsoft’s green credentials, one of few times sustainability was mentioned at the show. It was good to see (at least some) lip service paid to sustainability there, when other major vendors seem to be jettisoning their commitments.</span></p>
<p><b>Multimodel is not a position of strength</b></p>
<p><span style="font-weight: 400;">Finally &#8211; some thoughts on Large Language Models and industry ecosystems. Here is where the news is not quite so good for Microsoft. Betting on “multi model” is not the winning position.</span></p>
<p><span style="font-weight: 400;">At Build this reality was brought into stark relief in the “CEOs of competitors” section of Nadella’s keynote. Sam Altman of course appeared in a Zoom interview, this just a couple of weeks after rumours emerged that OpenAI might be acquiring Windsurf. Whether or not OpenAI does so, with its coming push into consumer devices, led by Jonny Ive, it’s going to be competing directly with both Apple and Microsoft. </span></p>
<p><span style="font-weight: 400;">After Sam Altman we had a recorded appearance with Elon Musk, because Microsoft was announcing support for the xAI’s Grok model. Whatever you think of Musk, and clearly he has his boosters, he’s a deeply divisive figure, and there were definitely people in the room and online who were not happy to see him featured. Nadella’s values and those of Musk would not seem to be terribly well aligned.</span></p>
<p><span style="font-weight: 400;">If Microsoft is forced to talk up the benefits of multi-model and provide platforms for third parties building frontier models, that’s arguably a problem for the company. Essentially all of the developer-facing tools described above are either running OpenAI or Anthropic.</span></p>
<p><span style="font-weight: 400;">Compare and contrast with Google, which is now puffing out its chest at its own events, trumpeting the capabilities of Gemini. Sure it supports multimodel in Google Cloud, but it can happily lead with Gemini in demos and its own code assist products and agents.  </span></p>
<p><span style="font-weight: 400;">Microsoft on the other hand is not in charge of its own destiny when it comes to frontier models, which is ironic for the company that is now marketing the idea of Frontier Firms. </span><a href="https://www.microsoft.com/en-us/worklab/work-trend-index/2025-the-year-the-frontier-firm-is-born"><span style="font-weight: 400;">According to Microsoft research</span></a><span style="font-weight: 400;">:</span></p>
<blockquote><p><span style="font-weight: 400;">We are entering a new reality—one in which AI can reason and solve problems in remarkable ways. This intelligence on tap will rewrite the rules of business and transform knowledge work as we know it. Organizations today must navigate the challenge of preparing for an AI-enhanced future, where AI agents will gain increasing levels of capability over time that humans will need to harness as they redesign their business. Human ambition, creativity, and ingenuity will continue to create new economic value and opportunity as we redefine work and workflows.</span></p>
<p>As a result, a new organizational blueprint is emerging, one that blends machine intelligence with human judgment, building systems that are AI-operated but human-led. Like the Industrial Revolution and the internet era, this transformation will take decades to reach its full promise and involve broad technological, societal, and economic change.</p></blockquote>
<p><span style="font-weight: 400;">Well in that case it’s probably a good idea to own one of the most powerful AI and LLM companies. Models may be commoditising but that doesn’t mean you don’t want to own one. If that game moves forward to winning customers with tokens, then you don’t want to be relying on a third party. And it’s increasingly a token-based world. </span></p>
<p><span style="font-weight: 400;">In closing &#8211; Microsoft is in excellent shape, with some great assets, but in my view it either needs to embark on a moonshot to build or foreground its own models, or it should make an era defining acquisition of Anthropic, which offers the best experience for code. Anthropic would be expensive &#8211; certainly north of the $61.5bn valuation of its last funding round &#8211; but Wall Street trusts Nadella, and the stakes are absurdly high. Rumours currently abound that OpenAI is trying to change the terms of its contract with Microsoft, and that OpenAI is under-cutting Microsoft in Copilot sales deals. I don’t really see how this will end well. This last paragraph deserves a post in its own right, so that’s what you’re going to get. I will be following up shortly.  </span></p>
<p><span style="font-weight: 400;">Disclosure: GitHub, Microsoft, Google Cloud, and Salesforce are all clients. </span></p>
]]></content:encoded>
					
					<wfw:commentRss>https://redmonk.com/jgovernor/microsoft-build-2025-agents-models-github-and-beast-mode-windows/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5354</post-id>	</item>
	</channel>
</rss>
