<?xml version="1.0" encoding="UTF-8" standalone="no"?><rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:sy="http://purl.org/rss/1.0/modules/syndication/" xmlns:wfw="http://wellformedweb.org/CommentAPI/" version="2.0">

<channel>
	<title>Shelly Palmer Digital Living - Daily Radio Report</title>
	<atom:link href="https://shellypalmer.com/category/radio/feed/" rel="self" type="application/rss+xml"/>
	<link>https://shellypalmer.com/category/radio/</link>
	<description>Shelly Palmer's reports on the top stories in technology, media and entertainment.</description>
	<lastBuildDate>Sun, 14 Sep 2025 13:00:33 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<itunes:explicit>no</itunes:explicit><copyright>Copyright 2008 SLP Productions, Inc. All Rights Reserved.</copyright><itunes:image href="http://www.shellypalmermedia.com/video/_images/600x600_iTunesAudio.jpg"/><itunes:keywords>Media,3,Media,Technology,Entertainment,Shelly,Palmer,Shelly,Palmer,Shelley,Palmer,Shelley,Advanced,Media,Ventures,Emmy,Emmy,Advanced,Media,Emmy,Awards,Advanced,Media,Committee</itunes:keywords><itunes:summary>Shelly Palmer's MediaBytes - making sense of the New Media News - five days a week. Audio-only version.</itunes:summary><itunes:subtitle>Shelly Palmer's MediaBytes - making sense of the New Media News - five days a week. Audio-only version.</itunes:subtitle><itunes:category text="Technology"><itunes:category text="Tech News"/></itunes:category><itunes:author>Shelly Palmer</itunes:author><itunes:owner><itunes:email>shelly@palmer.net</itunes:email><itunes:name>Shelly Palmer</itunes:name></itunes:owner><item>
		<title>If You Can’t Tell the Difference, There Is No Difference</title>
		<link>https://shellypalmer.com/2025/09/if-you-tell-the-difference-there-is-no-difference/</link>
					<comments>https://shellypalmer.com/2025/09/if-you-tell-the-difference-there-is-no-difference/#respond</comments>
		
		
		<pubDate>Sun, 14 Sep 2025 04:01:29 +0000</pubDate>
				<category><![CDATA[Advertising & Marketing]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Automotive & Vehicles]]></category>
		<category><![CDATA[Beauty & Fitness]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[CES 2014 Self-Guided Tour]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Finance & Financial Services]]></category>
		<category><![CDATA[Health & Wellness]]></category>
		<category><![CDATA[Law & Government]]></category>
		<category><![CDATA[Management]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[News & Journalism]]></category>
		<category><![CDATA[Production]]></category>
		<category><![CDATA[Radio]]></category>
		<category><![CDATA[Sports & Athletics]]></category>
		<category><![CDATA[Tech Biz]]></category>
		<category><![CDATA[Techno-politics]]></category>
		<category><![CDATA[Travel & Transportation]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=201870</guid>

					<description><![CDATA[<p>The debate over AI and its role in creative industries often centers on one question: Can AI ever be as creative as humans? While it’s tempting to philosophize about inspiration and ingenuity, this line of inquiry misses a crucial point: <em>If the audience can’t tell the difference between AI-generated and human-generated content, or if they don’t care, for all practical purposes, there is no difference.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2025/09/if-you-tell-the-difference-there-is-no-difference/">If You Can’t Tell the Difference, There Is No Difference</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p>The debate over AI and its role in creative industries often centers on one question: Can AI ever be as creative as humans? While it’s tempting to philosophize about inspiration and ingenuity, this line of inquiry misses a crucial point: <em>If the audience can’t tell the difference between AI-generated and human-generated content, or if they don’t care, for all practical purposes, there is no difference.</em></p>
<p>In advertising and marketing, content isn’t created to hang in a museum or win a Pulitzer. It’s created to achieve specific goals: attract attention, sell products, or raise awareness. This distinction between &#8220;required&#8221; content and &#8220;inspired&#8221; content is where much of the confusion begins. Required content (aka production or commercial content) is functional, created under tight deadlines to accomplish a clear objective. Inspired content (aka art), on the other hand, stems from the human need to express emotion, it is mostly unconstrained by time or utility. Comparing these two types of creativity is like comparing a business letter to a love poem: both involve language, but their purposes, and expectations, are worlds apart.  </p>
<h2>For a wide range of production tasks, AI is already &#8220;good enough.&#8221; </h2>
<p>I started my career as a <a href="https://shellypalmer.com/production/music-production/" target="_blank">composer and producer of commercial music</a>. I&#8217;ve worked on literally hundreds of radio and television show themes, and thousands of jingles and underscores. So, I&#8217;ve got a pretty good idea of what clients will accept as finished work product. I can also say (without hesitation) that creative services and production companies are run like factories and efficiency and productivity are absolutely key to economic success. Clients have budgets – that&#8217;s what you can charge. Your profit is based on your ability to produce an acceptable work product at the highest possible margin.</p>
<h2>Then, There&#8217;s The Audience</h2>
<p>It probably won&#8217;t surprise you to learn that most people cannot reliably sing a simple tune in key or detect subtle off-key notes. Tone deafness has an actual name, &#8220;congenital amusia.&#8221; Add to this the reduction of music and art education in our K-12 system and you have a recipe for a generation (or two) of audiences that truly cannot discern the difference between best-in-class AI-generated music and journeyman-created production music. Our clients&#8217; tests confirm this, if the audience isn&#8217;t told what it&#8217;s listening to, they have no idea how it was created, don&#8217;t notice, and don&#8217;t care.</p>
<h2>Steve Jobs Proved The Point</h2>
<p>The world of recorded music was irrevocably changed in October 2001 when Apple introduced the iPod. While it is well remembered as a stepping stone to the greatest comeback in American corporate history, the iPod is less well remembered for dealing the final, almost fatal, blow to sonic quality. First, audio files were compressed to fit on the iPod. Called &#8220;lossy compression&#8221; it reduced sonic quality by about 29X. Next, were the 29-cent earbuds (which Apple sold for $29 if purchased separately). Put all that together and you get the world of recorded music as mass marketed by Steve Jobs. You also get the death of sonic quality. The funny thing is, almost nobody noticed. (See: <em><a href="https://shellypalmer.com/2019/12/hi-res-audio-solution-search-problem/" target="_blank">Hi-Res Audio: A Solution in Search of a Problem</a></em> for more details).</p>
<h2>This Has Nothing To Do With Enjoyment</h2>
<p>Importantly, this has nothing to do with the enjoyment or emotional connection people feel to music. Everyone should sing like no one is listening, and everyone is the world’s foremost expert on the music they like. But production music isn’t about personal expression, it’s a tool. It’s designed to support a story, sell a product, or evoke a mood. AI is capable of doing this now and it&#8217;s getting better every day.</p>
<h2>AI May Always Have Limitations</h2>
<p>Will AI work for every case? No. Neither will every professional composer. Humans are magical and they do magical things – like create hit songs or timeless artwork or stories that capture our imaginations. That&#8217;s not what we&#8217;re talking about here.</p>
<p>In advertising, AI tools like ChatGPT and MidJourney are already crafting high-performing ad copy and visuals. This past year, A/B tests performed by our clients have shown AI-generated content performing on par with, if not better than, human-created alternatives. Background music for commercials, once the domain of human composers, can now be generated by AI in minutes, meeting technical and emotional requirements at a fraction of the cost. Why would a business pay a premium for human creators if AI can deliver similar or better metrics? You may cite ethical reasons, but in practice, every production function that can be automated, will be automated.</p>
<h2>Generative AI is Not Artificial Creativity</h2>
<p>Current generative AI platforms are not capable of originality. They lack the ability to originate truly novel ideas, struggle with emotional depth, and frequently require human oversight to ensure quality and ethical standards. But these shortcomings don’t negate its utility. Production content is not about originality or depth, it’s about efficiency, consistency, and scalability. In this context, AI fits seamlessly into the content factory model, offering significantly faster turnaround times and lower costs than traditional workflows.</p>
<h2>The Right Tool For The Right Job</h2>
<p>Which brings me back to where I started. Artificial creativity is not required to fulfill a growing range of production requirements. Generative AI is already doing a pretty good job of mimicking what we can fairly label, &#8220;average human creative output,&#8221; which is neither inspired or magical or truly creative. This is extremely bad news for people who do rule-based work in creative fields (such as setting type, sizing images, color grading, mixing, proofing, rudimentary editing, banging out an underscore for a chase scene in xyz musical style, etc). Said differently, no one should confuse creativity with execution. Most workers in the creative arts do not create, they technically execute someone else&#8217;s creative vision.</p>
<p>To that end, generative AI systems are improving their execution capabilities exponentially. Which means, deploying AI content production workflows is now an imperative. After all, if you can’t tell the difference, there is no difference. </p>
<p class="text-6 text-left"><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models. This article was originally published on December 1, 2024.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2025/09/if-you-tell-the-difference-there-is-no-difference/">If You Can’t Tell the Difference, There Is No Difference</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2025/09/if-you-tell-the-difference-there-is-no-difference/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
		<item>
		<title>Dial-up’s Death Knell: What 18,000X Faster Really Means</title>
		<link>https://shellypalmer.com/2025/08/dial-ups-death-knell-what-18000x-faster-really-means/</link>
					<comments>https://shellypalmer.com/2025/08/dial-ups-death-knell-what-18000x-faster-really-means/#respond</comments>
		
		
		<pubDate>Sun, 10 Aug 2025 13:23:07 +0000</pubDate>
				<category><![CDATA[AI]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Electronics]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[Radio]]></category>
		<category><![CDATA[Tech Biz]]></category>
		<category><![CDATA[Techno-politics]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=203535</guid>

					<description><![CDATA[<p>On September 30, 2025, AOL will finally turn off dial-up internet service. The AOL Dialer and AOL Shield Browser will go dark the same day. For millions of Americans, that distinctive screech of a modem handshake was the sound of the future arriving. Now the last of the original on-ramps is closing, and the highway it led to is unrecognizable.</p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2025/08/dial-ups-death-knell-what-18000x-faster-really-means/">Dial-up&#8217;s Death Knell: What 18,000X Faster Really Means</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p>On September 30, 2025, AOL will finally turn off dial-up internet service. The AOL Dialer and AOL Shield Browser will go dark the same day. For millions of Americans, that distinctive screech of a modem handshake was the sound of the future arriving. Now the last of the original on-ramps is closing, and the highway it led to is unrecognizable.</p>
<h2>The Velocity of Data is Increasing and Will Always Increase</h2>
<p>Dial-up maxed out at 56 kilobits per second. Today&#8217;s 1-gigabit fiber service &#8211; a conservative baseline in many markets &#8211; represents roughly 18,000 times faster performance in 34 years. But raw speed misses the point. We didn&#8217;t just get a faster internet, we got a different internet that enabled entirely new categories of human experiences. Every time network performance improved, markets invented applications that were previously impossible. The progression from dial-up to today follows a classic exponential technology curve, and this has significant business implications.</p>
<h2>A Predictable Path</h2>
<p>The technical path from screech to symmetry follows predictable patterns. Copper got pushed to its physical limits. Cable operators learned to squeeze gigabits from coaxial networks built for television. Fiber rewrote the baseline entirely, with new standards delivering 25 gigabits over the same glass that once carried megabits. Wireless caught up not just on speed but on reliability (probably even more important than raw speed for real applications).</p>
<p>Meanwhile, satellite internet transformed from a punchline into a credible broadband option through low-earth orbit constellations. Starlink now delivers 100+ megabit speeds with latency in the mid-20 milliseconds &#8211; fast enough for video calls from moving vehicles. You can get 500 Mbps service in most major metros, with many markets offering multi-gigabit tiers.</p>
<p>Most importantly, the increases in speed didn&#8217;t just give us faster internet, they enabled the creation of completely new products and services: Streaming video at scale. Cloud-first software development. Real-time collaboration across continents. Live commerce and interactive entertainment. Remote production workflows. In practice, high-speed internet has enabled the always-online world we live in today.</p>
<h2>The AI inflection point</h2>
<p>The current half-a-trillion dollar (and counting) hyperscale data center buildouts we&#8217;re seeing now represent the next velocity threshold. These investments will only generate returns if AI inference can reach customers over fat, reliable pipes with minimal latency. Your future AI assistant won&#8217;t feel intelligent if it takes three seconds to respond to voice commands.</p>
<p>For us to finally enjoy augmented reality (AR), we&#8217;re going to need motion-to-photon response times under 20 milliseconds. That&#8217;s not a nice-to-have spec, it&#8217;s table stakes for the category. We can expect latency for comfortable AR experiences to be measured in single-digit milliseconds at the edge.</p>
<p>Then, there&#8217;s the emerging world of AI agents. These systems will coordinate across multiple services, process real-time data streams, and respond to environmental changes instantly. An agent that takes 10 seconds to react to your calendar change or traffic pattern isn&#8217;t useful &#8211; it&#8217;s annoying.</p>
<h2>Think About This</h2>
<p>We need to design for tomorrow&#8217;s velocity, not today&#8217;s average. Our customers won&#8217;t thank us for building applications that &#8220;work fine on gigabit.&#8221; They&#8217;ll abandon us when those applications feel slow compared to competitors who assumed 10-gigabit symmetrical service.</p>
<p>Next, upstream bandwidth and consistent low latency will matter as much as headline download speeds. The shift to cloud-rendered experiences, real-time collaboration, and always-on AI means our users are producing as much data as they consume. Jitter and packet loss will decide whether professional workflows feel instant or unusable.</p>
<p>Lastly, network architecture will become an even more competitive race. The companies winning in AI won&#8217;t just rent more GPUs, they will position compute at the network edge, minimizing data round-trips, and designing for congestion scenarios. If your product strategy depends on AI responsiveness, network topology matters as much as your server specifications.</p>
<h2>What&#8217;s Next?</h2>
<p>The roadmap is surprisingly predictable. Expect 10-gigabit residential service to become commonplace as fiber buildouts accelerate and cable operators deploy new standards. In-home wireless will prioritize reliability over raw speed &#8211; crucial for AR devices and local AI systems. Mobile networks will push compute to the edge to support real-time applications.</p>
<p>My dear friend, Marty Cooper (who placed the first public call from a handheld portable cell phone in 1973) formulated the <a href="https://en.wikipedia.org/wiki/Martin_Cooper_(inventor)" target="_blank">Law of Spectral Efficiency</a> (aka Cooper&#8217;s Law) to help us calculate the increase in capacity. It states, &#8220;The maximum number of voice conversations or equivalent data transactions that can be conducted in all the useful radio spectrum over a given area doubles every 30 months.&#8221;</p>
<p>The larger pattern holds: infrastructure providers who can deliver deterministic, low-latency performance will enable the next generation of applications. Those applications will quickly feel essential rather than innovative. And the cycle begins again.</p>
<p>When AOL&#8217;s dial-up service finally goes silent we&#8217;ll close the chapter on 1990&#8217;s internet access. But we&#8217;ll also be marking the distance traveled on an exponential curve that shows no signs of flattening. In truth, the sound of that last modem handshake will be the starting gun for whatever comes next.</p>
<p class="text-6 text-left"><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2025/08/dial-ups-death-knell-what-18000x-faster-really-means/">Dial-up&#8217;s Death Knell: What 18,000X Faster Really Means</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2025/08/dial-ups-death-knell-what-18000x-faster-really-means/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
		<item>
		<title>The MCP Revolution: Why This Boring Protocol May Change Everything About AI (Updated)</title>
		<link>https://shellypalmer.com/2025/04/the-mcp-revolution-why-this-boring-protocol-may-change-everything-about-ai/</link>
					<comments>https://shellypalmer.com/2025/04/the-mcp-revolution-why-this-boring-protocol-may-change-everything-about-ai/#respond</comments>
		
		
		<pubDate>Sun, 27 Apr 2025 04:01:20 +0000</pubDate>
				<category><![CDATA[Advertising & Marketing]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Automotive & Vehicles]]></category>
		<category><![CDATA[Beauty & Fitness]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Finance & Financial Services]]></category>
		<category><![CDATA[Health & Wellness]]></category>
		<category><![CDATA[Law & Government]]></category>
		<category><![CDATA[Management]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[News & Journalism]]></category>
		<category><![CDATA[Production]]></category>
		<category><![CDATA[Radio]]></category>
		<category><![CDATA[Sports & Athletics]]></category>
		<category><![CDATA[Tech Biz]]></category>
		<category><![CDATA[Techno-politics]]></category>
		<category><![CDATA[Travel & Transportation]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=202734</guid>

					<description><![CDATA[<p>You’re already sick of the words Agent and Agentic, but you know they are the new new thing. You may not have heard the initialism MCP, but you’re going to start hearing about it now. A lot.</p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2025/04/the-mcp-revolution-why-this-boring-protocol-may-change-everything-about-ai/">The MCP Revolution: Why This Boring Protocol May Change Everything About AI (Updated)</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p>You&#8217;re already sick of the words <em>Agent</em> and <em>Agentic</em>, but you know they are the new new thing. You may not have heard the initialism <em>MCP</em>, but you’re going to start hearing about it now. A lot.</p>
<p>As of April 2025, every leading foundational model builder including OpenAI, Google, Meta, Microsoft, and Amazon has committed to adopting Model Context Protocol (MCP). What began in November 2024 as an initiative from Anthropic has quickly evolved into the new open standard for AI interoperability. This is bigger than it sounds. Much bigger.</p>
<h2>What is MCP?</h2>
<p>MCP is a standardized way for AI systems to talk to each other—and to your data. Instead of every AI provider using their own proprietary connection methods (forcing developers to build custom integrations for each), MCP creates a universal language that any AI can use to access, query, and interact with business tools, repositories, and software. Here&#8217;s what it looks like in practice:</p>
<p><img decoding="async" src="https://media.shellypalmer.com/wp-content/images/2025/03/MCP-flowchart.png" alt="MCP Flowchart - shellypalmer.ai" width="100%" /></p>
<p>The diagram looks simple because the concept is simple. An LLM powers multiple agents. Each agent talks to a dedicated MCP client. Those clients connect through MCP servers to specific services. It&#8217;s the digital equivalent of everyone agreeing to drive on the right side of the road.</p>
<h2>Why You Should Care (Even If You&#8217;re Not a Developer)</h2>
<p>If you&#8217;re thinking &#8220;this sounds like plumbing, why should I care?&#8221; – you&#8217;re exactly right. It is plumbing, but it&#8217;s plumbing that will dramatically change what&#8217;s possible with AI for your business.</p>
<p>Three reasons:</p>
<h3>1. Unified Connections = Faster Development</h3>
<p> Before MCP, if you wanted your AI assistant to connect to Salesforce, then Slack, then your custom database, you needed three different integration methods. Each one required specialized knowledge, unique error handling, and separate maintenance. With MCP, connect once, connect everywhere. Development time just got slashed by 70%.</p>
<h3>2. Standardized Data Exchange = Better Systems</h3>
<p> Not only can systems connect more easily, but they all speak the same language when exchanging information. It&#8217;s like going from a world where every restaurant uses a different ordering system to one where you can say &#8220;I&#8217;ll have the #2&#8221; anywhere and get exactly what you expect. The practical upshot? AI systems that are more reliable, more interoperable, and less likely to break when you need them most.</p>
<h3>3. Unified Context Model = Smarter AI</h3>
<p> The real magic happens with context. MCPs standardize how conversation history and user preferences are maintained across interactions. No more AI assistants that forget what you just told them when they switch tools. This isn&#8217;t just convenient—it&#8217;s the difference between an AI that feels broken and one that feels intelligent.</p>
<h2>From Competitors to Collaborators</h2>
<p>Let’s be clear: OpenAI and Anthropic are fierce competitors. They’re racing to build the most capable AI systems on the planet. Their business models depend on differentiation. So why would OpenAI adopt its rival’s protocol? Because standardization will speed AI adoption. It’s just that simple.</p>
<h2>What This Means For Your Business</h2>
<p>If you’re working on AI agents and agentic systems, MCP’s emergence as a standard has several immediate implications:</p>
<ul>
<li><strong>For the enterprise</strong>: You can build AI systems without fear of vendor lock-in. If ChatGPT doesn’t suit your needs next year, you can swap in Claude or any MCP-compatible model without rebuilding your architecture.</li>
<li><strong>For developers</strong>: Learn one protocol, connect to everything. The MCP ecosystem will expand rapidly now that the big players are on board.</li>
<li><strong>For startups</strong>: The barrier to entry just dropped significantly. You can build specialized services that plug into any MCP-compatible system without asking users to adopt another proprietary platform.</li>
</ul>
<h2>What To Do About It Now</h2>
<p>If you&#8217;re considering AI agents—and you should be—take these steps immediately:</p>
<ul>
<li><strong>Ask vendors about MCP support.</strong> If your AI tools aren’t built to be MCP-compatible, ask why. If the answer isn’t strategic, it’s probably technical debt.</li>
<li><strong>Design for modularity.</strong> Prioritize tools and platforms that separate agents from services. That flexibility will pay off when you want to scale or switch vendors.</li>
<li><strong>Plan for distributed systems.</strong> MCP assumes multiple servers. If your IT team isn’t thinking in terms of distributed orchestration, it’s time to level up.</li>
<li><strong>Pilot with real use cases.</strong> Standard protocols make it easier to test with one agent and one service, then expand. Don’t wait for the “perfect” platform—start with what you have.</li>
<li><strong>Train your teams.</strong> MCP isn’t just for engineers. Product owners, architects, and technical marketers all need to understand what this unlocks—and what it demands.</li>
</ul>
<h2>What&#8217;s Protocol?</h2>
<p>Protocols and standards may seem dull, but they often signal revolutions. MCP is not just another technical footnote—it’s the beginning of AI systems that are interoperable, scalable, and vendor-agnostic. It’s the HTTP moment for intelligent agents.</p>
<p>Ignore it at your own risk.</p>
<p class="text-6 text-left"><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models. This article was originally posted on March 30, 2025.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2025/04/the-mcp-revolution-why-this-boring-protocol-may-change-everything-about-ai/">The MCP Revolution: Why This Boring Protocol May Change Everything About AI (Updated)</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2025/04/the-mcp-revolution-why-this-boring-protocol-may-change-everything-about-ai/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
		<item>
		<title>Things That Have My Attention For 2025</title>
		<link>https://shellypalmer.com/2024/12/things-that-have-my-attention-for-2025/</link>
					<comments>https://shellypalmer.com/2024/12/things-that-have-my-attention-for-2025/#respond</comments>
		
		
		<pubDate>Sun, 15 Dec 2024 16:39:56 +0000</pubDate>
				<category><![CDATA[Advertising & Marketing]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Automotive & Vehicles]]></category>
		<category><![CDATA[Beauty & Fitness]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[CES 2014 Self-Guided Tour]]></category>
		<category><![CDATA[Cyber Security]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Finance & Financial Services]]></category>
		<category><![CDATA[Health & Wellness]]></category>
		<category><![CDATA[Management]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[Production]]></category>
		<category><![CDATA[Radio]]></category>
		<category><![CDATA[Self Help]]></category>
		<category><![CDATA[Sports & Athletics]]></category>
		<category><![CDATA[Tech Biz]]></category>
		<category><![CDATA[Tech Tips]]></category>
		<category><![CDATA[Techno-politics]]></category>
		<category><![CDATA[Travel & Transportation]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=201942</guid>

					<description><![CDATA[<p>This is the time of year I try to gather my observations and write them down. Most years, I craft a trend report or enumerate my "investable theses," but this year, I simply want to share what’s on my mind. These are not predictions or forecasts—just a collection of the ideas and concepts that have captured my attention. From the accelerating pace of innovation to the profound societal shifts driven by AI, here’s what I’m thinking about as 2024 comes to a close:</p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/12/things-that-have-my-attention-for-2025/">Things That Have My Attention For 2025</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p>This is the time of year I try to gather my observations and write them down. Most years, I craft a trend report or enumerate my &#8220;investable theses,&#8221; but this year, I simply want to share what’s on my mind. These are not predictions or forecasts—just a collection of the ideas and concepts that have captured my attention. From the accelerating pace of innovation to the profound societal shifts driven by AI, here’s what I’m thinking about as 2024 comes to a close:</p>
<h2>The Great AI Transformation of 2025</h2>
<p>The great AI transformation of 2025 won’t be about technology—it will be about leadership. Technology evolves exponentially, as it always has, but tech alone is not the answer. When you teach someone to complete three hours of work in three minutes, they won’t automatically move to the next task unless they are motivated and properly compensated. This is a leadership challenge, not a technological one. Success in 2025 will belong to those who understand how to lead teams in this era of unprecedented productivity.</p>
<h2>Keeping Up to Date</h2>
<p>The pace of technological change has reached an almost incomprehensible velocity. For over 25 years, I’ve been preaching exponential change, yet I, (like just about everyone) find myself struggling to keep up. Staying informed has gone from a strategic imperative to an ongoing battle against the overwhelming flood of information. AI-powered curation tools are no longer luxuries; they are necessities.</p>
<h2>Humans Are No Longer the Sole Writers of History</h2>
<p>AI has become a full co-writer of human history. Generative AI systems now shape culture, create narratives, and influence decisions at a scale we’ve never seen before. This evolution raises profound questions about authorship and authenticity.</p>
<h2>AGI: The New North Star</h2>
<p>Artificial General Intelligence (AGI) has supplanted AI as the ultimate goal for big model builders. The term &#8220;AI&#8221; feels almost quaint as tech giants race to define what a true AGI can achieve.</p>
<h2>Reasoning Engines</h2>
<p>Reasoning engines represent the next frontier in AI development. Unlike traditional LLMs, they aim to provide structured, logical decision-making. Their potential applications, from strategic planning to complex problem-solving, are boundless.</p>
<h2>Agentic Systems</h2>
<p>Everyone I know is working on agentic systems—software capable of autonomous action. Their development will redefine how we think about tools, assistants, and personal AI collaborators.</p>
<h2>Marketing to Bots</h2>
<p>As agentic systems become ubiquitous, marketing must adapt. Each of us will probably have multiple AI personas making decisions on our behalf. Brands will have to learn to target bots, not just humans, with personalized, bot-friendly messaging.</p>
<h2>The Loneliness Epidemic: Synthetic Companionship</h2>
<p>Platforms like character.ai are captivating audiences of all ages (but especially Gen-Alpha and Gen-Z). Synthetic companions are replacing traditional human interaction. Is this a passing fad, or are we witnessing the rise of a new social dynamic? Also, from a brand POV, are we authentically synthetic or synthetically authentic? It&#8217;s a new world.</p>
<h2>The End of Link-Based Search</h2>
<p>AI-driven summaries are putting a lot of stress on traditional, link-based search models. This shift poses existential questions for search engines, publishers, and the broader information ecosystem. </p>
<h2>Talking to Data</h2>
<p>Apps like Google’s NotebookLM and Meta&#8217;s NotebookLlama enable users to converse with their data, offering unprecedented insights and interactions. This transformative approach to information management is reshaping everything from education to enterprise decision-making.</p>
<h2>Productivity &#038; Workflow Innovation</h2>
<p>Simply increasing productivity with an AI stack isn’t enough. To unlock AI’s full potential, businesses must innovate workflows and processes at the same pace and with the same intensity as the technology is evolving.</p>
<h2>Human-AI Co-Worker Teams</h2>
<p>The future of work demands new rules for human-AI collaboration. Teams must redefine roles and expectations to fully integrate AI systems as partners, not tools.</p>
<h2>Creativity vs. Execution</h2>
<p>Humans excel at creativity; AI thrives in execution. Understanding this distinction is essential to assigning work effectively in hybrid human-AI teams. <a href="https://shellypalmer.com/2024/12/if-you-tell-the-difference-there-is-no-difference/" target="_blank">Here&#8217;s what I mean</a>.</p>
<h2>Multimodal Capabilities</h2>
<p>The rise of multimodal AI systems is empowering a new era of &#8220;social production,&#8221; where anyone can describe and produce complex outputs. This democratization of production capability is magical—and disruptive (just don&#8217;t confuse it with creativity).</p>
<h2>Creating a Culture of Continuous Adaptation</h2>
<p>The days of &#8220;change management&#8221; and “digital transformation” are over. Organizations must now build cultures that embrace perpetual change, fostering resilience and adaptability at every level. <a href="https://shellypalmer.com/2024/12/the-ai-native-flywheel-a-framework-for-continuous-adaptation/" target="_blank">We have a nice approach to this</a>.</p>
<h2>How LLMs Work</h2>
<p>A fundamental understanding of LLM technology—pre-training, inference workloads, and post-training—is essential. Mastering these concepts will separate leaders from laggards in 2025. We have a <a href="https://courses.shellypalmer.com/metacademy-generative-ai" target="_blank">course</a> that can help.</p>
<h2>AI’s Impact on Education</h2>
<p>AI has already reshaped teaching and learning at every level of education. The question for 2025 is: what’s next? From personalized tutoring to automated curriculum design, the possibilities are endless.</p>
<h2>AI Training Ethics</h2>
<p>The ethical implications of AI training are becoming impossible to ignore. From copyright law to biases baked into datasets, ensuring alignment between AI systems and societal values is essential for ethical deployment.</p>
<h2>AI Energy Consumption</h2>
<p>The environmental impact of AI cannot be overstated. Training large AI models requires vast amounts of energy, prompting the need for more efficient algorithms and sustainable infrastructure to minimize carbon footprints.</p>
<h2>We Are The Architects of the World We Want to Live in</h2>
<p>The future has never been more exciting or full of potential. This list is woefully incomplete. What’s capturing your attention? I’d love to hear your thoughts—feel free to <a href="https://shellypalmer.com/contact-us/" target="_blank">reach out and share what you’re thinking about</a>. Together, we can explore what’s new, what’s next, and what it means. Let’s shape the future, one idea at a time.</p>
<p class="text-6 text-left"><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/12/things-that-have-my-attention-for-2025/">Things That Have My Attention For 2025</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2024/12/things-that-have-my-attention-for-2025/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
		<item>
		<title>How Much Longer Can The Agency/Client Model Survive?</title>
		<link>https://shellypalmer.com/2024/11/how-much-longer-can-the-agency-client-model-survive/</link>
					<comments>https://shellypalmer.com/2024/11/how-much-longer-can-the-agency-client-model-survive/#respond</comments>
		
		
		<pubDate>Sun, 03 Nov 2024 18:15:50 +0000</pubDate>
				<category><![CDATA[Advertising & Marketing]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Automotive & Vehicles]]></category>
		<category><![CDATA[Beauty & Fitness]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Finance & Financial Services]]></category>
		<category><![CDATA[Health & Wellness]]></category>
		<category><![CDATA[Law & Government]]></category>
		<category><![CDATA[Management]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[News & Journalism]]></category>
		<category><![CDATA[Production]]></category>
		<category><![CDATA[Radio]]></category>
		<category><![CDATA[Tech Biz]]></category>
		<category><![CDATA[Techno-politics]]></category>
		<category><![CDATA[Travel & Transportation]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=201765</guid>

					<description><![CDATA[<p>While the FTE-based model is flawed and often exploited, its one saving grace is that it creates predictable billing for both sides. But there’s a new problem. With a single AI-enhanced, well-trained FTE capable of outproducing entire teams, the math changes. Now what?</p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/11/how-much-longer-can-the-agency-client-model-survive/">How Much Longer Can The Agency/Client Model Survive?</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p>“The agency model is broken.” This refrain is so worn out it’s a cliché. Yet, despite its obvious flaws, the model has survived for decades. Here’s how it works: agencies base their fees on the number of “Full-Time Equivalents” (FTEs) assigned to a client—essentially, headcount. In theory, more complex projects require more FTEs, allowing agencies to scale revenue with staff allocation. More FTEs mean higher fees; simple math, straightforward billing.</p>
<h2>When 1=12</h2>
<p>While the FTE-based model is flawed and often exploited, its one saving grace is that it creates predictable billing for both sides. But there’s a new problem. With a single AI-enhanced, well-trained FTE capable of outproducing entire teams, the math changes. Now what?</p>
<h2>The Impact of AI on Agency Economics</h2>
<p>For agencies, AI-enhanced productivity is quickly creating a serious financial paradox. The FTE-based model, which ties revenue to headcount, is at odds with increased productivity. Said differently, agencies don&#8217;t benefit from doing more with less.</p>
<p>Clients are quick to see these opportunities and expect cost savings, pressuring agencies to pass AI-driven efficiencies through lower fees. Brands, focused on maximizing value, view AI-enabled productivity as a way to achieve more for less. This has created an uncomfortable tension: agencies that don’t evolve risk irrelevance, while those that do must find ways to defend or reinvent their fee structures in a world that needs fewer human resources.</p>
<h2>Outcome Based Agency Fees</h2>
<p>Some agencies are working on transitioning to outcome-based fees (a page out of the performance marketing playbook). In this model, compensation is tied to measurable results like increased sales, higher engagement, or improved brand visibility. This approach directly links agency revenue to performance and better aligns agency efforts with client success.</p>
<p>However, outcome-based fee structures come with their own challenges. Establishing fair and achievable targets requires significant trust and transparency. Both sides need to agree on metrics and understand how factors like market conditions or client-side execution can influence outcomes. </p>
<p>For agencies, this model demands advanced data tracking and analytics capabilities to validate their impact—an area where many clients either lack the necessary infrastructure or are unwilling to share the data needed for accurate attribution.</p>
<h2>Alternative Models</h2>
<p>There are several alternative models being considered by my agency clients, including project-based pricing, value-based pricing, and even subscription-based services. Each of these models or a hybrid model may ultimately win the day. But there is another approach. </p>
<h2>The Future Is Clear, But The Model Is Less So</h2>
<p>WPP, Omnicom, Publcis, IPG, and Dentsu are all racing to integrate AI technologies into their service offerings. They are heavily investing in proprietary AI platforms and are quickly evolving into tech-centric entities. How quickly? Far more quickly than procurement departments are going to evolve their approach to agency selection.</p>
<p>What about the client side? Every one of my Fortune 500 clients has dozens (sometimes hundreds) of AI projects underway. Many marketing organizations are deploying in-house AI tech stacks dedicated to specific business outcomes such as: media optimization, creative optimization, personalization, etc. If this trend continues, will they even need their agencies?</p>
<p>This is not a tech issue, it&#8217;s a leadership issue. Agencies serve an extremely important purpose. They bring fresh, bright talent with an outside perspective and have the capacity to explore and innovate in ways that in-house teams do not. As AI forces us to separate creativity (humans) and execution (machines), new innovative workflows will naturally emerge. Human-AI co-worker teams will attend to these workflows and all of this will require leadership to evolve. </p>
<p>Which leaves us where we began. “The agency model is broken.” A refrain is so worn out it’s a cliché. Yet, despite its obvious flaws, the model endures. How long will it survive? Your guess is as good as mine. But, I&#8217;m looking forward to writing about the brave business leader that steps up and changes the game.</p>
<p class="text-6 text-left"><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/11/how-much-longer-can-the-agency-client-model-survive/">How Much Longer Can The Agency/Client Model Survive?</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2024/11/how-much-longer-can-the-agency-client-model-survive/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
		<item>
		<title>Meta’s New AI Video Tools</title>
		<link>https://shellypalmer.com/2024/10/metas-new-ai-video-tools/</link>
					<comments>https://shellypalmer.com/2024/10/metas-new-ai-video-tools/#respond</comments>
		
		
		<pubDate>Wed, 09 Oct 2024 13:42:15 +0000</pubDate>
				<category><![CDATA[Advertising & Marketing]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Automotive & Vehicles]]></category>
		<category><![CDATA[Beauty & Fitness]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Finance & Financial Services]]></category>
		<category><![CDATA[Health & Wellness]]></category>
		<category><![CDATA[Law & Government]]></category>
		<category><![CDATA[Management]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[News & Journalism]]></category>
		<category><![CDATA[Production]]></category>
		<category><![CDATA[Radio]]></category>
		<category><![CDATA[Sports & Athletics]]></category>
		<category><![CDATA[Tech Biz]]></category>
		<category><![CDATA[Travel & Transportation]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=201637</guid>

					<description><![CDATA[<p>Meta has launched Movie Gen, a new AI-powered video creation tool for small businesses. Movie Gen allows users to generate video content from prompts, edit existing videos, and add synchronized sound effects and music. It will not be available for open developer use but is intended for collaboration with the entertainment industry and content creators.</p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/10/metas-new-ai-video-tools/">Meta&#8217;s New AI Video Tools</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p>Meta has launched Movie Gen, a new AI-powered video creation tool for small businesses. Movie Gen allows users to generate video content from prompts, edit existing videos, and add synchronized sound effects and music. It will not be available for open developer use but is intended for collaboration with the entertainment industry and content creators.</p>
<p>The big win for businesses is lower production costs – not only direct reduction of non-working media budgets, but also a significant decrease in the workflow timelines and an even more significant reduction in the number of people who will have to be involved.</p>
<p>Movie Gen joins other AI video generation tools such as OpenAI&#8217;s Sora, Google&#8217;s Lumiere, Runway Gen-2, Stability AI&#8217;s Stable Video Diffusion, Pika Labs, and Nvidia&#8217;s Vid2Vid, but it is purpose built to help users quickly create video ads suitable for Facebook and Instagram.</p>
<p>The value of generative AI video tools is clear. The only question is, &#8220;When will one of these tools suit your needs.&#8221; For many, the answer will be Meta&#8217;s Movie Gen. And, remember, today is the worst these tools will ever be. They will all improve exponentially.</p>
<p class="text-6 text-left"><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/10/metas-new-ai-video-tools/">Meta&#8217;s New AI Video Tools</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2024/10/metas-new-ai-video-tools/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
		<item>
		<title>Ed Sheeran Had to Win. But What About AI?</title>
		<link>https://shellypalmer.com/2024/05/ed-sheeran-had-to-win/</link>
					<comments>https://shellypalmer.com/2024/05/ed-sheeran-had-to-win/#respond</comments>
		
		
		<pubDate>Sun, 26 May 2024 04:01:16 +0000</pubDate>
				<category><![CDATA[Advertising & Marketing]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Management]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[Production]]></category>
		<category><![CDATA[Radio]]></category>
		<category><![CDATA[Techno-politics]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=196663</guid>

					<description><![CDATA[<p>In May 2023 there was a big verdict in the world of copyright law. The decision? Ed Sheeran's 2014 hit, "Thinking Out Loud," did not infringe on Marvin Gaye's and Ed Townsend's 1973 hit, "Let's Get It On." As I will explain in a moment, from a musical standpoint, this was the only possible outcome. The copyright laws were applied as expected, and human composers in the United States are free to compose as they have been doing for centuries. But what about AI? Is it free to compose? Let's explore.</p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/05/ed-sheeran-had-to-win/">Ed Sheeran Had to Win. But What About AI?</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p>In May 2023 there was a big verdict in the world of copyright law. The decision? Ed Sheeran&#8217;s 2014 hit, &#8220;Thinking Out Loud,&#8221; did not infringe on Marvin Gaye&#8217;s and Ed Townsend&#8217;s 1973 hit, &#8220;Let&#8217;s Get It On.&#8221; As I will explain in a moment, from a musical standpoint, this was the only possible outcome. The copyright laws were applied as expected, and human composers in the United States are free to compose as they have been doing for centuries. But what about AI? Is it free to compose? Let&#8217;s explore.</p>
<h2>My Musical Background</h2>
<p>As many of you know, I spent most of my early professional life as a <a href="https://shellypalmer.com/production/music-production/https://shellypalmer.com/production/music-production/" rel="noopener" target="_blank">composer/producer</a> of commercial music. Fun fact: I won ASCAP&#8217;s 12th Annual Film &#038; TV Music Award for &#8220;Top Television Series.&#8221; The next year, I won an ASCAP award for &#8220;Most Performed Television Themes.&#8221; I&#8217;ve written a lot of music for TV, so I have a pretty good understanding of copyright law in the context of what will and will not get you sued.</p>
<h2>A Quick Musical Analysis</h2>
<p>The plaintiff in this case claimed that Sheeran&#8217;s song had &#8220;striking similarities&#8221; to Gaye &#038; Townsend&#8217;s work and that they shared &#8220;overt common elements.&#8221; If this was the way copyright law worked, the plaintiffs would have won. </p>
<p>While it&#8217;s true that the groove (meter and rhythm) of the background tracks and the orchestration (instruments and sounds) of both songs are strikingly similar, in the real musical world (and the world covered by copyright law) none of that matters at all. The case was decided based on the only part of a song that is covered by copyright – the melody and lyrics. In this case, the melodies and lyrics of both works are completely different. In fact, they are strikingly different.</p>
<h2>What about the chord progressions? Aren&#8217;t they almost identical?</h2>
<p>To a layperson&#8217;s ear, the answer is a clear yes. However, a paid, professional musicologist on a witness stand could (and did) make the argument that the chord progressions of the verses of both songs are not functionally identical.</p>
<p>The purely academic argument goes like this: The verses of &#8220;Let&#8217;s Get It On&#8221; are built on a progression that you would notate as I &#8211; iii &#8211; IV &#8211; V. The second chord in the progression, known as the three-minor (iii), is (in the key of D), an F#min, which contains the notes F# A C#. The chord progression for &#8220;Thinking Out Loud&#8221; is notated as I &#8211; I6 &#8211; IV &#8211; V. If the song was also in the key of D, the second chord, known as the one-six (I6), would be a D/F# (D major with F# in the bass), which contains the notes F# A D.</p>
<p>In an academic debate regarding functional harmony, you could argue that the second chord in each song serves a different function. In both songs, the I chord functions as the tonic, the IV chord functions as the subdominant, and the V chord functions as the dominant. Here&#8217;s where it gets debatable. In the interpretive world of functional harmony, the chords in question can be said to serve different musical functions. The I6 in Sheeran&#8217;s song functions as a tonic. The iii in Gaye &#038; Townsend&#8217;s song functions as a relative dominant. So, musicologically speaking, even though the chords in both songs sound about the same and even share two of their three notes, they are technically different.</p>
<h2>You Can&#8217;t Own Chords</h2>
<p>For many people (the plaintiffs specifically), this technical argument is nonsense, and since the groove sounds the same and the chords sound the same and the rhythm sounds the same, &#8220;Thinking Out Loud&#8221; must be a rip-off of &#8220;Let&#8217;s Get It On.&#8221; Nope. You can&#8217;t own chord progressions. If you could, there would be no popular music.</p>
<p>For as long as there has been music, there has been the practice of &#8220;contrafact,&#8221; the use of another song&#8217;s chord progression to create a new song. Every 12-bar blues song ever written sits on the chord progression I &#8211; I &#8211; I &#8211; I &#8211; IV &#8211; IV &#8211; I &#8211; I &#8211; V &#8211; IV &#8211; I &#8211; (I or V). Not some 12-bar blues songs. All of them.</p>
<p>This practice is an important part of how new music is made. The chords to George Gershwin&#8217;s &#8220;I Got Rhythm&#8221; are so beloved and so important they are professionally referred to as &#8220;Rhythm Changes,&#8221; and an untold number of songs have been written on top of them. You&#8217;re probably familiar with the theme song from &#8220;The Flintstones,&#8221; but the same chords are used for &#8220;Anthropology,&#8221; &#8220;Dexterity,&#8221; &#8220;Moose the Mooche,&#8221; &#8220;Steeplechase,&#8221; and &#8220;Red Cross,&#8221; to name a few. If Ed Sheeran lost this case, hundreds of composers would owe royalties to the Gershwin estate. In fact, thousands of composers would owe royalties to thousands of other composers who came before them. </p>
<h2>You Can Own Melody &#038; Lyrics</h2>
<p>The Ed Sheeran lawsuit was decided correctly because while you can&#8217;t own chords, or feel, or groove, you can own melody and lyrics. Sheeran&#8217;s melody was unique. His lyrics were unique. Case closed.</p>
<h2>What About Intent?</h2>
<p>All of the musicological analysis goes away when the copyright holder can prove that the creator of the new work intended to infringe, trade off, or profit from the original work. The difference between inspired by and blatantly and intentionally ripped off are triable issues of fact. When you intentionally rip off someone else&#8217;s work, you can be sued and can (and most likely will) lose.</p>
<h2>AI-Generated Music</h2>
<p>Which brings us to the biggest copyright issue of our day: music created by AI. At the moment, it is not copyrightable. According to the U.S. Constitution, Article I, Section 8, Clause 8: [The Congress shall have Power . . . ] &#8220;To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.&#8221; The Constitution grants these rights to people, not machines.</p>
<p>This is where the law is today, but is it where it should be? We&#8217;ve established that all music has derivative parts and original parts. We don&#8217;t let current composers profit from pre-existing chord progressions or rhythms or orchestrations. We do protect melodies and lyrics. </p>
<p>What if I prompt an AI to create a new melody and new lyrics over a backing track with striking similarities to &#8220;Let&#8217;s Get It On&#8221; or &#8220;Thinking Out Loud?&#8221; At the moment, it could not be protected by copyright, but it could easily infringe on either or both of these songs if the melody or lyrics were close.</p>
<h2>There&#8217;s not much you can do</h2>
<p>The copyright office holds hearings once in a while. But there&#8217;s nothing slated for 2024. You can check the official copyright.gov <a href="https://www.copyright.gov/ai/?loclr=blogcop" rel="noopener" target="_blank">Copyright and Artificial Intelligence</a> page. Other than that, I would suggest reaching out to your elected officials and letting them know how you feel.</p>
<p><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This article was originally published on May 7, 2023. It was updated to include a link to the official copyright office website on May 26, 2024.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/05/ed-sheeran-had-to-win/">Ed Sheeran Had to Win. But What About AI?</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2024/05/ed-sheeran-had-to-win/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
		<item>
		<title>Building Your First Synthetic Employee</title>
		<link>https://shellypalmer.com/2024/04/building-your-first-synthetic-employee/</link>
					<comments>https://shellypalmer.com/2024/04/building-your-first-synthetic-employee/#respond</comments>
		
		
		<pubDate>Sun, 14 Apr 2024 15:21:59 +0000</pubDate>
				<category><![CDATA[Advertising & Marketing]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Management]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[Mobile & Wireless]]></category>
		<category><![CDATA[Production]]></category>
		<category><![CDATA[Radio]]></category>
		<category><![CDATA[STEM]]></category>
		<category><![CDATA[Tech Biz]]></category>
		<category><![CDATA[Techno-politics]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=200625</guid>

					<description><![CDATA[<p>What would it take to build your company's first synthetic employee? What AI models could you use? How much data will it require? What about compute power? Electricity? Technical resources? Security? Legal? Risk? Ethics? Is the idea simply too crazy to think about now? Should you just wait until someone invents artificial general intelligence (AGI)?</p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/04/building-your-first-synthetic-employee/">Building Your First Synthetic Employee</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p class="text-6 text-left">Illustration created by DALL-E with the prompt &#8220;Create a 16&#215;9 image for this blog post.&#8221;</p>
<p>&nbsp;</p>
<p>What would it take to build your company&#8217;s first synthetic employee? What AI models could you use? How much data will it require? What about compute power? Electricity? Technical resources? Security? Legal? Risk? Ethics? Is the idea simply too crazy to think about now? Should you just wait until someone invents artificial general intelligence (AGI)?</p>
<p>Over the past year, I&#8217;ve had the pleasure of leading several workshops on this very subject for some of our global clients. Some key takeaways:</p>
<ol>
<li>You don&#8217;t need AGI to do this.</li>
<li>Getting everyone aligned on what it will take to turn AI into a competitive advantage is invaluable.</li>
<li>The exercise will help you organize all the disparate AI projects that are already in progress across your organization.</li>
</ol>
<h2>Hallway Handle</h2>
<p>If you work in a large organization, &#8220;our first synthetic employee&#8221; will become the “hallway handle.” (That’s the way people passing in the hallway will refer to your project.) Imagine two colleagues passing each other in the hallway. Without stopping to chat, they begin a very short dialog: “Hey Joe!” “Yo! John.” “What are you working on?” “The synthetic employee project.” “Yeah, I heard about that. That’s awesome!” This kind of alignment and cross-company label can serve as an excellent north star for the project.</p>
<h2>Synthia (Synthetic + Intelligence Advanced)</h2>
<p>Back to the job at hand. Let&#8217;s outline some of the things that must come together to create &#8220;Synthia,&#8221; your company&#8217;s first synthetic employee.</p>
<h2>Core Functions and Capabilities</h2>
<ul>
<li style="list-style-type: none;">
<ul>
<li><strong>Advanced Decision Making:</strong> Uses deep learning models to make strategic business decisions based on real-time global economic data, market trends, and internal business analytics.</li>
<li><strong>Autonomous Problem Solving:</strong> Equipped with reinforcement learning algorithms that adapt over time, enabling the AI to solve complex business challenges independently.</li>
<li><strong>Natural Language Processing (NLP):</strong> State-of-the-art NLP capabilities allow Synthia to understand, respond, and generate human-like text for communications, reports, and presentations. This includes multilingual support to communicate and operate globally.</li>
<li><strong>Multimodal LLM: </strong>(Which would probably handle all of the required NLP.) Integrates and interprets various data types &#8212; including text, images, and audio &#8212; to understand and process complex information from diverse inputs, enhancing decision-making and interaction capabilities across different mediums.</li>
<li><strong>Predictive Analytics:</strong> Utilizes machine learning models to predict market changes, customer behavior, and supply chain disruptions.</li>
<li><strong>Personalization Engines</strong>: Tailors marketing strategies and customer interactions based on individual consumer data analysis.</li>
<li><strong>Robotic Process Automation (RPA): </strong>Automates routine and complex tasks across all departments (such as finance, HR, and operations) to improve efficiency and accuracy.</li>
<li><strong>Computer Vision:</strong> Processes and analyzes visual data from the environment to aid in tasks such as quality control, surveillance, and customer interaction in retail settings.</li>
<li><strong>Emotional Intelligence:</strong> Uses affective computing to understand and react to human emotions, enhancing customer service and employee interactions.</li>
<li><strong>Ethical AI Governance:</strong> Monitors and ensures that all AI operations adhere to ethical AI standards and compliance with global regulations.</li>
</ul>
</li>
</ul>
<h2>Data Requirements</h2>
<ul>
<ul>
<h3>0th Party Data (Directly from the source)</h3>
</ul>
<ul>
<ul>
<li><strong>Internal Business Operations Data: </strong>Real-time access to company databases and sensors for operations, finance, HR, production, etc.</li>
<li><strong>Employee Input:</strong> Direct feedback and inputs from employees via custom apps or internal systems.</li>
</ul>
</ul>
</ul>
<ul>
<ul>
<h3>1st Party Data (Directly collected from users)</h3>
</ul>
<ul>
<ul>
<li><strong>Customer Interaction Data:</strong> Data collected from websites, apps, and physical locations (e.g., stores, kiosks) about customer preferences, behaviors, and feedback.</li>
<li><strong>User-Generated Content:</strong> Insights from customer reviews, social media interactions, and other forms of user-generated content.</li>
</ul>
<h3>2nd Party Data (Partner data)</h3>
<ul>
<li><strong>Strategic Partner Data:</strong> Shared data from partnerships (like supply chain information, co-developed technology insights, and market analysis).</li>
<li><strong>Industry Benchmarking:</strong> Data shared through alliances or consortiums for benchmarking and best practices.</li>
</ul>
<h3>3rd Party Data (Externally acquired)</h3>
<ul>
<li><strong>Market Data:</strong> Data purchased from market research firms, financial data services, and economic analysts.</li>
<li><strong>Competitive Intelligence:</strong> Data about competitors&#8217; activities, sourced from legal and ethical intelligence services.</li>
<li><strong>Global and Regional Regulations:</strong> Updates on regulations that impact various aspects of business, sourced from regulatory bodies and legal services.</li>
</ul>
</ul>
<h2>Integration and Security</h2>
<ul>
<li><strong>Hybrid Cloud Infrastructure:</strong> To ensure scalability and security, leveraging both private and public cloud services.</li>
<li><strong>Advanced Cybersecurity Measures:</strong> Use of encryption, anomaly detection, and AI-driven threat intelligence systems to protect sensitive data and operations.</li>
</ul>
<h2>Ethics and Compliance</h2>
<ul>
<li><strong>AI Ethics Board:</strong> Establishment of a board to oversee ethical AI use, ensuring that Synthia operates under strict ethical guidelines to avoid biases and respect privacy.</li>
<li><strong>Regulatory Compliance Monitoring:</strong> Automated systems to keep up with global regulations and ensure compliance.</li>
</ul>
<h2>Should You Wait For AGI?</h2>
<p>Synthia represents a first pass at an additive theory of AI integration. The strategy is to start with tested individual tools and build your synthetic employee as if you were building with Lego bricks. Though primitive by comparison to what&#8217;s coming, a system like this has the potential to dramatically enhance operational efficiency, decision-making speed (and accuracy), and customer satisfaction. Are you ready to start the process? We&#8217;re here to help.</p>
<p class="text-6 text-left"><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/04/building-your-first-synthetic-employee/">Building Your First Synthetic Employee</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2024/04/building-your-first-synthetic-employee/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
		<item>
		<title>CES 2024 Day 0</title>
		<link>https://shellypalmer.com/2024/01/ces-2024-day-0/</link>
					<comments>https://shellypalmer.com/2024/01/ces-2024-day-0/#respond</comments>
		
		
		<pubDate>Sun, 07 Jan 2024 14:15:34 +0000</pubDate>
				<category><![CDATA[Advertising & Marketing]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Blockchain]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[Cyber Security]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Electronics]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[Management]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[Mobile & Wireless]]></category>
		<category><![CDATA[Production]]></category>
		<category><![CDATA[Radio]]></category>
		<category><![CDATA[Tech Biz]]></category>
		<category><![CDATA[Techno-politics]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=200101</guid>

					<description><![CDATA[<p>Technology is meaningless unless it changes the way we behave. So far, I’ve found some super interesting new tech that I will write about after the embargoes are lifted and CES is officially underway.</p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/01/ces-2024-day-0/">CES 2024 Day 0</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p class="text-6 text-left">Illustration created by DALL-E with the prompt &#8220;A photorealistic image of thousands of people in the central corridor of the Las Vegas Convention Center attending the annual CES show. Aspect Ratio 16&#215;9&#8221;</p>
<p>&nbsp;</p>
<p>CES starts Tuesday morning, but for me Day 0 was yesterday. I always get to Las Vegas a few days early and use the time (before the show officially opens) to walk the show floor and search for technologies that meet my, &#8220;that&#8217;s interesting&#8221; threshold. My filter is pretty simple: Technology is meaningless unless it changes the way we behave. So far, I&#8217;ve found some super interesting new tech that I will write about after the embargoes are lifted and CES is officially underway. Until then, I thought it would be helpful to summarize what I&#8217;m expecting to see.</p>
<p>For those of you who don&#8217;t know what CES is, it used to be called the Consumer Electronics Show, but times have changed, so now it&#8217;s just called CES. The Consumer Technology Association (which used to be the Consumer Electronics Association) stages the show. It is, by far, the largest and most important tech show in the US. The stats will be public after the show, but about 140,000-150,000 people are expected, there&#8217;s roughly 2 million square feet of exhibit space and approximately 3,500 exhibitors. The show is huge &#8211; which is why we&#8217;ve been offering Executive Briefings and Floor Tours since 1996.</p>
<p>I&#8217;ll be hosting some extraordinary fireside chats which you can learn more about at <a href="https://shellypalmer.com/events/ces-2024-shelly-palmer-events/" rel="noopener" target="_blank">ces.shellypalmer.com</a>. My guests will include <strong>Linda Yaccarino</strong>, CEO of <strong>X</strong>; the one and only <strong>Mark Cuban</strong>; and the legendary <strong>will.i.am</strong>. <a href="https://shellypalmer.com/events/ces-2024-shelly-palmer-events/" rel="noopener" target="_blank">Request your invitations here</a>.</p>
<p>Now, a quick overview of what&#8217;s coming. </p>
<h2>AI</h2>
<p>First, it&#8217;s AI everything. Which probably goes without saying. AI&#8217;s influence is evident across all sectors at CES 2024. From AI-powered grills to laptops with dedicated AI keys, the integration of AI into consumer technology is more pronounced than ever.</p>
<h2>TV</h2>
<p>TV tech continues to be a cornerstone of CES. LG has already announced its 2024 lineup, focusing on AI processing enhancements rather than major panel upgrades. Samsung, known for its trailblazing approach, may have a few surprises in store. Expect AI to play a significant role in optimizing picture and sound quality.</p>
<h2>Auto</h2>
<p>Despite the absence of some major auto players, CES 2024 will still feature significant auto-focused announcements. Honda is set to unveil a new &#8220;global EV series,&#8221; and other major car companies like Mercedes-Benz, Hyundai, and BMW are expected to share their latest developments. Intel and Amazon, as emerging auto suppliers, will also be showcasing new products. And, the Sony/Honda Afeela has its own exhibit space this year.</p>
<h2>Computing</h2>
<p>Better chips and bigger laptops are a trend. With Intel&#8217;s new Meteor Lake chips, a plethora of laptops sporting updated processors will be on display. Expect to see innovative designs and experiments in laptop technology, possibly including larger display sizes, following Apple&#8217;s trend with 14- and 16-inch models.</p>
<h2>Smart Home</h2>
<p>AI is also the buzzword for smart home this year. Samsung and LG are leading the charge with AI-powered appliances and home robots. The focus will be how AI can simplify and enhance the smart home experience.</p>
<h2>Gaming</h2>
<p>For gamers, NVIDIA is expected to announce its first RTX 40-series Super cards, and AMD might launch its RX 7600 XT GPU. OLED monitors for PC gaming, with improved refresh rates, will be a highlight. Handheld gaming PCs might also see new entrants, competing with Valve&#8217;s popular Steam Deck.</p>
<h2>Phones</h2>
<p>While major phone announcements are expected post-CES, we might still see glimpses into new phones with Qualcomm&#8217;s Snapdragon 8 Gen 3 chips, emphasizing generative AI.</p>
<h2>Health, Wearables &#038; Experiential</h2>
<p>Lastly, in recent years, CES has become a platform for unconventional wearable ideas. Withings will likely showcase its vision for telehealth&#8217;s future, and smart rings are poised to gain more attention. Over-the-counter hearing aids and AR smart glasses concepts are also anticipated.</p>
<h2>The Center of the Tech Universe</h2>
<p>CES is always my favorite place to be in January. I&#8217;m excited to be here in Las Vegas and equally excited to share what I learn with you. Stay tuned. I&#8217;ll be doing frequent updates here and on social media as well.</p>
<p class="text-6 text-left"><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2024/01/ces-2024-day-0/">CES 2024 Day 0</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2024/01/ces-2024-day-0/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
		<item>
		<title>AI Music Generators</title>
		<link>https://shellypalmer.com/2023/11/ai-music-generators/</link>
					<comments>https://shellypalmer.com/2023/11/ai-music-generators/#respond</comments>
		
		
		<pubDate>Sun, 12 Nov 2023 12:53:09 +0000</pubDate>
				<category><![CDATA[Advertising & Marketing]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Blog]]></category>
		<category><![CDATA[Data Science]]></category>
		<category><![CDATA[Media & Entertainment]]></category>
		<category><![CDATA[Production]]></category>
		<category><![CDATA[Radio]]></category>
		<guid isPermaLink="false">https://shellypalmer.com/?p=198995</guid>

					<description><![CDATA[<p>I just started working on a vast music project that is screaming to be done in partnership with an AI co-arranger, co-orchestrator, co-audio engineer, co-musician(s), co-producer, along with fully automated mixing, mastering, and packaging for distribution. Do all of these functions exist in one AI platform? Here's a list of notable AI music generators:</p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2023/11/ai-music-generators/">AI Music Generators</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></description>
										<content:encoded><![CDATA[<p class="text-6 text-left">Illustration created by DALL-E with the prompt &#8220;A 16&#215;9 painting combining all of the functions of generative AI music production (text-to-music) including co-arranger, co-orchestrator, co-audio engineer, co-musician(s), co-producer, along with fully automated mixing, mastering, and packaging for distribution. colorful, beautiful, welcoming, fascinating, bright, dreamy&#8221;</p>
<p>&nbsp;</p>
<p>I just started working on a vast music project that is screaming to be done in partnership with an AI co-arranger, co-orchestrator, co-audio engineer, co-musician(s), co-producer, along with fully automated mixing, mastering, and packaging for distribution. Do all of these functions exist in one AI platform? Not at a level that I would deem capable of producing music on par with work done by human beings using state of the art tools.</p>
<p>That said, I&#8217;ve been paying close attention to several AI music tools over the past few years – carefully monitoring AI&#8217;s infusion into each aspect of the music realization process – the journey music goes through from being heard only in one&#8217;s mind&#8217;s ear to being heard and felt by others.</p>
<p>Because of AI&#8217;s new-found popularity (and some advances in consumer access to the technology) new apps seem to surface daily. So, I&#8217;m asking the hive mind to help me research the best of the best AI music generators &#8211; <a href="/contact-us/" rel="noopener" target="_blank">click here</a> to make a suggestion.</p>
<p>If you&#8217;re wondering what AI music generators are, you can think of them as sophisticated tools that create music using various technologies, including neural networks and machine learning algorithms. These generators can produce unique compositions or emulate existing styles, with some offering instant music generation and others requiring pre-training on music datasets. They offer a range of functionalities to suit both novices and professionals, streamlining the music creation process and fostering innovation in the field. Here&#8217;s an alphabetical list of notable (pardon the pun) AI music generators:</p>
<p><a href="https://www.aimi.fm/" rel="noopener" target="_blank">Aimi</a> offers a unique, interactive music player that allows users to directly influence the composition of music in real-time, providing a personalized music experience. Additionally, Aimi Studio leverages AI for music creation, aiding both novice and experienced creators in transforming their musical ideas into reality with ease and creative liberty. </p>
<p><a href="https://www.aiva.ai/" rel="noopener" target="_blank">AIVA</a> has been recognized for its capability to compose music for a range of media, including games and films. It simplifies the creation process by offering preset styles and allows for modifications to existing tracks, bypassing complex licensing issues.</p>
<p><a href="https://algonaut.audio/" rel="noopener" target="_blank">Algonaut Atlas 2</a> analyzes the user&#8217;s music library to produce new compositions, equipped with a library of sounds and the ability to create MIDI files.</p>
<p><a href="https://amadeuscode.com/app/en" rel="noopener" target="_blank">Amadeus Code</a> is an iOS app that allows for quick melody composition, drawing from chord progressions of famous songs to inspire new creations. It supports gesture-based composition and exports to audio editing software, with payment required for song retention.</p>
<p><a href="https://www.audiocipher.com/" rel="noopener" target="_blank">AudioCipher</a> is a word-to-MIDI music generator that enables users to type in a phrase and instantly create melodies and chord progressions. It helps break through creative blocks by offering a variety of scales, chords, and rhythms to generate unlimited musical variations.</p>
<p><a href="https://www.bandlab.com/songstarter" rel="noopener" target="_blank">BandLab SongStarter</a> is a creative platform for music composition, offering collaboration features and a developer API for integration into other software.</p>
<p><a href="https://www.beatoven.ai/" rel="noopener" target="_blank">Beatoven.AI</a> simplifies the process for content creators to generate royalty-free background music using AI. Users can choose from various genres and moods, make cuts to match content changes, and allow the AI to compose unique tracks, enhancing content creation with personalized music.</p>
<p><a href="https://boomy.com/" rel="noopener" target="_blank">Boomy</a> is a unique tool that allows for quick song creation, offering users the opportunity to earn streaming revenue. It generates complete songs based on selected criteria, with each user receiving a personalized profile for optimized music creation.</p>
<p><a href="https://www.brain.fm/" rel="noopener" target="_blank">Brain.fm</a> provides music designed to improve focus, relaxation, and sleep, using AI to tailor compositions to the listener&#8217;s brainwaves and desired cognitive state.</p>
<p><a href="https://covers.ai/" rel="noopener" target="_blank">Covers.AI</a> is a versatile AI music generator platform, offering an easy-to-use AI voice generator that allows users to create songs effortlessly. Users can simply sing, and the AI handles the technical aspects, transforming their input into a new song in minutes. The platform also enables the creation of AI covers using a wide array of voices from famous personalities and fictional characters.</p>
<p><a href="https://ecrettmusic.com/" rel="noopener" target="_blank">Ecrett Music</a> is designed for ease of use, offering a vast selection of scenes, moods, and genres. It provides a royalty-free music generator, ensuring hassle-free music creation for videos and games.</p>
<p><a href="https://endel.io/" rel="noopener" target="_blank">Endel</a> is a neuroscience-backed technology that generates personalized soundscapes to aid in focus, relaxation, and sleep. Its patented system adapts in real-time to various inputs like the time of day, weather, heart rate, and location, effectively improving focus and reducing stress through its responsive sound environments.</p>
<p><a href="https://openai.com/research/jukebox" rel="noopener" target="_blank">Jukebox</a> from OpenAI is capable of generating lyrics and melodies in various styles, offering a versatile tool for music creation.</p>
<p><a href="https://www.kits.ai/" rel="noopener" target="_blank">Kits.AI</a> offers a diverse palette of AI voices, allowing users to craft demos and vocal harmonies with artist-like precision. It provides a platform for easy conversion of audio into various AI voices, the creation of custom voice models, realistic speech generation, and the ability to split songs into vocals and instrumentals for remixing.</p>
<p><a href="https://www.lemonaide.ai/" rel="noopener" target="_blank">Lemonaide Music</a> features Lemonaide Seeds, a tool for generating melodic ideas, aimed at pushing musicians out of their comfort zones and seeding new hits. It allows users to generate melody and chords or melodies using AI, with the ability to drag and drop generated files into a Digital Audio Workstation (DAW) and influence AI output by selecting keys and moods.</p>
<p><a href="https://www.loudly.com/aimusicstudio" rel="noopener" target="_blank">Loudly</a> allows for the quick creation of custom audio tracks, combining machine learning with music theory for a variety of projects.</p>
<p><a href="https://magenta.tensorflow.org/studio/" rel="noopener" target="_blank">Magenta Studio</a> is an Ableton Live plugin built on Magenta’s open source tools and models. They use cutting-edge machine learning techniques for music generation.</p>
<p><a href="https://melobytes.com/en/app/ai_melobytes_song" rel="noopener" target="_blank">Melobytes</a> is a creative tool for generating music tracks, offering an intuitive interface and compatibility with multiple operating systems.</p>
<p><a href="https://openai.com/research/musenet" rel="noopener" target="_blank">MuseNet</a> by OpenAI is a MIDI generator that creates compositions in various styles. Based on the GPT-2 model, it can be trained to produce music in specific genres and is compatible with standard MIDI players.</p>
<p><a href="https://musicfy.lol/" rel="noopener" target="_blank">Musicfy</a> is an AI-based music creation tool that enables users to create music using their voice and leverages advanced AI technology to enhance musical production. It is designed to save time, streamline collaboration, and align artistic vision, making the music-making process more efficient and inspired.</p>
<p><a href="https://ai.honu.io/papers/musicgen/" rel="noopener" target="_blank">MusicGen</a> is a unique AI music generation tool that utilizes a single-stage transformer Language Model to process multiple streams of music data efficiently. It can generate high-quality music samples based on textual descriptions or melodic features, outperforming standard models in empirical evaluations.</p>
<p><a href="https://www.orbplugins.com/" rel="noopener" target="_blank">ORB Composer</a> assists composers in generating new music across different genres. It offers a user-friendly interface with a range of editing tools and sound libraries.</p>
<p><a href="https://www.audoir.com/" rel="noopener" target="_blank">Quick Lyrics AI</a> generates song lyrics using AI, providing a resource for songwriters to inspire or assist in lyric creation, with attention to copyright considerations.</p>
<p><a href="https://soundraw.io/" rel="noopener" target="_blank">Soundraw</a> combines AI with manual tools to offer a platform for creating and customizing music. It features a song generator accessible to all, with unlimited downloads available via subscript.</p>
<p><a href="https://soundry.ai/" rel="noopener" target="_blank">Soundry AI</a> offers text-to-sample generation for music producers, sound designers, and songwriters. It boasts an AI text-to-sound generator capable of creating studio-quality audio samples, providing unlimited musical variations, faster production than traditional sound design, and royalty-free samples.</p>
<p><a href="https://newsroom.spotify.com/2023-02-22/spotify-debuts-a-new-ai-dj-right-in-your-pocket/" rel="noopener" target="_blank">The DJ</a> offers a personalized music listening experience on Spotify. This AI guide, currently in beta, understands users&#8217; music tastes to such an extent that it can autonomously select songs to play.</p>
<p><a href="https://www.uberduck.ai/" rel="noopener" target="_blank">Uberduck.AI</a> specializes in synthetic singing and rapping vocals, aimed at creative agencies, musicians, and coders. It features an AI rap generator where users can choose beats, generate or write lyrics, select a voice, and create a rap song that can be downloaded as audio or video. The platform is used by DJs and iconic brands worldwide.</p>
<p><a href="https://www.voicify.ai/" rel="noopener" target="_blank">Voicify AI</a> is a leading platform specialized in creating high-quality AI covers of songs using a vast array of over 3000 AI voice models. This platform allows users to instantly generate AI covers with their favorite voices, making it a go-to choice for experiencing the future of music through AI voice cloning. It boasts regular platform updates, lifetime conversion history access, and secure data handling, serving a community of over one million users​.</p>
<p><a href="https://wave-ai.net/" rel="noopener" target="_blank">Wave AI</a> has two interesting tools: LyricStudio designed to help you write song lyrics and MelodyStudio which helps with songwriting.</p>
<p class="text-6 text-left"><em><strong>Author’s note:</strong> This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.</em></p>
<p>The post <a rel="nofollow" href="https://shellypalmer.com/2023/11/ai-music-generators/">AI Music Generators</a> originally appeared here on <a rel="nofollow" href="https://shellypalmer.com">Shelly Palmer</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://shellypalmer.com/2023/11/ai-music-generators/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<dc:creator>shelly@palmer.net (Shelly Palmer)</dc:creator></item>
	</channel>
</rss>