<?xml version="1.0" encoding="UTF-8" standalone="no"?><rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" version="2.0">
  <channel>
    <title>VP Land (FKA Coffee &amp; Celluloid)</title>
    <description>News, trends, and insights on the latest creative technology</description>
    
    <link>https://www.vp-land.com/</link>
    <atom:link href="https://rss.beehiiv.com/feeds/pQ7FTxHZ8E.xml" rel="self"/>
    
    <lastBuildDate>Thu, 14 May 2026 03:15:37 +0000</lastBuildDate>
    <pubDate>Mon, 11 May 2026 05:25:28 +0000</pubDate>
    <atom:published>2026-05-11T05:25:28Z</atom:published>
    <atom:updated>2026-05-14T03:15:37Z</atom:updated>
    
      <category>Film</category>
      <category>Media</category>
      <category>Technology</category>
    <copyright>Creative Commons Attribution-Share Alike</copyright>
    
    
    
    <docs>https://www.rssboard.org/rss-specification</docs>
    <generator>beehiiv</generator>
    <language>en-us</language>
    <webMaster>support@beehiiv.com (Beehiiv Support)</webMaster>

      <itunes:explicit>no</itunes:explicit><itunes:image href="https://newterritory.media/wp-content/uploads/2021/01/CC-Podcast-1.png"/><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords><itunes:summary>Coffee and Celluloid is a blog and podcast exploring film, filmmaking, and new media, looking at both the final product and the process of getting there. &#13;
&#13;
This podcast features interviews, Q&amp;As, and conversations with independent filmmakers.&#13;
&#13;
Hosted by Joey Daoud, Cherie Saulter, Carlos Rivera, and Andrew Hevia.</itunes:summary><itunes:subtitle>Freshly brewed podcast on film, photography, and all things visual</itunes:subtitle><itunes:category text="TV &amp; Film"/><itunes:category text="Arts"><itunes:category text="Visual Arts"/></itunes:category><itunes:category text="Society &amp; Culture"><itunes:category text="Places &amp; Travel"/></itunes:category><itunes:category text="Society &amp; Culture"><itunes:category text="Personal Journals"/></itunes:category><itunes:category text="Arts"><itunes:category text="Design"/></itunes:category><itunes:author>Coffee and Celluloid</itunes:author><itunes:owner><itunes:email>podcast@newterritory.media</itunes:email><itunes:name>Coffee and Celluloid</itunes:name></itunes:owner><item>
  <title>Multi-Camera Virtual Projection Stage Goes Tracking-Agnostic at NAB 2026</title>
  <description></description>
      <enclosure length="43522589" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/5d3385a2-a9bc-401e-92a4-6549d5538aa0/VPL_NAB_2026_-_Christie_-_Edited.png"/>
  <link>https://www.vp-land.com/p/multi-camera-virtual-projection-stage-goes-tracking-agnostic-at-nab-2026</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/multi-camera-virtual-projection-stage-goes-tracking-agnostic-at-nab-2026</guid>
  <pubDate>Thu, 07 May 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-05-07T07:00:00Z</atom:published>
    <category><![CDATA[Camera Tracking]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Ar &amp; Xr]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Christie Digital is showing a</span><a class="link" href="https://www.youtube.com/watch?v=qQoLaeklCRg&utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=multi-camera-virtual-projection-stage-goes-tracking-agnostic-at-nab-2026" target="_blank" rel="noopener noreferrer nofollow"> multi-camera virtual projection system</a><span style="color:rgb(29, 31, 37);"> at NAB 2026 that runs multiple camera tracking solutions at once, positioning projection-based backgrounds as an alternative to LED walls for in-camera visual effects.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Key points:</b></span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Christie Sapphire projectors drive the system in place of LED panels.</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Multiple camera tracking solutions run simultaneously, creating an agnostic workflow.</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Vizrt and WePlay Studios are partners on the integration.</span></p></li></ul><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/qQoLaeklCRg" width="100%"></iframe><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">We previewed Christie&#39;s</span><a class="link" href="https://www.vp-land.com/p/nab-preview-virtual-projection?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=multi-camera-virtual-projection-stage-goes-tracking-agnostic-at-nab-2026" target="_blank" rel="noopener noreferrer nofollow"> virtual projection system ahead of NAB</a><span style="color:rgb(29, 31, 37);"> alongside Kino Flo, ASSIMILATE Live FX, and Sim-Plates. The NAB 2026 booth demo expands that earlier preview with a working multi-camera tracking workflow and a clearer picture of where projection fits next to existing LED stage pipelines.</span></p><h3 class="heading" style="text-align:left;" id="pulling-focus-on-the-workflow"><span style="color:rgb(29, 31, 37);">Pulling Focus on the Workflow</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Chris Barnett, Virtual Projection Architect at Christie, walked through how the dual-camera tracking workflow lets two cameras pull from the same projected environment without forcing the production into a single tracking vendor.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">According to Barnett, the system supports multiple tracking solutions simultaneously rather than locking productions into one stack. That makes it compatible with existing virtual production pipelines, so teams using their preferred tracking, engine, and rendering tools can plug projection in without rebuilding the workflow. The demo runs in partnership with Vizrt and WePlay Studios.</span></p><h3 class="heading" style="text-align:left;" id="color-and-moir"><span style="color:rgb(29, 31, 37);">Color and Moiré</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Because the camera is photographing projected light rather than a grid of pixels,</span><a class="link" href="https://www.christiedigital.com/solutions/virtual-production/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=multi-camera-virtual-projection-stage-goes-tracking-agnostic-at-nab-2026" target="_blank" rel="noopener noreferrer nofollow"> Christie&#39;s virtual projection setup</a><span style="color:rgb(29, 31, 37);"> sidesteps the moiré patterns that can show up when a camera sensor&#39;s pixel grid interacts with an LED wall&#39;s pixel grid.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Barnett also points to color accuracy as a reason productions are evaluating projection. Without the discrete sub-pixel structure of an LED panel in frame, the captured image reads closer to a traditional photographed background, which matters for skin tones and fine detail in close-ups.</span></p><h3 class="heading" style="text-align:left;" id="stacking-and-screen-gain"><span style="color:rgb(29, 31, 37);">Stacking and Screen Gain</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The system uses projector stacking together with screen gain as part of how it puts a projected background on camera.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Barnett frames stacking and screen gain as design choices the stage makes to deliver a usable on-camera image, with the specific configuration tuned to the room and the shot rather than fixed across every setup.</span></p><h3 class="heading" style="text-align:left;" id="space-and-projection-types"><span style="color:rgb(29, 31, 37);">Space and Projection Types</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Barnett also covers the space requirements for projection-based stages and how different projection types fit different stage geometries, which is the practical question for any facility weighing whether to add projection alongside or instead of LED.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">That tradeoff matters because projection stages have different throw-distance and ceiling needs than LED volumes. The session frames it as a choice productions make based on the stage they have and the look they need, not a wholesale replacement of one technology with the other.</span></p><h3 class="heading" style="text-align:left;" id="paramount-use-cases"><span style="color:rgb(29, 31, 37);">Paramount Use Cases</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Barnett identifies </span><a class="link" href="https://www.christiedigital.com/about/display-technology/advanced-and-premium-large-format-cinema/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=multi-camera-virtual-projection-stage-goes-tracking-agnostic-at-nab-2026" target="_blank" rel="noopener noreferrer nofollow">Paramount</a><span style="color:rgb(29, 31, 37);"> as a customer with use cases for the multi-camera virtual projection system, citing the studio in the chapter on early adoption.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Arc Eye's Modular Mobile Stand Collapses the Dome Scanner Into a Flat-Pack Rig</title>
  <description></description>
      <enclosure length="13985656" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/5b1b9484-6ef5-4243-9ac4-bfc683398e95/VPL_NAB_2026_-_Arc_Eye_-_Edited.png"/>
  <link>https://www.vp-land.com/p/arc-eye-s-modular-mobile-stand-collapses-the-dome-scanner-into-a-flat-pack-rig</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/arc-eye-s-modular-mobile-stand-collapses-the-dome-scanner-into-a-flat-pack-rig</guid>
  <pubDate>Thu, 07 May 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-05-07T07:00:00Z</atom:published>
    <category><![CDATA[Gear]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Arc Eye CEO Christian Cicerone walked us through the company&#39;s</span> <a class="link" href="https://www.arceye.com?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=arc-eye-s-modular-mobile-stand-collapses-the-dome-scanner-into-a-flat-pack-rig" target="_blank" rel="noopener noreferrer nofollow">new modular mobile capture rig</a><span style="color:rgb(29, 31, 37);"> at NAB, a battery-powered alternative to the company&#39;s large dome scanner that breaks into three pieces and flat-packs for travel. Each stand carries six cameras and six lights on 80/20 aluminum extrusion rails, and multiple stands arrange in a circle for full-body or head scans while streaming wirelessly in real time.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/aefRB4zzPNU" width="100%"></iframe><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">We covered Arc Eye&#39;s</span><a class="link" href="https://www.vp-land.com/p/arc-eye-s-network-connected-camera-system-simplifies-3d-scanning-workflow?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=arc-eye-s-modular-mobile-stand-collapses-the-dome-scanner-into-a-flat-pack-rig" target="_blank" rel="noopener noreferrer nofollow"> network-connected scanning workflow</a><span style="color:rgb(29, 31, 37);"> at NAB 2025, when the company was still anchored to its fixed dome system. The new rig pushes that same workflow into a portable form factor.</span></p><h3 class="heading" style="text-align:left;" id="why-the-dome-was-limiting"><span style="color:rgb(29, 31, 37);">Why the Dome Was Limiting</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The original Arc Eye scanner is a large, fixed dome, which works well in a controlled studio but is impractical to ship or set up on location.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">According to Cicerone, the new modular stand was designed to collapse small enough to fit through a standard</span> man door<span style="color:rgb(29, 31, 37);"> and break into three pieces for transport. That makes the system viable for productions that need to scan talent on set rather than fly performers to a dedicated studio.</span></p><h3 class="heading" style="text-align:left;" id="modular-stand-configuration"><span style="color:rgb(29, 31, 37);">Modular Stand Configuration</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Each stand mounts six cameras and six lights on 80/20 aluminum extrusion, and the</span> camera-to-light ratio<span style="color:rgb(29, 31, 37);"> is fully reconfigurable depending on the job.</span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Body or group scans. 10 to 16 stands arranged in a circle.</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Head scans. 3 to 8 stands with the seated subject&#39;s head at the center curve.</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Per-stand mix. Configurable from three cameras with twelve lights up to all-camera setups.</span></p></li></ul><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Cicerone said the smallest practical face-scan kit is three stands plus three V-mount batteries, which a single operator can roll in and have running in 10 to 15 minutes.</span></p><h3 class="heading" style="text-align:left;" id="battery-power-and-sync"><span style="color:rgb(29, 31, 37);">Battery Power and Sync</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Power and sync run through Arc Eye&#39;s Wolf Power Sync unit, which uses</span> 6-pin Molex cables<span style="color:rgb(29, 31, 37);"> to deliver both the sync signal and power to the cameras and lights from a V-mount battery.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Combo cables are also available, pairing the power and sync line with a network connection on a single run. The wireless path uses Wi-Fi back to a MacBook Air for live view and capture, and an optional 10 gigabit wired connection from the laptop to the rig handles the fastest data offload when the production can run cable.</span></p><h3 class="heading" style="text-align:left;" id="two-rigs-one-sync-backbone"><span style="color:rgb(29, 31, 37);">Two Rigs, One Sync Backbone</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The mobile setup Cicerone demoed, the Wolf One rig, runs on battery with either wireless or wired control, while a second rig is hardwired only and built around full 10Gb throughput. Both rigs share the</span> same Power Sync module<span style="color:rgb(29, 31, 37);"> and cabling, so cameras swap between them without rewiring.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The Wolf One was demoed with Sony Alpha 9 III bodies, which use a global shutter and shoot up to 100 to 120 fps. Cicerone described capturing roughly 30 seconds of 6K RAW burst frames at 30 fps as the basis for what he called 4D capture, where the subject is recorded across time rather than a single frozen instant.</span></p><h3 class="heading" style="text-align:left;" id="global-shutter-vs-61-mp-stills"><span style="color:rgb(29, 31, 37);">Global Shutter vs 61MP Stills</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Both rigs can be populated with either the Sony 9 III for moving subjects or</span> a 61 MP rolling-shutter body<span style="color:rgb(29, 31, 37);"> when the priority is highest-resolution stills.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">According to Cicerone, the global-shutter cameras suit moving subjects and 4D moments, while the 61MP rolling-shutter option is the choice when the team wants the densest possible still scan. He framed the 61MP path as a setup well-suited to feeding 4D Gaussian splatting pipelines.</span></p><h3 class="heading" style="text-align:left;" id="live-capture-workflow"><span style="color:rgb(29, 31, 37);">Live Capture Workflow</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">With the rig connected to a MacBook Air over Wi-Fi, frames stream into the Arc Eye capture app for real-time review during the shoot.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">That live view lets the operator confirm coverage before the talent leaves the volume, which is the same workflow Arc Eye built for the dome system, now running off battery in a footprint a single van can carry.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>SpliceGeist's On-Prem AI Cuts Feature Film Conforms From 45 Hours to 45 Minutes</title>
  <description></description>
      <enclosure length="51217560" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/b4f66c48-ab85-41e3-8ba3-7db7374dbd50/VPL_NAB_2026_-_SpliceGeist_-_Edited.png"/>
  <link>https://www.vp-land.com/p/splicegeist-s-on-prem-ai-cuts-feature-film-conforms-from-45-hours-to-45-minutes</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/splicegeist-s-on-prem-ai-cuts-feature-film-conforms-from-45-hours-to-45-minutes</guid>
  <pubDate>Thu, 07 May 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-05-07T07:00:00Z</atom:published>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Utility Ai]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">At NAB 2026, we sat down with Kit Lubold, President of Mountaintop LLC, to talk through</span><a class="link" href="https://www.splicegeist.com/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=splicegeist-s-on-prem-ai-cuts-feature-film-conforms-from-45-hours-to-45-minutes" target="_blank" rel="noopener noreferrer nofollow"> SpliceGeist</a><span style="color:rgb(29, 31, 37);">, an on-premise AI system handling the technical grunt work of post-production conform and onlining. The pitch is narrow on purpose: stay out of creative decisions, but automate every clip-matching, timecode-fixing, missing-media task that eats finishing hours.</span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>On-prem hardware</b></span><span style="color:rgb(29, 31, 37);"> with optional air-gap or metadata-only cloud sync</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>AAF and XML parsing</b></span><span style="color:rgb(29, 31, 37);"> that flags broken timecode, missing media, and reel-name mismatches</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Frame-by-frame visual matching</b></span><span style="color:rgb(29, 31, 37);"> against the reference QuickTime to validate every shot</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Real example</b></span><span style="color:rgb(29, 31, 37);">: a feature conform that took 40 to 45 hours of human work compressed to roughly 45 minutes</span></p></li></ul><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/0cPti5xhEBM" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="conform-pain-points"><span style="color:rgb(29, 31, 37);">Conform Pain Points</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Lubold frames SpliceGeist as a sidekick that runs the moment a finishing team is ready to online.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The system targets the failure modes that derail conforms: clips that won&#39;t show in Resolve, timecode that starts at zero, filename mismatches between editorial and camera originals, and audio track counts that don&#39;t line up. According to Lubold, &quot;We stay out of the creative stuff,&quot; meaning anything needing an artist&#39;s eye gets flagged for a human while the automation grinds through validation. That split mirrors the broader pattern across NAB 2026, where we covered</span><a class="link" href="https://www.vp-land.com/p/avid-media-composer-adds-gemini-ai-through-multi-year-google-cloud-partnership?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=splicegeist-s-on-prem-ai-cuts-feature-film-conforms-from-45-hours-to-45-minutes" target="_blank" rel="noopener noreferrer nofollow"> Avid&#39;s Gemini integration</a><span style="color:rgb(29, 31, 37);">.</span></p><h3 class="heading" style="text-align:left;" id="on-prem-and-air-gapped-by-design"><span style="color:rgb(29, 31, 37);">On-Prem and Air-Gapped by Design</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Mountaintop ships SpliceGeist as dedicated hardware so studios can keep media inside the facility.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The box lives in a facility, studio, or offline bay with a portal operators access locally. Customers can fully air-gap the unit, or pair it with a cloud sync that carries only metadata. SpliceGeist never handles media off-prem; it only ingests the AAF, XML, CDL, FDL, and LUT files it needs to plan the conform.</span></p><h3 class="heading" style="text-align:left;" id="aaf-parsing-and-the-flag-list"><span style="color:rgb(29, 31, 37);">AAF Parsing and the Flag List</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Onboarding feeds the system enough context to spot what doesn&#39;t belong before a Resolve project opens.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Setup runs in three stages: project identifiers (title, code, client, studio, production company); the technical spine (camera count and models, capture resolution, framing intent, framing chart, master output resolution and color space, HDR container details such as a 2020 wrapper mastered on P3D65); and editorial details, including which NLE was used and whether the conform is an initial pass, partial, drop-in, or fix.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Once the AAF lands, the parser does triage. In one project Lubold cited, a 1,900-clip timeline carried roughly 3,000 effects because the editor had stamped color effects on every clip. SpliceGeist stripped anything not relevant to conform or color, the cruft that &quot;makes Resolve angry,&quot; while preserving retimes and plugin work from tools like Sapphire. It surfaced 40 to 50 clips with missing or zeroed timecode and 69 needing validation after reel and clip names had been renamed in editorial.</span></p><h3 class="heading" style="text-align:left;" id="eyes-on-every-frame"><span style="color:rgb(29, 31, 37);">Eyes on Every Frame</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">When metadata fails, SpliceGeist falls back to visual matching against the reference QuickTime.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Plugging in a drive triggers a scan. If the system recognizes the project it starts working; if not, it emails the operator. It works across client drives, LTO restores, NAS, and SAN volumes, with throttling options so it doesn&#39;t constantly ping always-attached storage. After discovery, SpliceGeist builds the Resolve project, imports the AAF, and catalogs what it found. In Lubold&#39;s example, Resolve auto-linked roughly half the shots; the rest had broken reel names, scrambled timecode, or both, after a media-managed dump overwrote duplicates into a single folder.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">For unmatched clips, the system grabs frames and asks whether each one looks like the expected shot, ignoring framing differences. Once enough material is staged, it runs a full frame-by-frame pass against the reference QuickTime, hunting for folders named &quot;picref,&quot; &quot;refqt,&quot; or similar variants and dropping the reference into a viewer attached to the timeline. We covered the</span><a class="link" href="https://www.vp-land.com/p/davinci-resolve-21-adds-nine-ai-tools-for-voice-focus-and-face-editing-plus-a-photo-page-for-still-i?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=splicegeist-s-on-prem-ai-cuts-feature-film-conforms-from-45-hours-to-45-minutes" target="_blank" rel="noopener noreferrer nofollow"> DaVinci Resolve 21 AI tools</a><span style="color:rgb(29, 31, 37);"> that sit on the same foundation.</span></p><h3 class="heading" style="text-align:left;" id="hardware-and-the-local-model"><span style="color:rgb(29, 31, 37);">Hardware and the Local Model</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Mountaintop custom-builds each unit and runs a local model trained on Resolve itself.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Pricing is not fixed. SSD and RAM volatility pushes Mountaintop to spec each box per customer, sized to workload complexity. A facility doing minimal conform work may not need a high-end GPU; heavier shops get more horsepower. The local model was trained on Resolve, mixing API calls with learned behavior. A larger model in development serves as a &quot;phone-a-friend&quot; when the local one hits something it hasn&#39;t seen, what Lubold described as &quot;a bigger brain in a box&quot; that learns from operator input so the same problem doesn&#39;t get solved twice.</span></p><h3 class="heading" style="text-align:left;" id="feature-conform-results"><span style="color:rgb(29, 31, 37);">Feature Conform Results</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">On a real feature conform, a 40-hour human pass became a 45-minute automated one, and SpliceGeist flagged shots the original team had missed.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The system also confirmed that some raw camera files were permanently gone, casualties of an overwrite during media management. Mountaintop sells SpliceGeist as dedicated hardware sized to each customer&#39;s workload, with no fixed list price. </span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>AWS Brings Generative AI to Broadcast, Filmmaking, and Vibe Coding at NAB 2026</title>
  <description></description>
      <enclosure length="37108690" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/2329b1f7-748c-4982-bf1f-e498a244adcd/VPL_NAB_2026_-_AWS_-_Edited.png"/>
  <link>https://www.vp-land.com/p/aws-brings-generative-ai-to-broadcast-filmmaking-and-vibe-coding-at-nab-2026</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/aws-brings-generative-ai-to-broadcast-filmmaking-and-vibe-coding-at-nab-2026</guid>
  <pubDate>Fri, 01 May 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-05-01T07:00:00Z</atom:published>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Utility Ai]]></category>
    <category><![CDATA[Generative Ai]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Amazon Web Services used its NAB 2026 presence to demonstrate </span><a class="link" href="https://aws.amazon.com/media/nab-2026/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=aws-brings-generative-ai-to-broadcast-filmmaking-and-vibe-coding-at-nab-2026" target="_blank" rel="noopener noreferrer nofollow">how generative AI is moving into media workflows</a><span style="color:rgb(29, 31, 37);"> across broadcast, content creation, and software development. The company&#39;s booth on the show floor highlighted three distinct applications: real-time vertical video generation for live broadcasts, a hybrid filmmaking collaboration with Luma AI and the Wonder Project, and an agentic IDE called Kiro that lets non-developers build production tools through vibe coding.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The breadth of the demos reflected AWS&#39;s positioning not as a creative tool company, but as infrastructure that supports creative workflows from capture through distribution. Each demonstration targeted a different segment of the media and entertainment industry.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/aNE6lY5uTMo" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="elemental-inference-and-live-vertic"><span style="color:rgb(29, 31, 37);">Elemental Inference and Live Vertical Video</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">AWS launched </span><a class="link" href="https://aws.amazon.com/elemental-inference/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=aws-brings-generative-ai-to-broadcast-filmmaking-and-vibe-coding-at-nab-2026" target="_blank" rel="noopener noreferrer nofollow">Elemental Inference</a><span style="color:rgb(29, 31, 37);">, a service designed to help broadcasters and streamers generate vertical video from live content in real time. The company pointed to Gen Z consumption patterns: 88% of that demographic watches streaming content on mobile in vertical format, forcing traditional broadcasters to compete with the native formats of TikTok, YouTube Shorts, and Instagram.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">On the show floor, AWS demonstrated this with a basketball court setup in partnership with MLSE (owners of the Toronto Raptors). Attendees could shoot baskets while cameras captured their form, generated real-time biomechanics comparisons to Raptors players, and then produced a vertical video clip of their shots with sub-10-second latency. While the basketball demo was a show floor activation, the underlying capability targets a real production challenge: generating vertical, social-ready content from live horizontal broadcasts without manual editing.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The key differentiator from existing clipping tools, according to AWS, is the live component. VOD clipping and vertical reformatting already exist across several platforms. Generating vertical clips from live feeds with minimal latency is the technical challenge Elemental Inference addresses.</span></p><h3 class="heading" style="text-align:left;" id="hybrid-filmmaking-with-luma-ai-and-"><span style="color:rgb(29, 31, 37);">Hybrid Filmmaking with Luma AI and the Wonder Project</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">AWS announced a </span><a class="link" href="https://www.vp-land.com/p/wonder-project-and-luma-ai-launch-innovative-dreams-for-hybrid-film-ai-production?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=aws-brings-generative-ai-to-broadcast-filmmaking-and-vibe-coding-at-nab-2026" target="_blank" rel="noopener noreferrer nofollow">collaboration with Luma AI and Wonder Project</a><span style="color:rgb(29, 31, 37);">, the production company led by John Irwin (House of David). The partnership introduces what AWS calls &quot;real-time hybrid filmmaking,&quot; combining live performance capture with AI-generated visual effects and LED wall environments.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The concept allows actors to perform inside immersive digital environments rather than working against green screens. AWS describes the workflow as enabling scene creation in minutes rather than the hours or days traditional pre-visualization and virtual production setup require. The Wonder Project&#39;s involvement signals that the collaboration is aimed at scripted content production, not just corporate or event video.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">This is one of several recent efforts to bridge generative AI capabilities with traditional filmmaking workflows. The involvement of AWS as infrastructure, Luma AI as the generation layer, and a production company with streaming credits suggests a full-stack approach that goes beyond individual tool demonstrations.</span></p><h3 class="heading" style="text-align:left;" id="kiro-an-agentic-ide-for-media-profe"><span style="color:rgb(29, 31, 37);">Kiro: An Agentic IDE for Media Professionals</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">AWS showed </span><a class="link" href="https://kiro.dev?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=aws-brings-generative-ai-to-broadcast-filmmaking-and-vibe-coding-at-nab-2026" target="_blank" rel="noopener noreferrer nofollow">Kiro</a><span style="color:rgb(29, 31, 37);">, an agentic integrated development environment that allows non-developers to build software tools through natural language prompts. On the NAB show floor, AWS demonstrated a workflow where a non-developer who uses Adobe Premiere used Kiro to build a custom plugin in a single afternoon. The plugin added a side panel in Premiere that exports assets directly to an AWS media lake for content analysis.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The demonstration highlighted a broader trend: as creative workflows become more complex, the ability to build custom integrations without a dedicated engineering team becomes a practical advantage. AWS positions Kiro as a way for media professionals to extend the tools they already use rather than waiting for vendors to add features.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">According to AWS, more than half of the demonstrations on the NAB show floor were built using Kiro, with the tool handling code generation for demo infrastructure alongside its use in customer-facing workflows.</span></p><h3 class="heading" style="text-align:left;" id="from-enterprise-to-independent-prod"><span style="color:rgb(29, 31, 37);">From Enterprise to Independent Productions</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">AWS also announced that Fox Corporation signed on as its cloud provider of choice, covering content creation through distribution. But the company emphasized that the same infrastructure supports smaller productions. The combination of Kiro for custom tooling, Bedrock for AI model access, and Elemental for distribution is designed to scale from major studios to independent teams.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">For filmmakers and production teams evaluating cloud infrastructure, the NAB demonstrations showed AWS pushing beyond traditional cloud storage and compute into active participation in creative workflows. The question for potential customers is whether a general-purpose cloud platform can deliver specialized creative tools as effectively as purpose-built solutions.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>iodyne Brings Data Center Discipline to Edge Storage With Pro Data and Pro Mini</title>
  <description></description>
      <enclosure length="1180329" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/f60dd5f7-3014-4f85-a426-0b1e6fd93c22/VPL_NAB_2026_-_Iodyne_-_Edited.png"/>
  <link>https://www.vp-land.com/p/iodyne-brings-data-center-discipline-to-edge-storage-with-pro-data-and-pro-mini</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/iodyne-brings-data-center-discipline-to-edge-storage-with-pro-data-and-pro-mini</guid>
  <pubDate>Fri, 01 May 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-05-01T07:00:00Z</atom:published>
    <category><![CDATA[Hardware]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">iodyne is building </span><a class="link" href="https://iodyne.com?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=iodyne-brings-data-center-discipline-to-edge-storage-with-pro-data-and-pro-mini" target="_blank" rel="noopener noreferrer nofollow">storage hardware</a><span style="color:rgb(29, 31, 37);"> designed to bring the security, performance, and reliability of data center infrastructure to fast-moving production environments. At NAB 2026, the company showcased Pro Data, its multi-user direct-attached storage device, and announced Pro Mini, a solid-state portable drive that pairs RAID-6 redundancy with sustained 2-3 GB/s performance in an encrypted, iPhone-sized form factor.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The company&#39;s thesis is straightforward: production teams at the edge, whether on set, on location, or in a temporary post facility, face the same security and reliability requirements as headquarters, but without the infrastructure to support them. iodyne&#39;s products aim to close that gap with hardware that handles encryption, redundancy, and multi-container organization without requiring IT support.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/yX5BARMWA6w" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="pro-mini-solving-the-thermal-proble"><span style="color:rgb(67, 67, 67);">Pro Mini: </span><span style="color:rgb(29, 31, 37);">Solving the Thermal Problem in Portable SSDs</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The </span><a class="link" href="https://iodyne.com/promini/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=iodyne-brings-data-center-discipline-to-edge-storage-with-pro-data-and-pro-mini" target="_blank" rel="noopener noreferrer nofollow">Pro Mini</a><span style="color:rgb(29, 31, 37);"> addresses a fundamental limitation of portable solid-state drives: thermal throttling. When SSDs heat up under sustained workloads, they drop power and slow down. For production teams moving terabytes of footage or working with high-resolution media, that performance cliff makes most portable drives unreliable for anything beyond short transfers.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">iodyne&#39;s solution is the Frore AirJet, a solid-state cooling system that draws cool air in and pushes hot air out without moving parts. The result is sustained transfer speeds between 2,000 and 3,000 MB/s without thermal throttling. The drive is encrypted by default, includes a Find My circuit for tracking, and features a digital paper display that shows the drive&#39;s status and customizable labels.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The Pro Mini also runs RAID-6 redundancy internally, making it one of the first encrypted and RAID-rated portable SSDs available. While redundancy is not backup, as iodyne emphasizes, RAID-6 means the drive can tolerate two drive failures without data loss, providing a safety net for production teams who cannot afford downtime.</span></p><h3 class="heading" style="text-align:left;" id="pro-data-multi-user-workstation-sto"><span style="color:rgb(67, 67, 67);">Pro Data: </span><span style="color:rgb(29, 31, 37);">Multi-User Workstation Storage</span></h3><p class="paragraph" style="text-align:left;"><a class="link" href="https://iodyne.com/prodata/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=iodyne-brings-data-center-discipline-to-edge-storage-with-pro-data-and-pro-mini" target="_blank" rel="noopener noreferrer nofollow">Pro Data</a><span style="color:rgb(29, 31, 37);">, already shipping, supports up to four simultaneous users with direct-attached performance. It runs RAID-6 across its internal drives and allows hot-swappable replacement with a single screw. If a drive fails, the device rebuilds itself and continues operating. Even if a second drive fails, work can continue while the device signals degraded status.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The device supports multiple containers, logical partitions that act as separate drives for different projects, workflows, or systems. For facilities working across different platforms (Windows, macOS, Linux), a single Pro Data can present different containers formatted for each environment.</span></p><h3 class="heading" style="text-align:left;" id="production-use-cases"><span style="color:rgb(67, 67, 67);">Production Use Cases</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">iodyne outlined several workflows where portable, high-performance encrypted storage solves specific production problems.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Virtual production teams can load LED wall content onto a Pro Data or Pro Mini, run an entire stage session from the portable drive, and unplug it at the end of the day. No IP remains on studio servers overnight, addressing a security concern for productions working with sensitive set designs on shared stages.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">VFX artists working on studio lots can use a Pro Data as an edge cache on their workstation. Rather than upgrading a shared SAN (a six-to-eight-figure investment), artists work from the local storage for EXR files, Unreal Engine projects, or Maya scenes, then sync back to the central SAN at the end of the day. The encrypted device meets studio information security requirements, and iodyne has been audited by major studios including Disney and Warner Bros.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">For specialized environments like immersive venues, where teams need to finish content on-site across multiple proprietary systems, the multi-container approach lets production teams carry a single device with partitions for each system rather than multiple dedicated drives.</span></p><h3 class="heading" style="text-align:left;" id="software-updates-and-cross-platform"><span style="color:rgb(67, 67, 67);">Software Updates and Cross-Platform Support</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Because iodyne owns both the hardware and software stack, feature updates and performance improvements are delivered as software updates to existing devices. The company actively collects feedback from customers and partners, building requested features into periodic updates rather than releasing new hardware revisions.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Both Pro Data and Pro Mini are cross-platform, supporting USB connectivity that adapts to the host machine&#39;s capabilities. Older machines receive appropriately scaled performance, while current hardware gets full throughput.</span></p><h3 class="heading" style="text-align:left;" id="pricing-perspective"><span style="color:rgb(67, 67, 67);">Pricing Perspective</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">iodyne positions its products as a four-to-five-figure alternative to six-to-eight-figure SAN upgrades. For production teams that need high-performance encrypted storage but cannot justify a full infrastructure overhaul, the portable approach offers a different economic model, one where storage scales with the project rather than the facility.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Descript Adds an API and Agentic Automation to Its Text-Based Video Editor</title>
  <description></description>
      <enclosure length="57233071" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/90f9cdec-1c05-4e57-ba98-74458f004482/VPL_NAB_2026_-_Descript_-_Edited.png"/>
  <link>https://www.vp-land.com/p/descript-adds-an-api-and-agentic-automation-to-its-text-based-video-editor</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/descript-adds-an-api-and-agentic-automation-to-its-text-based-video-editor</guid>
  <pubDate>Fri, 01 May 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-05-01T07:00:00Z</atom:published>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Utility Ai]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Descript is pushing its </span><a class="link" href="https://www.descript.com?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=descript-adds-an-api-and-agentic-automation-to-its-text-based-video-editor" target="_blank" rel="noopener noreferrer nofollow">text-based video editor</a><span style="color:rgb(29, 31, 37);"> deeper into professional workflows with a new API, agentic automation through its Underlord AI assistant, and expanded export options for NLE handoffs. Marcello Farrell, Enterprise Sales Engineer at Descript, walked through the updates at NAB 2026.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Key takeaways:</b></span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>New API</b></span><span style="color:rgb(29, 31, 37);"> enables automated ingest, Underlord actions, and workflow triggers from outside the app</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Underlord AI agent</b></span><span style="color:rgb(29, 31, 37);"> handles clip creation, translation, lip-sync localization, and automated editing</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Newsrooms and non-editors</b></span><span style="color:rgb(29, 31, 37);"> are adopting Descript to bypass the traditional video bottleneck</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>XML and AAF exports</b></span><span style="color:rgb(29, 31, 37);"> support handoffs to Premiere, Final Cut, and DaVinci Resolve</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Rooms</b></span><span style="color:rgb(29, 31, 37);"> feature provides remote recording with automatic transcription and text-based editing</span></p></li></ul><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/3CpngvZRJus" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="text-based-editing-for-non-editors"><span style="color:rgb(29, 31, 37);">Text-Based Editing for Non-Editors</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Descript&#39;s core pitch: edit audio and video the same way you edit a Word document. The platform transcribes media and lets users cut, rearrange, and refine content by editing the transcript. Farrell said the most common use case at NAB was clip creation from long-form content, where Underlord identifies the strongest moments and generates social-ready clips without manual timeline work.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The approach is resonating with newsrooms where storytellers outnumber editors. Farrell described the video output step as a persistent bottleneck: good journalists who cannot operate a traditional NLE now produce editable content in Descript, and the editing team finishes the last 20% rather than doing 100% of the work.</span></p><h3 class="heading" style="text-align:left;" id="from-rough-cut-to-final-polish"><span style="color:rgb(29, 31, 37);">From Rough Cut to Final Polish</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Descript has supported XML and AAF exports for years, but the workflow is evolving. Enterprise teams are using Descript as a rough cut and paper edit tool, then handing off to specialized editors in Premiere, Final Cut, or DaVinci Resolve for finishing. The text-based editing gets the narrative structure in place, and the NLE expert handles color, graphics, and final polish. Farrell said this split is becoming standard for larger teams that previously had editors doing everything from ingest to output.</span></p><h3 class="heading" style="text-align:left;" id="the-api-opens-the-back-door"><span style="color:rgb(29, 31, 37);">The API Opens the Back Door</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The new Descript API, available in early access, lets external systems pull media into Descript, trigger Underlord actions, and export finished content without human interaction. A practical workflow: a recording lands in an S3 bucket, an API call pulls it into Descript, Underlord generates clips and applies edits, and the result is waiting when the operator opens their computer. Farrell framed the API as the groundwork for an agentic AI future where tools like Claude coordinate Descript alongside other applications in a larger production pipeline.</span></p><h3 class="heading" style="text-align:left;" id="rooms-for-remote-recording"><span style="color:rgb(29, 31, 37);">Rooms for Remote Recording</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Descript&#39;s Rooms feature handles remote recording directly inside the editor. Multiple participants across different locations record into the platform, and the audio is immediately transcribed and available for text-based editing. Farrell said this remains a major use case as podcasting and remote interview content continues to grow despite the return of in-studio production.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Frame.io Drive Streams Cloud Media Into Premiere Like a Local Hard Drive</title>
  <description></description>
      <enclosure length="32173672" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/524572a1-e2bd-4870-aff7-4599cfd9188a/VPL_NAB_2026_-_Frame_-_Edited.png"/>
  <link>https://www.vp-land.com/p/frame-io-drive-streams-cloud-media-into-premiere-like-a-local-hard-drive</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/frame-io-drive-streams-cloud-media-into-premiere-like-a-local-hard-drive</guid>
  <pubDate>Wed, 29 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-29T07:00:00Z</atom:published>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Utility Ai]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;">Frame.io Drive mounts cloud-based Frame.io projects as a <a class="link" href="https://frame.io/drive?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=frame-io-drive-streams-cloud-media-into-premiere-like-a-local-hard-drive" target="_blank" rel="noopener noreferrer nofollow">desktop drive</a>, letting editors stream 4K footage directly into Premiere Pro, Photoshop, and other creative tools without downloading files first.</p><p class="paragraph" style="text-align:left;">Announced at NAB 2026 and available to enterprise customers, the new desktop application turns Frame.io from a review-and-approval platform into a full creative management system. Editors working in different locations can share the same source media, open the same project files, and avoid the relinking headaches that come with traditional file transfer workflows.</p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/fJtuArC3n-Q" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="how-frameio-drive-works">How Frame.io Drive Works</h3><p class="paragraph" style="text-align:left;">The application mounts Frame.io projects in macOS Finder or Windows File Explorer. Editors click &quot;Mount Project&quot; and their cloud media appears as if it were on a local hard drive. The system streams only the data needed for playback or editing rather than downloading entire files, which means 4K footage plays back smoothly even on convention center Wi-Fi, as demonstrated during the NAB booth demo.</p><p class="paragraph" style="text-align:left;">JJ Powell, Senior Product Marketing Manager at Frame.io, described the approach: &quot;It starts to act like a high performance external hard drive just connected to your computer. But what&#39;s really unique about this is this works really well for distributed teams.&quot;</p><p class="paragraph" style="text-align:left;"><b>Smart caching. </b>Editors can pin specific files to their local machine through Finder or File Explorer, and Frame.io Drive also intelligently caches media as it&#39;s accessed. This enables work on high-resolution files even in low-bandwidth environments.</p><p class="paragraph" style="text-align:left;"><b>File agnostic. </b>The system handles any file type that the host application can read, including RAW footage. Powell noted that while editing RED RAW 8K files from a coffee shop isn&#39;t realistic, 4K editing over standard connections works because the system only streams the data needed at any moment.</p><h3 class="heading" style="text-align:left;" id="single-source-of-truth-across-disci">Single Source of Truth Across Disciplines</h3><p class="paragraph" style="text-align:left;">Frame.io Drive extends beyond video to Photoshop files and other design assets. During the demo, Powell opened a PSD directly from the mounted drive, made color adjustments in Photoshop, and saved back to Frame.io without any file transfer step. The same workflow applies to Premiere project files.</p><p class="paragraph" style="text-align:left;">According to Powell, &quot;If you wanted to pick up this project after I made a couple edits, you don&#39;t need to go to a different system. You just mount Frame to your desktop and you can pick up the project from there.&quot;</p><p class="paragraph" style="text-align:left;">This eliminates the relinking problem that plagues collaborative editing. All editors reference the same source media in the same cloud location, so there&#39;s no need to manage proxy files, relink missing media, or coordinate file transfers between team members.</p><h3 class="heading" style="text-align:left;" id="security-and-governance">Security and Governance</h3><p class="paragraph" style="text-align:left;">Frame.io&#39;s existing permission system extends to the desktop. Editors can create share links, apply passcode protection, set forensic watermarks, and enforce DRM directly from Finder. Files never live on the local machine; all content stays in Frame.io&#39;s cloud storage.</p><p class="paragraph" style="text-align:left;">&quot;You get to apply all the security and governance that teams know and love within Frame.io and extend that to the desktop,&quot; Powell said. &quot;I&#39;m never taking these files and putting them on my machine.&quot;</p><h3 class="heading" style="text-align:left;" id="mounted-storage-a-new-tier">Mounted Storage: A New Tier</h3><p class="paragraph" style="text-align:left;">Frame.io Drive uses a new storage type called Mounted Storage, separate from Frame.io&#39;s existing Standard Storage. Mounted Storage provides the streaming architecture for desktop workflows, while Standard Storage handles the traditional review and approval processes.</p><p class="paragraph" style="text-align:left;">The two tiers exist side by side within the same account. Enterprise customers get custom pricing, and every Frame.io account will include a baseline entitlement of Mounted Storage alongside their Standard Storage allocation. Frame.io Drive itself is a free application that replaces the existing Frame.io Transfer tool.</p><h3 class="heading" style="text-align:left;" id="availability">Availability</h3><p class="paragraph" style="text-align:left;">Frame.io Drive and Mounted Storage are available to enterprise customers, with pricing and availability for other plans coming later. The application replaces Frame.io Transfer as the single desktop tool for uploads, downloads, and mounted cloud editing.</p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Post Up Processes 90GB of R3D Footage Into Synced Dailies in 75 Seconds, Locally</title>
  <description></description>
      <enclosure length="520769" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/92b86679-d155-40d5-8581-93fc4466e469/VPL_NAB_2026_-_Post_Up_-_Edited.png"/>
  <link>https://www.vp-land.com/p/post-up-processes-90gb-of-r3d-footage-into-synced-dailies-in-75-seconds-locally</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/post-up-processes-90gb-of-r3d-footage-into-synced-dailies-in-75-seconds-locally</guid>
  <pubDate>Wed, 29 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-29T07:00:00Z</atom:published>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Utility Ai]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Post Up is building an </span><a class="link" href="https://www.postup.pro?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=post-up-processes-90gb-of-r3d-footage-into-synced-dailies-in-75-seconds-locally" target="_blank" rel="noopener noreferrer nofollow">automated post-production pip</a><span style="color:rgb(29, 31, 37);"><a class="link" href="https://www.postup.pro?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=post-up-processes-90gb-of-r3d-footage-into-synced-dailies-in-75-seconds-locally" target="_blank" rel="noopener noreferrer nofollow">eline</a></span><span style="color:rgb(29, 31, 37);"> that runs entirely on local hardware without an internet connection. At NAB 2026, Jeremy from Post Up demoed the platform processing 90GB of raw R3D footage from a RED Komodo into synced, colored dailies and editorial proxies on a stock Mac Mini in under 75 seconds.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Key takeaways:</b></span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>320 frames per second rendering</b></span><span style="color:rgb(29, 31, 37);"> on a standard Mac Mini, up from the typical 50fps</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>90GB of R3D footage</b></span><span style="color:rgb(29, 31, 37);"> </span><span style="color:rgb(29, 31, 37);"><b>processed in 75 seconds</b></span><span style="color:rgb(29, 31, 37);">: ingest, audio sync, color, dailies, and proxies</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>No internet required</b></span><span style="color:rgb(29, 31, 37);">: the entire pipeline runs locally on the user&#39;s hardware</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Agnostic output</b></span><span style="color:rgb(29, 31, 37);">: ALE, Avid timeline, Premiere XML, and direct integrations with Frame.io and other review platforms</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Paramount, Amazon MGM, and HBO</b></span><span style="color:rgb(29, 31, 37);"> among early studio clients using Post Up&#39;s processing nodes</span></p></li></ul><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/TZE-K0B2mVE" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="from-camera-card-to-edit-ready-in-m"><span style="color:rgb(29, 31, 37);">From Camera Card to Edit-Ready in Minutes</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Post up handles the full set-to-edit pipeline. The operator inserts a camera card, clicks start, and the software runs a standard DIT ingest with checksums. After import, the system automatically syncs audio based on timecode with a waveform verification check. An agentic workflow flags any out-of-sync clips for review, with a future model planned to correct offset issues automatically. For footage without timecode, the waveform sync still works.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The system then applies the selected LUT, burns metadata into the dailies (scene, take, timecode), adds a configurable watermark, and outputs two sets of files. </span><span style="color:rgb(29, 31, 37);"><b>Dailies render as H.264 at 720p </b></span><span style="color:rgb(29, 31, 37);">with the LUT, watermark, and burned metadata for the director and studio. </span><span style="color:rgb(29, 31, 37);"><b>Editorial proxies render as ProRes 422 at 1080p</b></span><span style="color:rgb(29, 31, 37);"> uncolored and without watermark, going straight to the assistant editor.</span></p><h3 class="heading" style="text-align:left;" id="local-processing-at-scale"><span style="color:rgb(29, 31, 37);">Local Processing at Scale</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Post Up&#39;s rendering engine accelerates standard hardware from roughly 50 frames per second to 320fps on a Mac Mini. The company also operates two processing nodes in Burbank and Portland that push to 2,500fps on server racks. Jeremy cited a recent project: </span><span style="color:rgb(29, 31, 37);"><b>21.5 hours of footage synced, colored, grouped into Avid, and QC&#39;d in 12 minutes and 40 seconds</b></span><span style="color:rgb(29, 31, 37);"> on those nodes. Studios like Paramount, Amazon MGM, and HBO use the server nodes for large-scale episodic work where security compliance and speed are critical.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The local option targets smaller productions that want the same acceleration without the infrastructure. A music video, documentary, or independent film can run the full pipeline on a laptop or Mac Mini at the camera.</span></p><h3 class="heading" style="text-align:left;" id="agnostic-workflow-integration"><span style="color:rgb(29, 31, 37);">Agnostic Workflow Integration</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Post Up aggregates sound reports, camera reports, camera metadata, and script reports into an ALE file and generates an Avid timeline for the assistant editor. Premiere XML export is also supported. The company&#39;s stated philosophy: do not force filmmakers to change their process. Post up accelerates the existing workflow rather than replacing it.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Dailies can be sent directly to review platforms like Frame.io, and the system generates a dailies report automatically. Jeremy described the current dailies and proxy product as the first step, with master and archival pipeline stages in development.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Ghost Kits Keeps You Connected on Any Shoot With 600 Carrier Smart Switching</title>
  <description></description>
      <enclosure length="47054162" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/a7c5641d-3cd1-4ddf-8dab-db31c4be04e1/VPL_NAB_2026_-_Ghost_Kits_-_Edited.png"/>
  <link>https://www.vp-land.com/p/ghost-kits-keeps-you-connected-on-any-shoot-with-600-carrier-smart-switching</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/ghost-kits-keeps-you-connected-on-any-shoot-with-600-carrier-smart-switching</guid>
  <pubDate>Sun, 26 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-26T07:00:00Z</atom:published>
    <category><![CDATA[Gear]]></category>
    <category><![CDATA[Hardware]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Ghost Kits built a </span><a class="link" href="https://www.ghostkits.com?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=ghost-kits-keeps-you-connected-on-any-shoot-with-600-carrier-smart-switching" target="_blank" rel="noopener noreferrer nofollow">connectivity device</a><span style="color:rgb(29, 31, 37);"> that keeps photographers and content creators online no matter where they are shooting. The Spectrum model connects to 600 cellular carriers across 190 countries and automatically switches between them when signal degrades.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Key takeaways:</b></span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Smart carrier switching keeps uploads alive when stadiums, conferences, or remote locations kill your primary signal</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Bonds WiFi, ethernet, and cellular connections simultaneously for speed or redundancy</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Upcoming Kit to Cloud feature will let you offload SD card footage directly to any cloud destination</span></p></li></ul><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/JndbsFhH5lM" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="smart-switching-not-smart-luck"><span style="color:rgb(29, 31, 37);">Smart Switching, Not Smart Luck</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Casper Hanney, CEO and Co-Founder, and Cyrus Keenan Eason, Head of Engineering, walked through the Spectrum model at NAB 2026. The core pitch: you turn it on, it emits a WiFi signal, and it keeps you connected.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The device comes with a preloaded SIM card. No bring-your-own, no contract hunting. You buy data from Ghost Kits at $7.50 per gigabyte with volume pricing that drops as usage scales. Because the company holds contracts with all 600 carriers, the backend handles the carrier juggling without any user intervention.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The real-world scenario they led with: an NFL photographer at SoFi Stadium, where Verizon degrades as tens of thousands of fans arrive. Ghost Kits detects the degradation and switches to T-Mobile or AT&T without dropping the connection.</span></p><h3 class="heading" style="text-align:left;" id="bonding-modes-for-every-situation"><span style="color:rgb(29, 31, 37);">Bonding Modes for Every Situation</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Beyond carrier switching, the device supports network bonding across cellular, WiFi, and ethernet simultaneously. Two modes are available: speed mode sends different data across each connection to maximize bandwidth, while redundancy mode sends the same data across multiple connections to ensure delivery.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Configuration happens through a web portal at a local URL when connected to the device hotspot. No app required. Users can change the SSID, password, cellular bands, and bonding preferences. A separate management portal handles fleet-level device tracking, usage monitoring, and subscription management.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Camera connections work via WiFi, USB-C for data, or ethernet. The USB-C port also provides power. Multiple devices can share the hotspot connection.</span></p><h3 class="heading" style="text-align:left;" id="kit-to-cloud-offload-without-a-lapt"><span style="color:rgb(29, 31, 37);">Kit to Cloud: Offload Without a Laptop</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The most interesting upcoming feature is Kit to Cloud, expected as a software-only update in the next six to twelve months. Plug an SD card, hard drive, or non-WiFi camera into the Ghost Kit via USB, and it uploads files directly to any cloud destination: Frame.io, Dropbox, Google Drive, S3, or anything else.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The interface lets users select specific files and destinations. The device reads the external media, offloads selected files, and verifies delivery. No laptop required.</span></p><h3 class="heading" style="text-align:left;" id="mounting-and-pricing"><span style="color:rgb(29, 31, 37);">Mounting and Pricing</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The device is light. Most of the weight comes from the battery. It mounts on tripods, belt clips, backpacks, or directly on camera rigs via a V-mount adapter. Available for purchase and rental through the Ghost Kits website.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Ghost Kits connects to 600 carriers in 190 countries, bonds WiFi/ethernet/cellular simultaneously, and will soon offload SD card footage to any cloud without a laptop.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Google AI Reaches Into Avid, ComfyUI, and Cloud Media Workflows at NAB</title>
  <description></description>
      <enclosure length="12267126" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/10df3c38-f008-4bcd-b95d-e892c8d24c04/VPL_NAB_2026_-_Google_-_Edited.png"/>
  <link>https://www.vp-land.com/p/google-ai-reaches-into-avid-comfyui-and-cloud-media-workflows-at-nab</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/google-ai-reaches-into-avid-comfyui-and-cloud-media-workflows-at-nab</guid>
  <pubDate>Fri, 24 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-24T07:00:00Z</atom:published>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Utility Ai]]></category>
    <category><![CDATA[Generative Ai]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Google laid out its </span><a class="link" href="https://cloud.google.com/solutions/media-entertainment?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=google-ai-reaches-into-avid-comfyui-and-cloud-media-workflows-at-nab" target="_blank" rel="noopener noreferrer nofollow">media and entertainment AI strategy</a><span style="color:rgb(29, 31, 37);"> at NAB 2026 with </span><span style="color:rgb(13, 13, 13);font-family:Roboto, Noto, sans-serif;font-size:15px;">Anshul Kapoor</span><span style="color:rgb(29, 31, 37);">, Director of Media and AI. The approach is straightforward: bring AI to the tools professionals already use, rather than requiring them to learn a new interface.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/ppMWxjsmk2k" width="100%"></iframe><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Avid integration.</b></span><span style="color:rgb(29, 31, 37);"> Google&#39;s AI models now run inside Avid&#39;s editorial tools, letting editors access capabilities like automated transcription, smart search, and AI-assisted assembly without leaving their timeline. The partnership targets the reality that most professional editors are not going to switch away from their NLE.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>ComfyUI on Google Cloud. </b></span><span style="color:rgb(29, 31, 37);">Media companies running ComfyUI for generative workflows can host the entire pipeline on Google Cloud, accessing Veo, Gemini, and Nano Banana models while keeping all content inside their own VPC. Google recommends cloud hosting over local deployment for both compute capacity and security compliance, since content never leaves the customer&#39;s data protection perimeter.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Search as a backbone. </b></span><span style="color:rgb(29, 31, 37);">Google showed a universal search capability that indexes every asset in a media library — video footage, transcripts, metadata — and makes it searchable across the organization. Fox Sports uses the system as both a productivity tool for editors and a revenue-generation tool for sales teams finding specific clips for advertisers.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Agentic workflows ahead. </b></span><span style="color:rgb(29, 31, 37);">Lin said the next 12 months will focus on two areas: more capable autonomous agents that can chain multi-step creative tasks, and tighter security controls so media companies can deploy those agents with confidence around their most valuable assets. Partnerships with existing tool vendors will expand, following the Avid and ComfyUI pattern.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Takeaway. </b></span><span style="color:rgb(29, 31, 37);">Google&#39;s positioning is infrastructure, not interface. The company wants to be the AI engine behind the tools filmmakers already trust, with enough security and control that enterprise media companies will actually adopt it.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Bit Part's Bitbox Max Quadruples Radio Power for Remote Camera Control</title>
  <description></description>
      <enclosure length="1141132" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/a0470794-4f3a-4c02-81f3-32d79cc340fa/VPL_NAB_2026_-_Bitbox_-_Edited.png"/>
  <link>https://www.vp-land.com/p/bit-part-s-bitbox-max-quadruples-radio-power-for-remote-camera-control</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/bit-part-s-bitbox-max-quadruples-radio-power-for-remote-camera-control</guid>
  <pubDate>Thu, 23 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-23T07:00:00Z</atom:published>
    <category><![CDATA[Gear]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Bit Part rolled out a suite of updates to its wireless camera control ecosystem at NAB 2026, headlined by the </span><a class="link" href="https://bitpart.com?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=bit-part-s-bitbox-max-quadruples-radio-power-for-remote-camera-control" target="_blank" rel="noopener noreferrer nofollow">Bitbox Max</a><span style="color:rgb(29, 31, 37);"> with a custom radio module that delivers up to 1 watt of transmit power, up from 250 milliwatts in the original unit. The Max fits in the identical housing, and current customers can trade in their units for a board swap.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/me1uq9BxV6E" width="100%"></iframe><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Bit Control app for Sony. </b></span><span style="color:rgb(29, 31, 37);">Bit Part built its own camera control app using Sony&#39;s SDK, replacing the web UI that Venice operators have dealt with for years. The app sends only kilobits of data instead of megabits, eliminating the refresh lag that plagued Sony&#39;s browser-based interface. It runs on iPad and Mac, and the Mac version supports Stream Deck integration for physical knobs and buttons to adjust ND filters, ISO, and other settings.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>USB-C Mini for compact cameras.</b></span><span style="color:rgb(29, 31, 37);"> A new USB-C version of the Bitbox Mini targets cameras like the FX3 and FX6 that only have USB-C connections. One cable provides power and network connectivity.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Roaming mode for live events. </b></span><span style="color:rgb(29, 31, 37);">A new utility mode lets two base stations hand off a camera seamlessly as it moves between locations, designed for scenarios like a camera following players from a stadium tunnel onto the field. The utility also supports frequency locking for coordination with RF managers at live events.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Expansion module and focus. </b></span><span style="color:rgb(29, 31, 37);">The expansion module, still in development, was shown working with Moon Smart Focus for lens control and C Motion handsets for pulling focus, all running over the bitbox radio alongside camera control.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Availability. </b></span><span style="color:rgb(29, 31, 37);">The Bit Control app is in beta. New hardware products are expected to roll out in stages starting late summer 2026. Bit Part operates as a four-person team that also works as DITs on professional sets.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">We</span><a class="link" href="https://www.vp-land.com/p/bitpart-bitbox-at-nab-2024?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=bit-part-s-bitbox-max-quadruples-radio-power-for-remote-camera-control" target="_blank" rel="noopener noreferrer nofollow"> first covered Bit Part</a><span style="color:rgb(29, 31, 37);"> at NAB 2024 when bitbox debuted.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Atomos Expands From On-Set Monitors to Full Production Pipeline With Flanders Acquisition</title>
  <description></description>
      <enclosure length="1350461" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/c42df644-47a8-4646-9d36-cc00d33acba5/VPL_NAB_2026_-_Atomos_-_Edited.png"/>
  <link>https://www.vp-land.com/p/atomos-expands-from-on-set-monitors-to-full-production-pipeline-with-flanders-acquisition</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/atomos-expands-from-on-set-monitors-to-full-production-pipeline-with-flanders-acquisition</guid>
  <pubDate>Wed, 29 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-29T07:00:00Z</atom:published>
    <category><![CDATA[Gear]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Atomos </span><a class="link" href="https://www.atomos.com/2026/04/17/atomos-nab-showcase-2026/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=atomos-expands-from-on-set-monitors-to-full-production-pipeline-with-flanders-acquisition" target="_blank" rel="noopener noreferrer nofollow">used NAB 2026 </a><span style="color:rgb(29, 31, 37);">to launch the </span><span style="color:rgb(29, 31, 37);"><b>Sumo PRO-19 monitor/recorder/switcher</b></span><span style="color:rgb(29, 31, 37);">, </span><span style="color:rgb(29, 31, 37);"><b>wireless production headphones</b></span><span style="color:rgb(29, 31, 37);">, a rack-mount </span><span style="color:rgb(29, 31, 37);"><b>Shogun AV-19</b></span><span style="color:rgb(29, 31, 37);">, an advanced </span><span style="color:rgb(29, 31, 37);"><b>PTZ controller</b></span><span style="color:rgb(29, 31, 37);">, and a </span><span style="color:rgb(29, 31, 37);"><b>CF Express card reader</b></span><span style="color:rgb(29, 31, 37);">. The bigger move: acquiring Flanders Scientific, giving Atomos a presence from 5-inch on-camera monitors through 65-inch reference displays for color grading.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The product blitz signals Atomos pushing beyond its core monitor/recorder business into a full production ecosystem. &quot;We now have the ability to support everybody from small 5-inch on-camera monitors all the way up to a 65-inch reference display,&quot; said CEO Peter Barber.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/vAURDUyFZCc" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="sumo-pro-19-4-k-monitor-recorder-an"><span style="color:rgb(29, 31, 37);">Sumo PRO-19: 4K Monitor, Recorder, and Switcher Combined</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The </span><a class="link" href="https://www.atomos.com/product/sumo-pro-19-19-4k-monitor-recorder-switcher/?country=AU&utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=atomos-expands-from-on-set-monitors-to-full-production-pipeline-with-flanders-acquisition" target="_blank" rel="noopener noreferrer nofollow">Sumo PRO-19</a><span style="color:rgb(29, 31, 37);"> is a complete rebuild of Atomos&#39; popular Sumo line, not an upgrade. It features a 4K screen, CF Express Type B recording, USB external recording, camera-to-cloud connectivity, and a four-input touchscreen switcher. NDI support is built in with no extra license required.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The unit works across on-set, DIT cart, and studio environments. Removable feet and mounting points on all sides support portrait mode and custom rigging. Barber emphasized that the PRO-19 combines all Sumo and Ninja features into one device: &quot;All the features of Sumo and Ninja combined together onto this.&quot;</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">NDI included. All new Ninjas and the Sumo PRO-19 ship with NDI built in at no additional cost. The integration enables sending video from on-camera monitors to the Sumo or anywhere on a network.</span></p><h3 class="heading" style="text-align:left;" id="shogun-av-19-rack-mount-production-"><span style="color:rgb(29, 31, 37);">Shogun AV-19: Rack-Mount Production Monitor</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The </span><a class="link" href="https://www.atomos.com/product/shogun-av-19/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=atomos-expands-from-on-set-monitors-to-full-production-pipeline-with-flanders-acquisition" target="_blank" rel="noopener noreferrer nofollow">Shogun AV-19</a><span style="color:rgb(29, 31, 37);"> takes the Sumo PRO-19&#39;s feature set and puts it in a rack-mount enclosure, removing Wi-Fi and the rugged outer shell to reduce cost. It retains Ethernet connectivity, CF Express recording, monitoring tools, and camera-to-cloud features. The unit targets AV racks, broadcast facilities, and portable production cases where a full Sumo isn&#39;t necessary.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Both the Sumo PRO-19 and Shogun AV-19 run Atomos&#39; updated operating system with automatic over-the-air updates, eliminating the manual download-and-install process from previous generations.</span></p><h3 class="heading" style="text-align:left;" id="wireless-headphones-and-cf-express-"><span style="color:rgb(29, 31, 37);">Wireless Headphones and CF Express Reader</span></h3><p class="paragraph" style="text-align:left;"><a class="link" href="https://www.atomos.com/product/studiosonic-air-headphones?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=atomos-expands-from-on-set-monitors-to-full-production-pipeline-with-flanders-acquisition" target="_blank" rel="noopener noreferrer nofollow">StudioSonic AIR headphones</a><span style="color:rgb(29, 31, 37);">. Atomos added a wireless version of its production headphones, with active noise reduction and a smaller form factor than the wired originals. The wired pair launched with zero-latency direct connections; the wireless version trades some latency for mobility.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">USB4 CF Express card reader. Built from aircraft-grade aluminum for heat dissipation, the new reader uses USB4 for transfer speeds four times faster than the previous generation. It supports direct camera-to-cloud uploads alongside local transfers.</span></p><h3 class="heading" style="text-align:left;" id="ptz-controller-managing-up-to-254-c"><span style="color:rgb(29, 31, 37);">PTZ Controller: Managing Up to 254 Cameras</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Atomos&#39; advanced </span><a class="link" href="https://www.atomos.com/product-category/cameras/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=atomos-expands-from-on-set-monitors-to-full-production-pipeline-with-flanders-acquisition" target="_blank" rel="noopener noreferrer nofollow">PTZ controller</a><span style="color:rgb(29, 31, 37);"> features a built-in touchscreen for camera selection and control. It handles auto-tracking, manual tracking, keyframes, focus, zoom, and brightness adjustment. The controller outputs video to an external monitor and can manage up to 254 cameras from a single panel.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The system uses standard protocols, so it works with PTZ cameras from other manufacturers. Production teams can mix Atomos cameras with existing hardware and control everything from one unit.</span></p><h3 class="heading" style="text-align:left;" id="flanders-scientific-acquisition"><span style="color:rgb(29, 31, 37);">Flanders Scientific Acquisition</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Atomos </span><a class="link" href="https://www.atomos.com/2026/04/07/flanders-scientific-aquisition/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=atomos-expands-from-on-set-monitors-to-full-production-pipeline-with-flanders-acquisition" target="_blank" rel="noopener noreferrer nofollow">acquired Flanders Scientific</a><span style="color:rgb(29, 31, 37);">, the reference monitor company used in color grading suites, finishing bays, and broadcast mastering facilities. The entire Flanders team is joining Atomos.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The acquisition gives Atomos coverage across the full production pipeline. According to Barber, features will move between the product lines where it makes sense, but the workflows remain distinct: &quot;Obviously the feature set of an onset Sumo that does recording and touchscreen and camera control isn&#39;t required in a reference monitor.&quot;</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Reference monitors serve a different purpose than on-set confidence monitors. Studios delivering to Netflix, broadcasters, and other distributors need calibrated reference displays for color-accurate finishing. Flanders&#39; expertise in that space complements Atomos&#39; on-set and field production strength.</span></p><h3 class="heading" style="text-align:left;" id="availability"><span style="color:rgb(29, 31, 37);">Availability</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The Shogun AV-19 is available now. The Sumo PRO-19 ships in May. Ecosystem products (headphones, card reader, PTZ controller) are expected in May and June. Flanders Scientific continues operating under its existing brand while integrating with Atomos&#39; development pipeline.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Presaige Scores Your Images and Videos Before You Post Them</title>
  <description></description>
      <enclosure length="1598599" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/b64825af-afe1-4f12-85c4-ae3cfa14361a/VPL_NAB_2026_-_Presaige_-_Edited.png"/>
  <link>https://www.vp-land.com/p/presaige-scores-your-images-and-videos-before-you-post-them</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/presaige-scores-your-images-and-videos-before-you-post-them</guid>
  <pubDate>Thu, 30 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-30T07:00:00Z</atom:published>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Utility Ai]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Presaige </span><a class="link" href="https://www.presaige.ai?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=presaige-scores-your-images-and-videos-before-you-post-them" target="_blank" rel="noopener noreferrer nofollow">analyzes images and videos</a><span style="color:rgb(29, 31, 37);"> pixel by pixel and predicts how likely they are to generate engagement before they go live. Co-founder</span> David Gioiella<span style="color:rgb(29, 31, 37);"> demoed the platform at NAB 2026, showing how brands like NBC and Lyft are using confidence scores to pick thumbnails, order carousels, and iterate on generative AI assets.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Key takeaways:</b></span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>10 million+ image training set</b></span><span style="color:rgb(29, 31, 37);"> correlates pixel-level data with real engagement metrics</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Platform-agnostic scoring</b></span><span style="color:rgb(29, 31, 37);">: no need to select a demographic or distribution platform</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Thumbnail selector</b></span><span style="color:rgb(29, 31, 37);"> picks the best frame from a video automatically</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Carousel ordering</b></span><span style="color:rgb(29, 31, 37);"> ranks images by predicted engagement for optimal sequence</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>NBC Olympics and Lyft</b></span><span style="color:rgb(29, 31, 37);"> among the clients using Presaige to optimize deployed creative</span></p></li></ul><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/6tVQGXEMTU8" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="pixel-analysis-not-content-analysis"><span style="color:rgb(29, 31, 37);">Pixel Analysis, Not Content Analysis</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Presaige&#39;s algorithm does not evaluate what an image depicts. It analyzes pixel-level patterns and correlates them with engagement data from a training library of over 10 million images and videos. The scores reflect how similar visual patterns performed when deployed, independent of subject matter, platform, or target audience. Gioiella was explicit: the system does not care about the concept, aesthetics, or golden ratio. It compares pixels against the dataset and returns a confidence score.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">This approach means a user does not need to specify whether the content is for TikTok, Instagram, or broadcast. The algorithm normalizes for platform differences and demographic variation, arguing that a strong image performs well regardless of context.</span></p><h3 class="heading" style="text-align:left;" id="thumbnail-selection-from-video"><span style="color:rgb(29, 31, 37);">Thumbnail Selection From Video</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The thumbnail selector tool scans every frame of a video and surfaces one to four candidates ranked by predicted engagement. Science YouTuber Jack Gordon, who has been testing the tool, puts his thumbnail options into Presaige and deploys the one with the highest score instead of A/B testing on YouTube. For longer videos, creators focus on scoring the first six seconds to confirm the hook shot rates high, since Presaige limits video analysis to 90 seconds due to processing cost.</span></p><h3 class="heading" style="text-align:left;" id="scoring-generative-ai-assets"><span style="color:rgb(29, 31, 37);">Scoring Generative AI Assets</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">For AI-generated imagery, Presaige offers an apples-to-apples comparison workflow. Gioiella showed a Lyft project where three text-to-image generations in Nano Banana Pro produced slightly different results. Instead of subjective selection, the team deployed the image with the highest Presaige score into video generation, scored three video outputs, and moved the top-scoring one into production. Each result comes with a readiness score on a 1-to-10 scale and specific improvement suggestions for users who want to push the score higher.</span></p><h3 class="heading" style="text-align:left;" id="from-carousels-to-broadcast-promos"><span style="color:rgb(29, 31, 37);">From Carousels to Broadcast Promos</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">NBC used Presaige to optimize a figure skating promo for the Olympics that reached nearly 20 million views on social platforms. Gioiella iterated on the edit using Presaige scores until the piece reached a level he was confident deploying. For carousel posts, the platform scores each image and recommends the optimal sequence. Presaige is also building integrations with media management platforms like</span> Simeon<span style="color:rgb(29, 31, 37);"> and</span> Freepik<span style="color:rgb(29, 31, 37);">, where the scoring would run inside existing review and approval workflows via API.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Presaige offers a free tier with image scoring, a premium tier that unlocks video analysis, and an enterprise tier with custom pricing and full API access.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Pixera Turns Every LED Wall Into a Live Interactive Environment With One Platform</title>
  <description></description>
      <enclosure length="1603966" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/bf6ca081-ac8d-4645-8b1c-8e7a9c54ce52/VPL_NAB_2026_-_Pixera_-_Edited.png"/>
  <link>https://www.vp-land.com/p/pixera-turns-every-led-wall-into-a-live-interactive-environment-with-one-platform</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/pixera-turns-every-led-wall-into-a-live-interactive-environment-with-one-platform</guid>
  <pubDate>Sat, 25 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-25T07:00:00Z</atom:published>
    <category><![CDATA[Software]]></category>
    <category><![CDATA[Camera Tracking]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Generative Ai]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Pixera is building a </span><a class="link" href="https://pixera.one?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=pixera-turns-every-led-wall-into-a-live-interactive-environment-with-one-platform" target="_blank" rel="noopener noreferrer nofollow">unified platform for real-time content</a><span style="color:rgb(29, 31, 37);"> on LED walls, virtual productions, and live events. At NAB 2026, Connor McGill showed how Pixera handles everything from pixel mapping to interactive content triggering without switching between separate tools.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Key takeaways:</b></span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Single platform manages LED wall content, virtual production stages, and live event environments</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Real-time content triggering responds to physical inputs, sensors, and audience interaction</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Pixar-level render quality on LED volumes for virtual production backgrounds</span></p></li></ul><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/GFmgKza3Fto" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="one-platform-for-every-screen"><span style="color:rgb(29, 31, 37);">One Platform for Every Screen</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Pixera started in virtual production, powering LED volumes that display photorealistic backgrounds for film and TV shoots. The platform renders environments in real time, matching the perspective of the camera as it moves. McGill showed a desert environment rendered on an LED wall with parallax-correct tracking.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The scope has expanded. Pixera now handles any LED installation: concert stages, corporate events, broadcast studios, architectural displays. The same engine drives all of them.</span></p><h3 class="heading" style="text-align:left;" id="interactive-content-that-responds-t"><span style="color:rgb(29, 31, 37);">Interactive Content That Responds to the Real World</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The demonstration at NAB focused on interactivity. McGill showed content on an LED wall that responded to a physical ball being thrown at it. The wall detected the impact and triggered visual effects at the exact point of contact.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">This works through sensor integration. Pixera connects to external inputs including motion sensors, cameras, and custom hardware. When a sensor detects an event, Pixera triggers the corresponding content in real time on the LED surface.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The use cases range from concert visuals that react to performer movement to corporate installations where visitors interact with branded content by touching or approaching the display.</span></p><h3 class="heading" style="text-align:left;" id="pixel-mapping-and-content-managemen"><span style="color:rgb(29, 31, 37);">Pixel Mapping and Content Management</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Pixera includes pixel mapping tools that let operators assign content to specific sections of an LED wall or across multiple walls. The mapping interface handles irregular shapes, curved surfaces, and multi-panel arrays.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Content management happens in a timeline-based editor. Operators can pre-program sequences, cue live content changes, and blend pre-rendered video with real-time rendered elements. The timeline supports layering, so a background environment, a mid-ground effect, and a foreground interactive element can all run independently.</span></p><h3 class="heading" style="text-align:left;" id="virtual-production-backgrounds"><span style="color:rgb(29, 31, 37);">Virtual Production Backgrounds</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">For film and TV, Pixera renders photorealistic environments on LED volumes. The engine handles camera tracking, color accuracy for on-camera talent, and real-time adjustments to lighting and perspective. The goal is to replace green screen with environments that look correct on camera and provide realistic lighting on set.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Pixera runs on NVIDIA RTX hardware. The rendering pipeline supports Unreal Engine integrations for teams that want to build environments in Unreal and drive them through Pixera.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Pixera unifies LED wall content, virtual production rendering, and interactive event experiences under one real-time platform.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Foundry Brings Griptape AI Agents Into Nuke, Blender, and Maya via MCP</title>
  <description></description>
      <enclosure length="1352799" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/183e9a20-bab1-417e-9396-4d07f6ab9f52/VPL_NAB_2026_-_Foundry_-_Edited.png"/>
  <link>https://www.vp-land.com/p/foundry-brings-griptape-ai-agents-into-nuke-blender-and-maya-via-mcp</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/foundry-brings-griptape-ai-agents-into-nuke-blender-and-maya-via-mcp</guid>
  <pubDate>Fri, 24 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-24T07:00:00Z</atom:published>
    <category><![CDATA[Vad / Unreal Engine]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Ar &amp; Xr]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The acquisition is paying off: Foundry is embedding AI agents directly into artist workflows using Model Context Protocol, and showing live compositing demos at NAB.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Foundry used NAB 2026 to show the first tangible results of its February acquisition of Griptape, an AI orchestration platform. Chief Product Officer Lauren Morris walked through a live demonstration of AI agents running inside Nuke, Blender, and Maya via the Model Context Protocol (MCP), all connected to a centralized project brain that understands context across the pipeline.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/sTOKiJRe0Nk" width="100%"></iframe><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Agents in the artist&#39;s tool.</b></span><span style="color:rgb(29, 31, 37);"> Instead of routing AI tasks through a separate web interface, Foundry is placing agents inside the applications artists already use. An MCP server exposes Nuke&#39;s node graph to a large language model, which can then execute complex compositing operations through natural-language instructions. Morris showed an agent performing a full cleanup pass on a shot: identifying the problem area, building the node tree, and executing the composite without manual node creation.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Blender and Maya integration. </b></span><span style="color:rgb(29, 31, 37);">Similar MCP servers connect Blender and Maya to the same agent framework. The system can chain operations across applications, for example generating a 3D element in Blender and automatically importing it into a Nuke composite.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>CopyCat upgrades. </b></span><span style="color:rgb(29, 31, 37);">Foundry&#39;s existing machine-learning tool inside Nuke, CopyCat, received updates for faster training and more robust rotoscoping. The long-term direction is a unified AI layer where CopyCat handles pixel-level tasks while Griptape agents manage higher-level pipeline orchestration.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Enterprise security.</b></span><span style="color:rgb(29, 31, 37);"> Because Griptape runs on-premises or in a customer&#39;s cloud VPC, media companies retain full control over their content. Foundry emphasized that no assets leave the production environment during AI processing.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>ComfyUI and Google partnership. </b></span><span style="color:rgb(29, 31, 37);">Foundry showed its tools running alongside ComfyUI on Google Cloud, part of a broader industry pattern of bringing AI models to existing artist tools rather than forcing artists into new platforms.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Timeline.</b></span><span style="color:rgb(29, 31, 37);"> The MCP integrations are in early access. Foundry plans wider availability through 2026, with a focus on compositing and rotoscoping workflows first, expanding to layout, lighting, and scene assembly.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>DJI RS5 Adds Built-In Object Tracking, Teflon Arms, and a Vertical Bounce Indicator</title>
  <description></description>
      <enclosure length="1278853" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/90839d16-559e-46ac-a467-1aef47ea1666/VPL_NAB_2026_-_DJI_-_Edited.png"/>
  <link>https://www.vp-land.com/p/dji-rs5-adds-built-in-object-tracking-teflon-arms-and-a-vertical-bounce-indicator</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/dji-rs5-adds-built-in-object-tracking-teflon-arms-and-a-vertical-bounce-indicator</guid>
  <pubDate>Fri, 24 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-24T07:00:00Z</atom:published>
    <category><![CDATA[Gear]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">DJI&#39;s </span><a class="link" href="https://www.dji.com/global/rs-5?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=dji-rs5-adds-built-in-object-tracking-teflon-arms-and-a-vertical-bounce-indicator" target="_blank" rel="noopener noreferrer nofollow">RS5 gimbal</a><span style="color:rgb(29, 31, 37);"> introduces three features aimed at making gimbal work faster for both beginners and experienced operators: Teflon-coated arms with individual adjustment knobs, a built-in tracking module that works without a smartphone, and a vertical bounce indicator that shows whether your walking motion is introducing shake.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/05xA5rY0yqU" width="100%"></iframe><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Teflon arms and knobs. </b></span><span style="color:rgb(29, 31, 37);">Every axis (pan, tilt, roll) gets its own fine-tuning adjustment knob. Combined with Teflon-coated sliding arms, balancing becomes faster and more intuitive. Experienced operators can balance in under 60 seconds, while newcomers get a more forgiving process that avoids the nudge-and-overshoot loop common with older gimbals.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Vertical bounce indicator. </b></span><span style="color:rgb(29, 31, 37);">An on-screen display shows a real-time readout of vertical bounce while walking. Red means the shake will show up in footage, green means it will not. The indicator helps new operators learn proper gimbal walking technique and lets experienced operators test how fast they can move before introducing artifacts.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Built-in tracking module. </b></span><span style="color:rgb(29, 31, 37);">The new tracking module locks onto people, cars, animals, and objects. Operators point the gimbal at a subject, press the trigger, and the RS5 maintains composition. A side screen shows detected subjects. Operators can apply composition offsets (placing the subject camera-left, for example) and adjust tracking sensitivity. The module works entirely on-device with no phone pairing required, unlike DJI&#39;s older RavenEye system.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Gesture control. </b></span><span style="color:rgb(29, 31, 37);">A wave-to-track feature lets operators start tracking themselves hands-free, useful for solo shooters.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Powered rear handle. </b></span><span style="color:rgb(29, 31, 37);">A rear grip with integrated controls ships in the combo kit, providing ergonomic low-angle operation without external wiring.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>No DJI Focus compatibility. </b></span><span style="color:rgb(29, 31, 37);">The RS5 does not currently integrate with DJI&#39;s follow focus system. Third-party solutions from companies like Tilta may work depending on their own integration efforts.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Pricing. </b></span><span style="color:rgb(29, 31, 37);">The base gimbal costs $569. The combo kit at $719 includes the tracking module, powered rear handle grip, and carrying case.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Blackmagic Adds 100G Ports, B4 Mounts, and Immersive Live Output to Cinema Cameras</title>
  <description></description>
      <enclosure length="1241089" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/b4a25787-6cce-44df-90b7-04f6b58d6928/VPL_NAB_2026_-_Blackmagic_Camera_-_Edited.png"/>
  <link>https://www.vp-land.com/p/blackmagic-adds-100g-ports-b4-mounts-and-immersive-live-output-to-cinema-cameras</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/blackmagic-adds-100g-ports-b4-mounts-and-immersive-live-output-to-cinema-cameras</guid>
  <pubDate>Sat, 02 May 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-05-02T07:00:00Z</atom:published>
    <category><![CDATA[Gear]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Blackmagic Design is bridging the gap between cinema and live broadcast with a suite of updates to the URSA Cine 12K platform.</span> Bob Caniglia<span style="color:rgb(29, 31, 37);">, Director of Sales Operations for North America, showed how a 100G port, B4 lens mount support, and a new immersive live capture module turn cinema cameras into live broadcast tools. We</span><a class="link" href="https://www.vp-land.com/p/ursa-cine-immersive-100g-puts-blackmagic-s-dual-8k-immersive-cinema-on-live-broadcast-infrastructure?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=blackmagic-adds-100g-ports-b4-mounts-and-immersive-live-output-to-cinema-cameras" target="_blank" rel="noopener noreferrer nofollow"> previously covered</a><span style="color:rgb(29, 31, 37);"> the URSA Cine Immersive 100G announcement.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Key takeaways:</b></span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>URSA Cine 12K gets a 100G port</b></span><span style="color:rgb(29, 31, 37);"> for faster data offload and live broadcast workflows</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>B4 mount support</b></span><span style="color:rgb(29, 31, 37);"> enables broadcast-style long-range lenses on cinema cameras</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Immersive live module outputs ProRes</b></span><span style="color:rgb(29, 31, 37);"> for both eyes simultaneously through the 100G port</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Future 100G 3D switcher</b></span><span style="color:rgb(29, 31, 37);"> shown under glass, handling both 2D and 3D workflows</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Up to 440fps slow motion in HD</b></span><span style="color:rgb(29, 31, 37);"> through the live pipeline</span></p></li></ul><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/lLMFX-1QD2Q" width="100%"></iframe><h2 class="heading" style="text-align:left;" id="cinema-cameras-go-live"><span style="color:rgb(29, 31, 37);">Cinema Cameras Go Live</span></h2><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Blackmagic has been watching its cinema cameras adopted for live event production, from concerts to sports to immersive experiences. The new URSA Cine 12K 100G adds a 100-gigabit port to the back, moves some controls around on the side, and pairs with a new studio viewfinder that connects via USB-C. One of the SDI outputs can become an input, enabling full camera control through an ATEM switcher. The body-only price sits just under $10,000, with the immersive version just under $30,000. Both are expected in June, and the previous 10G version received a price drop of a couple thousand dollars.</span></p><h2 class="heading" style="text-align:left;" id="immersive-goes-real-time"><span style="color:rgb(29, 31, 37);">Immersive Goes Real-Time</span></h2><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The immersive module replaces the standard storage module and creates ProRes files for each eye that exit through the 100G port. Caniglia noted that the output is not raw data because of the real-time processing requirements, but the resolution is project-dependent and tunable for the destination. The use case: live immersive content for events like basketball games, where appointment viewing drives demand for the best available technology.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">A future 100G switcher was shown under glass at the booth. It handles 3D content with separate right and left eye capability, something the current ATEM models cannot do. The same switcher handles standard 2D work, making it a single platform for any broadcast format.</span></p><h2 class="heading" style="text-align:left;" id="faster-offloads-and-slow-motion"><span style="color:rgb(29, 31, 37);">Faster Offloads and Slow Motion</span></h2><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The 100G port also accelerates data wrangling. Moving footage off camera modules that previously used a 10G port now runs ten times faster. Blackmagic created matching 100G reader modules to support the throughput. For live production, the system can deliver up to 440fps in HD by narrowing the sensor readout. A new HyperDeck IP recorder with 100G support can record eight channels simultaneously. Blackmagic is also introducing converters that bridge standard 12G SDI to the 100G ecosystem, so facilities can adopt the new standard incrementally.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Saturation Builds the Budgeting App That Actually Understands How Film Productions Spend Money</title>
  <description></description>
      <enclosure length="508948" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/1f21ffc5-8fc1-4802-be6b-1e3155927f26/VPL_NAB_2026_-_Saturation_-_Edited.png"/>
  <link>https://www.vp-land.com/p/saturation-builds-the-budgeting-app-that-actually-understands-how-film-productions-spend-money</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/saturation-builds-the-budgeting-app-that-actually-understands-how-film-productions-spend-money</guid>
  <pubDate>Thu, 23 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-23T07:00:00Z</atom:published>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Utility Ai]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Saturation is a </span><a class="link" href="https://saturation.io?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=saturation-builds-the-budgeting-app-that-actually-understands-how-film-productions-spend-money" target="_blank" rel="noopener noreferrer nofollow">production budgeting platform</a><span style="color:rgb(29, 31, 37);"> built specifically for the way film and TV productions actually operate. At NAB 2026, co-founder and CEO Jens Jacob showed how Saturation handles the complexity that spreadsheets and generic accounting tools cannot.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/9XyFhD3bikc" width="100%"></iframe><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Key takeaways:</span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Purpose-built for film production budgets with line-item tracking tied to specific departments, locations, and shooting days</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Real-time collaboration lets producers, line producers, and accountants work in the same document simultaneously</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Template library includes budgets from real productions that users can adapt to their own projects</span></p></li></ul><h3 class="heading" style="text-align:left;" id="why-not-just-use-excel"><span style="color:rgb(67, 67, 67);">Why Not Just Use Excel</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Film production budgets are not standard financial documents. They track hundreds of line items across departments, tie spending to specific shooting days and locations, and need to handle contingencies, fringes, tax incentives, and currency conversions all at once. A spreadsheet can hold the numbers but cannot enforce the relationships between them.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Saturation models those relationships. Change the number of shoot days and every department budget updates. Adjust the location and the tax incentive calculation changes. Move a scene from day five to day three and the logistics costs recalculate.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Jacob demonstrated a budget for a commercial production that automatically adjusted crew costs, equipment rentals, and location fees when the shoot schedule changed from five days to four. The software recalculated overtime, travel, and per-diem allocations across every affected department.</span></p><h3 class="heading" style="text-align:left;" id="built-for-the-actual-workflow"><span style="color:rgb(67, 67, 67);">Built for the Actual Workflow</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The platform mirrors how productions actually build budgets. Users start from templates based on real productions: feature films, commercials, music videos, documentaries. Each template includes the line items, category structures, and formula relationships that match that type of production.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">From there, producers customize. Saturation supports top-down budgeting (start with a total and allocate down) and bottom-up budgeting (build from individual line items up to a total). Both approaches stay linked, so changes at any level propagate correctly.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Collaboration is real-time. Multiple users can work in the same budget simultaneously. Jacob showed a scenario where a line producer adjusted equipment costs in one tab while the accountant updated fringes in another, with both changes reflected in the master budget immediately.</span></p><h3 class="heading" style="text-align:left;" id="integration-and-export"><span style="color:rgb(67, 67, 67);">Integration and Export</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Saturation integrates with accounting systems and payroll providers used in production. Budgets can be exported as PDFs, CSV files, or formatted reports that match standard industry submission formats.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The platform also handles cost tracking during production. Actual spending gets logged against the budget in real time, showing variances as they happen rather than after the production wraps.</span></p><h3 class="heading" style="text-align:left;" id="who-uses-it"><span style="color:rgb(67, 67, 67);">Who Uses It</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Jacob said the user base spans independent producers managing their first feature film budget, commercial production companies running multiple simultaneous projects, and studios managing large-scale productions. The template system scales from a $50,000 documentary to a $50 million feature.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Saturation is a production budgeting platform that ties line items to shoot schedules, locations, and departments so every change propagates automatically.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Puget Systems on Why Workstations Are Moving to the Server Room</title>
  <description></description>
      <enclosure length="986089" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/7104b192-0657-4c38-ab8c-426a35f3c7a7/VPL_NAB_2026_-_Puget_Systems_-_Edited.png"/>
  <link>https://www.vp-land.com/p/puget-systems-on-why-workstations-are-moving-to-the-server-room</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/puget-systems-on-why-workstations-are-moving-to-the-server-room</guid>
  <pubDate>Thu, 30 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-30T07:00:00Z</atom:published>
    <category><![CDATA[Hardware]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Puget Systems founder Jon Bach sees a fundamental shift in where creative compute happens. Workstations aren&#39;t disappearing — they&#39;re moving off desks and into server rooms, giving teams full hardware performance without being physically tethered to a tower.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">At NAB 2026, Bach outlined the </span><a class="link" href="https://www.pugetsystems.com/press/press-release-puget-systems-debuts-the-puget-experience-at-nab-2026/?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=puget-systems-on-why-workstations-are-moving-to-the-server-room" target="_blank" rel="noopener noreferrer nofollow">trends reshaping creative hardware</a><span style="color:rgb(29, 31, 37);">: on-prem rack workstations, memory shortages driven by AI demand, and the economics of when to move workloads off the cloud and onto your own hardware.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/O4l9tmhBfqw" width="100%"></iframe><h3 class="heading" style="text-align:left;" id="workstations-without-the-desk"><span style="color:rgb(29, 31, 37);">Workstations Without the Desk</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">One of Puget&#39;s fastest-growing segments is the rack workstation — the same high-performance hardware, but mounted in a data center or server room instead of sitting on someone&#39;s desk. Teams remote into their dedicated hardware from home, a coffee shop, or another office.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">&quot;You&#39;re not performance compromised. You&#39;re not sharing hardware with anyone else. It&#39;s still your hardware, full performance, it&#39;s just not at your desk,&quot; Bach said. The approach also lets teams push hardware harder: rack-mounted systems can run fans at full speed in a dedicated space, which matters now that high-end GPUs draw 600 watts.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Rack workstations serve as a stepping stone between local desktop computing and full cloud virtualization. Teams get the familiarity and predictability of owning their hardware while gaining the flexibility of remote access.</span></p><h3 class="heading" style="text-align:left;" id="the-cloudto-local-economics"><span style="color:rgb(29, 31, 37);">The Cloud-to-Local Economics</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Bach&#39;s framework for deciding between cloud and local hardware is straightforward: start in the cloud, figure out what you actually need, and move to owned hardware once the math works.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">&quot;The cloud is a fantastic place to get started and it&#39;s a fantastic place to do the big stuff that you need really big hardware to do. But it also has performance costs, it has dollar costs, it has speed costs of latency,&quot; Bach said.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The tipping point is typically around a nine-month payback period. Once a team has dialed in their workflow in the cloud and can predict their compute needs, buying equivalent hardware often pays for itself in under a year. The challenge is knowing what to buy — which is exactly what the cloud testing phase answers.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">&quot;You really need to know what your needs are and what your workflow is before then. And the cloud is great for that. And the cloud will tell you when it&#39;s time to move because you&#39;ll find your configuration, you&#39;ll find what you want to do and you&#39;ll scale it up and then you&#39;ll get the bill,&quot; Bach said.</span></p><h3 class="heading" style="text-align:left;" id="ai-demand-and-memory-shortages"><span style="color:rgb(29, 31, 37);">AI Demand and Memory Shortages</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">AI workloads are creating supply chain pressure that affects all hardware buyers, not just AI companies. Memory shortages are the most visible symptom, driven by the massive RAM requirements of large language models and AI training pipelines.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Bach noted that shortage situations trigger unpredictable buyer behavior — hoarding inventory, timing purchases differently, and accelerating orders out of fear. Those behavioral effects compound the actual supply constraint, making pricing and availability more volatile than the underlying supply numbers would suggest.</span></p><h3 class="heading" style="text-align:left;" id="small-teams-big-results"><span style="color:rgb(29, 31, 37);">Small Teams, Big Results</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Bach highlighted the collaboration with </span><a class="link" href="https://www.vp-land.com/p/we-asked-top-vfx-artists-about-ai-at-the-ves-awards-here-s-what-they-said?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=puget-systems-on-why-workstations-are-moving-to-the-server-room" target="_blank" rel="noopener noreferrer nofollow">Corridor Digital</a><span style="color:rgb(29, 31, 37);"> on Corridor Key, their AI-powered green screen tool, as an example of where creative computing is heading. Corridor built a production-grade tool without being a software company, something Bach attributes to AI lowering the barrier between having an idea and shipping a product.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">&quot;We&#39;ve talked with some Hollywood studios where in order to make a feature film, you have to become this business and you have to have thousands of people on payroll and you have to have an HR department and finance and all of this stuff. And AI helps make everyone more effective so that you can have a small passionate team and still have an end result that&#39;s similar to those larger teams,&quot; Bach said.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The pattern extends beyond VFX. Bach&#39;s view is that the compute continuum runs from phones to local workstations to cloud, and the right mix depends on the specific workflow, budget, and team size.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

      <item>
  <title>Canon's CINE-SERVO 40-1200mm Goes Wider, Longer, and Adds Autofocus</title>
  <description></description>
      <enclosure length="1268699" type="image/png" url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/9efefe4e-5d0a-4030-a161-0bdd230349ba/VPL_NAB_2026_-_Canon_-_Edited.png"/>
  <link>https://www.vp-land.com/p/canon-s-cine-servo-40-1200mm-goes-wider-longer-and-adds-autofocus</link>
  <guid isPermaLink="true">https://www.vp-land.com/p/canon-s-cine-servo-40-1200mm-goes-wider-longer-and-adds-autofocus</guid>
  <pubDate>Wed, 29 Apr 2026 07:00:00 +0000</pubDate>
  <atom:published>2026-04-29T07:00:00Z</atom:published>
    <category><![CDATA[Gear]]></category>
    <category><![CDATA[Article]]></category>
    <category><![CDATA[Nab 2026]]></category>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #C0C0C0; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#F1F1F1; }
  .bh__table_header p { color: #2A2A2A; font-family:'Trebuchet MS','Lucida Grande',Tahoma,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Canon&#39;s</span><a class="link" href="https://www.usa.canon.com/shop/p/cine-servo-40-1200mm-t5-0-10-8-pl?utm_source=www.vp-land.com&utm_medium=newsletter&utm_campaign=canon-s-cine-servo-40-1200mm-goes-wider-longer-and-adds-autofocus" target="_blank" rel="noopener noreferrer nofollow"> CINE-SERVO 40-1200mm T5.0-10.8</a><span style="color:rgb(29, 31, 37);"> replaces the popular 50-1000mm with a wider and longer focal range, RF mount autofocus support, and an updated SERVO drive with focus breathing compensation. Ryan Snyder, Senior Product Specialist at Canon USA, walked through the updates at NAB 2026.</span></p><iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="true" class="youtube_embed" frameborder="0" height="100%" src="https://youtube.com/embed/4oDW-h7vsoM" width="100%"></iframe><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);"><b>Key takeaways:</b></span></p><ul><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">40-1200mm range covers wider and longer than the predecessor 50-1000mm</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">RF mount enables full autofocus with eye, face, head, and body detection on Canon Cinema EOS cameras</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Focus breathing compensation built into the updated SERVO drive</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">USB-C power input speeds up servo zoom to a 1-second snap from wide to tele</span></p></li><li><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Available in September in both RF and PL mount</span></p></li></ul><h3 class="heading" style="text-align:left;" id="from-wildlife-lens-to-live-event-wo"><span style="color:rgb(29, 31, 37);">From Wildlife Lens to Live Event Workhorse</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The original 50-1000mm launched in 2014, designed for wildlife and documentary filmmakers who needed a portable telephoto that covered Super 35 sensors. But the lens found an unexpected audience in live event multicam production for concerts, events, and houses of worship that wanted a more cinematic look. Snyder said the new 40-1200mm was built to serve that growing demand while staying true to its wildlife and documentary roots.</span></p><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The wider 40mm starting point gives operators more flexibility at the short end, while 1200mm pushes further into telephoto territory than the original&#39;s 1000mm ceiling.</span></p><h3 class="heading" style="text-align:left;" id="autofocus-joins-the-party"><span style="color:rgb(29, 31, 37);">Autofocus Joins the Party</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The RF mount version brings full Canon autofocus to a CINE-SERVO lens for the first time. Paired with Canon Cinema EOS cameras, operators get eye, face, head, and body detection. For live event shooters who previously had to manually track subjects at extreme focal lengths, this removes a significant operational burden. The PL mount version supports all camera systems but drops autofocus capability.</span></p><h3 class="heading" style="text-align:left;" id="servo-drive-gets-faster-and-smarter"><span style="color:rgb(29, 31, 37);">SERVO Drive Gets Faster and Smarter</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">Canon updated the SERVO drive unit with two notable improvements. Focus breathing compensation actively adjusts the zoom to counteract focus shift, reducing the already minimal breathing the lens exhibits. Broadcast sports shooters gave Canon feedback that the servo zoom speed on the 50-1000mm wasn&#39;t fast enough, so Canon added a USB-C port that accepts PD power to accelerate the drive. With external power, the lens snaps from wide to tele in about 1 second, behaving more like a box lens.</span></p><h3 class="heading" style="text-align:left;" id="firmware-handles-the-light-loss"><span style="color:rgb(29, 31, 37);">Firmware Handles the Light Loss</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The lens ramps from T5.0 to T10.8 starting around 560mm. A new firmware feature for the Canon</span> C400<span style="color:rgb(29, 31, 37);"> automatically compensates for the light loss at the ramp point by adjusting ISO or gain, so operators can use the full focal length without a visible dip in brightness. The C80 also receives a firmware update adding USB-C camera control for gimbal integration, allowing manual focus, start/stop, and other controls through a USB-C connection to systems like DJI gimbals.</span></p><h3 class="heading" style="text-align:left;" id="pricing-and-availability"><span style="color:rgb(29, 31, 37);">Pricing and Availability</span></h3><p class="paragraph" style="text-align:left;"><span style="color:rgb(29, 31, 37);">The CINE-SERVO 40-1200mm T5.0-10.8 will be available in September in both RF and PL mount. Pricing was not announced at the show.</span></p></div></div>
  ]]></content:encoded>
<author>podcast@newterritory.media (Coffee and Celluloid)</author><itunes:explicit>no</itunes:explicit><itunes:author>Coffee and Celluloid</itunes:author><itunes:keywords>film,photography,art,design,interview,filmmaking,independent</itunes:keywords></item>

  </channel>
</rss>