<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>TurboSquid Blog</title>
	<atom:link href="https://blog.turbosquid.com/feed/" rel="self" type="application/rss+xml" />
	<link>https://blog.turbosquid.com/</link>
	<description>The world’s leading source for quality 3D models.</description>
	<lastBuildDate>Fri, 06 Dec 2024 16:33:11 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
<site xmlns="com-wordpress:feed-additions:1">227686131</site>	<item>
		<title>5 Immersive &#038; 3D Design Trends for 2025</title>
		<link>https://blog.turbosquid.com/2024/12/06/5-immersive-3d-design-trends-for-2025/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=5-immersive-3d-design-trends-for-2025</link>
		
		<dc:creator><![CDATA[Jesse Radonski]]></dc:creator>
		<pubDate>Fri, 06 Dec 2024 16:33:10 +0000</pubDate>
				<category><![CDATA[3D Modeling]]></category>
		<guid isPermaLink="false">https://blog.turbosquid.com/?p=14084</guid>

					<description><![CDATA[<p>One of the incredible things about working at Shutterstock is witnessing trends unfold in real time as contributors continuously enrich the marketplace with fresh creative works. This dynamic flow of information is part of what inspires our team at Shutterstock Studios, where staying ahead of emerging technologies, techniques, and styles isn’t just valuable — it’s essential for delivering award-winning campaigns ... </p>
<div><a href="https://blog.turbosquid.com/2024/12/06/5-immersive-3d-design-trends-for-2025/" class="more-link">Read More</a></div>
<p>The post <a href="https://blog.turbosquid.com/2024/12/06/5-immersive-3d-design-trends-for-2025/">5 Immersive &amp; 3D Design Trends for 2025</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>One of the incredible things about working at Shutterstock is witnessing trends unfold in real time as contributors continuously enrich the marketplace with fresh creative works. This dynamic flow of information is part of what inspires our team at <a href="https://studios.shutterstock.com/portfolio?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=immersive-trends-2025&amp;utm_term=mktg_content&amp;utm_content=studios-portfolio" target="_blank" rel="noreferrer noopener">Shutterstock Studios</a>, where staying ahead of emerging technologies, techniques, and styles isn’t just valuable — it’s essential for delivering award-winning campaigns for brands like Lenovo and Carhartt.</p>



<p>So, with 2025 just around the corner, we thought we&#8217;d share five immersive and 3D design trends we’ve seen that may transform your creative approach in the year ahead. These trends reflect the rapidly evolving landscape of technology and creativity, so whether you’re a seasoned professional or just exploring 3D design, these insights could shape how you work and collaborate.</p>



<h4 class="wp-block-heading"><strong>AI-powered design assistance</strong></h4>



<p>Artificial intelligence has long played the role of a creative sidekick, but looking forward, it&#8217;s clear that we&#8217;re only scratching the surface of its potential. Right now, AI is transitioning from being a helpful assistant to an integral part of the creative process because of its growing ability to complement human creativity with speed, adaptability, and the ability to bridge technical gaps. We’ve all been there — hours spent brainstorming, late nights perfecting, and pushing through to deliver your best work. Those crunches are annoying and aggravating.</p>



<p>Could AI have helped? In many ways, yes. Its ability to handle the time-consuming basics empowers designers to focus on innovation and refinement, elevating the quality of their work. AI also makes 3D design more accessible to those without years of technical expertise. However, this increased accessibility comes with a challenge: an over-reliance on AI-generated default 3D models could lead to watered-down designs devoid of originality. To truly harness AI&#8217;s power, creators must prioritize learning how to use it as a springboard — an enabler of unique and thoughtful design — rather than a shortcut that bypasses creativity.</p>



<p>The minute the AI starts specializing in what you do, everything opens up. For instance, filming a commercial on a virtual production set for a global fast-food brand might involve swapping decor in real time to match the company&#8217;s branding. A generalist AI model must be adequately trained to create 3D models from owned IP, but a specialized AI model can generate the golden arches on decorative drinkware. If you want to get more flashy, it could potentially generate anthropomorphic chicken nuggets dressed as superheroes, too.&nbsp;</p>



<p>AI’s role in iterative design opens doors for rapid prototyping and experimentation. Designers can generate multiple concept variations in minutes, compare them in context, and refine the best version. Just imagine a product designer using AI to create several variations of a new gadget&#8217;s form factor, then visualize each one in a realistic 3D environment to assess its aesthetics, ergonomics, and functionality.</p>



<p>As technology advances, we anticipate seeing AI integrate more deeply into the 3D design pipeline. From adaptive learning algorithms that fine-tune their capabilities based on individual workflows to industry-specific APIs tailored for everything from gaming to e-commerce, the possibilities are vast. Rather than replacing human creativity, AI serves as a powerful collaborator, enhancing a designer’s ability to innovate and streamline their work.</p>


<div class="wp-block-image">
<figure class="aligncenter"><img decoding="async" src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXethKlgNsuxfMK5MilaeiSJBK4dTFWh6EqquelXr-dM0pwd3VSNKR32nHpWMlEdVXKqanxJobVIGtsFYnqa_D01ZhJ8Zb2erCdNDL1SiSMCbK2WpLIpSElXgnRNjgDFktptU4FkDg?key=7Lkjx2kCxYxYak4udNsTDmgz" alt="" /></figure>
</div>


<h4 class="wp-block-heading"><strong>Real-time collaboration in 3D design</strong></h4>



<p>As 3D engines become increasingly sophisticated, having the ability for teams to work on the same project in real time is becoming the norm. This innovation offers clear advantages for remote teams, but even in-office designers collaborating across multiple locations can benefit immensely.</p>



<p>Take architecture, for instance. Imagine brainstorming building designs in a shared VR space where team members can simultaneously interact with a 3D model. One person might suggest a design change, while another test the structural integrity of a concept, all within the same virtual environment. This level of immersion and instant feedback accelerates the decision-making process and encourages more dynamic, creative collaboration.</p>



<p>Platforms like <a href="https://www.nvidia.com/en-us/omniverse/" target="_blank" rel="noreferrer noopener">NVIDIA Omniverse</a> and <a href="https://www.autodesk.com/bim-360/" target="_blank" rel="noreferrer noopener">Autodesk BIM 360</a> are already making this vision a reality. Omniverse provides a collaborative ecosystem where architects, engineers, and designers can work in a unified space, like in a <a href="https://www.nvidia.com/en-us/use-cases/ai-for-virtual-factory-solutions/" target="_blank" rel="noreferrer noopener">virtual factory</a>, even using different software tools. Omniverse eliminates the silos that often slow down design workflows. Autodesk BIM 360, on the other hand, focuses on cloud-based collaboration, making it easier for global teams to access, annotate, and revise project files from anywhere.</p>



<p>Unity’s 3D collaboration tools are another standout example. In an <a href="https://youtu.be/klBSeLnT5kE" target="_blank" rel="noreferrer noopener">interview</a>, Volvo’s innovation leader for virtual experiences, Tommy Ghiurau, explains how Unity enabled its engineers and designers to work together in a shared virtual space. They could visualize the complete version of a car prototype, explore different design iterations, and provide feedback in real time. This approach dramatically sped up development cycles and enhanced communication, making it easier for Volvo’s teams to align on complex projects.</p>



<p>The benefits extend beyond product design into areas like asset creation. For example, one team member might focus on crafting detailed text prompts for AI-powered generative tools, producing a starting place for a 3D model. Meanwhile, others can simultaneously refine these assets in a 3D engine, optimizing textures or functions for their specific needs. This kind of parallel work eliminates bottlenecks and allows for a more fluid, efficient process.</p>



<p>Building upon AI-generated 3D models can be a powerful way to save time and enhance creativity, but it’s not without its challenges. Artists would need to navigate issues of accuracy, topology, and creative limitations while also adjusting their workflows. For example, an AI tool might produce a leather jacket but miss the signature stitching used by a specific brand, requiring the 3D artist to update that part of the model manually.</p>



<p>Overall, collaborative workflows could be a win-win for team leads and 3D users. They could more wisely reallocate time while allowing production staff to focus on creativity and solving problems. We all have skills that make us unique members of a team. These workflows do more than streamline your tasks; they allow you to devote more time to trying out new ideas, and that&#8217;s where you’ll find those hidden gems.</p>


<div class="wp-block-image">
<figure class="aligncenter"><img decoding="async" src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXd2kaGVDEL_crCFxOyRVPRxw8YEsSZPUAhnrmg3GCnhx3dWSk6-YvMYIAqbbyh4W08qwEur2-LXMGP8THGMGwUvGSwoVGBqqMLJCwOG7Io5WcoJ7fo9G6Cjrql4risR9U-xoArcdg?key=7Lkjx2kCxYxYak4udNsTDmgz" alt="" /></figure>
</div>


<h4 class="wp-block-heading"><strong>People-first AR &amp; VR design</strong></h4>



<p>We’ve already seen companies like Apple and Meta set the stage for the next era of immersive technology. The Apple Vision Pro emphasizes usability, delivering a seamless and intuitive experience, while the Ray-Ban Meta Wayfarer glasses focus on comfort and everyday wearability. These devices highlight how companies are changing to meet user needs in both function and form. Looking ahead, the potential to expand these offerings into more immersive and diverse applications is enormous.</p>



<p>Consider the possibility of apps like <a href="https://www.calm.com/" target="_blank" rel="noreferrer noopener">Calm</a>, which could elevate its audio-based relaxation stories by integrating them with VR. Picture winding down with a guided meditation session surrounded by a serene 3D environment such as an ocean shoreline, a quiet forest, or even a minimalist space designed to reduce sensory overload. This type of immersive experience could not only relieve stress but also revolutionize how we approach mental health and wellness technology.</p>



<p>As smart-glasses technology advances, features like real-time obstacle detection could be transformative, especially for individuals with impaired eyesight or mobility challenges. These enhancements, powered by AI trained on 3D objects, could make everyday navigation safer and more inclusive. For example, glasses could provide gentle haptic feedback or audio cues to help users avoid obstacles or navigate complex environments.</p>



<p>The retail space also embraces the power of immersive 3D and AR tools. Take <a href="https://www.ikea.com/us/en/home-design/" target="_blank" rel="noreferrer noopener">IKEA Kreativ</a>, a web-based app that allows users to visualize furniture in 3D spaces before visiting the showroom. As AR capabilities and spatial computing progress, these tools will become more powerful and interactive. Soon, shoppers may be able to use their smart glasses or AR-enabled devices to visualize how an entire room layout will look in their home. Similarly, virtual fitting rooms could allow users to try on clothing and accessories, making online shopping more accurate and engaging.</p>



<p>By addressing key pain points in areas like accessibility and wellness, what&#8217;s next in immersive and 3D design can make life easier, more inclusive, and exciting. Nobody enjoys feeling overwhelmed with technology — just think of the printer scene from <em>Office Space</em>. Hopefully, as these tools become more integrated into our everyday lives, they&#8217;ll be designed to simplify the experiences that matter most to us.</p>


<div class="wp-block-image">
<figure class="aligncenter"><img decoding="async" src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXeVb8PkukBW4U8QMnZFg9Mk5lz6-jS1YyBxp518d95xrXHstBzTvSRGMIYAVR0KW-FQ_YeqHu3qa5L98btASvKXL1tkjPlmg2a4fI-qNfCCFLHClqee51YfIkHAp0W0tPA0MfSmgQ?key=7Lkjx2kCxYxYak4udNsTDmgz" alt="" /><figcaption class="wp-element-caption"><em>Game-ready forest via <a href="https://www.turbosquid.com/3d-models/redwood-forest-for-game-ready-1428861?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=immersive-trends-2025&amp;utm_term=mktg_content&amp;utm_content=model" target="_blank" rel="noreferrer noopener">TurboSquid</a></em></figcaption></figure>
</div>


<h4 class="wp-block-heading"><strong>3D will be more convenient</strong></h4>



<p>TurboSquid has long been a leader in making 3D assets convenient and accessible. Now, as 3D applications expand across industries, the focus is shifting toward simplifying how these technologies are used within game engines and other types of software.&nbsp;</p>



<p>At Shutterstock, we’re pushing this vision further by enabling creators to quickly connect our generative 3D API to the engine of their choice. Generating and customizing 3D models directly in the environment you’re already familiar with eliminates the need to learn entirely new interfaces and workflows.</p>



<p>Ease of use is particularly important for the next generation of creators. In the past, beginner game designers might have dipped their toes into development using tools like Flash or <a href="https://www.rpgmakerweb.com/" target="_blank" rel="noreferrer noopener">RPG Maker</a> or even gained <a href="https://education.minecraft.net/" target="_blank" rel="noreferrer noopener">early coding experience</a> through Minecraft. But today, as the gaming industry matures, newer creators are diving straight into building 3D worlds within platforms like Roblox and Fortnite. Games such as these are no longer just for playing — they’re sandboxes for creation, fueled by extensive asset libraries and user-friendly tools.</p>



<p>The <a href="https://create.roblox.com/" target="_blank" rel="noreferrer noopener">Roblox Creator Hub</a> and Unreal Editor for Fortnite (<a href="https://dev.epicgames.com/community/fortnite/getting-started/uefn" target="_blank" rel="noreferrer noopener">UEFN</a>) are prime examples. These scaled-down versions of advanced game engines provide aspiring developers with simplified interfaces and a curated selection of pre-made assets, lowering the barrier to entry for creating engaging 3D experiences. Unlike their full-fledged counterparts, such as Unreal Engine 5, these tools offer streamlined workflows designed for specific use cases, enabling faster learning and greater accessibility.</p>



<p>Whether you’re a beginner exploring your first game concept or a professional streamlining your workflow, the convenience of asset libraries, generative tools, and user-friendly engines is changing what’s possible in 3D creation.</p>



<h4 class="wp-block-heading"><strong>Less making, more remixing</strong></h4>



<p>Related to convenience, modern workflows are transforming how 3D designers approach their projects. Instead of countless hours creating 3D objects from scratch, designers increasingly turn to existing assets, prioritizing customization over creation. This shift allows them to focus on refining and tailoring models to specific needs rather than reinventing the wheel for every project.&nbsp;</p>



<p>With a plethora of <a href="https://www.turbosquid.com/Search/3D-Models/free?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=immersive-trends-2025&amp;utm_term=mktg_content&amp;utm_content=free-srp" target="_blank" rel="noreferrer noopener">free 3D models</a> available online, designers have plenty of options for where to begin. And while free models sometimes have limitations in quality or modification permissions, they can be an excellent starting place for functional and creative work, especially in prototyping or concept development.</p>



<p>As AI-generated 3D model technology advances, workflows can begin with a text-to-3D description, yielding a basic model in seconds, such as “a modern desk lamp with a brushed metal finish.” This workflow can potentially save hours, if not days, on the initial creation process, allowing designers to allocate more time to the aspects of their work that require a human touch, like perfecting aesthetics, enhancing functionality, or aligning the model with brand guidelines.&nbsp;</p>



<p>Once tools are more sophisticated and accessible, combining pre-existing assets, AI assistance, and skilled human craftsmanship will redefine how 3D content is produced and elevate what designers can achieve within tighter timelines.</p>



<h4 class="wp-block-heading"><strong>Looking to the future</strong></h4>



<p>What’s exciting is how these advancements work together. AI isn’t here to replace human creativity. It’s amplifying it, enabling designers to focus on what they do best. Collaboration tools are breaking down geographic and technical barriers, making it easier than ever for teams to bring their shared visions to life.</p>



<p>As we look to 2025 and beyond, one thing is clear: 3D design is no longer confined to specialists or standalone workflows. It’s a field becoming more accessible, flexible, and essential across industries, from gaming to retail to wellness. For creators, the opportunities are limitless.</p>



<p>Whether you’re a seasoned designer, a team lead looking to streamline processes, or a newcomer exploring 3D for the first time, this is your moment to embrace the future. By leveraging the latest tools and trends, you can work smarter, collaborate better, and create more freely than ever before. The possibilities are waiting — what will you build next?</p>



<hr class="wp-block-separator has-alpha-channel-opacity" />



<p class="has-text-align-center"><strong>Ready to get started?</strong></p>



<p class="has-text-align-center">TurboSquid can help you keep up with the latest trends.</p>



<p class="has-text-align-center"><a href="https://auth.turbosquid.com/users/sign_up?client_id=2c781a9f16cbd4fded77cf7f47db1927b85a5463185769bcb970cfdfe7463a0c&amp;return_to=https%3A%2F%2Fwww%2Eturbosquid%2Ecom%2FLogin%2FKeymaster%2Ecfm%3Fendpoint%3Dauthorize" target="_blank" rel="noreferrer noopener"><strong>SIGN UP TODAY</strong></a></p>
<p>The post <a href="https://blog.turbosquid.com/2024/12/06/5-immersive-3d-design-trends-for-2025/">5 Immersive &amp; 3D Design Trends for 2025</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">14084</post-id>	</item>
		<item>
		<title>Read these TurboSquid tips before buying your next 3D game assets</title>
		<link>https://blog.turbosquid.com/2024/09/27/turbosquid-tips-buying-3d-game-assets/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=turbosquid-tips-buying-3d-game-assets</link>
					<comments>https://blog.turbosquid.com/2024/09/27/turbosquid-tips-buying-3d-game-assets/#comments</comments>
		
		<dc:creator><![CDATA[Jesse Radonski]]></dc:creator>
		<pubDate>Fri, 27 Sep 2024 17:49:38 +0000</pubDate>
				<category><![CDATA[Game-ready models]]></category>
		<category><![CDATA[3D game assets]]></category>
		<category><![CDATA[game dev assets]]></category>
		<category><![CDATA[game-ready]]></category>
		<guid isPermaLink="false">https://blog.turbosquid.com/?p=14064</guid>

					<description><![CDATA[<p>Acquiring the perfect 3D game assets is both an art and a science. Online marketplaces like TurboSquid offer a wealth of possibilities for game developers, but there are also some key things to consider if you want to avoid solvable issues down the line. The good news is finding great assets is easier than you think. Just use this guide, ... </p>
<div><a href="https://blog.turbosquid.com/2024/09/27/turbosquid-tips-buying-3d-game-assets/" class="more-link">Read More</a></div>
<p>The post <a href="https://blog.turbosquid.com/2024/09/27/turbosquid-tips-buying-3d-game-assets/">Read these TurboSquid tips before buying your next 3D game assets</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Acquiring the perfect 3D game assets is both an art and a science. Online marketplaces like TurboSquid offer a wealth of possibilities for game developers, but there are also some key things to consider if you want to avoid solvable issues down the line. The good news is finding great assets is easier than you think. Just use this guide, and you’ll be a seasoned hunter in no time.</p>



<h4 class="wp-block-heading">Before we begin</h4>



<p>First, we recommend comparing visual consistency across the assets. Are you making a cartoon-inspired 3D action platformer? You’ll want assets that aren’t very realistic then. Ensuring the aesthetic of your environment, characters, and items matches will save you time and prevent you from fixing inconsistencies. Tweaks are sometimes necessary for pre-made game assets; however, a little forward-thinking in what you choose now can go a long way later in your dev cycle.</p>



<p>If you plan on selling your game, check each asset’s license to see if it allows for commercial use. TurboSquid has a helpful <a href="https://blog.turbosquid.com/turbosquid-3d-model-license/?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=license" target="_blank" rel="noreferrer noopener">3D model license</a> page that covers multiple uses, including games, for reference. Generally, though, TurboSquid assets come with thorough documentation — found in the product description section — that outlines their intended use and features. If you think you may have trouble implementing the assets, see if the creator provides support or regular updates to ensure the asset remains functional with future engine updates. TurboSquid also has 24/7 live chat support if you have any questions.</p>


<div class="wp-block-image">
<figure data-wp-context="{&quot;imageId&quot;:&quot;69da3936ed018&quot;}" data-wp-interactive="core/image" data-wp-key="69da3936ed018" class="aligncenter wp-lightbox-container"><img fetchpriority="high" decoding="async" width="1600" height="888" data-wp-class--hide="state.isContentHidden" data-wp-class--show="state.isContentVisible" data-wp-init="callbacks.setButtonStyles" data-wp-on--click="actions.showLightbox" data-wp-on--load="callbacks.setButtonStyles" data-wp-on-window--resize="callbacks.setButtonStyles" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.png" alt="TurboSquid product page spotlighting the license area of the page." class="wp-image-14075" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.png 1600w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.png?resize=300,167 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.png?resize=768,426 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.png?resize=1024,568 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.png?resize=1536,852 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.png?resize=100,56 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.png?resize=862,478 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.png?resize=1200,666 1200w" sizes="(max-width: 1600px) 100vw, 1600px" /><button
			class="lightbox-trigger"
			type="button"
			aria-haspopup="dialog"
			aria-label="Enlarge"
			data-wp-init="callbacks.initTriggerButton"
			data-wp-on--click="actions.showLightbox"
			data-wp-style--right="state.imageButtonRight"
			data-wp-style--top="state.imageButtonTop"
		>
			<svg xmlns="http://www.w3.org/2000/svg" width="12" height="12" fill="none" viewBox="0 0 12 12">
				<path fill="#fff" d="M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z" />
			</svg>
		</button><figcaption class="wp-element-caption"><a href="https://www.turbosquid.com/3d-models/cartoon-man-rigged-1350163?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=model" target="_blank" rel="noreferrer noopener">Cartoon Man Rigged</a> via <a href="https://www.turbosquid.com/Search/Artists/CartoonFactory?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=artist" target="_blank" rel="noreferrer noopener">CartoonFactory</a></figcaption></figure>
</div>


<h4 class="wp-block-heading">Platform and engine compatibility</h4>



<p>Next, you’ll want to ensure the 3D asset format is compatible with the game engine you’re using. Generally, TurboSquid contributors will call out compatibility, but knowing that in advance will help avoid conversion issues. You can also filter models on TurboSquid by <a href="https://www.turbosquid.com/Search/3D-Models/unitypackage?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=format_search" target="_blank" rel="noreferrer noopener">Unity</a> or <a href="https://www.turbosquid.com/Search/3D-Models/upk?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=format_search" target="_blank" rel="noreferrer noopener">Unreal</a>.</p>



<p>After you’ve assessed engine compatibility, now’s the time to dig into optimization considerations. Are you developing a game for PC, console, mobile, or a combination of the three? You’ll want to ensure your asset is optimized for your target platform, as high-poly models and large textures may degrade performance, especially on lower-end devices. We all know dropped frames are a no-no for gamers. If you’re using assets optimized for real-time rendering, look for models that balance lower polygon counts without sacrificing too much visual fidelity. Try filtering models on TurboSquid by selecting <strong>Real-Time</strong> and <strong>Low Poly</strong>, or <a href="https://www.turbosquid.com/Search/3D-Models/real-time/low-poly?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=search" target="_blank" rel="noreferrer noopener">start here</a> at your convenience.</p>



<p>Game developers should ensure the model’s textures are properly compressed for the target platforms since high-resolution textures can impact the game’s memory usage and performance. Level-of-Detail models (LODs) allow for different versions of the same model that decrease in rendering details based on the camera’s distance, which helps maintain performance without overloading the system with unnecessary detail at a distance. Just because you can walk to a mountain peak in an open-world RPG doesn’t mean you need all its fine details from 20 miles away.</p>


<div class="wp-block-image">
<figure data-wp-context="{&quot;imageId&quot;:&quot;69da3936ed602&quot;}" data-wp-interactive="core/image" data-wp-key="69da3936ed602" class="aligncenter wp-lightbox-container"><img decoding="async" data-wp-class--hide="state.isContentHidden" data-wp-class--show="state.isContentVisible" data-wp-init="callbacks.setButtonStyles" data-wp-on--click="actions.showLightbox" data-wp-on--load="callbacks.setButtonStyles" data-wp-on-window--resize="callbacks.setButtonStyles" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg" alt="TurboSquid product page of an Iceland mountain 3D model." class="wp-image-14073" /><button
			class="lightbox-trigger"
			type="button"
			aria-haspopup="dialog"
			aria-label="Enlarge"
			data-wp-init="callbacks.initTriggerButton"
			data-wp-on--click="actions.showLightbox"
			data-wp-style--right="state.imageButtonRight"
			data-wp-style--top="state.imageButtonTop"
		>
			<svg xmlns="http://www.w3.org/2000/svg" width="12" height="12" fill="none" viewBox="0 0 12 12">
				<path fill="#fff" d="M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z" />
			</svg>
		</button><figcaption class="wp-element-caption"><a href="https://www.turbosquid.com/3d-models/mountain-iceland-1646851?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=model" target="_blank" rel="noreferrer noopener">Mountain Iceland</a> via <a href="https://www.turbosquid.com/Search/Artists/Asset-Scan-3d?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=artist" target="_blank" rel="noreferrer noopener">Asset Scan 3d</a></figcaption></figure>
</div>


<h4 class="wp-block-heading">Animating your characters</h4>



<p>You’ll likely require <a href="https://www.turbosquid.com/Search/3D-Models/animated?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=search" target="_blank" rel="noreferrer noopener">animated models</a>, so we recommend checking two things during the buying process: how they&#8217;re rigged and whether that’s compatible with your game engine. If available, review the contributor’s video of their animation to see if its movement matches your game’s style. For instance, characters in a colorful cartoon world generally will have exaggerated movements compared to those in a hyper-realistic game.</p>



<p>Of course, not every non-playable character needs to be fully rigged. If they’re fairly stationary, it may not make sense to overcomplicate the rig for things like finger movements and full-body actions, which can cause performance issues.</p>



<p>You can also reduce your asset’s file size and improve its playback performance by seeing if unnecessary keyframes are removed from the animations. If you plan to apply different animations to your game asset, also make sure the rig is compatible with your engine’s animation retargeting system. Watch the video below if retargeting is new to you.</p>



<figure class="wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="x-resp-embed x-is-video x-is-youtube"><iframe title="Retargeting Animations in 5.4 is Finally Easy!!! #unrealengine5" width="742" height="417" src="https://www.youtube.com/embed/iE474cUpR-o?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>



<h4 class="wp-block-heading">Real-time lighting and shading</h4>



<p>Many real-time game engines use physically based rendering (PBR) workflows to mimic how light interacts with real-world objects, so ensuring your assets are built with the correct materials and textures will easily help them integrate into your game scenes. If you’re unfamiliar with PBR workflows, we have <a href="https://blog.turbosquid.com/2023/07/27/an-intro-to-physically-based-rendering-material-workflows-and-metallic-roughness/?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=intro-to-pbr-metallic-roughness" target="_blank" rel="noreferrer noopener">a whole series</a> of blog posts where you can learn more about how this works.</p>



<p>Verify that the model’s shaders are optimized for real-time rendering. However, try to avoid overly complex shaders, as they can cause performance bottlenecks, especially on lower-end devices. It’s common for game development to feel like a tug-of-war between features and performance.</p>



<p>Consider using assets that respond to real-time lighting conditions, such as reflections and shadows. Dynamic shadows changing as lights and objects move in the game environment adds a nice, immersive touch. If possible, see if the 3D asset has been tested under various lighting scenarios, such as indoors, outdoors, and day/night cycles.</p>



<p><strong>Final tip: </strong>Don’t forget about the platform you’re developing the game for. Real-time lighting and shading performance can vary between platforms, so you should use optimized assets to suit the platform’s capabilities.</p>


<div class="wp-block-image">
<figure data-wp-context="{&quot;imageId&quot;:&quot;69da3936edd8f&quot;}" data-wp-interactive="core/image" data-wp-key="69da3936edd8f" class="aligncenter wp-lightbox-container"><img decoding="async" width="1600" height="900" data-wp-class--hide="state.isContentHidden" data-wp-class--show="state.isContentVisible" data-wp-init="callbacks.setButtonStyles" data-wp-on--click="actions.showLightbox" data-wp-on--load="callbacks.setButtonStyles" data-wp-on-window--resize="callbacks.setButtonStyles" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg" alt="TurboSquid product page of a robotic T-Rex." class="wp-image-14074" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg 1600w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg?resize=300,169 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg?resize=768,432 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg?resize=1024,576 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg?resize=1536,864 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg?resize=100,56 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg?resize=862,485 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/09/image.jpeg?resize=1200,675 1200w" sizes="(max-width: 1600px) 100vw, 1600px" /><button
			class="lightbox-trigger"
			type="button"
			aria-haspopup="dialog"
			aria-label="Enlarge"
			data-wp-init="callbacks.initTriggerButton"
			data-wp-on--click="actions.showLightbox"
			data-wp-style--right="state.imageButtonRight"
			data-wp-style--top="state.imageButtonTop"
		>
			<svg xmlns="http://www.w3.org/2000/svg" width="12" height="12" fill="none" viewBox="0 0 12 12">
				<path fill="#fff" d="M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z" />
			</svg>
		</button><figcaption class="wp-element-caption"><a href="https://www.turbosquid.com/3d-models/robotic-t-rex-walking-pose-2235933?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=model" target="_blank" rel="noreferrer noopener">Robotic T-Rex Walking Pose</a> via <a href="https://www.turbosquid.com/Search/Artists/3d_molier-studio?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=3d-game-assets&amp;utm_term=mktg_content&amp;utm_content=artist" target="_blank" rel="noreferrer noopener">3d_molier studio</a></figcaption></figure>
</div>


<h4 class="wp-block-heading">One-up your dev game</h4>



<p>Whether you’re new to game development or just need a fresh reminder of what to look for when shopping for 3D game assets, we hope these tips help. Video games are challenging to create, but knowing where to take shortcuts and where to spend some quality time can ensure your game comes to fruition sooner rather than later.</p>



<hr class="wp-block-separator has-alpha-channel-opacity" />



<p class="has-text-align-center"><strong>Need 3D models for your project?</strong></p>



<p class="has-text-align-center">TurboSquid 3D game assets for countless real-time uses.</p>



<p class="has-text-align-center"><a href="https://auth.turbosquid.com/users/sign_up?client_id=2c781a9f16cbd4fded77cf7f47db1927b85a5463185769bcb970cfdfe7463a0c&amp;return_to=https%3A%2F%2Fwww%2Eturbosquid%2Ecom%2FLogin%2FKeymaster%2Ecfm%3Fendpoint%3Dauthorize" target="_blank" rel="noreferrer noopener">MAKE A TURBOSQUID ACCOUNT</a></p>
<p>The post <a href="https://blog.turbosquid.com/2024/09/27/turbosquid-tips-buying-3d-game-assets/">Read these TurboSquid tips before buying your next 3D game assets</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blog.turbosquid.com/2024/09/27/turbosquid-tips-buying-3d-game-assets/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">14064</post-id>	</item>
		<item>
		<title>Explore the Kaiju VFX techniques used in Godzilla movies</title>
		<link>https://blog.turbosquid.com/2024/08/29/kaiju-vfx-techniques-used-in-godzilla-movies/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=kaiju-vfx-techniques-used-in-godzilla-movies</link>
		
		<dc:creator><![CDATA[Jesse Radonski]]></dc:creator>
		<pubDate>Thu, 29 Aug 2024 19:00:51 +0000</pubDate>
				<category><![CDATA[3D Modeling]]></category>
		<category><![CDATA[godzilla]]></category>
		<category><![CDATA[Kaiju]]></category>
		<category><![CDATA[VFX]]></category>
		<category><![CDATA[visual effects]]></category>
		<guid isPermaLink="false">https://blog.turbosquid.com/?p=14049</guid>

					<description><![CDATA[<p>Kaiju. It’s a word almost synonymous with Godzilla movies. If you have some years under your belt, chances are the large lizard, or King Kong, was your first taste of the genre. Or if you’re younger, Mighty Morphin Power Rangers might have been more of your jam. Regardless of where you begin on the Kaiju timeline, audiences around the globe ... </p>
<div><a href="https://blog.turbosquid.com/2024/08/29/kaiju-vfx-techniques-used-in-godzilla-movies/" class="more-link">Read More</a></div>
<p>The post <a href="https://blog.turbosquid.com/2024/08/29/kaiju-vfx-techniques-used-in-godzilla-movies/">Explore the Kaiju VFX techniques used in Godzilla movies</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p><strong>Kaiju.</strong> It’s a word almost synonymous with <em>Godzilla</em> movies. If you have some years under your belt, chances are the large lizard, or <em>King Kong</em>, was your first taste of the genre. Or if you’re younger, <em>Mighty Morphin Power Rangers</em> might have been more of your jam.</p>



<p>Regardless of where you begin on the Kaiju timeline, audiences around the globe have enjoyed almost a century’s worth of giant monster FX. And while adolescent audiences may chuckle at the special effects and sets of the classics, there are still great takeaways for the modern-day 3D artist.</p>



<p>In this blog post, we’ll spotlight how some of the genre&#8217;s old-school techniques are more applicable now than ever, and reflect on potentially why they have an enduring legacy in the world of cinema.</p>



<figure class="wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="x-resp-embed x-is-video x-is-youtube"><iframe loading="lazy" title="EVOLUTION of GODZILLA in Movies (1954 - 2021) Then &amp; Now" width="742" height="417" src="https://www.youtube.com/embed/KxKXQG5MkZQ?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>



<h4 class="wp-block-heading">Epic proportions</h4>



<p>Kaiju movies wouldn’t be what they are if they didn’t have sets built to scale. The original <em>Godzilla</em> movie set was built at a 25-to-1 scale and featured detailed temples, bridges over water, and power lines ready for smashing.</p>



<p>After the atomic animal made its way to the silver screen in 1954, sets slowly evolved in successive Kaiju movies and TV shows with larger water features, taller buildings, mountains, and other landscapes but generally stayed visually consistent whether you were watching a <em>Godzilla</em> movie or <em>Ultraman</em>.</p>



<p>In the 1980s, things got much bigger, especially with <em>The Return of Godzilla</em>, which scaled the creature up to 260 feet. The sets were built for more detailed destruction, and the movie even included a <a href="https://www.imdb.com/title/tt9015178/mediaviewer/rm2096714497/?ref_=ttmi_mi_all_79" target="_blank" rel="noreferrer noopener">robotic version</a> of the behemoth.</p>



<p>We mention this increase in scale because learning about it is essential for 3D artists and animators. Digital water simulation works differently when a monstrous beast is in the scene. Add in another titan and things really start to get interesting … especially when they’re facing off against each other on water.</p>



<p>While making <em>Godzilla vs. King Kong</em>, Scanline VFX supervisor Bryan Hirota <a href="https://www.cbr.com/godzilla-vs-kong-scanline-vfxs-bryan-hirota-kaiju-clash-visual-life/" target="_blank" rel="noreferrer noopener">told <em>CBR</em></a>: “You have these two gigantic creatures so you have to maintain their scale, one of which has fur. You have to deal with considerations with their musculature and their skin, but they&#8217;re also on top of a very dynamic battlefield: They&#8217;re on ships traveling through the ocean, and they affect the ships with their weight. If they smash things on the aircraft carriers, the boats sink and they&#8217;re able to go under the water and above the water. You have a bunch of different factors and different simulations that you have to take into account throughout the whole sequence.”</p>



<p>But even if you keep your Kaiju on land, understanding the scale of a virtual set can help with camera placement. Ian Failes from <a href="https://beforesandafters.com/2024/05/02/i-really-feel-like-the-cameras-tell-half-the-story-behind-the-visualization-of-godzilla-x-kong-the-new-empire/" target="_blank" rel="noreferrer noopener"><em>befores &amp; afters</em></a> spoke with <em>Godzilla x Kong: The New Empire</em> visualization supervisor Jeremy Munro about this, highlighting the need to keep some trees large but shrink people and trees to a one-tenth scale to help frame things, at least in the early stages.</p>


<div class="wp-block-image">
<figure class="aligncenter size-large"><img loading="lazy" decoding="async" height="486" width="1024" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg?w=742" alt="Promotional photograph from the production company, Legendary, showing a scene moments before Godzilla and King Kong begin fighting on a battleship." class="wp-image-14055" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg 8192w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg?resize=300,142 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg?resize=768,365 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg?resize=1024,486 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg?resize=1536,729 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg?resize=2048,972 2048w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg?resize=100,47 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg?resize=862,409 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/Godzilla-vs-King-Kong.jpeg?resize=1200,570 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption class="wp-element-caption"><em>Image courtesy of Legendary</em></figcaption></figure>
</div>


<h4 class="wp-block-heading">Break-neck momentum</h4>



<p>Of course, scale isn’t only in relation to the Kaiju and the set design. It also gives animators a better understanding of how to add speed, weight, and impact to combat at a massive proportion.</p>



<p>Industrial Light &amp; Magic’s Hal Hickel and his team had to solve the challenge of showing battles between <em>Pacific Rim</em>’s giant robot Jaegers and Kaiju that didn’t just appear to show slow-motion shots of punches and whipping tails.</p>



<p>The solution? Use virtual cameras on body parts. Hickel <a href="https://www.usatoday.com/story/life/movies/2013/07/09/pacific-rim-industrial-light-magic-del-toro-special-effects/2480799/" target="_blank" rel="noreferrer noopener">told <em>USA Today</em></a>, &#8220;That&#8217;s where we were able to convey speed. A fist-cam shot has you riding onboard at 150 miles an hour, as opposed to a wide shot that would show you the action at a slower pace.&#8221;</p>



<p>Its sequel, <em>Pacific Rim: Uprising</em>, ups the ante on the power and destruction of numerous robots and Kaiju in a futuristic Mega Tokyo. A massive city was needed to accommodate that many colossal wrecking machines in a single area, so Double Negative’s production team filmed in South Korea as a stand-in for Tokyo. Production visual effects supervisor Peter Chiang <a href="https://www.vfxvoice.com/pacific-rim-uprising-filming-the-battle-of-tokyo/" target="_blank" rel="noreferrer noopener">explained to <em>VFX Voice</em></a>, “If you think about it, in order for the Jaegers to run down a street, we needed a 70-foot-wide boulevard in order to accommodate just their legs,” said Chiang.&nbsp;</p>



<p>The more giants you have on screen, the more area they’ll need to adequately run, fly, and fall in. Otherwise, it’ll look like they’re sluggishly stepping from place to place. We recommend reading the entire article if you geek out on things like background plates and reference and drone photography as a starting point for creating digital buildings.</p>



<figure class="wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="x-resp-embed x-is-video x-is-youtube"><iframe loading="lazy" title="Pacific Rim: Uprising Movie Clip - Tokyo (2018) | Movieclips Coming Soon" width="742" height="417" src="https://www.youtube.com/embed/UjUICPfoh2c?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>



<h4 class="wp-block-heading">Building beasts on a budget</h4>



<p>Long before Double Negative was digitally assembling Jaegers from the ground up, it teamed up with Tippett Studio on another Kaiju film set in New York City. <em>Cloverfield</em> took audiences by storm utilizing found-footage camera techniques to showcase the destruction evocative of real-life events.</p>



<p>Visual effects supervisor Kevin Blank <a href="https://www.awn.com/vfxworld/cloverfield-reinventing-monster-movie" target="_blank" rel="noreferrer noopener">told <em>AWN</em></a> that even though <em>Cloverfield</em> had a budget of only $25 million, a good portion of that went to VFX. If you haven’t seen the making of this movie, it’s worth reading the article to see how much green screen was used in places like the Brooklyn Bridge sequence. “What was created was basically a 150-foot stretch for the board planks, a few benches, and then lighting fixtures were in place where they would be on the bridge, but the railing, the lamps, and everything is CG,” said Blank. “In New York, we shot helicopter plates on the side of the Brooklyn Bridge to make the environment, but the actual structure of the bridge was 99% visual effects. The only thing that was not was the ground these people were walking on.”</p>



<figure class="wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="x-resp-embed x-is-video x-is-youtube"><iframe loading="lazy" title="Cloverfield (2/9) Movie CLIP - Brooklyn Bridge Collapse (2008) HD" width="742" height="417" src="https://www.youtube.com/embed/dVCki9kwF_4?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>



<p>The visual effects production crew did a remarkable job — and we haven’t even talked about the Kaiju yet. Later in the article, lead creature designer Neville Page explained his process for designing the creature, drawing digitally on a tablet using Adobe Photoshop, using clay, and sculpting with ZBrush.&nbsp;</p>



<p>Page notes, “For me, one of the most key moments in our collective brainstorming was the choice to make the creature be something that we would empathize with. It is not out there, just killing. It is confused, lost, scared. It&#8217;s a newborn. Having this be a story point (one that the audience does not know), it allowed for some purposeful choices about its anatomy, movement and, yes, motivations.”</p>



<p>Those motivations made for some very frightening scenes from a first-person camera perspective as rubble fell dangerously close to the movie’s main characters. Sure, the Kaiju is a big, scary monster, but the damage it leaves in its wake is downright terrifying.</p>



<h4 class="wp-block-heading">Evolving the giant genre</h4>



<p>Environment and creature VFX look so amazing today, it feels like the sky’s the limit for what’s to come in Kaiju films. The found footage genre was already in its prime by the time <em>Cloverfield</em> entered theaters, so there’s no telling if a new movie-making technique like <a href="https://blog.turbosquid.com/2024/07/26/digital-sets-real-impact-virtual-productions-role-in-a-greener-future/?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=kaiju-vfx-techniques&amp;utm_term=mktg_content&amp;utm_content=digital-sets-virtual-productions-role" target="_blank" rel="noreferrer noopener">virtual production</a> will lead to a brand-new moviegoing experience.</p>



<p>Just look at <a href="https://youtu.be/Q0aCarTjHTI?si=IM5Wsf1wMnaaN_vH" target="_blank" rel="noreferrer noopener">YouTube content creators</a> going from homemade stunt videos to doing something new with the possession horror genre in <em>Talk to Me</em>. As real-time technology like <a href="https://www.unrealengine.com/" target="_blank" rel="noreferrer noopener">Unreal Engine</a> continues to gain ground with 3D artists and animators, using pre-made assets like on TurboSquid along with AI-generated objects and sounds could enable small teams to make something we’ve never seen before.</p>



<p>Kaiju movies continue to be popular almost 100 years later, partially because they’ve been used as a way to talk about nuclear warfare and terrorism. But perhaps there’s something in the reptilian part of our brains that thinks it’s pretty cool to see two giant monsters go head-to-head (or fist-to-tail).</p>



<p>Only time will tell where the genre goes next. If it’s your dream to make a Kaiju movie, we hope you’ve found some of this advice useful. Many of the models on TurboSquid are built to a real-world scale, so that’s a great place to begin if you want to start creating a city made for destruction.</p>



<hr class="wp-block-separator has-alpha-channel-opacity" />



<p class="has-text-align-center"><strong>Want to make a Kaiju scene?</strong></p>



<p class="has-text-align-center">TurboSquid has giant <a href="https://www.turbosquid.com/3d-model/monster?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=kaiju-vfx-techniques&amp;utm_term=mktg_content&amp;utm_content=category-search" target="_blank" rel="noreferrer noopener">monsters</a>, <a href="https://www.turbosquid.com/3d-model/city?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=kaiju-vfx-techniques&amp;utm_term=mktg_content&amp;utm_content=category-search" target="_blank" rel="noreferrer noopener">cityscapes</a>, <a href="https://www.turbosquid.com/3d-model/trees?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=kaiju-vfx-techniques&amp;utm_term=mktg_content&amp;utm_content=category-search" target="_blank" rel="noreferrer noopener">trees</a>, and more.</p>



<p class="has-text-align-center"><a href="https://auth.turbosquid.com/users/sign_up?client_id=2c781a9f16cbd4fded77cf7f47db1927b85a5463185769bcb970cfdfe7463a0c&amp;return_to=https%3A%2F%2Fwww%2Eturbosquid%2Ecom%2FLogin%2FKeymaster%2Ecfm%3Fendpoint%3Dauthorize" target="_blank" rel="noreferrer noopener"><strong>CREATE AN ACCOUNT TODAY</strong></a></p>
<p>The post <a href="https://blog.turbosquid.com/2024/08/29/kaiju-vfx-techniques-used-in-godzilla-movies/">Explore the Kaiju VFX techniques used in Godzilla movies</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">14049</post-id>	</item>
		<item>
		<title>SIGGRAPH spotlight: Huge generative 3D launch news and more</title>
		<link>https://blog.turbosquid.com/2024/08/15/siggraph-shutterstock-generative-3d-launch-news/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=siggraph-shutterstock-generative-3d-launch-news</link>
					<comments>https://blog.turbosquid.com/2024/08/15/siggraph-shutterstock-generative-3d-launch-news/#comments</comments>
		
		<dc:creator><![CDATA[Jesse Radonski]]></dc:creator>
		<pubDate>Thu, 15 Aug 2024 15:10:15 +0000</pubDate>
				<category><![CDATA[Events]]></category>
		<category><![CDATA[Alliance for OpenUSD]]></category>
		<category><![CDATA[generative 3D]]></category>
		<category><![CDATA[HP]]></category>
		<category><![CDATA[nvidia]]></category>
		<category><![CDATA[NVIDIA Edify]]></category>
		<category><![CDATA[SIGGRAPH]]></category>
		<category><![CDATA[Trigger XR]]></category>
		<category><![CDATA[WPP]]></category>
		<guid isPermaLink="false">https://blog.turbosquid.com/?p=14032</guid>

					<description><![CDATA[<p>This week at SIGGRAPH, we shared big news about the future of 3D Design. We showcased new technologies for 3D artists to add to their toolbox and our friends at NVIDIA, HP, WPP, and Trigger XR showed off their own cutting edge advancements. If you haven’t had time to catch up on everything coming out of the Mile High City, ... </p>
<div><a href="https://blog.turbosquid.com/2024/08/15/siggraph-shutterstock-generative-3d-launch-news/" class="more-link">Read More</a></div>
<p>The post <a href="https://blog.turbosquid.com/2024/08/15/siggraph-shutterstock-generative-3d-launch-news/">SIGGRAPH spotlight: Huge generative 3D launch news and more</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>This week at <a href="https://s2024.siggraph.org/" target="_blank" rel="noreferrer noopener">SIGGRAPH</a>, we shared big news about the future of 3D Design. We showcased new technologies for 3D artists to add to their toolbox and our friends at <a href="https://www.nvidia.com/en-us/" target="_blank" rel="noreferrer noopener">NVIDIA</a>, HP, WPP, and Trigger XR showed off their own cutting edge advancements. If you haven’t had time to catch up on everything coming out of the Mile High City, then keep reading.</p>



<figure class="wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="x-resp-embed x-is-video x-is-youtube"><iframe loading="lazy" title="Generative 3D API | Shutterstock" width="742" height="417" src="https://www.youtube.com/embed/4zE0fMn15gM?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>



<h4 class="wp-block-heading">Starting the show with a bang</h4>



<p>We kicked off SIGGRAPH by <a href="https://www.shutterstock.com/press/Shutterstock-Launches-First-Ethical-J8znD?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=SFDC_event_240730-SIGGRAPH-2024&amp;utm_term=mktg_content&amp;utm_content=siggraph_recap_blog" target="_blank" rel="noreferrer noopener">announcing the launch</a> of our generative 3D API. Why an API? Because it’s the easiest way for companies to incorporate this powerful tech into both existing workflows and DCC applications, like <a href="https://www.autodesk.com/products/maya/overview" target="_blank" rel="noreferrer noopener">Maya</a> and <a href="https://www.unrealengine.com/en-US" target="_blank" rel="noreferrer noopener">Unreal Engine</a>.</p>



<p>Built on <a href="https://blogs.nvidia.com/blog/edify-3d-generative-ai-custom-fine-tuning/" target="_blank" rel="noreferrer noopener">NVIDIA Edify</a>, this API offers a fast and ethical way to produce realistic 3D models using AI. It&#8217;s available now for large enterprises and we&#8217;ll have an option open to everyone coming this fall, so be sure to <a href="https://www.shutterstock.com/discover/generative-ai-3d?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=SFDC_event_240730-SIGGRAPH-2024&amp;utm_term=mktg_content&amp;utm_content=siggraph_recap_blog" target="_blank" rel="noreferrer noopener">sign up</a> to be notified when it’s available.</p>



<p>Also announced was the early access release of our 360 HDRi generator API, which generates rich and detailed 360° panoramas of natural environments that are great for lighting 3D scenes. We showcased both of these offerings in our talk, <em>Enhancing 3D Pipelines with Generative AI</em>, presented by our very own Dade Orgeron.&nbsp;</p>



<p>He discussed responsible AI practices using Shutterstock&#8217;s TRUST framework:&nbsp;</p>



<ul class="wp-block-list">
<li><strong>T</strong>raining on properly licensed data</li>



<li><strong>R</strong>oyalties that compensate artists fairly</li>



<li><strong>U</strong>plift and Promote representation that reflects the world</li>



<li><strong>S</strong>afeguards to control content and protect customers</li>



<li><strong>T</strong>ransparency by design</li>
</ul>



<p>Dade also explored how generative AI appeals to teams of all sizes in various industries, including non-3D industries that have had difficulty incorporating 3D solutions in the past, and for brands currently testing it like HP and WPP.</p>



<figure class="wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="x-resp-embed x-is-video x-is-youtube"><iframe loading="lazy" title="Shutterstock Launches First Ethical Generative 3D API | SIGGRAPH 2024" width="742" height="417" src="https://www.youtube.com/embed/HSqzlFRYg7A?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>



<p><strong>Generative 3D in action</strong></p>



<p>Speaking of <a href="https://www.wpp.com/en-us/" target="_blank" rel="noreferrer noopener">WPP</a>, we had the pleasure of sitting down with its talented creative technologists to explore how they’ve used the generative 3D API for game development and virtual production.</p>



<p>&#8220;Being able to bring something from an idea in your head to actually seeing it on screen or on paper within, like, a minute is incredible,&#8221; says WPP creative technologist Omotara Edu. &#8220;And I think that leaves a lot of space for you to then be more creative.&#8221; We couldn’t agree more.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" height="768" width="1024" data-id="14038" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png?w=742" alt="" class="wp-image-14038" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png 3000w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png?resize=300,225 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png?resize=768,576 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png?resize=1024,768 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png?resize=1536,1152 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png?resize=2048,1536 2048w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png?resize=100,75 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png?resize=862,647 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1837.png?resize=1200,900 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" height="768" width="1024" data-id="14037" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png?w=742" alt="" class="wp-image-14037" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png 3000w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png?resize=300,225 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png?resize=768,576 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png?resize=1024,768 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png?resize=1536,1152 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png?resize=2048,1536 2048w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png?resize=100,75 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png?resize=862,647 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1832.png?resize=1200,900 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" height="768" width="1024" data-id="14036" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png?w=742" alt="" class="wp-image-14036" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png 3000w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png?resize=300,225 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png?resize=768,576 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png?resize=1024,768 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png?resize=1536,1152 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png?resize=2048,1536 2048w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png?resize=100,75 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png?resize=862,647 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1835.png?resize=1200,900 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" height="768" width="1024" data-id="14039" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png?w=742" alt="" class="wp-image-14039" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png 3000w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png?resize=300,225 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png?resize=768,576 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png?resize=1024,768 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png?resize=1536,1152 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png?resize=2048,1536 2048w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png?resize=100,75 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png?resize=862,647 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/IMG_1829.png?resize=1200,900 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</figure>



<h4 class="wp-block-heading">Fun with 3D printing</h4>



<p>Back in March, we had a wonderful time at GTC seeing people generate 3D-printed models with the help of <a href="https://www.hp.com/us-en/home.html" target="_blank" rel="noreferrer noopener">HP</a>. So it was exciting to do it again at SIGGRAPH. Just a quick look at the images above shows how diverse the models can be. We’re partial to cats, but that snowman is pretty awesome, too.</p>



<p>The demo shows the power of using generative 3D for quickly prototyping ideas and bringing them to your fingertips with 3D printing. Can you think of any related use cases? Let us know in the comments.</p>



<h4 class="wp-block-heading">Spatial computing with a side of generative</h4>



<p>While the models were being 3D printed, the folks at <a href="https://www.triggerxr.com/" target="_blank" rel="noreferrer noopener">Trigger XR</a> wanted to give SIGGRAPH guests the opportunity to generate a model and play around with it using an Apple Vision Pro headset. For some, this was the first time they had ever tried the spatial computing device, and people understood its potential uses, whether you’re working on a VR game or a training application.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" height="1024" width="1024" data-id="14040" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator1.png?w=742" alt="" class="wp-image-14040" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator1.png 1080w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator1.png?resize=150,150 150w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator1.png?resize=300,300 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator1.png?resize=768,768 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator1.png?resize=1024,1024 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator1.png?resize=100,100 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator1.png?resize=862,862 862w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" height="1024" width="1024" data-id="14041" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator2.png?w=742" alt="" class="wp-image-14041" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator2.png 1080w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator2.png?resize=150,150 150w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator2.png?resize=300,300 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator2.png?resize=768,768 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator2.png?resize=1024,1024 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator2.png?resize=100,100 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/08/generator2.png?resize=862,862 862w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</figure>



<h4 class="wp-block-heading">Joining the Alliance for OpenUSD</h4>



<p>At this point, you’ve seen examples of generative 3D in various applications. In a perfect world, all 3D files would work interchangeably, but that’s not always the case. That’s why we’re happy to announce that Shutterstock has joined the <a href="https://aousd.org/" target="_blank" rel="noreferrer noopener">Alliance for OpenUSD</a>.</p>



<p>As the popularity of 3D and real-time technologies continue to grow, more and more industries struggle to take full advantage of the many benefits. OpenUSD ensures that 3D assets and data can be easily shared and used in various applications, reducing compatibility issues and streamlining workflows.</p>



<p>By joining the Alliance for OpenUSD, which includes members such as NVIDIA, Pixar, WPP, and Autodesk, we’re doing our part to ensure both existing and new 3D users can unlock the full potential of 3D for their industries.</p>



<figure class="wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="x-resp-embed x-is-video x-is-youtube"><iframe loading="lazy" title="Shutterstock Joins the Alliance for OpenUSD | SIGGRAPH 2024" width="742" height="417" src="https://www.youtube.com/embed/W4LV_X4ZwYk?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>



<p>It&#8217;s been a busy week at SIGGRAPH. Are you ready to try out our generative 3D API? It’s available now for enterprise customers, with a self-serve option coming soon.</p>



<p>Visit the <a href="https://www.shutterstock.com/discover/generative-ai-3d?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=SFDC_event_240730-SIGGRAPH-2024&amp;utm_term=mktg_content&amp;utm_content=siggraph_recap_blog" target="_blank" rel="noreferrer noopener">generative 3D landing page</a> to learn more about the API, sign up to be notified of the self-serve launch, and gain access to the developer platform to preview API documentation. Developers and enterprises can also experiment with the capabilities today by visiting <a href="https://build.nvidia.com/shutterstock/edify-3d" target="_blank" rel="noreferrer noopener">build.nvidia.com/shutterstock/edify-3d</a>.</p>



<hr class="wp-block-separator has-alpha-channel-opacity" />



<p class="has-text-align-center"><strong>Need professional-quality 3D models?</strong></p>



<p class="has-text-align-center">TurboSquid has models for just about any industry.</p>



<p class="has-text-align-center"><strong><a href="https://auth.turbosquid.com/users/sign_up?client_id=2c781a9f16cbd4fded77cf7f47db1927b85a5463185769bcb970cfdfe7463a0c&amp;return_to=https%3A%2F%2Fwww%2Eturbosquid%2Ecom%2FLogin%2FKeymaster%2Ecfm%3Fendpoint%3Dauthorize" target="_blank" rel="noreferrer noopener">SIGN UP NOW</a></strong></p>
<p>The post <a href="https://blog.turbosquid.com/2024/08/15/siggraph-shutterstock-generative-3d-launch-news/">SIGGRAPH spotlight: Huge generative 3D launch news and more</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blog.turbosquid.com/2024/08/15/siggraph-shutterstock-generative-3d-launch-news/feed/</wfw:commentRss>
			<slash:comments>4</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">14032</post-id>	</item>
		<item>
		<title>Digital sets, real impact: Virtual production&#8217;s role in a greener future</title>
		<link>https://blog.turbosquid.com/2024/07/26/digital-sets-real-impact-virtual-productions-role-in-a-greener-future/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=digital-sets-real-impact-virtual-productions-role-in-a-greener-future</link>
					<comments>https://blog.turbosquid.com/2024/07/26/digital-sets-real-impact-virtual-productions-role-in-a-greener-future/#comments</comments>
		
		<dc:creator><![CDATA[Jesse Radonski]]></dc:creator>
		<pubDate>Fri, 26 Jul 2024 17:23:14 +0000</pubDate>
				<category><![CDATA[Virtual Production]]></category>
		<category><![CDATA[Shutterstock Studios]]></category>
		<category><![CDATA[Unity]]></category>
		<category><![CDATA[Unreal]]></category>
		<category><![CDATA[virtual production]]></category>
		<guid isPermaLink="false">https://blog.turbosquid.com/?p=14008</guid>

					<description><![CDATA[<p>If you’ve paid attention to the Film—or even TV—landscape over the last decade, you’re probably no stranger to virtual production. Here at TurboSquid, it’s not uncommon to open up Slack and see our team sharing behind-the-scenes videos from shows like The Mandalorian and House of the Dragon, detailing how they pulled off some of the most epic, cinematic moments in ... </p>
<div><a href="https://blog.turbosquid.com/2024/07/26/digital-sets-real-impact-virtual-productions-role-in-a-greener-future/" class="more-link">Read More</a></div>
<p>The post <a href="https://blog.turbosquid.com/2024/07/26/digital-sets-real-impact-virtual-productions-role-in-a-greener-future/">Digital sets, real impact: Virtual production&#8217;s role in a greener future</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>If you’ve paid attention to the Film—or even TV—landscape over the last decade, you’re probably no stranger to <strong>virtual production</strong>. Here at TurboSquid, it’s not uncommon to open up Slack and see our team sharing behind-the-scenes videos from shows like <a href="https://www.youtube.com/watch?v=-gX4N5rDYeQ&amp;t=1s" target="_blank" rel="noreferrer noopener"><em>The Mandalorian</em></a> and <a href="https://youtu.be/BBUhAi_QXpE?si=gj0TwpSjykLgM5mj" target="_blank" rel="noreferrer noopener"><em>House of the Dragon</em></a>, detailing how they pulled off some of the most epic, cinematic moments in their productions.</p>



<p>These shows feature fantastical worlds in diverse and dangerous settings where filming on location can be costly and carbon-intensive. This is where virtual production comes in, and with it, a plethora of sustainability and financial benefits.</p>



<p>In fact, Shutterstock Studios is already using it in the ad world for companies like Reckitt and Procter &amp; Gamble. We’ve already seen the advantages. Now, we want to explore why your organization should consider it too.</p>



<figure class="wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="x-resp-embed x-is-video x-is-youtube"><iframe loading="lazy" title="Lysol Cleaning 2024 Kid Chaos 15 CTA ENG  | Shutterstock" width="742" height="417" src="https://www.youtube.com/embed/umCCLLW3sq8?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>



<h4 class="wp-block-heading">Old vs. new</h4>



<p>Think about some of your favorite classic movie. Chances are, transporting the actors, production teams, and equipment to one or many locations came with a sizable carbon footprint. Then factor in building out sets, re-shoots, and all the other common filmmaking costs, and you can quickly see how your environmental impact goes from green to red.</p>



<p>The use of green and blue screens has moved film production in a greener direction (no pun intended). Through post-production, directors could now bring a location <em>to</em> their actors on the set. Less transportation hassle and more control over lighting on-set are nice benefits; however, the trade-off was that the actors needed to use more of their imagination to immerse themselves in the scene.</p>



<p>More recently, LED wall production has progressed to include high-resolution screens that can display realistic backgrounds and environments, using game engines like <a href="https://www.unrealengine.com/" target="_blank" rel="noreferrer noopener">Unreal Engine</a> or <a href="https://unity.com/" target="_blank" rel="noreferrer noopener">Unity</a> to display the virtual environments in real time. Some of these sets are massive, but others are simply a curved wall with some set dressing in front of it. That’s the magic of virtual production; even a smaller wall can become whatever environment you load into the game engine and project onto it.</p>



<p>Aside from its flexibility, this method also reduces the amount of single-use set waste for a project and provides more control over your environment. For instance, rather than traveling to desert locations, <em>The Mandalorian</em> production team used an impressively large set for season 1 of the Disney+ show, later noting the project <a href="https://variety.com/2020/tv/news/the-mandalorian-variety-sustainability-series-1234783200/" target="_blank" rel="noreferrer noopener">reduced carbon emissions</a> by 30 tons.</p>



<figure class="wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="x-resp-embed x-is-video x-is-youtube"><iframe loading="lazy" title="The Virtual Production of The Mandalorian Season One" width="742" height="417" src="https://www.youtube.com/embed/gUnxzVOs3rk?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>



<h4 class="wp-block-heading">Environmental benefits of virtual production</h4>



<p>Similar carbon reductions are happening in television production, such as when the BBC used remote XR stages for its <a href="https://www.newscaststudio.com/2021/10/12/is-green-virtual-production-a-buzzword/" target="_blank" rel="noreferrer noopener">Tokyo Olympics coverage</a>, which was estimated to reduce crew travel by 30 to 40 percent. But what we like about virtual production is that sometimes, you don’t even need a physical object around to make a set come to life.</p>



<p>Take a look at the variety of <a href="https://www.turbosquid.com/Search/3D-Models/real-time/vase?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=digital-sets-virtual-productions-role&amp;utm_term=mktg_content&amp;utm_content=ts_srp" target="_blank" rel="noreferrer noopener">vases on TurboSquid</a> with the real-time label applied, and you can see the potential for adding any of these to a scene. We <a href="https://blog.turbosquid.com/2022/12/15/12-reasons-to-choose-virtual-production-for-your-next-film/?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=digital-sets-virtual-productions-role&amp;utm_term=mktg_content&amp;utm_content=12-reasons-virtual-production" target="_blank" rel="noreferrer noopener">previously blogged</a> about some other advantages of virtual production  if you want to learn more. </p>



<p>Another nice benefit is the energy savings. <a href="https://animationsinstitut.de/en/blog/campus/detail/green-screens-green-pixels-and-green-shooting" target="_blank" rel="noreferrer noopener">A 2022 study</a> compared two projects and found some surprising outcomes. One company used a green screen, offline rendering, and post-production. The other used virtual production, shooting with an LED wall for in-camera effects. The result? The green screen project clocked in at 5,073 kWh vs. the virtual production project’s 1,594 kWh—a 68% reduction.</p>



<figure class="wp-block-image size-large"><a href="https://www.turbosquid.com/3d-models/flower-bouquet-of-sunflowers-in-a-vase-118-1762568" target="_blank" rel="noreferrer noopener"><img loading="lazy" decoding="async" height="576" width="1024" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/flower-vase-deckorator4.jpg?w=742" alt="" class="wp-image-14013" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/flower-vase-deckorator4.jpg 1920w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/flower-vase-deckorator4.jpg?resize=300,169 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/flower-vase-deckorator4.jpg?resize=768,432 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/flower-vase-deckorator4.jpg?resize=1024,576 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/flower-vase-deckorator4.jpg?resize=1536,864 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/flower-vase-deckorator4.jpg?resize=100,56 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/flower-vase-deckorator4.jpg?resize=862,485 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/flower-vase-deckorator4.jpg?resize=1200,675 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></a><figcaption class="wp-element-caption">3D model by <a href="https://www.turbosquid.com/3d-models/flower-bouquet-of-sunflowers-in-a-vase-118-1762568?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=digital-sets-virtual-productions-role&amp;utm_term=mktg_content&amp;utm_content=model" target="_blank" rel="noreferrer noopener">deckorator4</a></figcaption></figure>



<h4 class="wp-block-heading">What to consider next</h4>



<p>Ready to leverage virtual production’s benefits? Two big hurdles to adapting it to your workflows are the initial investment in the tech and the expertise needed to begin and complete the project. LED walls aren’t cheap; setting up a virtual production studio requires real investment, including hiring people who know what they’re doing.&nbsp;</p>



<p>If you’re experienced with 3D modeling, you might know that game engines are relatively inexpensive for beginners, initially limited only by the power of your computer and the spare time you have to learn the engine. We’ve seen some fantastic opportunities for people with unique skill sets to enter industries they might not otherwise have considered.</p>



<p>With <a href="https://www.fastcompany.com/91044059/whats-driving-the-flood-of-layoffs-in-the-video-game-industry" target="_blank" rel="noreferrer noopener">so many layoffs</a> in the video game industry between 2023 and 2024, game developers with extensive experience using real-time game engines could potentially enter a different entertainment area with only a little training. Much like the limitless worlds projected onto a virtual production wall, a little investment in yourself can open up doors to new places and opportunities.</p>


<div class="wp-block-image">
<figure class="aligncenter size-large is-resized"><a href="https://www.shutterstock.com/g/KinoMasterskaya?&amp;utm_campaign=digital-sets-virtual-productions-role&amp;utm_content=sstk_contributor&amp;utm_medium=blog&amp;utm_source=ts_blog&amp;utm_term=mktg_content" target="_blank" rel="noreferrer noopener"><img loading="lazy" decoding="async" height="576" width="1024" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg?w=742" alt="" class="wp-image-14016" style="width:746px;height:auto" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg 3840w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg?resize=300,169 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg?resize=768,432 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg?resize=1024,576 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg?resize=1536,864 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg?resize=2048,1152 2048w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg?resize=100,56 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg?resize=862,485 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/shutterstock_2389827065.jpg?resize=1200,675 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></a><figcaption class="wp-element-caption">Image by <a href="http://www.shutterstock.com/g/KinoMasterskaya?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=digital-sets-virtual-productions-role&amp;utm_term=mktg_content&amp;utm_content=sstk_contributor" target="_blank" rel="noreferrer noopener">KinoMasterskaya</a></figcaption></figure>
</div>


<h4 class="wp-block-heading">What’s next for virtual production</h4>



<p>As fuel prices continue to rise and technology becomes more broadly available, we expect to see the gulf between carbon and monetary savings widen further. If the trend of taxing CO2 emissions continues, then companies already embracing virtual production may have a competitive advantage moving forward.</p>



<p>Companies with a global virtual production presence can also compete on a cultural level, collaborating with teams familiar with their regions, promoting diversity and inclusivity. Shutterstock Studios has already seen the benefits of using the local expertise of its global teams on virtual production sets, switching out country-specific actors and objects to seamlessly integrate an advertisement into its broadcast region.</p>



<p>Virtual production has revolutionized the scale at which movies, TV shows, sports programs, and commercials can be created. But outside of creativity, it’s important to take care of the planet we live on now and for future generations, especially at a commercial scale. By reducing transportation pollutants, on-set waste, and energy consumption, we’re all taking an important step toward a greener future.</p>



<hr class="wp-block-separator has-alpha-channel-opacity" />



<p class="has-text-align-center"><strong>Want to learn more about virtual production?</strong></p>



<p class="has-text-align-center">Shutterstock Studios leverages virtual production capabilities, 3D design expertise, and more.</p>



<p class="has-text-align-center"><a href="https://studios.shutterstock.com/immersive?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=digital-sets-virtual-productions-role&amp;utm_term=mktg_content&amp;utm_content=studios-hp" target="_blank" rel="noreferrer noopener"><strong>SEE HOW TO GET STARTED</strong></a></p>
<p>The post <a href="https://blog.turbosquid.com/2024/07/26/digital-sets-real-impact-virtual-productions-role-in-a-greener-future/">Digital sets, real impact: Virtual production&#8217;s role in a greener future</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blog.turbosquid.com/2024/07/26/digital-sets-real-impact-virtual-productions-role-in-a-greener-future/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">14008</post-id>	</item>
		<item>
		<title>TurboSquid at SIGGRAPH 2024: Get hands-on with generative 3D</title>
		<link>https://blog.turbosquid.com/2024/07/22/turbosquid-at-siggraph-2024-generative-3d/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=turbosquid-at-siggraph-2024-generative-3d</link>
					<comments>https://blog.turbosquid.com/2024/07/22/turbosquid-at-siggraph-2024-generative-3d/#comments</comments>
		
		<dc:creator><![CDATA[Jesse Radonski]]></dc:creator>
		<pubDate>Mon, 22 Jul 2024 17:59:16 +0000</pubDate>
				<category><![CDATA[Events]]></category>
		<category><![CDATA[generative 3D]]></category>
		<category><![CDATA[HP]]></category>
		<category><![CDATA[SIGGRAPH]]></category>
		<category><![CDATA[SIGGRAPH 2024]]></category>
		<category><![CDATA[TriggerXR]]></category>
		<guid isPermaLink="false">https://blog.turbosquid.com/?p=13958</guid>

					<description><![CDATA[<p>We’re headed to Denver for SIGGRAPH 2024, and we’re bringing groundbreaking generative 3D innovations that are set to supercharge your workflow. Curious? Keep reading. Three days for the future of design Visit Shutterstock booth #415 for all the action. Experience the latest in our generative 3D technology at our demo stations, created in collaboration with our friends at NVIDIA. If ... </p>
<div><a href="https://blog.turbosquid.com/2024/07/22/turbosquid-at-siggraph-2024-generative-3d/" class="more-link">Read More</a></div>
<p>The post <a href="https://blog.turbosquid.com/2024/07/22/turbosquid-at-siggraph-2024-generative-3d/">TurboSquid at SIGGRAPH 2024: Get hands-on with generative 3D</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>We’re headed to Denver for <a href="https://s2024.siggraph.org/" target="_blank" rel="noreferrer noopener">SIGGRAPH 2024</a>, and we’re bringing groundbreaking<strong> generative 3D innovations</strong> that are set to <strong>supercharge your workflow.</strong></p>



<p>Curious? Keep reading.</p>


<div class="wp-block-image">
<figure class="aligncenter size-large"><img loading="lazy" decoding="async" height="576" width="1024" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sstk_3d_intro.png?w=742" alt="" class="wp-image-14001" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sstk_3d_intro.png 1920w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sstk_3d_intro.png?resize=300,169 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sstk_3d_intro.png?resize=768,432 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sstk_3d_intro.png?resize=1024,576 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sstk_3d_intro.png?resize=1536,864 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sstk_3d_intro.png?resize=100,56 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sstk_3d_intro.png?resize=862,485 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sstk_3d_intro.png?resize=1200,675 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</div>


<h4 class="wp-block-heading">Three days for the future of design</h4>



<p>Visit Shutterstock booth #415 for all the action. Experience the latest in our generative 3D technology at our demo stations, created in collaboration with our friends at NVIDIA. If you’re new to this type of tech, imagine having the power to create a 3D model from text or an image in under a minute. We put more helpful information on this in our GTC 2024 recap blog post. Read it to learn more.</p>



<p>In addition to generative 3D, Trigger XR is bringing an Apple Vision Pro to our booth, giving guests an opportunity to generate a model and watch it blend seamlessly with reality on a spatial computing headset. Our booth is also home to a theater. Here, we’ll present some of our latest technologies, ranging from generative 3D to AI in virtual production and game development. <a href="https://d3kqgz5iyf5gxy.cloudfront.net/CRTV+2024/Enterprise/Events/theater+schedule+-+new.pdf?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=SFDC_event_240730-SIGGRAPH-2024&amp;utm_term=mktg_content&amp;utm_content=theater-schedule" target="_blank" rel="noreferrer noopener">Check the schedule</a> and start planning your day.</p>


<div class="wp-block-image">
<figure class="aligncenter size-large"><img loading="lazy" decoding="async" height="576" width="1024" src="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sc_video_games.png?w=742" alt="" class="wp-image-14000" srcset="https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sc_video_games.png 1920w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sc_video_games.png?resize=300,169 300w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sc_video_games.png?resize=768,432 768w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sc_video_games.png?resize=1024,576 1024w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sc_video_games.png?resize=1536,864 1536w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sc_video_games.png?resize=100,56 100w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sc_video_games.png?resize=862,485 862w, https://blog.turbosquid.com/wp-content/uploads/sites/2/2024/07/sc_video_games.png?resize=1200,675 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</div>


<h4 class="wp-block-heading">3D pipelines, 3D printing and virtual production&nbsp;</h4>



<p>We’ll have a special talk in the NVIDIA AI theater on Wednesday, July 31, at 9 a.m. MT. <strong>Enhancing 3D Pipelines with Generative AI</strong> will kick off with Dade Orgeron, VP of Innovation for Shutterstock and TurboSquid, as he shares valuable insights into how the next wave of generative technology will improve and enhance 3D pipelines across industries.</p>



<p>In partnership with HP, select models SIGGRAPH guests generate will be 3D printed. This was a big hit at GTC, and we’re looking forward to the work of creative imaginations taking shape. We hear there’ll be some additional giveaways, too.</p>



<p>Is virtual production your thing? Catch a demo of the Apple Vision Pro Immersive 3D experience from <a href="https://studios.shutterstock.com/immersive?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=SFDC_event_240730-SIGGRAPH-2024&amp;utm_term=mktg_content&amp;utm_content=studios" target="_blank" rel="noreferrer noopener">Shutterstock Studios</a> at our booth. This <a href="https://investor.shutterstock.com/news-releases/news-release-details/shutterstock-studios-sweeps-awards-season-15-telly-awards-and?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=SFDC_event_240730-SIGGRAPH-2024&amp;utm_term=mktg_content&amp;utm_content=telly" target="_blank" rel="noreferrer noopener">award-winning team</a> has worked with companies like Procter &amp; Gamble, Amazon, L’Oreal, TikTok, and more. Trust us, it’s gonna be good.</p>


<div class="wp-block-image">
<figure class="aligncenter"><img decoding="async" src="https://lh7-us.googleusercontent.com/docsz/AD_4nXfKSemn1ev_9CqOu_7ZEgZd9PV9VhIRZzXqJeKo93yDUoWJYmKKu7OoytQGNqGyxGt8rYA8eVGdHPUiAEVa9USf6a8jJU2F2-2yxHqGluQQsKEssIXyKF4uBWrB5DaRdDu3s2uWN7InOZDSITjHP0hq8Ci1?key=ajfucmVUFw2ZsherzxqdOQ" alt="" /></figure>
</div>


<h4 class="wp-block-heading">How can I stay updated if I can’t attend live?</h4>



<p>Our team will post at SIGGRAPH from Shutterstock’s <a href="https://www.instagram.com/shutterstock/" target="_blank" rel="noreferrer noopener">Instagram</a>. If there&#8217;s something you want to see at the show, leave a comment below, and we’ll share it with the team. Additionally, we’ll recap the event on our blog and in the TurboSquid newsletter.</p>



<p>Even if you can’t attend, you’ll soon be able to take advantage of our new generative 3D service options. We’re also improving the TurboSquid website, so stay tuned for more information.</p>



<hr class="wp-block-separator has-alpha-channel-opacity" />



<p class="has-text-align-center"><strong>Join the API Waitlist</strong></p>



<p class="has-text-align-center">Experiment, integrate, and scale 3D content anywhere, powered by TurboSquid and NVIDIA.</p>



<p class="has-text-align-center"><strong><a href="https://www.shutterstock.com/discover/generative-ai-3d?&amp;utm_source=ts_blog&amp;utm_medium=blog&amp;utm_campaign=SFDC_event_240730-SIGGRAPH-2024&amp;utm_term=mktg_content&amp;utm_content=lp-cta" target="_blank" rel="noreferrer noopener">Sign up now</a></strong></p>



<p></p>



<p></p>



<p></p>



<p></p>
<p>The post <a href="https://blog.turbosquid.com/2024/07/22/turbosquid-at-siggraph-2024-generative-3d/">TurboSquid at SIGGRAPH 2024: Get hands-on with generative 3D</a> appeared first on <a href="https://blog.turbosquid.com">TurboSquid Blog</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blog.turbosquid.com/2024/07/22/turbosquid-at-siggraph-2024-generative-3d/feed/</wfw:commentRss>
			<slash:comments>10</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">13958</post-id>	</item>
	</channel>
</rss>
