Autodesk Media & Entertainment brings new capabilities to enhance creativity and efficiency

October 15, 2024 4 min read

Autodesk News

From Autodesk News: Diana Colella


Wonder Studio (an Autodesk product)

The Media and Entertainment industry has shown resiliency in years of transformation, and continues to reinvent how movies, tv, and games are made.

Talented creatives continue to produce remarkable content like Inside Out 2 (the highest-grossing animated film ever) and crossover content from games to tv series like The Last of Us and Fallout, while looking for efficiencies in their pipeline.

We understand that content needs to be both compelling and profitable. That’s why we, at Autodesk, are committed to enhancing the performance and capabilities of our tools, and are integrating artificial intelligence (AI) workflows to help automate tedious tasks, giving artists more space for creative iteration and storytelling.

AI is a hot topic in entertainment, more so than in perhaps any other industry. Yet despite the hype, we will need solutions that can help augment pipelines without disrupting them, while keeping an eye on reinvention in the future.

Enhancing our core tools with AI

Autodesk is adding AI capabilities to its existing creative tools to help accelerate artist workflows:

  • Getting less noisy renders takes a lot of computing power so we used AI in Arnold to solve this problem. Artists can now quickly denoise images every time they render a scene.
  • Let’s say an artist is working in Flame and wants to slow down a video. Using AI, we are able to create additional frames in between to produce a much more realistic result.
  • In Maya, you now have the ML Deformer. The algorithm learns how a complex character moves just from the data in a scene. Animators can then pose the character in real-time, as the tool approximates complex deformations for them.

Moving the industry forward with AI

Today, artists can use Wonder Studio to easily put characters into live-action scenes. At Autodesk University (AU) 2024, we are unveiling a powerful new AI capability in Wonder Studio, Motion Prediction, which anticipates character poses even when the view of an actor is obstructed by an object. Motion Prediction anticipates movement to produce more natural poses with less shaking and noise.

Peter France, VFX Artist at Corridor Digital, an independent visual effects (VFX) studio, shared that “a lot of the time, as an independent artist, you want to make really ambitious projects […]. Wonder Studio is going to be an invaluable tool for these kinds of people where they can go after more of those ambitious ideas.”

Autodesk Research is also pushing the boundaries of what AI can do for artists. A good example of this is Neural Motion Control, a prototype which enables animators to direct a character’s actions using a handful of keyframes and a neural network. This will save animators significant time, while maintaining the low-level control they are used to, enabling them to focus on crafting the character’s unique performance.

Increasing efficiency with Autodesk Flow

Autodesk Flow, the industry cloud for Media and Entertainment, helps connect workflows, data, and teams across the entire production lifecycle from earliest concept to final delivery.  At the heart of Flow is a common data model underpinning the new capabilities we are showcasing at AU.

  • The reimagined user experience in Flow Capture, our camera-to-cloud solution, is putting assets front and center – where they should be! Studios can now find their assets in a central place, organize them with a new drag and drop capability, and much more.
  • We are launching our new Flow Graph Engine API that allows developers to run Bifrost graphs in the cloud and build custom compute solutions for their studio. We are creating an ecosystem where others can build on the Flow Graph Engine. DigitalFish, a company that develops novel tools for creating digital media and immersive content in media and entertainment, have already put the Flow Graph Engine API to work. They are building an XR workflow leveraging Apple’s iOS and visionOS to bring compute and pre-visualization of effects to the set. The workflow starts with the crew scanning the set to create a digital twin, then using the Flow Graph Engine to complete the mesh. The VFX artist can add 3D assets and VFX simulations that react to the digital twin. Visual effects are simulated in the cloud, and the director can see it all to communicate precisely how the visual effects will transform the scene. Now the actors can refine their performances, the cinematographer can compose the best camera angles, and the entire crew can interact with the 3D elements. We also built Autodesk’ s Flow Retopology service (available in both Maya and 3ds Max 2025) with the Flow Graph Engine that enables artists to offload preparation of complex meshes for animation and rendering to the cloud.
  • We are enhancing artist workflows with new Flow features that connect to our desktop solutions, one of which is Flow Animating in Context for Maya. The creative intent of the editor is the heartbeat of any production, but it’s not visible to artists working in Maya. We will change this by connecting data from Flow Production Tracking to Maya and vice versa. Animators will be able to see the editorial timeline as they work and create better animations faster.

We want to enable you to be more creative and make pipelines more efficient. We will continue to bring value to the products you use today and add even more value with Autodesk Flow and meaningful AI solutions.

Our industry will continue to transform and reinvent itself and we will be right alongside you.

Register for a digital pass




Also in Design News

Subscribe