Key Takeaways
• Google has partnered with Sphere Entertainment to bring The Wizard of Oz to life as an immersive, AI-enhanced experience at the Las Vegas Sphere.
• The project uses generative AI models—Veo 2, Imagen 3, and Gemini—to upscale, expand, and recompose scenes from the original 1939 film.
• No new dialogue or music has been added; the experience is built entirely from the original film and archival materials.
• Over 1.2 petabytes of data were processed using Google’s AI infrastructure to deliver 16K visuals across the Sphere’s 160,000-square-foot screen.
• The initiative signals a significant milestone in entertainment technology, blending historical preservation with advanced AI innovation.
Nearly a century after its original release, The Wizard of Oz is returning in a revolutionary new format.
Thanks to a first-of-its-kind collaboration between Google and Sphere Entertainment Co., the 1939 classic is being transformed into a generative AI-powered immersive experience that will premiere on August 28, 2025, at the Las Vegas Sphere.
The venue’s massive 160,000-square-foot LED screen—the most advanced display of its kind—is set to host a version of Oz that retains the original content but amplifies it through technology never before used at this scale in cinema.
A Partnership That Pushes Boundaries
Google Cloud and DeepMind, along with Sphere Studios, Magnopus, and Warner Bros. Discovery, are combining archival preservation with machine learning innovation to deliver the project. Every element has been approached with technical rigor and creative respect.
— Jim Dolan, Executive Chairman and CEO, Sphere Entertainment
— Thomas Kurian, CEO, Google Cloud
Generative AI at the Core
The project uses a trio of powerful AI tools—Veo 2, Imagen 3, and Gemini—to intelligently enhance, extend, and adapt footage from the original film.
Key AI Features in Use:
• Super Resolution: Converts 35mm grainy film into crisp 16K ultra-HD visuals.
• AI Outpainting: Seamlessly extends frames to fill the Sphere’s curved screen.
• Performance Generation: Allows characters to stay onscreen longer than traditional editing permits.
• Long-Context Coherence: Ensures scene continuity through extended sequence processing.
A Preservation-First Philosophy
The development team made an intentional decision to preserve the film’s original integrity—no new dialogue, no added music. Every enhancement was derived from extensive archival research.
— Buzz Hays, Global Lead, Entertainment Solutions, Google Cloud
• Original shooting scripts
• Set design blueprints
• Production illustrations
• Still photography
• Camera specifications and focal lengths
These references were used to fine-tune the AI models and preserve visual consistency across the enhanced version.
Behind the Scenes: A Data-Driven Feat
To bring this experience to life, Google deployed a vast technical infrastructure capable of processing 1.2 petabytes of data to date. This includes:
• Google’s custom Tensor Processing Units (TPUs)
• Google Kubernetes Engine (GKE)
• AI-optimized multi-cloud processing clusters
— Dr. Steven Hickson, Researcher, Google DeepMind
— Dr. Irfan Essa, Principal Scientist, Google DeepMind
Cultural and Industry Impact
This project marks a critical moment in the evolution of immersive media. By focusing on adaptive expansion—rather than remaking or modernizing the content—Google and Sphere demonstrate how legacy films can be preserved and experienced in new ways without compromising their originality.
— Jim Dolan, Sphere Entertainment
April 8, 2025: Google Tries to Keep AI Talent With Cash Contracts April 7, 2025: Google DeepMind Warns AGI Could Arrive by 2030 and Threaten Humanity April 4, 2025: Google Pixel 9A’s Tensor G4: The AI Revolution in Your Pocket
For more news and insights, visit AI News on our website.