How animators and AI researchers made ‘Dear Upstairs Neighbors’
- •Google DeepMind debuts ‘Dear Upstairs Neighbors’ animated short at Sundance using novel video-to-video generative workflows.
- •Researchers developed fine-tuned Veo and Imagen models to preserve specific artistic styles and character consistency.
- •Professional animators used rough 3D and 2D layouts to guide AI generations for frame-by-frame control.
Google DeepMind has unveiled "Dear Upstairs Neighbors," an animated short premiering at the Sundance Film Festival that bridges the gap between traditional storytelling and generative AI. Instead of relying on unpredictable text prompts, the production team—which includes Pixar veterans—prioritized artistic control through specialized research tools. By fine-tuning AI models on custom concept art, the creators ensured that characters like Ada remained consistent across scenes, even handling complex 2D design rules that typically clash with 3D perspectives.
The breakthrough lies in a video-to-video workflow. Animators first created rough layouts in standard tools like Maya or TV Paint, which acted as a visual blueprint. Google’s generative models then transformed these sketches into high-fidelity, expressionist frames that followed the exact timing and motion intended by the artists. This hybrid approach allowed for "dailies" reviews—a standard film production practice—where specific regions of a video could be refined or masked without re-generating the entire shot from scratch.
The project highlights a shift toward artist-first AI, where models serve as sophisticated technical assistants rather than autonomous creators. The production concluded with the use of a 4K upscaling model to prepare the film for the big screen, a feature Google plans to bring to its Vertex AI and AI Studio platforms soon. By treating AI as a layer of digital paint, DeepMind demonstrates a future where human intuition drives the machine's creative output.