Continuing to fulfill its promise to bring third-party AI models into its Firefly ecosystem, Adobe has added Luma AI’s newest generative video model, Ray3, to Firefly.
Adobe Firefly is Luma AI’s first launch partner for Ray3, and it will be available exclusively on Luma AI’s Dream Machine platform and Adobe Firefly for the next couple of weeks. Luma AI made headlines last November after revamping Dream Machine.
Alongside Firefly’s own generative video model, which launched last October to mixed reviews, Luma AI’s Ray3 enables users to generate video clips from scratch. Ray3 promises “cinematic, high-quality video” clips up to 10 seconds long and is among the very first AI video models to support native high dynamic range (HDR) video generation.
“Built on a new multimodal reasoning system, Ray3 can better understand the user’s creative intent and deliver production-ready assets in a wide variety of complex circumstances, including motion blur, crowds, preserved anatomy, physics simulations, and world exploration,” Adobe writes.
Luma AI’s CEO and co-founder, Amit Jain, states that Ray3 was designed with cinematic storytellers and filmmakers in mind.
“Ray3 was built with filmmakers, storytellers, and creators at the center, and this integration with Adobe Firefly makes that vision real,” Jain says. “By combining Adobe’s creative app ecosystem with the intelligence of Ray3, we are giving creators the ability to move from idea to cinematic video in seconds with professional-grade quality and control. This partnership isn’t just about faster workflows; it is about unlocking entirely new ways to imagine and tell stories.”
Inside Firefly, users can quickly generate b-roll or background footage with Ray3 using simple text prompts. Adobe and Luma AI argue that this is a good use case for content creators who need to fill in gaps or add a bit more narrative depth to their videos, particularly for social media content.
Filmmakers can use Ray3 in Firefly Boards to explore visual directions and storyboard before heading out to capture their desired real-world footage.
“It’s easy to generate environments, shot compositions and camera perspectives while storyboarding or planning scenes,” Adobe writes. “Whether you’re imagining a chase through a bustling city or a quiet moment in a remote landscape, Ray3 helps you prototype your vision quickly.”
Everything created using Ray3 inside Firefly can be synced with Adobe Creative Cloud, allowing users to bring generated content into other Adobe apps, such as Premiere Pro, for further refinement and editing.
Like other content generated inside Firefly, videos generated using Ray3 include Content Credentials, so people will know which AI model was used to create it.
Luma AI’s Ray3 model is now available inside Adobe Firefly. Paid Firefly and Creative Cloud Pro subscribers receive unlimited generations using Ray3 until October 1, after which generative credit usage will apply as usual.
Image credits: Adobe, Luma AI