Adobe just consolidated over 30 AI models into a single app. Google, OpenAI, Runway, and Adobe's own — all accessible from one interface. You can generate with one model, refine with another, compare outputs, and edit the result with Photoshop-grade tools. Without leaving the workspace.
The biggest announcement is custom models, now in public beta. Train Firefly on your own images and it learns your visual style. Upload your portfolio and it reproduces your look.
Video tools are in too. Quick Cut builds a first cut from raw footage automatically. The image editor now has generative fill, remove, expand, upscale, and background removal all in a single workspace.
The interesting one: Design Intelligence
They quietly added something to Illustrator that's worth paying attention to. Firefly learns the visual rules of your brand and generates consistent, on-brand content. That's not just a creative tool. That's a design system with a brain.
The moat is shifting
The moat in creative tools used to be features. Photoshop had layers, Figma had collaboration, Canva had simplicity. Adobe is now arguing the moat is how many models you can access and how good the editing tools are that wrap around them.
Every standalone AI image generator should be watching this. If Adobe offers 30 models in one place with professional editing on top, the standalone generators become commodity. The editing layer becomes the product.
This is the same pattern we saw with cloud. AWS didn't win because S3 was the best individual service. They won because they had everything in one place.
Adobe is betting the creative tool market works the same way.
Do you think bundling 30 AI models into one platform is the future, or will specialists always beat the all-in-one?
Top comments (0)