Importing existing projects into Diploi has been possible for some time. The first versions were limited to simple Node.js and Lovable projects and, to be honest, were not very good even at that.
Initially, we used Railpack by Railway to detect languages and technologies. That helped, but the real breakthrough came when we started using AI for the harder parts of the import process. There are still things to improve, but we believe Diploi now handles cases that no other services can.
The challenge
The problem is massive: How do you import and run just about any GitHub repository, ideally with no configuration upfront?
A repository might be:
- a frontend-only project
- a frontend and backend application
- a monorepo
- a project with multiple microservices
- a setup with databases, background workers, or other supporting services
To make this work, Diploi needs to detect technologies, ports, hosts, container start commands, environment variables, and service relationships. It then needs to map all of this into Diploi components that can run the actual containers and make them talk to each other.
One thing that makes this especially interesting is that Diploi runs both development and production workloads. Development environments often work quite differently from production. They may need hot module reloading, editable source code, different commands, or different assumptions than a prebuilt production app.
Why Diploi is well suited for AI-powered imports
There are a few things that make Diploi especially well suited for this task.
First, Diploi has precisely defined components for many popular languages and frameworks. This makes it easier for AI to reason about what a project needs and how it should be mapped.
Second, most of the infrastructure setup is controlled through a single diploi.yaml file. This file defines hosts, ports, environment variables, container commands, and relationships between components. In many ways, this is exactly the kind of structured configuration that AI is good at producing.
We started experimenting with AI-powered imports in March, and the initial tests were promising. By analyzing package files, Docker Compose files, Dockerfiles, READMEs, folder structures, and other project files, AI turned out to be remarkably capable of understanding how a project should run on Diploi.
Our work has mostly been to provide context to AI, add hints for tricky situations, write instructions for our components, and fine-tune the process through automated tests. We also made many improvements to the components themselves so they could handle a wider range of real-world use cases. Our initial benchmark was the list of every public repository that had ever been imported into Diploi.
In the start of April 2026, we decided to release the new import flow to production, because it had already then become a massive improvement over the previous version.
Where we are now
This feature is by no means perfect.
Even when the architecture is correctly detected and mapped to Diploi, there may still be extra steps before an application runs 100% correctly. Some projects require migrations, manual setup steps, or secret keys that need to be configured. In some cases, Dockerfiles might need to be modified (we currently do not allow AI to do that).
We also do not support every language, framework, or version yet, but we're constantly improving and expanding what Diploi can run.
Comparison
When we benchmarked this against other services, we could not find another service that could import and run even modestly complex projects in the way described above. Many platforms are excellent at running frontend applications, such as Netlify and Vercel, but we have not found any that really try to handle full-stack projects, monorepos, development environments, production environments, databases, and service relationships in the same import flow.
| Service | What we found |
|---|---|
| Vercel | Excellent for frontend only, especially Next.js and Lovable-style imports, which run quickly and reliably. Backend is handled through serverless functions rather than containers, so container-based backends and full-stack monorepos with their own services are not really in scope. We could not find clear support for importing public repos or for example databases in the import flow itself. |
| Netlify | Strong for static sites, Next.js and serverless functions, with an integrated database option (Neon). Frontend-only imports such as Lovable projects worked well in our tests, but we did not find support for importing monorepos, container-based backends, Python apps, or full-stack projects with multiple services. |
| Railway | Has an import flow that detects and launches monorepos, but in our tests the experience was hard to follow: little visible feedback on what was detected, where the deployed site lived, or why a build had failed. More complex repos failed without clear indication of the cause, which made it hard to evaluate how far the import actually goes. |
| Fly.io | Does not really offer a magical import flow in the same sense. In our tests we could not get even a static site to run end-to-end through an import; it appears to assume you are designing for Fly.io specifically rather than handing it an arbitrary repo. |
If we have missed something, let us know.
And if this sounds interesting, try importing one of your own projects into Diploi. We would love to hear how it works for you.
Top comments (0)