π Today my mission was to deploy a Strapi CMS that would power the content for my portfolio website.
I wanted to create the fastest and most efficient development workflow possible. So I looked for a cloud solution that could load and deploy Docker images straight out of the box.
βοΈ That is when I discovered Render, a cloud platform that builds Docker images directly from a GitHub repository. This allows you to run the exact same versions locally and in production, in my case Strapi v5 and PostgreSQL 16.
π¬ After some back and forth with Grok from xAi, while sitting on the toilet, I figured out exactly how to use Render with a Render configuration file. This file defines your server structure in a clean and organised way. It is similar to a Docker file but created specifically for Render so you can build reusable blueprints directly from GitHub.
βοΈ Once I understood the workflow, I created a new project and asked my good friend Cursor to generate a full Strapi v5 project with a Docker image. Honestly, who still searches everything manually when Cursor with Opus 4.5 can create it in seconds. I loaded the files and did not touch Google once.
π³ After Cursor generated the project, I started debugging. I checked the logs and verified that the Docker container was running properly. With a bit of prompting and tweaking, I had a fully working Strapi v5 setup with a Docker file and a Render configuration file in under ten minutes.
π Pro tip: Import the documentation of your tech stack into Cursor to give it more context. Think Render docs, Strapi v5 docs, and anything else you use. Feed everything in.
π¦ After testing everything locally, I uploaded the blueprint to Render. When the deployment began, a few errors popped up. Normally you could copy the logs manually into Cursor, but today we have something much better. Render provides an MCP server that you can connect directly to Cursor. Cursor can fetch deployment logs automatically and understand exactly what went wrong.
π€― It worked beautifully, and with only a few prompts my production Docker file was fixed. MCP servers are absolutely brilliant. Yes there are some risks, but I am not working with sensitive data right now so who cares.
π§± With Render running my Strapi v5 CMS, the real work began. I created entries and linked them to my existing Next JS portfolio. I structured everything based on my current site. Projects, jobs, tech stacks, all the objects I use, plus the single pages like About and Home.
π Then came a new challenge. I needed to build an API integration for my portfolio. Did I really want to manually inspect all endpoints. Of course not. I would not be Tijmen if I did not look for an MCP server. And guess what. There is a Strapi v5 MCP server that you can install on your local Strapi instance and expose to other projects. It only works on localhost but it was exactly what I needed.
π§ This allowed my portfolio to retrieve all data automatically. The MCP server knew all my entries and pages, giving Cursor full context for building my API integration.
π Conclusion: If you want to move fast today, use Cursor with MCP servers and choose a cloud platform that integrates with your development environment. You stay in full control as a developer while using modern tools to reach your goals as quickly as possible.
This is my first post but I will definitely write more in the coming months about this learning journey!
Top comments (2)
This is a brilliant practical walkthrough. What you're describing with MCP servers + Cursor is essentially a paradigm shift: from AI-as-a-crutch to AI-as-an-amplifier of your architecture.
The key insight here is that you didn't just let Cursor "generate" codeβyou gave it context. By exposing your Strapi schema via MCP, you created a contract between the AI and your system. Cursor then operates within that constraint, dramatically improving accuracy and consistency.
This mirrors what we're seeing in production AI systems: the most successful integrations aren't the ones with the fewest guardrails, but the ones with the clearest contracts. MCP servers are essentially making those contracts explicit and executable.
One thought for the next iteration: as your system grows, the real bottleneck won't be Cursor's speedβit'll be the quality of your MCP specs and context fed to the AI. Invest heavily in:
Build once for clarity; scale via AI amplification. This is the workflow.
Yeah totally agree! Thanks for you'r reply! :)