DEV Community

Cover image for Deploy Your App and Add an AI Agent to Ship Tickets, With Your Own API Key

Deploy Your App and Add an AI Agent to Ship Tickets, With Your Own API Key

You have a Node.js or Python project on GitHub. It runs locally. You want it live, and you want an AI agent that can pick up a ticket, write the code, and open a PR for you to review.

This guide walks through exactly that, using Open Source Cloud (OSC) as the infrastructure layer. You bring your own Git repo and your own Anthropic or OpenAI API key. OSC handles the hosting and wires the agent loop.

No Dockerfile required.


Prerequisites

  • A Git repo with a Node.js or Python web app (must run on a single port)
  • A GitHub (or GitLab, or any HTTPS git host) personal access token if the repo is private
  • An Anthropic API key (sk-ant-...) or OpenAI API key (sk-...)
  • An OSC account at app.osaas.io

Node.js apps need a start script in package.json. Python apps should expose a server on PORT (OSC injects this as an env var). The platform clones your repo and runs it directly. No build step is required if your app doesn't have one, though npm run build will be executed if the script exists.


Step 1: Deploy as a My App

Log in to app.osaas.io and navigate to My Apps in the sidebar.

Click Create New App. You'll see a modal with four fields:

  • Name: lowercase letters and numbers, no spaces (e.g., mywebapp)
  • Type: select Node.js or Python (also Go, .NET, and WASM are available)
  • Git URL: the HTTPS clone URL of your repo (e.g., https://github.com/you/mywebapp)
  • Git Token: your personal access token, if the repo is private

Leave Git Token blank for public repos. Hit Create.

OSC clones the repo, starts the runtime, and assigns a public URL. The app card shows a "Building..." spinner while the initial build runs. Once it flips to live, you'll see a URL in the format {hash}.apps.osaas.io. That's your app, publicly accessible over HTTPS.

If you push to the default branch and want to pick up the changes, use the Rebuild action on the app card.


Step 2: Verify the deployment

Click the URL on the app card. Your app should respond.

If it doesn't, check the logs. Click the app card to expand it, then open View Logs. The most common issues:

  • App binds to a hardcoded port instead of process.env.PORT (Node.js) or os.environ['PORT'] (Python). OSC injects PORT=8080, so your server must listen on whatever PORT says.
  • Missing dependencies in package.json or requirements.txt. OSC runs npm install or pip install -r requirements.txt automatically, so make sure the file is committed.

Once the app responds correctly, move on.


Step 3: Configure your AI credentials

Navigate to Dashboard > Agent Tasks in the left sidebar. If you haven't configured agent credentials yet, you'll see a setup card.

The setup card has two dropdowns:

  • Runtime: choose Claude (Anthropic) or Codex (OpenAI)
  • Credential Type: for Claude, choose Anthropic API Key and paste your sk-ant-... key. For Codex, choose OpenAI API Key and paste your sk-... key.

Your key is stored encrypted in a Kubernetes secret scoped to your tenant. OSC never transmits it to third parties beyond the AI provider you chose. Submit the form.

This is a one-time step. The credentials apply to all your apps.


Step 4: Enable Agentic SDLC on your app

Go back to My Apps. The app table now has an Agentic SDLC column. The toggle for your app should be clickable (it was grayed out before you saved credentials).

Click the toggle. A confirmation dialog opens for Enable Agentic SDLC. Confirm for your app.

OSC provisions a Gitea instance for your account (if one doesn't exist already) and initializes a ticket repository inside it. A setup agent runs once: it inspects your app's source code, creates a CLAUDE.md at the repo root describing your project's conventions, and adds agent definitions under .claude/agents/. This takes a minute or two. The toggle shows "Setting up..." while it runs.

When the status flips to Active, Agentic SDLC is live. From this point, requests you submit through the dashboard drive the agent pipeline.

If your app is on GitHub, you can also configure a GitHub token so the implementation agent opens PRs on your GitHub repo directly. The setup dialog prompts for this if needed.


Step 5: Open a ticket and watch it ship

In My Apps, click your app's name to open its detail page. You'll land on the Requests tab. Click + New request.

The form asks: "What do you want to change or fix?" Write it in plain English, with an optional details field for more context. Something like:

What do you want to change or fix?
Add a /healthz endpoint

Details (optional):
Add a GET /healthz route that returns { "status": "ok" } with HTTP 200.
This endpoint should not require authentication.
Enter fullscreen mode Exit fullscreen mode

Submit the request. Within a minute or two, you'll see activity in the Requests table:

  1. A triage agent reads the request and decides whether to decompose it into sub-tasks. For something this small it will create a single sub-ticket and post a decomposition comment for you to review.
  2. An implementation agent picks up the sub-ticket, clones your app repo, writes the code, and opens a PR. The PR references the ticket number.
  3. A reviewer agent reads the diff, checks that it actually satisfies the acceptance criteria, and posts a verdict comment on the PR.

Your job: review the PR, run your test suite if you have one, and merge if it looks good.

The agents never merge PRs on your behalf. That gate stays with you.


What you own

At the end of this setup, your code stays in your Git repo (GitHub, GitLab, or OSC's built-in Gitea). No code is stored by OSC outside of what you explicitly push. Your AI account does the work, billed directly by Anthropic or OpenAI at their standard rates, with no vendor markup on tokens. OSC's infrastructure runs on standard Kubernetes and open protocols, so there's no proprietary lock-in if you ever want to move.

The Creator plan on OSC costs 15 EUR/month and includes My Apps hosting plus the Agentic SDLC feature. See pricing details at app.osaas.io.


Where to go from here

The My Apps feature supports Node.js, Python, Go, .NET, and WASM. Each runtime uses the same deploy-from-git model.

If you want to connect AI agents to the broader OSC infrastructure (databases, managed services, other deployments), the MCP tools give agents 40+ tools to provision and wire services through a conversation. The Streaming Tech TV+ example in the OSC examples gallery shows what this looks like at scale: a production streaming service, 15,000+ lines of code, 13 OSC services, built in 36 hours with one human directing six AI agents.

But you don't need to start there. A git repo and an API key is enough.

Top comments (0)