<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Anup Singh</title>
    <description>The latest articles on DEV Community by Anup Singh (@anup_singh_ai).</description>
    <link>https://dev.to/anup_singh_ai</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/anup_singh_ai"/>
    <language>en</language>
    <item>
      <title>How to give each user their own AI agent environment - no infra needed</title>
      <dc:creator>Anup Singh</dc:creator>
      <pubDate>Tue, 07 Apr 2026 07:23:31 +0000</pubDate>
      <link>https://dev.to/anup_singh_ai/how-to-give-each-user-their-own-ai-agent-environment-no-infra-needed-100d</link>
      <guid>https://dev.to/anup_singh_ai/how-to-give-each-user-their-own-ai-agent-environment-no-infra-needed-100d</guid>
      <description>&lt;p&gt;If you're building an AI agent, you've probably hit this wall:                                                                                            &lt;/p&gt;

&lt;p&gt;User A needs their own files. User B needs their own database. User C needs their own compute. And none of them should see each other's data.&lt;/p&gt;

&lt;p&gt;So you start building. S3 for storage. A database per user. Docker containers. Isolation logic. Scaling. Pause/resume when users go idle.&lt;/p&gt;

&lt;p&gt;Suddenly it's three months later and you haven't written any agent logic.                                                                                 &lt;/p&gt;

&lt;p&gt;I hit this exact wall building my AI coding agent (100k users). Built the infra. Took months. Then I realized every AI agent startup builds the same thing.&lt;/p&gt;

&lt;p&gt;So I turned it into a product: &lt;a href="https://oncell.ai" rel="noopener noreferrer"&gt;https://oncell.ai&lt;/a&gt;                                                                                                        &lt;/p&gt;

&lt;p&gt;What you get&lt;/p&gt;

&lt;p&gt;One API call. Each user gets their own isolated environment with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;File storage — read, write, list, delete. Persistent across sessions.
&lt;/li&gt;
&lt;li&gt;Database — key-value store. Instant.&lt;/li&gt;
&lt;li&gt;Search — text search across all files in the environment.&lt;/li&gt;
&lt;li&gt;Shell — run any command.
&lt;/li&gt;
&lt;li&gt;Live URL — {id}.cells.oncell.ai serves files from the environment.&lt;/li&gt;
&lt;li&gt;Observability — workflow journal, logs, metrics. Built in.
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Install                                                                                                                                                   &lt;/p&gt;

&lt;p&gt;npm install @oncell/sdk                                                                                                                                   &lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Step 1: Create an environment with your agent code&lt;br&gt;&lt;br&gt;
*&lt;/em&gt;&lt;br&gt;&lt;br&gt;
  import { OnCell } from "@oncell/sdk";&lt;/p&gt;

&lt;p&gt;const oncell = new OnCell({ apiKey: process.env.ONCELL_API_KEY });&lt;/p&gt;

&lt;p&gt;const cell = await oncell.cells.create({&lt;br&gt;
    customerId: "user-123",&lt;br&gt;&lt;br&gt;
    tier: "starter",&lt;br&gt;
    agent: `&lt;br&gt;&lt;br&gt;
      module.exports = {&lt;br&gt;&lt;br&gt;
        async generate(ctx, params) {&lt;br&gt;
          // This runs INSIDE the user's environment&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      // Search existing files for context (local — 0ms)
      const context = ctx.search.query(params.instruction);                                                                                             

      // Call your LLM (only network call)                                                                                                              
      const res = await fetch("https://openrouter.ai/api/v1/chat/completions", {                                                                      
        method: "POST",                   
        headers: {                    
          "Authorization": "Bearer " + process.env.API_KEY,
          "Content-Type": "application/json",                                                                                                           
        },
        body: JSON.stringify({                                                                                                                          
          model: "google/gemini-2.5-flash",                                                                                                           
          messages: [{ role: "user", content: params.instruction }],
        }),                                                                                                                                             
      });
      const data = await res.json();                                                                                                                    
      const code = data.choices[0].message.content;                                                                                                   

      // Write to storage (local — 0ms)
      ctx.store.write("index.html", code);

      // Save to database (local — 0ms)
      const history = ctx.db.get("history") || [];                                                                                                      
      history.push({ instruction: params.instruction, ts: Date.now() });                                                                              
      ctx.db.set("history", history); 

      // Log the action                                                                                                                                 
      ctx.journal.step("generate", "Wrote index.html");

      return { code, files: ctx.store.list() };                                                                                                       
    }                                 
  };
`,                                                                                                                                                      
secrets: {
  API_KEY: process.env.OPENROUTER_KEY,                                                                                                                  
},                                                                                                                                                    
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;});&lt;/p&gt;

&lt;p&gt;console.log(cell.previewUrl);&lt;br&gt;
  // &lt;a href="https://dev-abc--user-123.cells.oncell.ai" rel="noopener noreferrer"&gt;https://dev-abc--user-123.cells.oncell.ai&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That's it. The environment is live. The agent code is loaded. Secrets are injected as process.env — never written to disk, never visible in file listings.&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Step 2: Send requests to the agent&lt;br&gt;&lt;br&gt;
*&lt;/em&gt;&lt;br&gt;&lt;br&gt;
  const result = await oncell.cells.agentRequest("user-123", "generate", {&lt;br&gt;&lt;br&gt;
    instruction: "Build a pricing page with 3 tiers"&lt;br&gt;&lt;br&gt;
  });&lt;/p&gt;

&lt;p&gt;console.log(result.code);   // full HTML&lt;br&gt;&lt;br&gt;
  console.log(result.files);  // ["index.html"]&lt;/p&gt;

&lt;p&gt;The agent ran inside the user's environment. It searched existing files, called the LLM, wrote the result to storage, saved conversation history to the&lt;br&gt;
  database. All locally — no network round-trips for file or database operations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: View the result&lt;/strong&gt;                     &lt;/p&gt;

&lt;p&gt;Open cell.previewUrl in a browser. The environment serves index.html automatically.                                                                     &lt;/p&gt;

&lt;p&gt;Every user gets their own URL. User A's files never touch User B's environment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4: Follow-up requests&lt;/strong&gt;                                                                                                                              &lt;/p&gt;

&lt;p&gt;const result2 = await oncell.cells.agentRequest("user-123", "generate", {&lt;br&gt;
    instruction: "Make it dark mode with purple accents"&lt;br&gt;
  });                                     &lt;/p&gt;

&lt;p&gt;The agent reads the existing code from storage, gets conversation history from the database, and applies the edit. State persists across requests.        &lt;/p&gt;

&lt;p&gt;What the agent has access to                                                                                                                              &lt;/p&gt;

&lt;p&gt;Your agent code receives a ctx object:&lt;/p&gt;

&lt;p&gt;ctx.store.write(path, content)    // file storage&lt;br&gt;&lt;br&gt;
  ctx.store.read(path)              // read file&lt;br&gt;
  ctx.store.list()                  // list all files&lt;br&gt;
  ctx.db.get(key)                   // read from database&lt;br&gt;
  ctx.db.set(key, value)            // write to database&lt;br&gt;
  ctx.search.query(text)            // search files&lt;br&gt;
  ctx.shell(cmd)                    // run shell command&lt;br&gt;&lt;br&gt;
  ctx.journal.step(type, msg)       // log workflow step&lt;br&gt;
  ctx.stream(data)                  // SSE streaming to client                                                                                              &lt;/p&gt;

&lt;p&gt;All local to the environment. Zero latency for storage, database, and search operations.&lt;/p&gt;

&lt;p&gt;What about idle users?&lt;/p&gt;

&lt;p&gt;Environments pause automatically after 15 minutes of inactivity:                                                                                        &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Active: your tier rate ($0.10 - $0.50/hr)&lt;/li&gt;
&lt;li&gt;Paused: $0.003/hr (just storage)
&lt;/li&gt;
&lt;li&gt;Resume: 200ms (data cached on NVMe)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You don't manage this. It just happens.                                                                                                                 &lt;/p&gt;

&lt;p&gt;What about security?                                                                                                                                      &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Each environment runs in its own process with resource limits
&lt;/li&gt;
&lt;li&gt;Secrets injected as environment variables, never on disk
&lt;/li&gt;
&lt;li&gt;File listings never expose agent code&lt;/li&gt;
&lt;li&gt;Users can't see each other's environments&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;What you stop building                  &lt;/p&gt;

&lt;p&gt;I replaced this:                                                                                                                                          &lt;/p&gt;

&lt;p&gt;├── Dockerfile&lt;br&gt;&lt;br&gt;
  ├── docker-compose.yml&lt;br&gt;&lt;br&gt;
  ├── terraform/&lt;br&gt;
  │   ├── ecs.tf&lt;br&gt;
  │   ├── efs.tf&lt;br&gt;
  │   ├── dynamodb.tf&lt;br&gt;
  │   ├── s3.tf&lt;br&gt;
  │   ├── iam.tf&lt;br&gt;&lt;br&gt;
  │   ├── security-groups.tf&lt;br&gt;&lt;br&gt;
  │   └── ...&lt;br&gt;
  ├── src/&lt;br&gt;&lt;br&gt;
  │   ├── isolation/&lt;br&gt;
  │   ├── storage/&lt;br&gt;&lt;br&gt;
  │   ├── database/&lt;br&gt;&lt;br&gt;
  │   ├── scaling/&lt;br&gt;&lt;br&gt;
  │   └── pause-resume/&lt;br&gt;&lt;br&gt;
  └── 3 months of your life&lt;/p&gt;

&lt;p&gt;With this:&lt;/p&gt;

&lt;p&gt;npm install @oncell/sdk                                                                                                                                 &lt;/p&gt;

&lt;p&gt;Try it&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Sign up at &lt;a href="https://oncell.ai" rel="noopener noreferrer"&gt;https://oncell.ai&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Create an API key
&lt;/li&gt;
&lt;li&gt;Add credits ($5 minimum)&lt;/li&gt;
&lt;li&gt;npm install @oncell/sdk
&lt;/li&gt;
&lt;li&gt;Create your first environment&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Docs: &lt;a href="https://oncell.ai/docs" rel="noopener noreferrer"&gt;https://oncell.ai/docs&lt;/a&gt;&lt;br&gt;&lt;br&gt;
GitHub: &lt;a href="https://github.com/oncellai" rel="noopener noreferrer"&gt;https://github.com/oncellai&lt;/a&gt;&lt;br&gt;
SDK: &lt;a href="https://www.npmjs.com/package/@oncell/sdk" rel="noopener noreferrer"&gt;https://www.npmjs.com/package/@oncell/sdk&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Launching on &lt;a href="https://www.producthunt.com/products/oncell-ai?launch=oncell-ai" rel="noopener noreferrer"&gt;https://www.producthunt.com/products/oncell-ai?launch=oncell-ai&lt;/a&gt;. Would love your feedback.      &lt;/p&gt;

</description>
      <category>ai</category>
      <category>webdev</category>
      <category>javascript</category>
      <category>programming</category>
    </item>
  </channel>
</rss>
