DEV Community

SWAN
SWAN

Posted on • Originally published at Medium

Write your first AI Agent with Typescript

Let’s create a simple Math AI Agent. Instead of talking about AI agents in theory, let’s jump straight into building one. No complex setup. No overwhelming abstractions. Just a clean, practical introduction to building your first intelligent agent with the SWAN Framework.

PROJECT SETUP

  1. Create an empty Typescript project directory and install dependencies:
npm install --save-dev typescript @types/node
npm install @ssww.one/framework zod
Enter fullscreen mode Exit fullscreen mode
  • Create tsconfig.json file
{
  "compilerOptions": {
    "outDir": "./dist",
    "module": "nodenext",
    "target": "esnext",
    "lib": ["esnext"],
    "types": ["node"],
    "sourceMap": true,
    "declaration": true,
    "declarationMap": true,
    "strict": true,
    "jsx": "react-jsx",
    "isolatedModules": true,
    "noUncheckedSideEffectImports": true,
    "moduleDetection": "force",
    "skipLibCheck": true,
  }
}
Enter fullscreen mode Exit fullscreen mode
  • Create an empty index.ts file
.
├── index.ts
├── node_modules
├── package.json
├── package-lock.json
└── tsconfig.json
Enter fullscreen mode Exit fullscreen mode
  1. Update your package.json scripts to automatically transpile ts files to js and execute node index.js
{
  ...
  "scripts": {
    "start": "rm -rf dist && tsc && node dist"
  },
  ...
}
Enter fullscreen mode Exit fullscreen mode

WRITE AI-AGENTS

  1. Start with a typical Agents simple blueprint on index.ts
import { AgentTool, OpenAILLM, startAgentCLI, loop } from "@ssww.one/framework";

export async function agent(at: AgentTool) {
  at.print('Hello there, may I help you?', true); // GREETINGS

  await loop(async () => {
    const instruction: string = await at.waitForUserInstruction();
    const response: string = await at.askLLM(`Reply to: ${instruction}`);
    at.print(response, true); // true on 2nd param to close response
  });
}

startAgentCLI(agent, {
  llm: new OpenAILLM()
});
Enter fullscreen mode Exit fullscreen mode

Code above is a simple agent that prints greetings in the first execution, after that it will loop forever to wait for user prompt and then ask AI for an answer and print the response to the user.

When you run npm start, it will run the agent function, print greetings, ask for prompt

$ npm start

> project@1.0.0 start
> rm -rf dist && tsc && node dist

◇ injected env (0) from .env // tip: ⌘ multiple files { path: ['.env.local', '.env'] }
◇ injected env (0) from .env // tip: ⌁ auth for agents [www.vestauth.com]
Hello there, may I help you?
? prompt ‣ 
Enter fullscreen mode Exit fullscreen mode

if you respond to the prompt it will show an error, since you havent configure any API Key and Open AI LLM model.

  1. Add OpenAI LLM API key and model name

Create a new file .env and then put your API key and model name like this:

CHATGPT_APIKEY="sk-proj-rWWdZnm3f...."
CHATGPT_MODEL="gpt-4.1"
Enter fullscreen mode Exit fullscreen mode

Now your project directory files become:

.
├── .env
├── index.ts
├── node_modules
├── package.json
├── package-lock.json
└── tsconfig.json
Enter fullscreen mode Exit fullscreen mode
  1. Try run npm start again, and now it should work well
$ npm start

> project@1.0.0 start
> rm -rf dist && tsc && node dist

◇ injected env (0) from .env // tip: ⌘ enable debugging { debug: true }
◇ injected env (0) from .env // tip: ⌁ auth for agents [www.vestauth.com]
Hello there, may I help you?
✔ prompt · Hi, how are you?
Loading...
Reply to: Hi, how are you?
"I'm good, thank you! How are you?"
? prompt ‣ 
Enter fullscreen mode Exit fullscreen mode
  1. Lets back to the Math AI Agent.

Change your index.ts to this simple Math AI Agent function

import { AgentTool, OpenAILLM, startAgentCLI, loop } from "@ssww.one/framework";
import z from "zod";

export async function agent(at: AgentTool) {
  at.print("🧮 I'm your math agent. Ask me anything!", true);

  await loop(async () => {
    const question = await at.waitForUserInstruction();

    const result = await at.askLLM(
      `Is this a math question: "${question}"?`,
      z.object({ isMath: z.boolean() })
    );

    if (result.isMath) {
      const answer = await at.askLLM(`Solve this clearly: ${question}`);
      at.print(answer, true);
    } else {
      at.print("❌ I only handle math questions!", true);
    }
  });
}
startAgentCLI(agent, {
  llm: new OpenAILLM()
});
Enter fullscreen mode Exit fullscreen mode

These agent works as steps below:

  • Waits for you to type a question
  • Checks if your question is related to math
  • If yes → it solves the problem step-by-step
  • If no → it politely rejects it

Congratulations, you have successfully created your first AI Agent with Typescript.

BONUS

You can change the LLM with another provider like OpenRouter or Google Gen AI

  1. On provider Open Router or other providers that support OpenAI SDK, you just need to add CHATGPT_ENDPOINT key on .env file
CHATGPT_APIKEY="sk-proj-rWWdZnm3f...."
CHATGPT_MODEL="gpt-4.1"
CHATGPT_ENDPOINT="https://openrouter.ai/api/v1"
Enter fullscreen mode Exit fullscreen mode
  1. On provider Google Gen AI, you need to replace the whole LLM and .env with new values
  • index.ts
import { GoogleGenAILLM } from "@ssww.one/framework";

// agents function

startAgentCLI(agent, {
  llm: new GoogleGenAILLM()
});
Enter fullscreen mode Exit fullscreen mode
  • .env
GOOGLE_GENAI_APIKEY="egzz222nufi...."
GOOGLE_GENAI_MODEL="gemini-2.5-flash"
Enter fullscreen mode Exit fullscreen mode

Full documentation visit https://ssww.one/docs

Top comments (0)