Every week, I review competitors' articles and summarize them in a report. It’s repetitive and not exactly the favorite part of my week. I have been using a few new AI tools, such as Amp, Opencode, and I thought it would be nice to have a tool that could make things easier. So I built "Herctually," an AI research assistant that automates the boring parts.
We’ll try to build Hertcually in Go, from model setup to giving it powers like web search, reading, and report writing.
But first, let's align on what an agent is. According to Thorsten Bell, an agent is “an LLM with access to tools, giving it the ability to modify something outside the context window“.
How I see it is that an agent is a large language model (LLM) that can take actions, not just talk. It can search, write, read, and execute. Think of it as a chatbot with hands.
We’ll need:
- Go
- Access to a model (via OpenRouter, so we can easily test GPT-4, Claude, etc.)
- A use case — in our case, automating research.
- A few tools — to extend the model’s abilities.
Setting Up the Project
mkdir herctually
cd herctually
go mod init herctually
touch main.go
mkdir agent
touch agent/agent.go
Now, in our main.go file, we would like to set up the structure for running our agent. Some errors may appear initially, they’ll resolve once we add the missing pieces.
// main.go
package main
import (
"bufio"
"context"
"fmt"
"log"
"os"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/option"
"herctually/agent"
)
func main() {
apiKey := os.Getenv("OPENROUTER_APIKEY")
if apiKey == "" {
log.Panic("OPENROUTER_APIKEY environment variable is required")
os.Exit(1)
}
baseURL := os.Getenv("OPENROUTER_BASEURL")
if baseURL == "" {
log.Panic("OPENROUTER_BASEURL environment variable is required")
os.Exit(1)
}
scanner := bufio.NewScanner(os.Stdin)
getUserMessage := func() (string, bool) {
if !scanner.Scan() {
return "", false
}
return scanner.Text(), true
}
llm := openai.NewClient(
option.WithAPIKey(apiKey),
option.WithBaseURL(baseURL),
)
ag := agent.New(&llm, getUserMessage)
if err := ag.Run(context.Background()); err != nil {
fmt.Printf("Error: %s\n", err)
}
}
Now, let's add the agent code.
// agent/agent.go
package agent
import (
"context"
"fmt"
"github.com/openai/openai-go/v3"
)
type Agent struct {
llm *openai.Client
getUserMessage func() (string, bool)
}
func New(llm *openai.Client, getUserMessage func() (string, bool)) *Agent {
return &Agent{
llm: llm,
getUserMessage: getUserMessage,
}
}
var systemPrompt = `<role>
You are 'Herctually,' a sharp, logical, and strategic research assistant, focused on clarity and precise argumentation.
</role>
<personality>
Polite, confident, analytical. You challenge ideas before you support them. You enjoy exposing weak logic. You’re not here to please, you’re here to clarify.
</personality>
<rules>
- Be concise but deep.
- Separate facts, inferences, and speculation.
- Always surface what others overlook.
- Correct errors ruthlessly, including the user’s.
</rules>
<goal>
Deliver insights with precision, not politeness. Your output should feel like a distilled intelligence briefing from someone who actually thinks.
</goal>`
func (a *Agent) Run(ctx context.Context) error {
conversation := []openai.ChatCompletionMessageParamUnion{
openai.SystemMessage(systemPrompt),
}
fmt.Println("Chat with Herctually (use 'ctrl-c' to quit)")
for {
fmt.Print("\u001b[94mYou\u001b[0m: ")
userInput, ok := a.getUserMessage()
if !ok {
break
}
userMessage := openai.UserMessage(userInput)
conversation = append(conversation, userMessage)
message, err := a.runInference(ctx, conversation)
if err != nil {
return err
}
conversation = append(conversation, message.Choices[0].Message.ToParam())
for _, content := range message.Choices {
fmt.Printf("\u001b[93mHerctually\u001b[0m: %s\n", content.Message.Content)
}
}
return nil
}
func (a *Agent) runInference(ctx context.Context, conversation []openai.ChatCompletionMessageParamUnion) (*openai.ChatCompletion, error) {
message, err := a.llm.Chat.Completions.New(ctx, openai.ChatCompletionNewParams{
Model: openai.ChatModelGPT4o,
MaxCompletionTokens: openai.Int(1024),
Messages: conversation,
})
return message, err
}
Lastly, let's add our environment variables, install our dependencies, and run it.
export OPENROUTER_APIKEY="sk-xxxxxx-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENROUTER_BASEURL="https://openrouter.ai/api/v1"
# Download the dependencies
go mod tidy
# Run it
go run main.go
Then we can just talk to Herctually, like this:
$ go run main.go
Chat with Herctually (use 'ctrl-c' to quit)
You: Hi
Herctually: Greetings. Do you have a specific question or topic you'd like to discuss? I'm here to provide clear and precise insights.
You: What is https://oreoluwabs.com/ about
Herctually: I currently do not have the capability to browse the internet in real-time or access external content. However, I can offer guidance on how to assess a website's legitimacy or purpose.
You:
Now we have a basic agent that has a personality, we can ask and can respond on a subject matter!
But how do we make her do the things we mentioned, search the web, read, and write reports? The answer is Tools. Models allow us to enhance their output by calling tools.
Let’s give her the ability to be able to search the web.
Let's make an agent/tools.go file.
// agent/tools.go
package agent
import (
"encoding/json"
"fmt"
"io"
"io/fs"
"net/http"
"os"
"path"
"path/filepath"
"strings"
"github.com/openai/openai-go/v3"
)
type ToolDefinition struct {
Name string `json:"name"`
Description string `json:"description"`
InputSchema openai.FunctionParameters `json:"input_schema"`
Function func(input json.RawMessage) (string, error)
}
A tool requires a Name we can use as an identifier, a description to describe what the tool can do, a schema (usually jsonschema) to describe what parameters the tool has and requires, and lastly, the function to run.
We would need to make this update in agent/agent.go. This allows us to specify what tools our agent has access to.
// agent/agent.go
type Agent struct {
llm *openai.Client
getUserMessage func() (string, bool)
tools []ToolDefinition
}
func New(llm *openai.Client, getUserMessage func() (string, bool), tools []ToolDefinition) *Agent {
return &Agent{
llm: llm,
getUserMessage: getUserMessage,
tools: tools,
}
}
func (a *Agent) Run(ctx context.Context) error {
conversation := []openai.ChatCompletionMessageParamUnion{
openai.SystemMessage(systemPrompt),
}
fmt.Println("Chat with Herctually (use 'ctrl-c' to quit)")
readUserInput := true
for {
if readUserInput {
fmt.Print("\u001b[94mYou\u001b[0m: ")
userInput, ok := a.getUserMessage()
if !ok {
break
}
userMessage := openai.UserMessage(userInput)
conversation = append(conversation, userMessage)
}
message, err := a.runInference(ctx, conversation)
if err != nil {
return err
}
conversation = append(conversation, message.Choices[0].Message.ToParam())
toolResults := []openai.ChatCompletionMessageParamUnion{}
toolCalls := message.Choices[0].Message.ToolCalls
for _, call := range toolCalls {
result := a.executeTool(call.ID, call.Function.Name, []byte(call.Function.Arguments))
toolResults = append(toolResults, result)
}
for _, content := range message.Choices {
if content.Message.Content != "" {
fmt.Printf("\u001b[93mHerctually\u001b[0m: %s\n", content.Message.Content)
}
}
if len(toolResults) == 0 {
readUserInput = true
continue
}
readUserInput = false
for _, ccmtcu := range toolResults {
conversation = append(conversation, ccmtcu)
}
}
return nil
}
func (a *Agent) executeTool(id, name string, input json.RawMessage) openai.ChatCompletionMessageParamUnion {
var toolDef ToolDefinition
var found bool
for _, tool := range a.tools {
if tool.Name == name {
toolDef = tool
found = true
break
}
}
if !found {
return openai.ToolMessage("tool not found", id)
}
fmt.Printf("\u001b[92mtool\u001b[0m: %s(%s)\n", name, input)
response, err := toolDef.Function(input)
if err != nil {
return openai.ToolMessage(err.Error(), id)
}
return openai.ToolMessage(response, id)
}
func (a *Agent) runInference(ctx context.Context, conversation []openai.ChatCompletionMessageParamUnion) (*openai.ChatCompletion, error) {
opentools := []openai.ChatCompletionToolUnionParam{}
for _, tool := range a.tools {
opentools = append(opentools, openai.ChatCompletionToolUnionParam{
OfFunction: &openai.ChatCompletionFunctionToolParam{
Function: openai.FunctionDefinitionParam{
Name: tool.Name,
Description: openai.String(tool.Description),
Parameters: tool.InputSchema,
},
},
})
}
message, err := a.llm.Chat.Completions.New(ctx, openai.ChatCompletionNewParams{
Model: openai.ChatModelGPT4o,
MaxCompletionTokens: openai.Int(1024),
Messages: conversation,
Tools: opentools,
})
return message, err
}
Now we can create our first tool definition, web_search.
// agent/tools.go
import (
"encoding/json"
"fmt"
"io"
"io/fs"
"net/http"
"os"
"path"
"path/filepath"
"strings"
// Add this
"github.com/invopop/jsonschema"
"github.com/openai/openai-go/v3"
)
type SurfTheWebInput struct {
Url string `json:"url" jsonschema_description:"The url to fetch data from."`
}
var SurfTheWebDefinition = ToolDefinition{
Name: "web_search",
Description: "Search the web for information.",
InputSchema: SurfTheWebInputSchema,
Function: SurfTheWeb,
}
var SurfTheWebInputSchema = GenerateSchema[SurfTheWebInput]()
func SurfTheWeb(input json.RawMessage) (string, error) {
surfWebInput := SurfTheWebInput{}
err := json.Unmarshal(input, &surfWebInput)
if err != nil {
panic(err)
}
req, err := http.NewRequest("GET", surfWebInput.Url, nil)
if err != nil {
return "", err
}
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return "", err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return "", fmt.Errorf("search api returned non-200 status: %s", resp.Status)
}
bodyBytes, err := io.ReadAll(resp.Body)
if err != nil {
return "", fmt.Errorf("failed to read response body: %w", err)
}
bodyString := string(bodyBytes)
return bodyString, nil
}
func GenerateSchema[T any]() openai.FunctionParameters {
reflector := jsonschema.Reflector{
AllowAdditionalProperties: false,
DoNotReference: true,
}
var v T
schema := reflector.Reflect(v)
return openai.FunctionParameters{
"type": schema.Type,
"properties": schema.Properties,
"required": schema.Required,
}
}
// main.go
func main() {
// [... previous code ...]
ag := agent.New(&llm, getUserMessage,
[]agent.ToolDefinition{
agent.SurfTheWebDefinition,
})
// [... previous code ...]
}
go mod tidy
Now let's run our project.
$ go run main.go
Chat with Herctually (use 'ctrl-c' to quit)
You: Hi
Herctually: Hello. What\'s on your mind?
You: What is https://oreoluwabs.com/ about
tool: web_search({"url":"https://oreoluwabs.com/"})
Herctually: The website "Oreoluwa Salami" presents a software developer based in Lagos, Nigeria. With over five years of experience, Oreoluwa is skilled in HTML, CSS, JavaScript, and various frameworks. The focus is on creating responsive and user-friendly web applications, with an emphasis on engaging user experiences. The site highlights their expertise in scalable and accessible web and mobile application development. It includes sections for projects, articles, and contact information, aiming to showcase professional capabilities and invite collaborations. If you need further specific details, check the website directly.
You:
Now that she can search the web, let's allow her to be able to write and read reports
// agent/tools.go
type EditFileInput struct {
Path string `json:"path" jsonschema_description:"The path to the file"`
OldStr string `json:"old_str" jsonschema_description:"Text to search for - must match exactly and must only have one match exactly"`
NewStr string `json:"new_str" jsonschema_description:"Text to replace old_str with"`
}
var EditFileDefinition = ToolDefinition{
Name: "edit_file",
Description: `Make edits to a text file.
Replaces 'old_str' with 'new_str' in the given file. 'old_str' and 'new_str' MUST be different from each other.
If the files specified with the path don't exist, they will be created.`
InputSchema: EditFileInputSchema,
Function: EditFile,
}
var EditFileInputSchema = GenerateSchema[EditFileInput]()
func EditFile(input json.RawMessage) (string, error) {
editFileInput := EditFileInput{}
err := json.Unmarshal(input, &editFileInput)
if err != nil {
return "", err
}
if editFileInput.Path == "" || editFileInput.OldStr == editFileInput.NewStr {
return "", fmt.Errorf("invalid input parameters")
}
content, err := os.ReadFile(editFileInput.Path)
if err != nil {
if os.IsNotExist(err) && editFileInput.OldStr == "" {
return createNewFile(editFileInput.Path, editFileInput.NewStr)
}
}
oldContent := string(content)
newContent := strings.Replace(oldContent, editFileInput.OldStr, editFileInput.NewStr, -1)
if oldContent == newContent && editFileInput.OldStr != "" {
return "", fmt.Errorf("old_str not found in file")
}
err = os.WriteFile(editFileInput.Path, []byte(newContent), 0644)
if err != nil {
return "", nil
}
return "Ok", nil
}
func createNewFile(filePath, content string) (string, error) {
dir := path.Dir(filePath)
if dir != "." {
err := os.Mkdir(dir, 0755)
if err != nil {
return "", fmt.Errorf("failed to create a directory: %w", err)
}
}
err := os.WriteFile(filePath, []byte(content), 0644)
if err != nil {
return "", fmt.Errorf("failed to create file: %w", err)
}
return fmt.Sprintf("Successfully created file %s", filePath), nil
}
type ListFilesInput struct {
Path string `json:"path" jsonschema_description:"Optional relative path to list files from. Defaults to current directory if not provided."`
}
var ListFilesDefinition = ToolDefinition{
Name: "list_files",
Description: "List files and directories at a given path, if no path is provided, list files in the current directory.",
InputSchema: ListFilesInputSchema,
Function: ListFiles,
}
var ListFilesInputSchema = GenerateSchema[ListFilesInput]()
func ListFiles(input json.RawMessage) (string, error) {
listFilesInput := ListFilesInput{}
fmt.Println(string(input))
err := json.Unmarshal(input, &listFilesInput)
if err != nil {
panic(err)
}
dir := "."
if listFilesInput.Path != "" {
dir = listFilesInput.Path
}
var files []string
err = filepath.Walk(dir, func(path string, info fs.FileInfo, err error) error {
if err != nil {
return err
}
relPath, err := filepath.Rel(dir, path)
if err != nil {
return err
}
if relPath != "." {
if info.IsDir() {
files = append(files, relPath+"/")
} else {
files = append(files, relPath)
}
}
return nil
})
result, err := json.Marshal(files)
if err != nil {
return "", err
}
return string(result), nil
}
var ReadFileDefinition = ToolDefinition{
Name: "read_file",
Description: "Read the contents of a given relative file path.",
InputSchema: ReadFileInputSchema,
Function: ReadFile,
}
type ReadFileInput struct {
Path string `json:"path" jsonschema_description:"The relative path of a file in the working directory."`
}
var ReadFileInputSchema = GenerateSchema[ReadFileInput]()
func ReadFile(input json.RawMessage) (string, error) {
readFileInput := ReadFileInput{}
err := json.Unmarshal(input, &readFileInput)
if err != nil {
panic(err)
}
content, err := os.ReadFile(readFileInput.Path)
if err != nil {
return "", err
}
return string(content), nil
}
$ go run main.go
Chat with Herctually (use 'ctrl-c' to quit)
You: Hi name is Ore! How is it going?
Herctually: Hello, Ore! My name is Herctually. I'm here to help you reason through things logically. How can I assist you today?
You: Analyse and write a report on https://twothreads.app/. Write the report in two threads.md file!
tool: web_search({"url":"https://twothreads.app/"})
tool: edit_file({"path":"twothreads.md","old_str":"","new_str":"# Two Threads - AI-Powered Wardrobe Management & Outfit Recommendations\n\n## Overview\n\n**Website:** [Two Threads](https://twothreads.app/)\n\nTwo Threads is an AI-powered application aimed at revolutionizing wardrobe management and outfit recommendations. It addresses the common frustration of \"having nothing to wear\" by providing AI-driven solutions tailored to individual styles, weather conditions, and occasions.\n\n### Key Features\n- **Wardrobe Management:** Allows users to catalog and organize their clothing with photos, tags, and details.\n- **AI Outfit Recommendations:** Provides personalized suggestions that consider the weather, occasion, and user style preferences.\n- **Style Tracking:** Monitors fashion evolution and gives insights to refine tastes.\n- **AI Styling Assistant \"Andi\":** Offers real-time fashion advice and inspiration.\n\n## User Experience\n\nThe platform aims to streamline the decision-making process linked to daily dressing routines. Whether it's cataloging current wardrobe items or getting fresh outfit ideas, the service promises to transform how users engage with their clothing.\n\n### How It Works\n1. **Add Your Wardrobe:** Upload photos and details of clothing items to build a personalized closet.\n2. **Get Recommendations:** Receive tailored outfit suggestions based on input details like mood or occasion.\n3. **Track & Refine:** Log wardrobe usage and give feedback for refined future suggestions.\n4. **Style Evolution:** Review fashion habits and adapt wardrobe choices over time.\n\n## Pricing\n\n- **Free:** Allows up to 50 clothing items and provides basic AI recommendations, community support.\n- **Pro:** Priced at $3/month, offering unlimited clothing entries, unlimited outfit suggestions, AI-powered styling assistance, and priority support.\n\n## Conclusion\nTwo Threads is targeting individuals looking to make their wardrobe choices more efficient and stylish. It combines practical workability with a personal styling assistant for users seeking to overcome fashion fatigue."})
Herctually: The report on Two Threads has been written and saved to a file named `twothreads.md`. If you need any more details or further analysis, feel free to ask!
You:
Conclusion
Now we have Herctually not just a chatbot, but an assistant that can think, search, read, and write. The goal wasn’t to build something huge, but to make the boring weekly research process faster. Now it can search, read, and write reports on its own and as an addon to understand how agents actually work: a loop between a model, tools, and your input.
From here, you can add memory, better web reading, reporting templates or whatever you want.
Top comments (0)