DEV Community

Alexander Ertli
Alexander Ertli

Posted on

From a Simple Ollama Client to 26k LOC

Here is how everything has started:

package main

import (
    "bytes"
    "encoding/json"
    "fmt"
    "io/ioutil"
    "net/http"
    "os"
)

func main() {
    // Get the base URL from an environment variable; default to Ollama's URL if not set.
    baseURL := os.Getenv("OLLAMA_URL")
    if baseURL == "" {
        baseURL = "http://localhost:11434"
    }

    // Prepare the request payload.
    payload := map[string]interface{}{
        "model":  "llama3.2",             // Change this to the model you want to use.
        "prompt": "Why is the sky blue?", // Your prompt.
        "stream": false,                  // Set to false for a complete, non-streaming response.
    }

    // Encode payload into JSON.
    body, err := json.Marshal(payload)
    if err != nil {
        fmt.Println("Error marshalling payload:", err)
        return
    }

    // Build the full URL for the generate endpoint.
    url := fmt.Sprintf("%s/api/generate", baseURL)

    // Create a new POST request.
    req, err := http.NewRequest("POST", url, bytes.NewBuffer(body))
    if err != nil {
        fmt.Println("Error creating request:", err)
        return
    }
    req.Header.Set("Content-Type", "application/json")

    // Send the request using the default HTTP client.
    resp, err := http.DefaultClient.Do(req)
    if err != nil {
        fmt.Println("Error making HTTP request:", err)
        return
    }
    defer resp.Body.Close()

    // Read the response body.
    respBody, err := ioutil.ReadAll(resp.Body)
    if err != nil {
        fmt.Println("Error reading response body:", err)
        return
    }

    // Print the response.
    fmt.Println("Response from Ollama:")
    fmt.Println(string(respBody))
}
Enter fullscreen mode Exit fullscreen mode

Now it's grown into something that has 26,106 LOC (according to qlty.sh) and ~20 modules. I'm still evaluating, but chances are that after the runtime-MVP is launched, most will be refactored into microservices to ensure the flexibility to cope with whatever the platform ends up being used for.

Sometimes all you need is to get started coding and keep writing code that solves one visible problem. Like in the snippet I shared.

Don't overthink it. Embrace the refactoring needed with each layer of complexity. And don't fear scraping a whole codebase to restart when your design hits the wall. For me at least, this worked.

You may know this as extreme programming, if you have a formal background, but the term does not matter. What mattered to me is that working on my own codebase should feel engaging and not like a rock that you have to move up the hill.

And here is a little favor I have. If you are interested in participating in the demo launch of an App on top of the contenox runtime-MVP or following me for updates, I may drop a link to the challenge soon.

I'm still ruminating if I should launch with a pirate-themed treasure hunt challenge and a little gift card for whoever completes the game via the Telegram bot first — but yeah, if you like the direction or have a better idea, let me know in the comments :)

Thanks for making it to the end.

Top comments (0)