DEV Community

Cover image for Hello World Has Entered the Chat: Engineering a Scalable Chatroom with gRPC (Part 1 - Foundations)
Adrian
Adrian

Posted on

Hello World Has Entered the Chat: Engineering a Scalable Chatroom with gRPC (Part 1 - Foundations)

Do you remember the first time you ran

fmt.Println("Hello, world!")
Enter fullscreen mode Exit fullscreen mode

and nobody answered back? Well, that's no different fate than Voyager 1 🛰️ (at least for now 👀), broadcasting a lonely message into the cosmic void, forever waiting for an ACK.

That’s how most programs start: talking into silence.
So, let’s fix that.

In this series, we’ll be making “Hello World” to talk back, and eventually to chat with everyone.


🧭 The journey we’re taking

This isn’t just about slapping together a chat demo code, but more about engineering journey starting from basics. All roads lead to Rome, but I will share the thoughts behind the engineering decisions of the system so the foundation remains clear, consistent, and could grow.


🥽 Why gRPC

gRPC fits well when:

  1. You need strong contracts between services (protobuf schemas = no guesswork).
  2. You want type-safe clients across many languages.
  3. You care about efficient binary encoding and low latency.
  4. You plan for microservices communicating over well-defined APIs.

So where does this apply? Anywhere sessions exist, such as online multiplayer game rooms, online meetings, customer service chat, etc. where a session has a clear lifetime.

There are plenty of alternatives: Kafka-based event systems, real-time channels in Elixir, or simple WebSockets.

Here, I focus on gRPC because it integrates naturally with Go services and keeps data contracts explicit through protobufs, which helps teams scale.

🧱Designing for scale from Day 1

Even though this project starts with a small “chat" RPC, we’ll build as if several teams might someday share this codebase.

🪐 Centralized proto system

Instead of burying .proto files inside the server folder, we’ll keep a dedicated /proto module so any other service can import the same contracts without divergence.

Approach TL;DR
Go module Publish generated Go code in a standalone module; import via go get.
Git submodule Embed the proto repo directly in each service.

Both work.

Submodules are language-agnostic but harder to keep in sync, while Go modules fit naturally with Go tooling.

So why not use the best of both worlds?

🧩 Buf as the bridge

Buf doesn’t replace Go modules, it complements them.
It treats .proto definitions like a versioned module across languages, like a lightweight go.mod for your protobufs.

Using plain protoc is still perfectly fine; it’s just that Buf adds conveniences like built-in versioning and linting. You push once, every service can pull the same schema version in Go, Node, Python, or anything else, no submodule syncing, no manual plugin setup.

I chose this because the focus is Go-centric scalability, not polyglot CI gymnastics.

🗺️ Project layout

go-grpc-chat/
├── proto/                  # shared protobufs + codegen config
│   ├── chat/v1/chat_service.proto
│   ├── buf.yaml
│   ├── buf.gen.yaml
│   └── gen/go/chat/v1/...
│
├── server/                 # gRPC server
│   └── main.go
├── client/                 # interactive CLI client
│   └── main.go
└── go.mod
Enter fullscreen mode Exit fullscreen mode

🧩 Real-world note: each module would normally be its own service, deployed separately, versioned independently.

Here, everything lives together for learning clarity.

🧾 Define the Contracts (proto)

The .proto file defines how the client and server will communicate, what messages they exchange and what service methods exist.
proto/chat/v1/chat.proto

syntax = "proto3";
package chat.v1;

import "google/protobuf/timestamp.proto";
option go_package = "./chat/v1;chatv1";

message ChatMessage {
  string sender = 1;                       // who sent it
  string text = 2;                         // message body
  google.protobuf.Timestamp sent_at = 3;   // when it was sent (server or client fills)
}

service ChatService {
  rpc Chat(stream ChatMessage) returns (stream ChatMessage);
}
Enter fullscreen mode Exit fullscreen mode

Keep it small. A single ChatMessage works for now.

Configure Buf

To configure Buf, we need to install Buf CLI beforehand. Visit the official installation guide.
Move into your proto/ folder and initialize the configuration.

cd proto
buf mod init
Enter fullscreen mode Exit fullscreen mode

Now you should see buf.yaml generated in your folder

Create a new file buf.gen.yaml to define how and where generated Go code should be placed.
proto/buf.gen.yaml

version: v2
plugins:
  - plugin: buf.build/protocolbuffers/go
    out: gen/go
    opt:
      - paths=source_relative
  - plugin: buf.build/grpc/go
    out: gen/go
    opt:
      - paths=source_relative
Enter fullscreen mode Exit fullscreen mode

Generate the code

Now that Buf knows what to do, let’s generate the Go code.

This step converts .proto files into .pb.go and _grpc.pb.go files that your Go code will import.
Run the following commands:

buf generate
Enter fullscreen mode Exit fullscreen mode

If everything works, you’ll find the generated files here:

proto/gen/go/chat/v1/chat.pb.go
proto/gen/go/chat/v1/chat_grpc.pb.go
Enter fullscreen mode Exit fullscreen mode

At this point, your proto layer is complete. It’s reusable, versionable, and ready for multiple services to depend on.

🖥️ Server - One Voice, Many Echoes

The server keeps track of connected clients and rebroadcasts every incoming message to all others.

We’ll store active client streams in a map and protect it with a sync.Mutex.
server/main.go

type chatServer struct {
  chatv1.UnimplementedChatServiceServer
  mu      sync.Mutex
  clients map[string]chatv1.ChatService_ChatServer // id -> stream
}
Enter fullscreen mode Exit fullscreen mode
  • clients holds each active user’s connection (the ChatService_ChatServer stream).

  • mu (a sync.Mutex) protects access to clients, because multiple goroutines might try to read or write to it at the same time. Without a lock, Go would panic with a “concurrent map writes” error.

    Implementing the chat stream

    Each client connection gets its own Chat handler running in its own goroutine.

func (s *chatServer) Chat(stream chatv1.ChatService_ChatServer) error {
    // create a unique ID for this client
    id := fmt.Sprintf("%p", stream)

    // safely add client to the list
    s.mu.Lock()
    s.clients[id] = stream
    s.mu.Unlock()

    defer func() {
        // remove the client when they disconnect
        s.mu.Lock()
        delete(s.clients, id)
        s.mu.Unlock()
    }()

    for {
        msg, err := stream.Recv()
        if err == io.EOF {
            return nil // client closed the stream
        }
        if err != nil {
            return err // network or decoding error
        }
        log.Printf("🟢 [%s]: \"%s\"", msg.Sender, msg.Text)
        s.broadcast(msg)
    }
}
Enter fullscreen mode Exit fullscreen mode

Broadcast method:

func (s *chatServer) broadcast(msg *chatv1.ChatMessage) {
    s.mu.Lock()
    defer s.mu.Unlock()
    for _, c := range s.clients {
        c.Send(msg)
    }
}
Enter fullscreen mode Exit fullscreen mode
  • stream.Recv() blocks until a message arrives, then fans it out to everyone else.
  • broadcast wraps the send logic in a lock to keep broadcasts atomic and map access safe. Without that, concurrent writes to the map could corrupt it.

Start the server

func main() {
    port := ":4000"
    lis, err := net.Listen("tcp", port)
    if err != nil {
        log.Fatalf("Failed to listen: %v", err)
    }

    s := grpc.NewServer()
    chatv1.RegisterChatServiceServer(s, &chatServer{
        clients: make(map[string]chatv1.ChatService_ChatServer),
    })

    log.Println("🚀 Chat gRPC server listening on " + port)
    if err := s.Serve(lis); err != nil {
        log.Fatalf("Failed to serve: %v", err)
    }
}
Enter fullscreen mode Exit fullscreen mode

🗣️ Client - Two Ears, One Mouth

The client needs to listen and speak simultaneously.
It opens a single bidirectional stream, reads from the terminal, and prints incoming broadcasts.
client/main.go

func main() {
    port := ":4000"
    var conn *grpc.ClientConn
    conn, err := grpc.NewClient(port, grpc.WithTransportCredentials(insecure.NewCredentials()))
    if err != nil {
        log.Fatalf("Failed to create gRPC client: %s", err)
    }
    defer conn.Close()

    client := chatv1.NewChatServiceClient(conn)
    stream, err := client.Chat(context.Background())
    if err != nil {
        log.Fatalf("Error when starting chat stream: %v", err)
    }
}
Enter fullscreen mode Exit fullscreen mode

Because the Chat RPC is bidirectional streaming, we need to open a stream that can send and receive messages simultaneously.

Choose a name

Before joining the conversation, it’s nice to have an identity.

    scanner := bufio.NewScanner(os.Stdin)
    fmt.Println("💬 Connected to chat server!")
    fmt.Print("Enter your name: ")
    scanner.Scan()
    name := scanner.Text()
Enter fullscreen mode Exit fullscreen mode

Listening in the background

The client needs to constantly listen for new messages while still allowing the user to type. We’ll spin up a goroutine that handles incoming messages in a concurrent.

    go func() {
        for {
            msg, err := stream.Recv()
            if err != nil {
                return
            }
            fmt.Printf("%s ✅ [%s]: \"%s\"\n",
            msg.SentAt.AsTime().Format(time.TimeOnly),
            msg.Sender, msg.Text)
            fmt.Print("> ")
        }
    }()
Enter fullscreen mode Exit fullscreen mode
  • Recv() blocks until a message arrives from the server.

Sending messages

Now the main goroutine becomes the “speaker.”

We read from the terminal and send a message through the open stream.

    fmt.Println("Type your message and press Enter (type 'exit' to quit).")
    for {
        fmt.Print("> ")
        scanner.Scan()
        text := scanner.Text()
        text = strings.TrimSpace(text)
        if text == "" {
            continue
        }
        if strings.ToLower(text) == "exit" {
            fmt.Println("👋 Goodbye!")
            break
        }

        stream.Send(&chatv1.ChatMessage{
            Sender: name,
            Text:   text,
            SentAt: timestamppb.Now(),
        })
    }
Enter fullscreen mode Exit fullscreen mode

Here’s what’s happening:

  • Each message you type becomes a ChatMessage struct and is sent to the server.
  • The server then broadcasts it to all other clients (including you if you want).
  • Typing exit cleanly closes your send stream, which also tells the server you’ve left.

🕹️ Try it out

Now that everything has been set up, let's say hello to our chat app.

  1. Start the server

    cd server
    go run main.go
    
  2. Open multiple clients

    cd client
    go run main.go
    
  3. Give each a different name, start typing, and watch the messages appear instantly in every window.

    [14:40:12] Grace: Hello World?
    [14:40:13] Adrian: Hi Grace!
    

What’s happening under the hood

  • All clients share a persistent stream with the server.
  • The server loops forever, relaying each incoming message to every active stream.

🧠 Wrap Up

Every step might look “over-engineered” for a small demo, but it mirrors how real systems evolve.
We didn’t just make chat work, we built the foundation for scalable, session-based communication.

Along the way, we’ve quietly touched on several engineering principles:

  • Concurrency and synchronization - why we needed a sync.Mutex to safely share data between multiple client streams.
  • Persistent streaming connections - how gRPC keeps an open channel instead of repeating HTTP-style requests.
  • Clear API contracts - how Protocol Buffers enforce structure and safety between client and server.
  • Design for growth - why even a small demo benefits from centralized .proto files and modular layout.

Next, we’ll dive into rooms and authentication to give our chat a sense of place and identity.

Our lonely Voyager has finally found a constellation to talk to 🌌.

Source: Part 1 Snapshot

Top comments (0)