DEV Community

Cover image for Concurrency in Go: Goroutines and Channels Explained with Real Examples
Dishon Oketch
Dishon Oketch

Posted on

Concurrency in Go: Goroutines and Channels Explained with Real Examples

Concurrency in Go: Goroutines and Channels Explained with Real Examples

If you've been coding in Go for a while, you've probably heard the phrase "Don't communicate by sharing memory; share memory by communicating." Today, we're going to break that down — no fluff, just real code and real use cases.


What Is Concurrency?

Concurrency is the ability to handle multiple tasks at the same time — or at least, appear to. It doesn't mean things literally run in parallel (though they can). It means your program can juggle multiple things without waiting for each one to finish before starting the next.

Think of a chef preparing a meal: while the pasta boils, they chop vegetables and stir the sauce. That's concurrency.

Go makes concurrency a first-class citizen of the language through two core concepts: goroutines and channels.


Goroutines: Lightweight Threads

A goroutine is a function that runs concurrently with other functions. You launch one with the go keyword.

package main

import (
    "fmt"
    "time"
)

func greet(name string) {
    fmt.Printf("Hello, %s!\n", name)
}

func main() {
    go greet("Alice") // runs concurrently
    go greet("Bob")   // runs concurrently
    go greet("Carol") // runs concurrently

    time.Sleep(1 * time.Second) // give goroutines time to finish
}
Enter fullscreen mode Exit fullscreen mode

Output (order may vary):

Hello, Bob!
Hello, Alice!
Hello, Carol!
Enter fullscreen mode Exit fullscreen mode

Notice the order isn't guaranteed — that's concurrency in action.

⚠️ The time.Sleep hack is just for demonstration. In real code, use sync.WaitGroup or channels to coordinate.


WaitGroups: The Right Way to Wait

package main

import (
    "fmt"
    "sync"
)

func greet(name string, wg *sync.WaitGroup) {
    defer wg.Done() // signal that this goroutine is done
    fmt.Printf("Hello, %s!\n", name)
}

func main() {
    var wg sync.WaitGroup

    names := []string{"Alice", "Bob", "Carol"}

    for _, name := range names {
        wg.Add(1) // tell the WaitGroup to expect one more
        go greet(name, &wg)
    }

    wg.Wait() // block until all goroutines call Done()
    fmt.Println("All done!")
}
Enter fullscreen mode Exit fullscreen mode

sync.WaitGroup is your best friend when you want to fire off goroutines and wait for all of them to complete.


Channels: Goroutines Talking to Each Other

A channel is a typed pipe through which goroutines send and receive values. Think of it as a conveyor belt between workers.

ch := make(chan int)     // unbuffered channel of ints
ch := make(chan int, 5)  // buffered channel with capacity 5
Enter fullscreen mode Exit fullscreen mode

Simple Example: Passing a Value

package main

import "fmt"

func square(n int, ch chan int) {
    ch <- n * n // send result to channel
}

func main() {
    ch := make(chan int)

    go square(9, ch)

    result := <-ch // receive from channel (blocks until value arrives)
    fmt.Println("9 squared is:", result)
}
Enter fullscreen mode Exit fullscreen mode

Buffered vs Unbuffered Channels

Unbuffered Channel

Sending blocks until someone receives.

ch := make(chan string)
ch <- "hello" // 🚫 DEADLOCK — no one is receiving yet!
Enter fullscreen mode Exit fullscreen mode

Buffered Channel

Sending only blocks when the buffer is full.

ch := make(chan string, 3)
ch <- "first"   // OK
ch <- "second"  // OK
ch <- "third"   // OK
ch <- "fourth"  // 🚫 BLOCKS — buffer is full
Enter fullscreen mode Exit fullscreen mode

Real Example: Fetching Data Concurrently

Imagine you're fetching user profiles from an API — one at a time is slow. Let's do it concurrently.

package main

import (
    "fmt"
    "sync"
    "time"
)

type UserProfile struct {
    ID   int
    Name string
}

// Simulates an API call
func fetchProfile(id int, ch chan<- UserProfile, wg *sync.WaitGroup) {
    defer wg.Done()
    time.Sleep(100 * time.Millisecond) // simulate network delay

    ch <- UserProfile{
        ID:   id,
        Name: fmt.Sprintf("User_%d", id),
    }
}

func main() {
    ch := make(chan UserProfile, 10)
    var wg sync.WaitGroup

    userIDs := []int{1, 2, 3, 4, 5}

    for _, id := range userIDs {
        wg.Add(1)
        go fetchProfile(id, ch, &wg)
    }

    // Close channel once all goroutines are done
    go func() {
        wg.Wait()
        close(ch)
    }()

    // Collect results
    for profile := range ch {
        fmt.Printf("Fetched: ID=%d, Name=%s\n", profile.ID, profile.Name)
    }
}
Enter fullscreen mode Exit fullscreen mode

This fetches all 5 profiles concurrently instead of sequentially — roughly 5x faster.


Select: Listening on Multiple Channels

select lets a goroutine wait on multiple channel operations at once — like a switch for channels.

package main

import (
    "fmt"
    "time"
)

func main() {
    ch1 := make(chan string)
    ch2 := make(chan string)

    go func() {
        time.Sleep(1 * time.Second)
        ch1 <- "one"
    }()

    go func() {
        time.Sleep(2 * time.Second)
        ch2 <- "two"
    }()

    for i := 0; i < 2; i++ {
        select {
        case msg1 := <-ch1:
            fmt.Println("Received from ch1:", msg1)
        case msg2 := <-ch2:
            fmt.Println("Received from ch2:", msg2)
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

This is especially useful for timeouts:

select {
case result := <-ch:
    fmt.Println("Got result:", result)
case <-time.After(3 * time.Second):
    fmt.Println("Timed out!")
}
Enter fullscreen mode Exit fullscreen mode

Common Pitfalls to Avoid

1. Goroutine Leaks

If a goroutine is blocked on a channel that nobody ever reads from, it runs forever.

// BAD: goroutine blocks forever
ch := make(chan int)
go func() {
    ch <- 42 // no one reads this
}()
Enter fullscreen mode Exit fullscreen mode

Always make sure every goroutine has a way to exit.

2. Race Conditions

Two goroutines writing to the same variable without synchronization = undefined behaviour.

// BAD
counter := 0
go func() { counter++ }()
go func() { counter++ }()
Enter fullscreen mode Exit fullscreen mode

Use sync.Mutex or channels to protect shared state.

var mu sync.Mutex
mu.Lock()
counter++
mu.Unlock()
Enter fullscreen mode Exit fullscreen mode

3. Closing a Closed Channel

Closing an already-closed channel causes a panic. Only close from the sender side, and only once.


Quick Reference

Concept Use When
go func() You want to run something concurrently
sync.WaitGroup You want to wait for multiple goroutines
Unbuffered channel You need tight synchronization between goroutines
Buffered channel You want to decouple sender and receiver
select You're waiting on multiple channels or implementing timeouts
sync.Mutex You're protecting shared mutable state

Wrapping Up

Go's concurrency model is elegant once it clicks. The key ideas:

  • Goroutines are cheap — you can spin up thousands
  • Channels are how goroutines communicate safely
  • Don't share memory to communicate — let channels do the talking
  • Watch out for leaks, race conditions, and double closes

The best way to get comfortable is to build something: a web scraper, a concurrent file processor, a worker pool. Pick one and run with it.


Are you using goroutines in your current project? What patterns have you found most useful? Drop a comment below 👇


Tags: #go #golang #concurrency #programming #beginners

Top comments (1)

Collapse
 
stephgrino profile image
Stephane Pellegrino

Really nice introduction, thanks !