Photo by Hannah Busing on Unsplash
This is the first one in the series of Golang Themed articles.
One of the most in-demand skills in the cloud-native community is the ability to write concurrent programs to leverage all the multi-core processor power today’s hardware has to offer. Golang was developed keeping concurrency in mind.
Digital transformation does not just mean moving to cloud platforms. In fact, digital transformation is indeed a continuous process undertaken by organisations to constantly optimize their IT spend. Well, not just organizations, solo-businesses, and startups alike - who does not want to save money?
In this post, we will understand the concepts related to Golang’s concurrency, the benefits in the context of cloud architecture, and also explore a few patterns to use concurrency in Golang.
Goroutines vs. Threads
Before we move ahead, it is important to understand the difference between threads and goroutines.
Threads are managed by OS, while Goroutines are managed by Go runtime environment. Threads are single units of execution within a process - multiple threads belong to a process - and all these threads share the same resources. While Goroutines are independently scheduled functions and do not share resources.
Due to the sharing aspect described above, threads tend to be more prone to deadlocks and race conditions. While Goroutines are better immune to the same. Since Goroutines are managed by Go runtime environment, it offers higher level of abstraction - thus easy to implement by developers. Threads on the other hand offer lower level of abstraction.
This managed nature of Goroutines is of great advantage over threads when automatically scheduling them on multiple OS threads. As developers, it reduces the cognitive load of worrying about shared resources, and also makes them lightweight so that instantiation and context switching becomes faster.
What should we know to understand Golang concurrency?
Goroutines and channels are fundamental concurrency constructs in the Go programming language, designed to simplify and enhance the development of concurrent and parallel applications. They allow us to achieve concurrency by enabling multiple tasks to be executed concurrently without the need for creating separate threads or managing complex synchronization mechanisms.
Channels, provide a safe and structured way for goroutines to communicate and synchronize their actions. Channels act as pipes for the exchange of data between goroutines. They ensure proper synchronization and avoid race conditions by enforcing a model where data is sent on a channel by one goroutine and received by another in a coordinated manner, resulting in clean and clear communication between concurrent tasks. Channels are used for data sharing and signaling, allowing goroutines to coordinate their actions and operate in a synchronized manner.
The combination of goroutines and channels enables us to write concurrent programs that are both efficient and comprehensible, fostering easier maintenance and debugging of complex parallel applications.
Top 5 Benefits of Goroutines in cloud architecture
Backend applications built using Golang have greater impact on reliablility, and resource optimization - in a good way!
Efficient Concurrency
Goroutines are designed to be lightweight and have a lower memory footprint compared to traditional threads. This efficiency makes it easier to handle a large number of concurrent tasks in cloud applications without consuming excessive resources.
Scalability
Cloud architectures often require the ability to scale resources up or down dynamically based on demand. Goroutines can help distribute workloads efficiently, enabling cloud applications to handle increased traffic and workload reducing the need to provision or manage additional VM instances.
Parallelism
Goroutines allow for easy parallelism by executing tasks concurrently. This leads to improved performance for tasks that are divided into smaller subtasks, such as data processing, image manipulation, or network requests. In a cloud environment, this leads to faster response times and optimized resource utilization.
Cost Optimization
Cloud services are billed based on resource usage. By utilizing goroutines, applications can make better use of available resources, optimizing the utilization of CPU cores and memory. This efficiency can result in cost savings in cloud deployments.
Resource Pooling
In cloud architectures, resources like database connections or network sockets need to be managed efficiently. Goroutines are used to manage resource pooling, allowing multiple tasks to share limited resources effectively.
Top 3 Frequently used concurrency patterns in Golang
As a very basic example, if you make a function call with a go
keyword at the beginning, it will automatically be executed in parallel. Let us take a look at some of the advanced patterns below.
Producer-Consumer
As the name suggests, the producer function is where input data is generated and passed to the calling function via channels to consumer function. The consumer function processes any data that arrives on the channel in concurrent manner. There can me multiple producers, and a single consumer can take care of the input data.
Worker Pools
In this pattern, the logic to process the input/data is wrapped in a separate function called “worker” function. The calling function
divides the input data into multiple batches
creates input channel
calls the worker function with each batch of input data
Fan-Out, Fan-In
Similar to Worker Pools pattern, the worker function is present here as well. The difference here is that the calling function waits for aggregating the results from all the workers (fan-in). The calling function
divides the input data into multiple batches
creates input channel and results channel
passes each batch of data to the worker function using go keyword to induce concurrent execution
waits for all the workers to return results on results channel
Refer to this video for more information on Golang Concurrency.
Sumeet N.
Top comments (0)