DEV Community

anandsunderraman
anandsunderraman

Posted on

Running ELK (Elastic Logstash Kibana) on Docker

ELK (Elastic Logstash Kibana) are a set of software components that are part of the Elastic stack.

What does ELK do ?

To explain in layman terms this what each of them do

  • Elasticsearch is primarily a data store
  • Logstash is a data parsing software that stores the data in Elasticsearch in a desired format
  • Kibana is the UI that can be used to query / visualize the data that is stored in Elasticsearch

To get an in-depth understanding of what they do and how they work I would recommend Beginner's Guide To Elastic Search

How do I run ELK ?

Since each of the above components are separate pieces of software one way of running them is to head to the installation instructions and run each one of them separately.

An easier and a more convenient way to run them would be using Docker.

Most likely if you find yourself experimenting with this stack you would want to run all these 3 together. What better way to achieve than using docker and docker-compose

Docker Compose

At the time of writing this post I was experimenting with ELK stack version 6.6. Hence the following docker-compose.yml refers to image versions 6.6.

If you notice the above gist it references a directory by name logstash-conf. The contents of this directory is a logstash configuration file that dictates how the data needs to be parsed.

The contents of this file would be:

What are we configuring in Logstash ?

The following section says we will get the input for logstash via beats which is another software in Elastic which I will attempt to explain in another post.

We configure to obtain data via port 5044 and we expect the data to be in json format

input {
    beats {
        port => "5044"
        codec => "json"
    }
}
Enter fullscreen mode Exit fullscreen mode

Here we state that we are using the json plugin in logstash and attempt to extract json data from the message field in our log message. I know this sounds a bit cryptic but hope you take the leap of faith with me on this.

filter {
  json {
    source => "message"
  }
}
Enter fullscreen mode Exit fullscreen mode

Finally we have the output. We basically are passing on the data to elasticsearch to store the data in an index that is defined by "%{[fields][project]}-%{[fields][application]}-%{+YYYY.MM.dd}"

output {
    elasticsearch {
        hosts => "${ELASTIC_HOST}"
        index => "%{[fields][project]}-%{[fields][application]}-%{+YYYY.MM.dd}"
        codec => json
    }
}
Enter fullscreen mode Exit fullscreen mode

How to run it ?

  • Start docker on your local machine
  • Run docker-compose up in the directory where you have the docker-compose.yml

How do I navigate to Kibana ?

  • Point your browser to http://localhost:5601/
  • Note this is based on the port 5601 provided for the kibana image on the docker-compose.yml

Conclusion

The main goal of this tutorial was to demonstrate how to get the ELK stack running using docker

Top comments (0)