DEV Community

Cover image for Setting Up Local ELK Stack

Posted on

Setting Up Local ELK Stack

Photo by Chandler Cruttenden on Unsplash

As I promised in the previous post, here we'll take a look at how to spin up an ELK. Let's go! 🚀


Straight to the point: open Portainer. Go to "Containers" and smash the "Add container" button. Then look at the screenshot below and fill values for Name, Image, Ports and Volumes. Maybe change Restart Policy to Unless stopped. One mild difference is that you need to click on "Advanced mode" in order to be able to pull an image from Elastic hub. In the screenshot it says "Simple mode" since I already chose advanced.

Elastic create container

Then hit the "Deploy the container" button and wait a little.

Almost nothing new, if you've read my first article about dockerizing stuff you need. You did read that, right? 🤔

Here I'm not creating any volumes just because. But you most definitely can. Just map it to /usr/share/elasticsearch/data.


First create a volume, for example kibana_data. Then create container as on the screenshot below:

Kibana create container


Man, am I tired to repeat this 😅 Create volume and compare the stuff you're typing with a screenshot:

alt text

Deploy and wait.

Next, configuring.

sudo nano /var/lib/docker/volumes/logstash_data/_data/config/logstash.yml
Enter fullscreen mode Exit fullscreen mode

Don't know what is this, but it's important: ""
xpack.monitoring.elasticsearch.hosts: [ "" ]
Enter fullscreen mode Exit fullscreen mode

Now pipeline. I have zero explanation why it was necessary to separate main and pipeline configs, but when I was first figuring it out it took me 3 hours of swear, frustration and eye strain. Maybe it's somewhere in documentation but it's buried so good I couldn't find it.

sudo nano /var/lib/docker/volumes/logstash_data/_data/pipeline/logstash.conf
Enter fullscreen mode Exit fullscreen mode

I went with this config below since I don't need beat or w/e it is called. I just want to POST logs to Logstash. And you can always configure that as you wish by referencing the official documentation.

input {
  http {
    port => 5044

output {
  stdout {
    codec => json
  elasticsearch {
    hosts => [""]
    index => "logstash-%{+YYYY}"
Enter fullscreen mode Exit fullscreen mode

Now we try this! Simple curl -XPUT '' -d 'log' will suffice. Now navigate to It's under "Management" / "Stack management" inside the toast menu. Choose "Index Management" below "Data" from the side menu. You can see our "logstash-2021" index. Or whatever year you live in. That means it has data!

Kibana Index Management

Go to It's "Kibana" / "Discover" inside the toast menu. And what we'll see? Right, one hit we've just sent!

Kibana Discover

Here you can filter everything as you'd like 🐧

Hope you've learned something new from this article. And now you're a master of local deployment with Docker and a help from Portainer! Cheers and happy coding!

Top comments (3)

hiepxanh profile image

cannot pass discovery.type=single-node on elastic search on the newest version of portainer

hiepxanh profile image

must be a bug from portainer :D

c_v_ya profile image

Interesting, I've rebuilt the container not that long ago and it was ok. Not sure if I had the latest Portainer version or not though