Have you heard about Elasticsearch before? It's a popular search engine that can do supercharged-search unlike the SQL query can do. With elasticsearch, you can perform search based on synonym or n-gram, autocomplete for search, multi data aggregation in a single query, and more.
The elastic doesn't provides only a search engine service, they also provides the data visualization plugin called Kibana
to give you insight about your data in a interesting way. You probably also need data collection engine to store data to elasticsearch automatically from a different sources, so they made Logstash
for you.
All these services became a trinity stack abbreviately called ELK.
By reading this post, I assume you are eager to learn more about ELK stack. I'm not gonna tell you everything about elasticsearch here, but I want to help you to get up and run elastcicsearch at ease using Docker-ELK.
Setting Up and Run Docker-ELK
Before we get started, make sure you had docker
and docker-compose
installed on your machine.
You can grab the latest Docker-ELK
repository from Its official Github page or clone it using git command :
git clone https://github.com/deviantony/docker-elk.git
After you done clone the repository, go inside docker-elk
directory/repository using your favorite CLI and execute this docker-compose
command:
docker-compose up -d
The above command will pull and build the containers (elasticsearch, kibana, logstash). The pull and build process will take long time (and your internet connection too) for the first time.
After the above processes are done, try to check the containers status by executing:
docker-compose ps
everything are ok if the output of above command is more like this:
The system cannot find the path specified.
Name Command State Ports
------------------------------------------------------------------------------------------------------------------------------
docker-elk_elasticsearch_1 /usr/local/bin/docker-entr ... Up 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp
docker-elk_kibana_1 /usr/local/bin/dumb-init - ... Up 0.0.0.0:5601->5601/tcp
docker-elk_logstash_1 /usr/local/bin/docker-entr ... Up 0.0.0.0:5000->5000/tcp, 5044/tcp, 0.0.0.0:9600->9600/tcp
Accessing Elasticsearch
The Elasticsearch service should be accessible by accessing http://localhost:9200 using a HTTP client like Postman. Use auth to access it with elastic
as the username and changeme
as the password, and add application/json
for Content-Type
header.
Here is my result on Postman:
Accessing Kibana
The Kibana app should be accessible by accessing http://localhost:5601 using your favorite browser. When you got something like Kibana server is not ready yet
on the browser, it means you really have to wait for the server to be ready. Just wait for about 10-15 minutes, then you can refresh the page and you will see the kibana login screen.
You can use the same elastic
and changeme
credential as username and password to enter.
Accessing Logstash
The Logstash server should be accessible in http://localhost:5000 . I'm still new to ELK stack, so I will provide you example to use this service later or maybe on a specific blog post.
Have fun exploring ELK Stack!
ELK version used: 7.4.1
Top comments (4)
Hey I m currently using version 7.9 . Any idea or tutorials on how to use it .
I tried running Kibana, but for almost several hours, I m getting the response as 'Kibana server not ready yet' in Windows OS
Any inputs ?
try to inspect it using
docker-compose logs kibana
and see what's happening.thank you dendi, but its my build error:
any idea?
Maybe the internet connection problem?