DEV Community

Daniel Albuschat
Daniel Albuschat

Posted on

Request for Comments: Slurping Kubernetes logs

I've investigated many tools and stacks to store and search logs of containers, especially from Kubernetes clusters. In my environment, I have a hefty constraint that I don't want the logs, which main contain personal information, e.g. usernames in URIs, to be stored outside the EU. GDPR and all. That excludes the ready-to-use services like datadog, dynatrace, newrelic, etc.

It seems the ELK stack consisting of ElasticSearch, Logstash and Kibana seems pretty popular and can be used on-premise. However, investigating this and other solutions I found them all to be very complex, and read comments about them being hard to set up and maintain.

Thinking about it, I wonder whether there is not a very simple approach that might yield good results. My idea is simple: Just write a small script that fetches logs from containers via kubectl logs --timestamps=true once a minute or so. Consecutive fetches use --since-time of the last received timestamp. The script pushes the results into some database to archive them and make them searchable.

I know, this is a "roll-your-own..." approach, but it seems pretty simple, but also effective. I'd like to ask the dev community for feedback on this idea, and pointers to alternatives that are not complex and can be used on-premise.

So, what are your thoughts?

Top comments (1)

Collapse
 
danielkun profile image
Daniel Albuschat

Or, instead of polling the logs, one could do the same with kubectl logs -f instead and push the log files "live" into the store...