<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Martin Hynar</title>
    <description>The latest articles on DEV Community by Martin Hynar (@martinhynar).</description>
    <link>https://dev.to/martinhynar</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/martinhynar"/>
    <language>en</language>
    <item>
      <title>Reason: user 'xxx' does not meet 'require'ments for user to be allowed access</title>
      <dc:creator>Martin Hynar</dc:creator>
      <pubDate>Tue, 09 Jun 2020 08:45:52 +0000</pubDate>
      <link>https://dev.to/martinhynar/reason-user-xxx-does-not-meet-require-ments-for-user-to-be-allowed-access-18aj</link>
      <guid>https://dev.to/martinhynar/reason-user-xxx-does-not-meet-require-ments-for-user-to-be-allowed-access-18aj</guid>
      <description>&lt;p&gt;This post is meant as lessons learned from setting up Apache HTTPD server with authentication for some backend resources. The goal was to secure the resources with simple password based authentication and allow given list of users to access them. Accounts have been assembed in dbm file on local filesystem.&lt;/p&gt;

&lt;h3&gt;
  
  
  Configuration that works, but...
&lt;/h3&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;VirtualHost *:80&amp;gt;
    ErrorLog /var/log/httpd/http-error.log
    CustomLog /var/log/httpd/http-access.log

    &amp;lt;Location /myresources&amp;gt;
        AuthType basic
        AuthName "Authenticate using username and password"
        AuthBasicProvider dbm

        AuthDBMUserFile "/etc/httpd/authentication.dbm"
        AuthGroupFile "/etc/httpd/groups"
        &amp;lt;RequireAny&amp;gt;
            Require user adam
            Require user bob
            Require user cecil
        &amp;lt;/RequireAny&amp;gt;
    &amp;lt;/Location&amp;gt;
&amp;lt;/VirtualHost&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;So, what we have here&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Credentials for user accounts are stored in &lt;code&gt;/etc/httpd/authentication.dbm&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Access to &lt;code&gt;myresources&lt;/code&gt; is allowed only to used &lt;code&gt;adam, bob, cecil&lt;/code&gt;. This is defined using &lt;code&gt;RequireAny&lt;/code&gt; meaning that if &lt;code&gt;any&lt;/code&gt; of these requirements is matched, user is granted with access.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This will work, but you will see disturbing messages in error log. They are disturbing, because they will be naming those users that &lt;em&gt;are allowed&lt;/em&gt;! However, at the same time, you will not see any error response in access log. (Log messages are shortened)&lt;/p&gt;

&lt;p&gt;&lt;code&gt;/var/log/httpd/http-error.log&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[authz_user:error] [client 10.10.10.10:10000] AH01663: access to /myresources failed, reason: user 'bob' does not meet 'require'ments for user to be allowed access
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;&lt;code&gt;/var/log/httpd/http-access.log&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;10.10.10.10 - bob "POST /myresources HTTP/1.1" 200 102
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;h4&gt;
  
  
  Why is that?
&lt;/h4&gt;

&lt;p&gt;The reason for this is that error log will contain error message for each unsatisfied &lt;code&gt;Require&lt;/code&gt; directive. For &lt;em&gt;adam&lt;/em&gt; there won't be any error message as he is first in list. For &lt;em&gt;bob&lt;/em&gt; there will be 1 and for &lt;em&gt;cecil&lt;/em&gt; there will be 2 for each request.&lt;/p&gt;

&lt;h3&gt;
  
  
  Configuration that works, but...
&lt;/h3&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;VirtualHost *:80&amp;gt;
    ErrorLog /var/log/httpd/http-error.log
    CustomLog /var/log/httpd/http-access.log

    &amp;lt;Location /myresources&amp;gt;
        AuthType basic
        AuthName "Authenticate using username and password"
        AuthBasicProvider dbm

        AuthDBMUserFile "/etc/httpd/authentication-myresources.dbm"
        AuthGroupFile "/etc/httpd/groups"
        Require valid-user
    &amp;lt;/Location&amp;gt;
&amp;lt;/VirtualHost&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The difference here is that there is no &lt;code&gt;Require user&lt;/code&gt; list. Instead, any valid user is allowed. But, the authentication database can have more users that only those allowed to access &lt;code&gt;myresources&lt;/code&gt;. This is why there is changed authentication file and you have to keep there correct list of users.&lt;/p&gt;

&lt;p&gt;The latter configuration won't be generating &lt;em&gt;false alarms&lt;/em&gt; in error log, but I don't consider it perfect too.&lt;/p&gt;

</description>
      <category>apache</category>
      <category>httpd</category>
      <category>authentication</category>
    </item>
    <item>
      <title>Monitoring Kafka brokers using Jolokia, Metricbeat and ElasticSearch</title>
      <dc:creator>Martin Hynar</dc:creator>
      <pubDate>Fri, 29 May 2020 13:43:32 +0000</pubDate>
      <link>https://dev.to/martinhynar/monitoring-kafka-brokers-using-jolokia-metricbeat-and-elasticsearch-5678</link>
      <guid>https://dev.to/martinhynar/monitoring-kafka-brokers-using-jolokia-metricbeat-and-elasticsearch-5678</guid>
      <description>&lt;p&gt;If you don't have commercial installation of Kafka that comes packed with monitoring solution, you have to build it on your own. You always need to have at least some tools to find what is going on in your service. Visualizing available metrics is indeed valuable option.&lt;/p&gt;

&lt;p&gt;In this text, I will describe how to get these important metrics from Kafka and ship them to ElasticSearch where they can be visualized using Kibana. I will focus mainly on Kafka and Metricbeat configuration (how to get the metrics) rather than on visualization (make figures to your own taste).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Component list&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://kafka.apache.org/" rel="noopener noreferrer"&gt;Kafka&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://jolokia.org" rel="noopener noreferrer"&gt;Jolokia&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.elastic.co/beats/metricbeat" rel="noopener noreferrer"&gt;Metricbeat&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.elastic.co/elasticsearch/" rel="noopener noreferrer"&gt;ElasticSearch&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.elastic.co/kibana" rel="noopener noreferrer"&gt;Kibana&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Data pipeline
&lt;/h1&gt;

&lt;p&gt;Kafka advertises runtime metrics using dedicated MBeans that can be made available using JMX interface. This is exactly the channel we will use here. One of the tools that is developed to collect metrics from various systems is Metricbeat that comes with prepared Kafka module that simply "knows what metrics Kafka provides". Behind the scenes, Metricbeat makes use of Jolokia that serves as bridge for JMX and provides metrics using HTTP/JSON.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.plantuml.com%2Fplantuml%2Fsvg%2FNSun3i8m44JHtgVOaXI-048e53H4L9HMYebPo2gEZUmDI3azXWxQhwVHj4MHTPc3CQx3RG9jNg8ZdL98au0ETuxQzBIpvCwiMVp0q9wsL30_0fkVQlVaZW55nLyOsOzVg2bNzzTj7UnaGCJ7FAKi2BAoCIt7Q_tp1W00" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.plantuml.com%2Fplantuml%2Fsvg%2FNSun3i8m44JHtgVOaXI-048e53H4L9HMYebPo2gEZUmDI3azXWxQhwVHj4MHTPc3CQx3RG9jNg8ZdL98au0ETuxQzBIpvCwiMVp0q9wsL30_0fkVQlVaZW55nLyOsOzVg2bNzzTj7UnaGCJ7FAKi2BAoCIt7Q_tp1W00" alt="pipeline"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Setting parts up
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Kafka with Jolokia
&lt;/h3&gt;

&lt;blockquote&gt;
&lt;p&gt;Jolokia is a JMX-HTTP bridge giving an alternative to JSR-160 connectors. It is an agent based approach with support for many platforms. In addition to basic JMX operations it enhances JMX remoting with unique features like bulk requests and fine grained security policies.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In this article, I will not describe installation of Kafka. There is plenty of up-to-date documentation available. I am using rpm packaged Kafka distribution from &lt;a href="https://www.confluent.io" rel="noopener noreferrer"&gt;Confluent.io&lt;/a&gt;. Let's focus on Jolokia.&lt;/p&gt;

&lt;p&gt;We will be running Jolokia in agent mode. This is, we'll put Jolokia agent library on Kafka's class path, configure Jolokia in Kafka's configuration files and run it. This way, Jolokia will connect to Kafka monitoring MBeans and will provide metrics on HTTP interface.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Download &lt;strong&gt;Jolokia JVM Agent&lt;/strong&gt; - &lt;a href="https://jolokia.org/download.html" rel="noopener noreferrer"&gt;https://jolokia.org/download.html&lt;/a&gt; &lt;/li&gt;
&lt;li&gt;Save it into Kafka's lib folder &lt;code&gt;/usr/share/java/kafka&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;For easier configuration, either rename or make symlink &lt;code&gt;/usr/share/java/kafka/jolokia-jvm-agent.jar&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Now, we have the agent library in place and need to configure it&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open &lt;code&gt;/usr/bin/kafka-server-start&lt;/code&gt; and add &lt;code&gt;KAFKA_JMX_OPTS&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

export KAFKA_JMX_OPTS="
-javaagent:/usr/share/java/kafka/jolokia-jvm-agent.jar=port=8778,host=localhost \
-Dcom.sun.management.jmxremote=true \
-Dcom.sun.management.jmxremote.authenticate=false \
-Dcom.sun.management.jmxremote.ssl=false \
-Djava.rmi.server.hostname=localhost \
-Dcom.sun.management.jmxremote.host=localhost \
-Dcom.sun.management.jmxremote.port=9999 \
-Dcom.sun.management.jmxremote.rmi.port=9999 \
-Djava.net.preferIPv4Stack=true"



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;Start Kafka&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At this point, you shall be able to see listening process on port 8778. You can try to get some numbers from it.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

&lt;span class="nb"&gt;sudo &lt;/span&gt;netstat &lt;span class="nt"&gt;-ltnp&lt;/span&gt;
curl &lt;span class="nt"&gt;-s&lt;/span&gt; http://localhost:8778/jolokia/version | jq


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
  
  
  Metricbeat
&lt;/h3&gt;

&lt;p&gt;After installing Metricbeat, you find default configuration in &lt;code&gt;/etc/metricbeat&lt;/code&gt; with Kafka module disabled. In Metricbeat, we need to do 2 things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Configure Kafka module for sampling, and&lt;/li&gt;
&lt;li&gt;Configure Metricbeat to send data to ElasticSearch and create Kibana artifacts.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To configure Kafka sampling&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Rename &lt;code&gt;/etc/metricbeat/modules.d/kafka.yml.disabled&lt;/code&gt; to &lt;code&gt;/etc/metricbeat/modules.d/kafka.yml&lt;/code&gt; to activate it.&lt;/li&gt;
&lt;li&gt;Open &lt;code&gt;/etc/metricbeat/modules.d/kafka.yml&lt;/code&gt;. To read all available metrics, the configuration shall have this contents&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;

&lt;span class="c1"&gt;# Module: kafka&lt;/span&gt;
&lt;span class="c1"&gt;# Docs: https://www.elastic.co/guide/en/beats/metricbeat/7.6/metricbeat-module-kafka.html&lt;/span&gt;

&lt;span class="c1"&gt;# Kafka metrics collected using the Kafka protocol&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;module&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;kafka&lt;/span&gt;
  &lt;span class="na"&gt;metricsets&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;partition&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;consumergroup&lt;/span&gt;
  &lt;span class="na"&gt;period&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;10s&lt;/span&gt;
  &lt;span class="na"&gt;hosts&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;localhost:9092"&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;

  &lt;span class="na"&gt;client_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;metricbeat&lt;/span&gt;

  &lt;span class="c1"&gt;# List of Topics to query metadata for. If empty, all topics will be queried.&lt;/span&gt;
  &lt;span class="c1"&gt;#topics: []&lt;/span&gt;

&lt;span class="c1"&gt;# Metrics collected from a Kafka broker using Jolokia&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;module&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;kafka&lt;/span&gt;
  &lt;span class="na"&gt;metricsets&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;broker&lt;/span&gt;
  &lt;span class="na"&gt;period&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;10s&lt;/span&gt;
  &lt;span class="na"&gt;hosts&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;localhost:8778"&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;For detailed description of Metricbeat's Kafka module see &lt;a href="https://www.elastic.co/guide/en/beats/metricbeat/current/metricbeat-module-kafka.html" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;To configure proper forwarding of metrics to ElasticSearch&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open &lt;code&gt;/etc/metricbeat/metricbeat.yml&lt;/code&gt;, and configure &lt;code&gt;elasticsearch&lt;/code&gt; output&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;

&lt;span class="na"&gt;output.elasticsearch&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;hosts&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;elasticsearch:9200"&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
  &lt;span class="na"&gt;index&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;metricbeat-kafka-%{[agent.version]}-%{+yyyy-MM-dd}"&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;Start Metricbeat service.&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

&lt;span class="nb"&gt;sudo &lt;/span&gt;systemctl start metricbeat


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;At this moment, Metricbeat shall start to sample Kafka metrics with 10 seconds interval and send them to ElasticSearch.&lt;/p&gt;

&lt;p&gt;If you have Kibana installed, you can command Metricbeat to create visualizations and dashboard for you. For this, you need to configure &lt;a href="https://www.elastic.co/guide/en/beats/metricbeat/current/setup-kibana-endpoint.html" rel="noopener noreferrer"&gt;Kibana endpoint&lt;/a&gt; parameters - where is your Kibana - and &lt;a href="https://www.elastic.co/guide/en/beats/metricbeat/current/configuration-dashboards.html" rel="noopener noreferrer"&gt;Kibana dashboard&lt;/a&gt; parameters - where in Kibana you want the artifacts to be created.&lt;/p&gt;

&lt;p&gt;To create Kibana artifacts, run &lt;code&gt;metricbeat setup&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Version information&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For this post, I used following component versions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Kafka - Confluent Community 5.4.1&lt;/li&gt;
&lt;li&gt;Jolokia 1.6.2&lt;/li&gt;
&lt;li&gt;Metricbeat 7.6.2&lt;/li&gt;
&lt;li&gt;ElasticSearch 7.6.0&lt;/li&gt;
&lt;li&gt;Kibana 7.6.0&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>kafka</category>
      <category>jolokia</category>
      <category>metricbeat</category>
      <category>elasticsearch</category>
    </item>
  </channel>
</rss>
