<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Elastic</title>
    <description>The latest articles on DEV Community by Elastic (@elastic).</description>
    <link>https://dev.to/elastic</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/elastic"/>
    <language>en</language>
    <item>
      <title>What is Context Engineering?</title>
      <dc:creator>Carly Richmond</dc:creator>
      <pubDate>Tue, 09 Sep 2025 14:22:24 +0000</pubDate>
      <link>https://dev.to/elastic/what-is-context-engineering-f56</link>
      <guid>https://dev.to/elastic/what-is-context-engineering-f56</guid>
      <description>&lt;p&gt;With the fast-paces evolving nature of AI, new terms and techniques appear all the time. One of the latest discussions is around context engineering. If you are not sure what context engineering is, why it's important, or what techniques you can use to optimize the context your agentic systems use, read on to find out.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is context engineering?
&lt;/h2&gt;

&lt;p&gt;Context engineering refers to a collection of practices that can be combined to provide the right information to Large Language Models (or LLMs) to help them accomplish the desired task. It's important to ensure that the LLMs we use in agents and MCP tools have the right information sources to ensure they provide accurate results, and don't hallucinate or fail to give the desired answer. My high school maths teacher always talked about the notion of "rubbish in, rubbish out" in terms of the inputs we provided to our calculations and proofs. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcs9ml22zdfs8s0dh2nam.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcs9ml22zdfs8s0dh2nam.png" alt="ChatGPT Output asking what they author said about Context Engineering" width="800" height="406"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The same goes for LLMs. We can't expect LLMs to provide the answers and automations we need accurately without providing them with the right information. As you can see from the above example leveraging ChatGPT, it can only pick up information on which it was trained, or that is provided in the context via the components discussed in subsequent sections.&lt;/p&gt;

&lt;h2&gt;
  
  
  Components
&lt;/h2&gt;

&lt;p&gt;The below visualization showcases the key components of context that we can use to improve the responses of LLMs invoked by AI agents and called by MCP tools: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fba64t2n3lzyf6s3x53u0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fba64t2n3lzyf6s3x53u0.png" alt="Context Engineering ([credit Philipp Schmid](https://www.philschmid.de/context-engineering))" width="800" height="611"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As &lt;a href="https://github.com/humanlayer/12-factor-agents/blob/main/content/factor-03-own-your-context-window.md" rel="noopener noreferrer"&gt;Dexter Horthy outlines as his 3rd principal of 12-Factor Agents&lt;/a&gt;, it's important to own your context to ensure LLMs generate the best outputs possible.&lt;/p&gt;

&lt;h3&gt;
  
  
  RAG
&lt;/h3&gt;

&lt;p&gt;RAG is an architectural pattern where data sourced from an information retrieval system, such as Elasticsearch, is provided to an LLM to ground and enhance the results that they generate. We've covered RAG in many Elasticsearch Labs blogs &lt;a href="https://www.elastic.co/search-labs/blog/retrieval-augmented-generation-rag" rel="noopener noreferrer"&gt;including this one&lt;/a&gt; which provides an overview of the construct, and &lt;a href="https://www.elastic.co/search-labs/tutorials/chatbot-tutorial/welcome" rel="noopener noreferrer"&gt;this tutorial&lt;/a&gt; which covers building a single RAG chatbot with Python, Langchain, React and Elasticsearch. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzc69c4eepaar5ty2neht.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzc69c4eepaar5ty2neht.jpg" alt="RAG Sample Architecture" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Although some suggest that the ever expanding context window size of newer LLMs mean that "RAG is dead", in fact may find their LLM suffers from &lt;a href="https://www.dbreunig.com/2025/06/22/how-contexts-fail-and-how-to-fix-them.html#context-confusion" rel="noopener noreferrer"&gt;context confusion&lt;/a&gt;, as covered by Drew Breunig. Context confusion refers to the issue where surplus information provided to the LLM leads to a sub-optimal response.  RAG helps direct LLMS to the desired result as it addresses common limitations of general LLMs, including:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Lack of specific domain knowledge for jargon-heavy disciplines such as financial services or engineering.&lt;/li&gt;
&lt;li&gt;Newer information or events that have happened after the model has been trained.&lt;/li&gt;
&lt;li&gt;Hallucinations, where the LLM generates incorrect answers.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;RAG typically involves pulling relevant documents from a data store and passing via the prompt, or through dedicated AI tools invoked by an LLM. A simple example from the &lt;a href="https://www.elastic.co/search-labs/blog/ai-agents-ai-sdk-elasticsearch#flight-information-tool" rel="noopener noreferrer"&gt;AI SDK Travel Planner&lt;/a&gt; leveraging the &lt;a href="https://www.elastic.co/docs/reference/elasticsearch/clients/javascript" rel="noopener noreferrer"&gt;Elasticsearch JavaScript client&lt;/a&gt; is given below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;tool&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nx"&gt;createTool&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ai&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;zod&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;Client&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@elastic/elasticsearch&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;SearchResponseBody&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@elastic/elasticsearch/lib/api/types&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;Flight&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;../model/flight.model&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;upcoming-flight-data&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;Client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Client&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;node&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ELASTIC_ENDPOINT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;auth&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ELASTIC_API_KEY&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;extractFlights&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;SearchResponseBody&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;Flight&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;Flight&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="kc"&gt;undefined&lt;/span&gt;&lt;span class="p"&gt;)[]&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;hit&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;hit&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;_source&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;flightTool&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;createTool&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Get flight information for a given destination from Elasticsearch, both outbound and return journeys&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;parameters&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;object&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;destination&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;describe&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;The destination we are flying to&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="na"&gt;origin&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;describe&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;The origin we are flying from (defaults to London if not specified)&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="p"&gt;}),&lt;/span&gt;
  &lt;span class="na"&gt;execute&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;destination&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;origin&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;responses&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;msearch&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
        &lt;span class="na"&gt;searches&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;index&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;index&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
              &lt;span class="na"&gt;bool&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="na"&gt;must&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
                  &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="na"&gt;match&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                      &lt;span class="na"&gt;origin&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;origin&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="p"&gt;},&lt;/span&gt;
                  &lt;span class="p"&gt;},&lt;/span&gt;
                  &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="na"&gt;match&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                      &lt;span class="na"&gt;destination&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;destination&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="p"&gt;},&lt;/span&gt;
                  &lt;span class="p"&gt;},&lt;/span&gt;
                &lt;span class="p"&gt;],&lt;/span&gt;
              &lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="p"&gt;},&lt;/span&gt;

          &lt;span class="c1"&gt;// Return leg&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;index&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;index&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
              &lt;span class="na"&gt;bool&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="na"&gt;must&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
                  &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="na"&gt;match&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                      &lt;span class="na"&gt;origin&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;destination&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="p"&gt;},&lt;/span&gt;
                  &lt;span class="p"&gt;},&lt;/span&gt;
                  &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="na"&gt;match&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                      &lt;span class="na"&gt;destination&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;origin&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="p"&gt;},&lt;/span&gt;
                  &lt;span class="p"&gt;},&lt;/span&gt;
                &lt;span class="p"&gt;],&lt;/span&gt;
              &lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="p"&gt;],&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;

      &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;responses&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;responses&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Unable to obtain flight data&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;

      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;outbound&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;extractFlights&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;responses&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;responses&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nx"&gt;SearchResponseBody&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;Flight&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="na"&gt;inbound&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;extractFlights&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;responses&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;responses&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nx"&gt;SearchResponseBody&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;Flight&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="p"&gt;};&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Unable to obtain flight information&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;location&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;location&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="p"&gt;};&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Retrieving relevant information from sources such as Elasticsearch or others, and even utilising techniques such as &lt;a href="https://arxiv.org/abs/2305.14239" rel="noopener noreferrer"&gt;LLM summarization&lt;/a&gt; or data aggregation as my colleague Alex achieved &lt;a href="https://www.elastic.co/search-labs/blog/how-to-build-mcp-server" rel="noopener noreferrer"&gt;when building an MCP data to summarize and query his health data&lt;/a&gt;, can ensure that the LLM has the precise data it needs to provide the answer. This context can then be passed using emerging protocols such as &lt;a href="https://www.elastic.co/search-labs/blog/mcp-current-state" rel="noopener noreferrer"&gt;Model Context Protocol (MCP)&lt;/a&gt; or &lt;a href="https://developers.googleblog.com/en/a2a-a-new-era-of-agent-interoperability/" rel="noopener noreferrer"&gt;Agent2Agent Protocol (known as A2A)&lt;/a&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  Prompts
&lt;/h3&gt;

&lt;p&gt;Perhaps considered a more established practice, but still very much a subset of context engineering, prompt engineering refers to the practice of refining and crafting effective inputs (or prompts) to an LLM to produce the result we want. Although commonly structured as simple text, prompts can consist of other media sources such as images and sounds. Sander Schulhoff et al. in &lt;a href="https://arxiv.org/pdf/2406.06608" rel="noopener noreferrer"&gt;their survey of prompt engineering&lt;/a&gt; define the following components of a prompt:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Directive: the instruction or question serving as the main intent of the request.&lt;/li&gt;
&lt;li&gt;Exemplars: demonstrable examples to guide the LLM to accomplish the task.&lt;/li&gt;
&lt;li&gt;Output Formatting: the format the output information is expected to be returned, such as JSON or unstructured text. This is important as depending on the source of data the LLM may need to translate (for example from the structured JSON of an Elasticsearch query response to another format compared to returning the result directly).&lt;/li&gt;
&lt;li&gt;Style instructions: guidance on how to alter the structure of the output. This is considered a specific type of output formatting.&lt;/li&gt;
&lt;li&gt;Role: the persona the LLM needs to emulate to achieve the task (for example a travel agent).&lt;/li&gt;
&lt;li&gt;Additional information: other useful details needed to complete the task, including context from other sources.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Specifically, the below example showcases each of these elements for a prompt for a travel planning agent:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw8ofa7tkp02ugaj5ynf2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw8ofa7tkp02ugaj5ynf2.png" alt="Example Prompt" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;All of these elements can be tweaked and evaluated to ensure the optimal result is obtained by the LLM. In addition to these elements, there are numerous techniques that can be used to structure and optimize prompts to gain the answer you need. For example, &lt;a href="https://arxiv.org/pdf/2201.11903" rel="noopener noreferrer"&gt;Wei et al. in their 2023 paper&lt;/a&gt; found that standard zero-shot prompts where we ask an LLM a simple structured question fair less effectively compared to chain-of-thought prompting techniques for arithmetic and reasoning tasks. The differences are summarized in the example below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg6nbwgtqv4d3a65ea0cg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg6nbwgtqv4d3a65ea0cg.png" alt="Standard versus Chain-of-Thought Prompting (credit [Wei et al. in their 2023 paper](https://arxiv.org/pdf/2201.11903))" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When considering the format of the prompt to provide you need to consider several factors, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The type of task (for example simple recall or translation compared to complex arithmetic reasoning).&lt;/li&gt;
&lt;li&gt;Task complexity and ambiguity. Ambiguous requests may lead to unpredictable results.&lt;/li&gt;
&lt;li&gt;The inputs you are providing as context, along with the format.&lt;/li&gt;
&lt;li&gt;The output required.&lt;/li&gt;
&lt;li&gt;The capabilities of your chosen LLM.&lt;/li&gt;
&lt;li&gt;The persona you would like the LLM to emulate.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Memory
&lt;/h3&gt;

&lt;p&gt;Much like humans, AI applications rely on both short and long-term memory to recall information. Within context engineering:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Short-term memory, often referred to as state or chat history, refers to the messages exchanged in the current conversation between the user and the model. This includes the initial and follow-up questions presented by the user.&lt;/li&gt;
&lt;li&gt;Long-term memory, simply referred to as memory, refers to information shared across conversations. Key examples would be relevant common information or recent prior conversations.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Taking our Travel Planner Agent example, the short-term memory would include the travel dates and location, along with any follow-up messages if the user changes their mind and wants to explore another destination. The long term memory in this place could contain profile information about the user's travel preferences, along with past trips that could be used to inform the suggestions of what activities to include in a new itinerary (such as wine tasting opportunities for those who have taken part in those activities on prior vacations).&lt;/p&gt;

&lt;p&gt;Most AI frameworks provide the ability to manage both chat history and memory as it's important to ensure the history is managed to ensure it fits within the context window alongside the other elements of context. Taking &lt;a href="https://langchain-ai.github.io/langgraphjs/" rel="noopener noreferrer"&gt;LangGraph&lt;/a&gt; as an example, short term memory is managed as part of agent state using a checkpointer, meanwhile long term memory is persisted to long term stores:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvrtzkr9jb5zoosvsqz74.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvrtzkr9jb5zoosvsqz74.png" alt="LangGraph Memory Management (credit [LangGraph](https://langchain-ai.github.io/langgraphjs/concepts/memory/))" width="571" height="372"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As we build multi-agent architectures, we need to also be mindful about the segregation of memory and context. When splitting tasks among sub-agents in a larger flow, each agent may need knowledge of the other agent's results to remain in sync. However, over time these additions result in a context window overflow:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8jqkkty722djt92q05f7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8jqkkty722djt92q05f7.png" alt="Multi-Agent Context Overflow (credit [Cognition](https://cognition.ai/blog/dont-build-multi-agents#a-theory-of-building-long-running-agents))" width="800" height="816"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is important that the context stored in both types of memory to ensure relevant and up to date context is provided to LLMs. Failure to do so can result in context poisoning. This can be carried out through malicious intent, as we see in prompt injection and data poisoning attacks as per the &lt;a href="https://genai.owasp.org/resource/owasp-top-10-for-llm-applications-2025/" rel="noopener noreferrer"&gt;OWASP LLM Application Top 10&lt;/a&gt;. But it can also occur via innocent reasons such as the build up of history that can distract the model, or even contradictory information that results in clashes. In the &lt;a href="https://storage.googleapis.com/deepmind-media/gemini/gemini_v2_5_report.pdf" rel="noopener noreferrer"&gt;Gemini 2.5 report&lt;/a&gt;, researchers found that a Pokémon-playing Gemini agent showed a tendency to repeat actions from its history instead of forming novel approaches, meaning the growing context became more of a hindrance to solving the problem. For these reasons practices such as chat history trimming, summarization and relevant retrieved information should be managed.&lt;/p&gt;

&lt;h3&gt;
  
  
  Structured Outputs
&lt;/h3&gt;

&lt;p&gt;As we move to complex AI agent architectures, there is a need to ensure that the outputs emitted by LLMs adhere to a schema or contract that makes it easier to parse and integrate with other systems and workflows.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4qd9bhbgl5z5l48398z7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4qd9bhbgl5z5l48398z7.png" alt="Structured Outputs (credit [LangChain](https://js.langchain.com/docs/concepts/structured_outputs/))" width="800" height="246"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We are all used to freeform text results, but these formats can be difficult to integrate into dependent systems and agents. Much like designing a set of REST endpoints not just adhering to best practices such as the &lt;a href="https://www.openapis.org/" rel="noopener noreferrer"&gt;OpenAPI standard&lt;/a&gt; but to a contract compatible with other components, we need to specify the output format and schema that we expect the LLM to return. The below example represents specifying a schema to generate an object adhering to a particular schema using &lt;a href="https://ai-sdk.dev/" rel="noopener noreferrer"&gt;AI SDK&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;generateObject&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ai&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;zod&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;object&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;generateObject&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;openai/gpt-4.1&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;schemaName&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Travel Itinerary&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;schemaDescription&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Sample travel itinerary for a trip&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;schema&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;object&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="na"&gt;location&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="na"&gt;hotel&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;object&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="na"&gt;roomType&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="na"&gt;amount&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;number&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="na"&gt;checkin&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;iso&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;date&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="na"&gt;checkout&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;iso&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;date&lt;/span&gt;&lt;span class="p"&gt;()}),&lt;/span&gt;
    &lt;span class="na"&gt;flights&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;object&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="na"&gt;carrier&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="na"&gt;flightNo&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;8&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="na"&gt;origin&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="na"&gt;destination&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="na"&gt;date&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;iso&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;datetime&lt;/span&gt;&lt;span class="p"&gt;()})),&lt;/span&gt;
    &lt;span class="na"&gt;excursions&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;object&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="na"&gt;amount&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="na"&gt;date&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;iso&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;datetime&lt;/span&gt;&lt;span class="p"&gt;()}))&lt;/span&gt;
  &lt;span class="p"&gt;}),&lt;/span&gt;
  &lt;span class="na"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Generate a travel itinerary based on the specified location&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The introduction of JSON structured output for LLM outputs makes sense. It's common to need to balance processing structured and unstructured data, just as Elasticsearch does internally. For this reason there is emerging support in some models for generating outputs adhering to a provided JSON schema, including through the &lt;a href="https://platform.openai.com/docs/guides/structured-outputs?lang=javascript" rel="noopener noreferrer"&gt;Structured Outputs feature available in the OpenAI platform&lt;/a&gt;. When combined with function calling, this allows us to define standard contracts for the passing of information between tools. However, given LLMs can generate JSON with syntax issues it's important to handle potential errors gracefully when processing results.&lt;/p&gt;

&lt;h3&gt;
  
  
  Available Tools
&lt;/h3&gt;

&lt;p&gt;The final element of context that we can use within context engineering is the tools that we provide to LLMs to provide data. Tools allow us to perform actions such as automating operations such as booking the trip defined by our itinerary, retrieving data using RAG as discussed previously, or providing information from other sources of information.&lt;/p&gt;

&lt;p&gt;We have shown an example of a RAG tool above with our &lt;code&gt;flightTool&lt;/code&gt;, but tools can be used to pull in other sources of information, for example the below weather tool built with &lt;a href="https://ai-sdk.dev/" rel="noopener noreferrer"&gt;AI SDK&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;tool&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nx"&gt;createTool&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ai&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;zod&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;WeatherResponse&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;../model/weather.model&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;weatherTool&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;createTool&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; 
  &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Display the weather for a holiday location&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;parameters&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;object&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;location&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;string&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;describe&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;The location to get the weather for&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="p"&gt;}),&lt;/span&gt;
  &lt;span class="na"&gt;execute&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;location&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// While a historical forecast may be better, this example gets the next 3 days&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`https://api.weatherapi.com/v1/forecast.json?q=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;location&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;&amp;amp;days=3&amp;amp;key=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;WEATHER_API_KEY&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="na"&gt;weather&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;WeatherResponse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; 
        &lt;span class="na"&gt;location&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;location&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
        &lt;span class="na"&gt;condition&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;weather&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;condition&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
        &lt;span class="na"&gt;condition_image&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;weather&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;condition&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;icon&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;temperature&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;weather&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;temp_c&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="na"&gt;feels_like_temperature&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;weather&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;feelslike_c&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="na"&gt;humidity&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;weather&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;humidity&lt;/span&gt;
      &lt;span class="p"&gt;};&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; 
        &lt;span class="na"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Unable to obtain weather information&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
        &lt;span class="na"&gt;location&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;location&lt;/span&gt;
      &lt;span class="p"&gt;};&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Irrespective of the framework used, a tool comprises of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A description of what the tool does to inform the LLM.&lt;/li&gt;
&lt;li&gt;The parameters expected by the function, along with defined data types. Here we define these using the Typescript validation library &lt;a href="https://zod.dev/" rel="noopener noreferrer"&gt;&lt;code&gt;zod&lt;/code&gt;&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;The function to be invoked by the LLM when the tool is used.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If a LLM supports tool calling, it can choose to call (potentially) one or many tools to solve the problem. &lt;a href="https://www.elastic.co/search-labs/blog/ai-agents-ai-sdk-elasticsearch#model-choice" rel="noopener noreferrer"&gt;I have discussed my experiences of model choice before while building my own multi-tool AI agent&lt;/a&gt;. It's important when choosing a model to investigate the level of tool calling support using resources such as the &lt;a href="https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard#/" rel="noopener noreferrer"&gt;Hugging Face Open LLM Leaderboard&lt;/a&gt; or &lt;a href="https://gorilla.cs.berkeley.edu/leaderboard.html" rel="noopener noreferrer"&gt;Berkeley Function-Calling Leaderboard&lt;/a&gt;. The problem is that given the LLM decides which tools are relevant to the objective, it is possible that can be confused by tool many tools and call irrelevant tools &lt;a href="https://www.dbreunig.com/2025/06/22/how-contexts-fail-and-how-to-fix-them.html#context-confusion" rel="noopener noreferrer"&gt;as discussed by Drew Breunig&lt;/a&gt;. This idea of tool confusion is also discussed in &lt;a href="https://arxiv.org/pdf/2411.15399" rel="noopener noreferrer"&gt;the 2024 paper by Paramanayakam et al.&lt;/a&gt; where they found the performance of Llama 3.1 8b improved when provided with less tools (19 compared with 46).&lt;/p&gt;

&lt;p&gt;Optimizing the number of tools available is an open area of research. Experiments to apply RAG architectures to combat tool confusion, such as &lt;a href="https://arxiv.org/pdf/2505.03275" rel="noopener noreferrer"&gt;retrieving tool descriptions to optimize tool selection in MCP&lt;/a&gt; so that providing relevant tool descriptions to the LLM results in more accurate results.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Here we have explained what context engineering is, and gave an overview of the key components of context. If you are interested in learning more check out the below resources.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://www.philschmid.de/context-engineering" rel="noopener noreferrer"&gt;The New Skill in AI is Not Prompting, It's Context Engineering | Philipp Schmid&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/humanlayer/12-factor-agents?tab=readme-ov-file" rel="noopener noreferrer"&gt;12-Factor Agents - Principles for building reliable LLM applications | Dexter Horthy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.dbreunig.com/2025/06/22/how-contexts-fail-and-how-to-fix-them.html#fn:live" rel="noopener noreferrer"&gt;How Long Contexts Fail | Drew Breunig&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.dbreunig.com/2025/06/26/how-to-fix-your-context.html" rel="noopener noreferrer"&gt;How to Fix Your Context | Drew Breunig&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://arxiv.org/pdf/2406.06608" rel="noopener noreferrer"&gt;The Prompt Report: A Systematic Survey of Prompt Engineering Techniques | Schulhoff et al.&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://arxiv.org/pdf/2201.11903" rel="noopener noreferrer"&gt;Chain-of-Thought Prompting Elicits Reasoning in Large Language Models| Wei et al.&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://langchain-ai.github.io/langgraphjs/concepts/memory/#what-is-memory" rel="noopener noreferrer"&gt;What is Memory? | LangGraph&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://cognition.ai/blog/dont-build-multi-agents#a-theory-of-building-long-running-agents" rel="noopener noreferrer"&gt;Don't Build Multi-Agents | Cognition&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://js.langchain.com/docs/concepts/structured_outputs/" rel="noopener noreferrer"&gt;Structured Outputs | LangChain&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://arxiv.org/pdf/2411.15399" rel="noopener noreferrer"&gt;Less is More: Optimizing Function Calling for LLM Execution on Edge Devices | Paramanayakam et al.&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://arxiv.org/pdf/2505.03275" rel="noopener noreferrer"&gt;RAG-MCP: Mitigating Prompt Bloat in LLM Tool Selection via Retrieval-Augmented Generation | Gan and Sun&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>ai</category>
      <category>elasticsearch</category>
      <category>rag</category>
    </item>
    <item>
      <title>Elasticsearch: 15 years of indexing it all, finding what matters</title>
      <dc:creator>Philipp Krenn</dc:creator>
      <pubDate>Fri, 05 Sep 2025 10:09:13 +0000</pubDate>
      <link>https://dev.to/elastic/elasticsearch-15-years-of-indexing-it-all-finding-what-matters-2epk</link>
      <guid>https://dev.to/elastic/elasticsearch-15-years-of-indexing-it-all-finding-what-matters-2epk</guid>
      <description>&lt;p&gt;Elasticsearch just turned 15-years-old. It all started back in February 2010 with the &lt;a href="https://thedudeabides.com/articles/elasticsearch" rel="noopener noreferrer"&gt;announcement blog post&lt;/a&gt; (featuring the iconic “You Know, for Search” tagline), &lt;a href="https://github.com/elastic/elasticsearch/commit/b3337c312765e51cec7bde5883bbc0a08f56fb65" rel="noopener noreferrer"&gt;first public commit&lt;/a&gt;, and the &lt;a href="https://github.com/elastic/elasticsearch/releases/tag/v0.4.0" rel="noopener noreferrer"&gt;first release&lt;/a&gt;, which happened to be 0.4.0.&lt;/p&gt;

&lt;p&gt;Let’s take a look back at the last 15 years of indexing and searching, and turn to the next 15 years of relevance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff77mct3d4ax5fxnueqm1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff77mct3d4ax5fxnueqm1.png" alt="Release blog post" width="800" height="313"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;code&gt;GET _cat/stats&lt;/code&gt;
&lt;/h2&gt;

&lt;p&gt;Since its launch, Elasticsearch has been &lt;strong&gt;downloaded an average of 3 times per second&lt;/strong&gt;, totaling over 1.45 billion downloads.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://github.com/elastic/elasticsearch" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; stats are equally impressive: More than &lt;strong&gt;83,000 commits from 2,400 unique authors&lt;/strong&gt;, 38,000 issues, 25,000 forks, and 71,500 stars. And there is &lt;a href="https://github.com/elastic/elasticsearch/graphs/contributors" rel="noopener noreferrer"&gt;no sign of slowing down&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nawwdpoxeddj227oonu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nawwdpoxeddj227oonu.png" alt="History of Elasticsearch GitHub commits over time" width="800" height="721"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;All of this is on top of &lt;strong&gt;countless &lt;a href="https://lucene.apache.org" rel="noopener noreferrer"&gt;Apache Lucene&lt;/a&gt; contributions&lt;/strong&gt;. We’ll get into those for the 25 year anniversary of Lucene, which is also this year. In the meantime, you can check out the &lt;a href="https://www.elastic.co/celebrating-lucene" rel="noopener noreferrer"&gt;20 year anniversary page&lt;/a&gt; to celebrate one of the top Apache projects.&lt;/p&gt;

&lt;h2&gt;
  
  
  A search (hi)story
&lt;/h2&gt;

&lt;p&gt;There are too many highlights to list them all, but here are 15 releases and features from the last 15 years that got Elasticsearch to where it is today:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Elasticsearch, &lt;a href="https://thedudeabides.com/articles/you-know-for-search-inc" rel="noopener noreferrer"&gt;the company&lt;/a&gt; (2012)&lt;/strong&gt;: The open source project became an open source company, setting the stage for its growth.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/blog/getting-started-with-elk" rel="noopener noreferrer"&gt;ELK Stack&lt;/a&gt; (2013)&lt;/strong&gt;: Elasticsearch joined forces with Logstash and Kibana to form the ELK Stack, which is now synonymous with logging and analytics.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://web.archive.org/web/20140612205549/http://www.elasticsearch.org/blog/1-0-0-released/" rel="noopener noreferrer"&gt;Version 1&lt;/a&gt;&lt;/strong&gt; (2014): The first stable release introduced key features like snapshot/restore, aggregations, circuit breakers, and the _cat API.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Shield and Found&lt;/strong&gt; (2015): &lt;a href="https://web.archive.org/web/20150201061023/http://www.elasticsearch.com/blog/shield-redefining-can-elk/" rel="noopener noreferrer"&gt;Shield&lt;/a&gt; brought security to Elasticsearch clusters in the form of a (paid) plugin. And the &lt;a href="https://www.elastic.co/blog/welcome-found" rel="noopener noreferrer"&gt;acquisition of found.no&lt;/a&gt; brought Elasticsearch to the cloud, evolving into what is now Elastic Cloud. As an anecdote, nobody could find Found — SEO can be hard for some keywords.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/blog/elasticsearch-2-0-0-released" rel="noopener noreferrer"&gt;Version 2&lt;/a&gt;&lt;/strong&gt; (2015): Introduced pipelined aggregations, security hardening with the Java Security Manager, and performance and resilience improvements.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/blog/elasticsearch-5-0-0-released" rel="noopener noreferrer"&gt;Version 5&lt;/a&gt; and the Elastic Stack&lt;/strong&gt; (2016): Skipping two major versions to unify the version numbers of the ELK Stack and turning it into the Elastic Stack after adding Beats. This version also introduced ingest nodes and the scripting language Painless.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/blog/elasticsearch-6-0-0-released" rel="noopener noreferrer"&gt;Version 6&lt;/a&gt;&lt;/strong&gt; (2017): Brought zero-downtime upgrades, index sorting, and the removal of types to simplify data modeling.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.elastic.co/blog/elasticsearch-7-0-0-released" rel="noopener noreferrer"&gt;**Version 7&lt;/a&gt;** (2019): Changed the cluster coordination to the more scalable and resilient Zen2, single-shard default settings, built-in JDK, and adaptive replica selection.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/blog/security-for-elasticsearch-is-now-free" rel="noopener noreferrer"&gt;Free security&lt;/a&gt;&lt;/strong&gt; (2019): With the 6.8 and 7.1 releases, core security became free to help everyone secure their cluster.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/blog/introducing-elasticsearch-searchable-snapshots" rel="noopener noreferrer"&gt;ILM, data tiers, and searchable snapshots&lt;/a&gt;&lt;/strong&gt; (2020): Made time-series data more manageable and cost-effective with Index Lifecycle Management (ILM), tiered storage, and searchable snapshots.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/blog/whats-new-elastic-8-0-0" rel="noopener noreferrer"&gt;Version 8&lt;/a&gt;&lt;/strong&gt; (2022): Introduced &lt;a href="https://www.elastic.co/search-labs/blog/introduction-to-vector-search" rel="noopener noreferrer"&gt;native dense vector search&lt;/a&gt; with &lt;a href="https://www.elastic.co/blog/introducing-approximate-nearest-neighbor-search-in-elasticsearch-8-0" rel="noopener noreferrer"&gt;HNSW&lt;/a&gt; and enabled security by default.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/blog/whats-new-elastic-search-8-11-0" rel="noopener noreferrer"&gt;ELSER&lt;/a&gt;&lt;/strong&gt; (2023): Launched Elastic Learned Sparse EncodeR model, bringing sparse vector search for better semantic relevance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/blog/elasticsearch-is-open-source-again" rel="noopener noreferrer"&gt;Open source again&lt;/a&gt;&lt;/strong&gt; (2024): Added AGPL as a licensing option to bring back open source Elasticsearch.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/elastic/start-local" rel="noopener noreferrer"&gt;Start Local&lt;/a&gt;&lt;/strong&gt; (2024): Made it easier than ever to run Elasticsearch and Kibana: curl -fsSL &lt;a href="https://elastic.co/start-local" rel="noopener noreferrer"&gt;https://elastic.co/start-local&lt;/a&gt; | sh&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.elastic.co/search-labs/blog/elasticsearch-logsdb-index-mode" rel="noopener noreferrer"&gt;LogsDB&lt;/a&gt;&lt;/strong&gt; (2024): A new specialized index mode that reduces log storage by up to 65%.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The future of search is bright
&lt;/h2&gt;

&lt;p&gt;Thanks to the rise of AI capabilities, search is more relevant and interesting than ever. So what is next for Elasticsearch? There’s way too much to name, so we’ll stick to three areas and the challenges they address.&lt;/p&gt;

&lt;h3&gt;
  
  
  Serverless
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;No shards, nodes, or versions.&lt;/strong&gt; &lt;a href="https://www.elastic.co/cloud/serverless" rel="noopener noreferrer"&gt;Elasticsearch Serverless&lt;/a&gt; takes care of the operational issues you might have experienced in the past:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;15 years in, and someone is still setting &lt;code&gt;number_of_shards: 100&lt;/code&gt; for no reason.&lt;/li&gt;
&lt;li&gt;15 years, and we’re still debating &lt;code&gt;refresh_interval: 1s&lt;/code&gt; vs &lt;code&gt;30s&lt;/code&gt; like it’s a life-or-death decision.&lt;/li&gt;
&lt;li&gt;15 years of major versions, minor heart attacks, and the thrill of migrating to the latest version.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can &lt;a href="https://cloud.elastic.co" rel="noopener noreferrer"&gt;try out Elasticsearch Serverless&lt;/a&gt; today.&lt;/p&gt;

&lt;h3&gt;
  
  
  ES|QL
&lt;/h3&gt;

&lt;p&gt;“Cheers to 15 years of Elasticsearch — where the Query DSL is still the most complex part of your day.” But it doesn’t have to be. The new &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/esql.html" rel="noopener noreferrer"&gt;Elasticsearch Piped Query Language (ES|QL)&lt;/a&gt; brings a much &lt;strong&gt;simpler syntax&lt;/strong&gt; and a significant investment into a &lt;strong&gt;new compute engine&lt;/strong&gt; with performance in mind. While we’re building out more features, you can already use &lt;a href="https://www.elastic.co/docs/explore-analyze/query-filter/languages/esql" rel="noopener noreferrer"&gt;ES|QL&lt;/a&gt; today. Don’t worry; the Query DSL will understand.&lt;/p&gt;

&lt;h3&gt;
  
  
  AI everywhere
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;15 years of query tuning, and we’re still just throwing &lt;code&gt;boost: 10&lt;/code&gt; at the problem.&lt;/li&gt;
&lt;li&gt;15 years of making your logs searchable while you still have no idea what’s happening in production.&lt;/li&gt;
&lt;li&gt;Still the best at finding that one log line… if you remember how you indexed it.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AI is redefining what’s possible — from &lt;strong&gt;turning raw logs into actionable insights&lt;/strong&gt; with the AI Assistant for &lt;a href="https://www.elastic.co/guide/en/observability/current/obs-ai-assistant.html" rel="noopener noreferrer"&gt;observability&lt;/a&gt; and &lt;a href="https://www.elastic.co/guide/en/security/8.17/security-assistant.html" rel="noopener noreferrer"&gt;security&lt;/a&gt;, to more relevant &lt;strong&gt;search with &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/semantic-search.html" rel="noopener noreferrer"&gt;semantic understanding&lt;/a&gt;&lt;/strong&gt; and intelligent &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/re-ranking-overview.html" rel="noopener noreferrer"&gt;re-ranking&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This is only the beginning. More AI-powered features are on the horizon — bringing smarter search, enhanced observability, and stronger security. The future of Elasticsearch isn’t just about finding data; it’s about understanding it. Stay tuned — the best is yet to come.&lt;/p&gt;

&lt;h2&gt;
  
  
  Thanks to all of you
&lt;/h2&gt;

&lt;p&gt;Thanks to all contributors, users, and customers over the last 15 years to make Elasticsearch what it is today. We couldn’t have done it without you and are grateful for every query you send to Elasticsearch.&lt;/p&gt;

&lt;p&gt;Here’s to the next 15 years. Enjoy!&lt;/p&gt;

</description>
      <category>elasticsearch</category>
      <category>kibana</category>
      <category>opensource</category>
    </item>
    <item>
      <title>NLP and Elastic: Getting started</title>
      <dc:creator>Priscilla Parodi</dc:creator>
      <pubDate>Thu, 02 Jun 2022 21:13:34 +0000</pubDate>
      <link>https://dev.to/elastic/natural-language-processing-15fj</link>
      <guid>https://dev.to/elastic/natural-language-processing-15fj</guid>
      <description>&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/elastic/nlp-handson-4f90"&gt;Next Post: NLP HandsOn&lt;/a&gt; |&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Natural language processing (NLP)&lt;/strong&gt; is the branch of artificial intelligence (AI) that focuses on understanding human language as closely as possible to human interpretation, combining computational linguistics with statistical, machine learning and deep learning models.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwy77qmvnlsb6oyavpdze.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwy77qmvnlsb6oyavpdze.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Some examples of NLP tasks:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Named entity recognition&lt;/strong&gt; is a type of information extraction, identifying words or phrases as entities.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcquav4p0h8gheba0zhwv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcquav4p0h8gheba0zhwv.png" alt="Image description"&gt;&lt;/a&gt;(&lt;a href="https://huggingface.co/dslim/bert-base-NER" rel="noopener noreferrer"&gt;&lt;em&gt;model used&lt;/em&gt;&lt;/a&gt;)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Sentiment analysis&lt;/strong&gt; is a type of text classification, attempting to extract subjective emotions from text.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3oseiqe4zrzrbvpywsa3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3oseiqe4zrzrbvpywsa3.png" alt="Image description"&gt;&lt;/a&gt;(&lt;a href="https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english" rel="noopener noreferrer"&gt;&lt;em&gt;model used&lt;/em&gt;&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;There are more examples that can be used according to your use case.&lt;/p&gt;

&lt;h2&gt;
  
  
  BERT
&lt;/h2&gt;

&lt;p&gt;In 2018, Google sourced a new technique for pre-training NLP called &lt;a href="https://www.youtube.com/watch?v=2lR8Fzays4I" rel="noopener noreferrer"&gt;BERT&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;BERT uses “transfer learning”, which is the method of pre-training linguistic representations. Pre-training refers to how BERT was first trained using unsupervised learning on a large source of plain text extracted from a collection of samples (800 million words) and Wikipedia documents (2,500 million words). Earlier models required manual labeling.&lt;/p&gt;

&lt;p&gt;BERT was pretrained on two tasks: language modeling (15% of tokens were masked and BERT was trained to predict them from context) and next sentence prediction (BERT was trained to predict if a chosen next sentence was probable or not given the first sentence). With this understanding, BERT can be adapted to many other types of NLP tasks very easily.&lt;/p&gt;

&lt;p&gt;Knowing the intent and context and not just the keywords, it is possible to go further in understanding in a way that is even closer to the way humans understand.&lt;/p&gt;

&lt;h2&gt;
  
  
  NLP with Elastic
&lt;/h2&gt;

&lt;p&gt;To support models that use the same tokenizer as BERT, Elastic is supporting the PyTorch library, one of the most popular machine learning libraries that supports neural networks like the Transformer architecture that BERT uses, enabling NLP tasks.&lt;/p&gt;

&lt;p&gt;In general, any trained model that has a supported architecture is deployable in Elasticsearch, including BERT and variants.&lt;/p&gt;

&lt;p&gt;These models are listed by NLP task. &lt;a href="https://www.elastic.co/guide/en/machine-learning/current/ml-nlp-overview.html" rel="noopener noreferrer"&gt;Currently&lt;/a&gt;, these are the tasks supported:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.elastic.co/guide/en/machine-learning/current/ml-nlp-extract-info.html" rel="noopener noreferrer"&gt;Extract information&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Named entity recognition&lt;br&gt;
Fill-mask&lt;br&gt;
Question answering&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.elastic.co/guide/en/machine-learning/current/ml-nlp-classify-text.html" rel="noopener noreferrer"&gt;Classify text&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Language identification&lt;br&gt;
Text classification&lt;br&gt;
Zero-shot text classification&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.elastic.co/guide/en/machine-learning/current/ml-nlp-search-compare.html" rel="noopener noreferrer"&gt;Search and compare text&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Text embedding&lt;br&gt;
Text similarity&lt;/p&gt;

&lt;p&gt;As in the cases of &lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-classification-analysis-1b3f"&gt;classification&lt;/a&gt; and &lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-regression-analysis-2ge2"&gt;regression&lt;/a&gt;, when a &lt;a href="https://dev.to/priscilla_parodi/trained-models-for-supervised-learning-154n"&gt;trained model&lt;/a&gt; is imported you can use it to make predictions (&lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-inference-processor-handson-3392"&gt;inference&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: For NLP tasks you must choose and deploy a third-party NLP model. If you choose to perform language identification, as an option we have a trained model &lt;code&gt;lang_ident_model_1&lt;/code&gt; provided in the cluster.&lt;/em&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  NLP with Elastic Solutions
&lt;/h2&gt;

&lt;p&gt;There are many possible use cases to add NLP capabilities to your Elastic project and here are some examples:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Security&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Spam detection: Text classification capabilities are useful for scanning emails for language that often indicates spam, allowing content to be blocked or deleted and preventing malware emails.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8t4dljvgvf6yapd9ug52.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8t4dljvgvf6yapd9ug52.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PUT spam-detection/_doc/1
{
  "email subject": "Camera - You are awarded a SiPix Digital Camera! Call 09061221066. Delivery within 28 days.",
  "is_spam": true
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Enterprise Search&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Analysis of unstructured text: Entity recognition is useful for structuring text data, adding new field types to your documents and allowing you to analyze more data and obtain even more valuable insights.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fksptdxozxqx09ugsh3vk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fksptdxozxqx09ugsh3vk.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PUT /source-index
{
  "mappings": {
    "properties": {
      "input":    { "type": "text" }
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PUT /new-index
{
  "mappings": {
    "properties": {
      "input":    { "type": "text" },  
      "organization":  { "type": "keyword"  }, 
      "location":   { "type": "keyword"  }     
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Observability&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Service request and incident data: Extracting meaning from operational data, including ticket resolution comments, allows you to not only generate alerts during incidents, but also go further by observing your application, predicting behavior, and having more data to improve ticket resolution time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhwxuc8kv6g1ur3lbvki9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhwxuc8kv6g1ur3lbvki9.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;...
  "_source": {
    "support_ticket_id": 119237,
    "customer_id": 283823,
    "timestamp": "2021-06-06T17:23:02.770Z",
    "text_field": "Response to the case was fast and problem was solved after first response, did not need to provide any additional info.",
    "ml": {
      "inference": {
        "predicted_value": "positive",
        "prediction_probability": 0.9499962712516151,
        "model_id": "heBERT_sentiment_analysis"
      }
    }
  }
...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  NLP HandsOn
&lt;/h2&gt;

&lt;p&gt;Now, let's proceed with an &lt;strong&gt;&lt;a href="https://dev.to/elastic/nlp-handson-4f90"&gt;end-to-end example&lt;/a&gt;&lt;/strong&gt;! To prepare for the &lt;a href="https://dev.to/elastic/nlp-handson-4f90"&gt;NLP HandsOn&lt;/a&gt;, we will need an Elasticsearch cluster running at least version 8.0 with an ML node. If you haven't created your &lt;a href="https://dev.to/priscilla_parodi/handson-setup-elastic-cloud-4j8p"&gt;Elastic Cloud Trial&lt;/a&gt; yet, now is the time.&lt;/p&gt;

&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/elastic/nlp-handson-4f90"&gt;Next Post: NLP HandsOn&lt;/a&gt; |&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This post is part of a series that covers &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Artificial Intelligence with a focus on Elastic's (Creators of Elasticsearch, Kibana, Logstash and Beats) Machine Learning solution&lt;/a&gt;, aiming to introduce and exemplify the possibilities and options available, in addition to addressing the context and usability.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>nlp</category>
      <category>tutorial</category>
      <category>elasticsearch</category>
    </item>
    <item>
      <title>NLP HandsOn</title>
      <dc:creator>Priscilla Parodi</dc:creator>
      <pubDate>Thu, 02 Jun 2022 21:13:24 +0000</pubDate>
      <link>https://dev.to/elastic/nlp-handson-4f90</link>
      <guid>https://dev.to/elastic/nlp-handson-4f90</guid>
      <description>&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Note: This HandsOn assumes that you have already followed the step-by-step Setup of your &lt;a href="https://dev.to/priscilla_parodi/handson-setup-elastic-cloud-4j8p"&gt;Elastic Cloud Trial account&lt;/a&gt;, and also that you have read the blog &lt;a href="https://dev.to/elastic/natural-language-processing-15fj"&gt;NLP and Elastic: Getting started&lt;/a&gt;.&lt;br&gt;
Config: To prepare for the NLP HandsOn, we will need an Elasticsearch cluster running at least version 8.0 with an ML node.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;To start using NLP in your Stack you will need to import your model. The first thing we need to do is upload your model into a cluster.&lt;/p&gt;

&lt;p&gt;In our &lt;a href="https://www.elastic.co/guide/en/elasticsearch/client/eland/current/overview.html" rel="noopener noreferrer"&gt;eland&lt;/a&gt; library, a Python Elasticsearch client for exploring and analyzing data in Elasticsearch, we have some simple methods and scripts that allow you to upload models from local disk, or to pull models down from the Hugging Face model hub.&lt;/p&gt;

&lt;p&gt;Once models are uploaded into the cluster, you’ll be able to allocate those models to specific ML nodes. Once model allocation is complete, we’re ready for inference.&lt;/p&gt;

&lt;p&gt;Eland can be installed from &lt;a href="https://pypi.org/project/eland" rel="noopener noreferrer"&gt;PyPI&lt;/a&gt; via pip.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Before you go any further, make sure you have Python installed.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can check this by running:&lt;/p&gt;

&lt;p&gt;Unix/macOS&lt;br&gt;
&lt;code&gt;python3 --version&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;You should get some output like:&lt;br&gt;
&lt;code&gt;Python 3.8.8&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Additionally, you’ll need to make sure you have &lt;a href="https://packaging.python.org/en/latest/key_projects/#pip" rel="noopener noreferrer"&gt;pip&lt;/a&gt; available.&lt;/p&gt;

&lt;p&gt;You can check this by running:&lt;/p&gt;

&lt;p&gt;Unix/macOS&lt;br&gt;
&lt;code&gt;python3 -m pip --version&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;You should get some output like:&lt;br&gt;
&lt;code&gt;pip 21.0.1 from …&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;If you installed Python from source, with an installer from &lt;a href="//python.org"&gt;python.org&lt;/a&gt;, or via &lt;a href="https://brew.sh/" rel="noopener noreferrer"&gt;Homebrew&lt;/a&gt; you should already have pip.&lt;/p&gt;

&lt;p&gt;If you don't have Python and pip installed, &lt;a href="https://packaging.python.org/en/latest/tutorials/installing-packages/" rel="noopener noreferrer"&gt;install it first&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;With that, Eland can be installed from &lt;a href="https://pypi.org/project/eland/" rel="noopener noreferrer"&gt;PyPI&lt;/a&gt; via pip:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ python3 -m pip install eland&lt;/code&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Getting started
&lt;/h2&gt;

&lt;p&gt;To interact with your cluster through the &lt;a href="https://eland.readthedocs.io/en/v8.1.0/reference/index.html" rel="noopener noreferrer"&gt;API&lt;/a&gt;, we will need to use your Elasticsearch cluster endpoint information.&lt;/p&gt;

&lt;p&gt;The endpoint looks like:&lt;br&gt;
&lt;code&gt;https://&amp;lt;user&amp;gt;:&amp;lt;password&amp;gt;@&amp;lt;hostname&amp;gt;:&amp;lt;port&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Open your &lt;a href="https://cloud.elastic.co/deployments" rel="noopener noreferrer"&gt;deployment&lt;/a&gt; settings to find your endpoint information  and click on the gear icon.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe6ynpjou5lw8xx076ibo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe6ynpjou5lw8xx076ibo.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Copy your Elasticsearch endpoint as in the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fir6l9yqmk71ktw0vm9lb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fir6l9yqmk71ktw0vm9lb.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: If you want to try out examples with your own cluster, remember to include your endpoint URLs and authentication details.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Now add the username and password so your request can be authenticated, your endpoint will look like this:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;https://elastic:123456789@00c1f8.es.uscentral1.gcp.cloud.es.io:9243&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;username:&lt;/strong&gt; &lt;code&gt;elastic&lt;/code&gt; is a built-in superuser. Grants full access to cluster management and data indices.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;password:&lt;/strong&gt; If you don't have your password, you will need to reset it and &lt;a href="https://www.elastic.co/guide/en/cloud-enterprise/2.3/ece-password-reset-elastic.html" rel="noopener noreferrer"&gt;generate a new password&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Copy your endpoint, you'll need it later.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In parallel, let's proceed locating the first model to be imported.&lt;/p&gt;

&lt;p&gt;We will import the model from &lt;a href="https://huggingface.co" rel="noopener noreferrer"&gt;Hugging Face&lt;/a&gt;, an AI community to build, train and deploy open source machine learning models.&lt;/p&gt;

&lt;p&gt;In this demo we will use a random sentiment analysis model but feel free to import the model you want to use. You can read more details about &lt;a href="https://huggingface.co/bhadresh-savani/distilbert-base-uncased-emotion" rel="noopener noreferrer"&gt;this model&lt;/a&gt; on the Hugging Face webpage.&lt;/p&gt;

&lt;p&gt;Copy the model name as in the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fswxdak4ytf3gf7bule28.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fswxdak4ytf3gf7bule28.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that we have all the necessary information &lt;strong&gt;(elasticsearch cluster endpoint information and the name of the model we want to import)&lt;/strong&gt; let's proceed by importing the model:&lt;/p&gt;

&lt;p&gt;Open your terminal and update the following command with your endpoint and model name:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;eland_import_hub_model --url https://&amp;lt;user&amp;gt;:&amp;lt;password&amp;gt;@&amp;lt;hostname&amp;gt;:&amp;lt;port&amp;gt; \
--hub-model-id &amp;lt;model_name&amp;gt; \
--task-type &amp;lt;task_type&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this case we are importing the &lt;code&gt;bhadresh-savani/distilbert-base-uncased-emotion&lt;/code&gt; model to run the &lt;code&gt;text_classification&lt;/code&gt; task.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;In Huggning Face filters you will be able to see the task of each model. Supported values are fill_mask, ner, question_answering, text_classification, text_embedding, and zero_shot_classification.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxswewkqgmqtbtkah593y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxswewkqgmqtbtkah593y.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;eland_import_hub_model --url https://elastic:&amp;lt;password&amp;gt;@&amp;lt;hostname&amp;gt;:&amp;lt;port&amp;gt; \
--hub-model-id bhadresh-savani/distilbert-base-uncased-emotion \
--task-type text_classification
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You will see that the Hugging Face model will be loaded directly from the model hub and then your model will be imported into Elasticsearch.&lt;/p&gt;

&lt;p&gt;Wait for the process to end.&lt;/p&gt;

&lt;p&gt;Let's check if the model was imported.&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Machine Learning&lt;/code&gt; in your Kibana menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Froba7x9q593t01lyh6v4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Froba7x9q593t01lyh6v4.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Under model management click &lt;code&gt;Trained Models&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7qnt7a2v27hh4jc4t18.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7qnt7a2v27hh4jc4t18.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Your model needs to be on this list as shown in the image below, if it is not on this list check if there was any error message in the previous process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdh6kvjkr8kk43hnwlsa1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdh6kvjkr8kk43hnwlsa1.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If your model is on this list it means it was imported but now you need to start the deployment. To do this, in the last column under &lt;code&gt;Actions&lt;/code&gt; click &lt;code&gt;Start deployment&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm6qwqlvv5uw9qz1teddq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm6qwqlvv5uw9qz1teddq.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After deploying, the &lt;code&gt;State&lt;/code&gt; column will have the value &lt;code&gt;started&lt;/code&gt; and under &lt;code&gt;Actions&lt;/code&gt; the &lt;code&gt;Start deployment&lt;/code&gt; option will be disabled, which means that the deploy has been done.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi1yhto34cp8myjidoev4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi1yhto34cp8myjidoev4.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's test our model!&lt;/p&gt;

&lt;p&gt;Copy your model ID:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnzbk607jepdmbaaeilr5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnzbk607jepdmbaaeilr5.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In Kibana's menu, click &lt;code&gt;Dev Tools&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flfizr6wogdbj9qzto8xs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flfizr6wogdbj9qzto8xs.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this UI you will have a console to interact with the REST API of Elasticsearch.&lt;/p&gt;

&lt;p&gt;We will to use the &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/infer-trained-model-deployment.html#infer-trained-model-deployment-request" rel="noopener noreferrer"&gt;inference processor&lt;/a&gt; to evaluate this model.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;POST _ml/trained_models/&amp;lt;model_id&amp;gt;/deployment/_infer
{
  "docs": { "text_field": "&amp;lt;input&amp;gt;"}
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This &lt;code&gt;POST&lt;/code&gt; method contains a &lt;code&gt;docs&lt;/code&gt; array with a field matching your configured trained model input, typically the field name is &lt;code&gt;text_field&lt;/code&gt;. The &lt;code&gt;text_field&lt;/code&gt; value is the input you want to infer.&lt;/p&gt;

&lt;p&gt;In our case it will be:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;POST _ml/trained_models/bhadresh-savani__distilbert-base-uncased-emotion/deployment/_infer
{
  "docs": { "text_field": "Elastic is the perfect platform for knowledgebase NLP applications"}
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Where the model_id is &lt;code&gt;bhadresh-savani__distilbert-base-uncased-emotion&lt;/code&gt; and the value that I am using as a test is &lt;code&gt;Elastic is the perfect platform for knowledgebase NLP applications&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Clicking the play button you can send the request:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbto5nhhhn6pee4b646fg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbto5nhhhn6pee4b646fg.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this case the predicted sentiment is "joy".&lt;/p&gt;

&lt;p&gt;That's it, the model is working. 🚀&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: You can run more tests to determine if this model works for what you need.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;To get all the statistics of your model you can use the &lt;code&gt;_stats&lt;/code&gt; request:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;GET _ml/trained_models/&amp;lt;model_id&amp;gt;/_stats&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Let's continue with part 2, How to run this model on data being ingested?
&lt;/h2&gt;

&lt;p&gt;To do this, let's start by importing a .csv file into Elasticsearch. So we can run the model while importing data.&lt;/p&gt;

&lt;p&gt;I think it's interesting to run an analysis on random texts and tweets are good use cases.&lt;/p&gt;

&lt;p&gt;Recently Elon Musk announced his interest in buying Twitter, but before that he was famously active on the platform. As we have a sentiment analysis model, let's proceed with analyzing a sample of Elon's tweets.&lt;/p&gt;

&lt;p&gt;I found this database on &lt;a href="https://www.kaggle.com/datasets/kulgen/elon-musks-tweets?resource=download" rel="noopener noreferrer"&gt;Kaggle&lt;/a&gt;, this is a good website for locating datasets.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: We don't have a huge amount of data, 172Kb between November 16, 2012 and September 29, 2017. But as this is not a research paper this is not a problem.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Feel free to use whatever data you prefer, or even the twitter API.&lt;/p&gt;

&lt;p&gt;Let's download this file:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwow2ov4dx39uf5dv1v6h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwow2ov4dx39uf5dv1v6h.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And import into Elasticsearch.&lt;/p&gt;

&lt;p&gt;There are different ways to do this, but since this is a small .csv file, we can use the &lt;code&gt;Upload a file&lt;/code&gt; integration.&lt;/p&gt;

&lt;p&gt;In the Kibana menu, click &lt;code&gt;Integrations&lt;/code&gt;, you will see a list of integrations we have for collecting data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4ka2xdgnc58lw05io3pn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4ka2xdgnc58lw05io3pn.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Search for &lt;code&gt;Upload a file&lt;/code&gt; as in the image below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7gsre12r11ikirgdbpf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7gsre12r11ikirgdbpf.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And then click &lt;code&gt;Select or drag and drop a file&lt;/code&gt; and choose your csv file, in our case &lt;code&gt;data_elonmusk.csv&lt;/code&gt; that you downloaded earlier.&lt;/p&gt;

&lt;p&gt;You will see something similar to the image below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxd9e2q37zy997gil6cda.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxd9e2q37zy997gil6cda.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Override settings&lt;/code&gt; to rename the Tweet column to &lt;code&gt;text_field&lt;/code&gt;. As explained before, there needs to be a field that matches your configured trained model input which is typically called &lt;code&gt;text_field&lt;/code&gt;. With this, the model will be able to identify the field to be analyzed.&lt;/p&gt;

&lt;p&gt;Rename the Tweet column/field to &lt;code&gt;text_field&lt;/code&gt;. Click &lt;code&gt;Apply&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdsjvrcfrx86olzftw2gt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdsjvrcfrx86olzftw2gt.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After the page loads, click &lt;code&gt;Import&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl40dc6yrilyp7sq0mrez.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl40dc6yrilyp7sq0mrez.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And then click &lt;code&gt;Advanced&lt;/code&gt; to edit the import process settings.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxuvs4xzkqfde9c3n6an.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxuvs4xzkqfde9c3n6an.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The import process has several steps:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Processing file&lt;/strong&gt; - Turning the data into NDJSON documents so they can be ingested using the bulk api&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Creating index&lt;/strong&gt; - Creating the index using the settings and mappings objects&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Creating ingest pipeline&lt;/strong&gt; - Creating the ingest pipeline using the ingest pipeline object&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Uploading data&lt;/strong&gt; - Loading data into the new Elasticsearch index&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Creating a data view&lt;/strong&gt; (Index pattern) - Create a Kibana index pattern (if the user has opted to)&lt;/p&gt;

&lt;p&gt;As you can see the &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/csv-processor.html#csv-processor" rel="noopener noreferrer"&gt;CSV processor&lt;/a&gt; is being used  in the ingest pipeline to import your document.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgm89yqaqulgrkog509y7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgm89yqaqulgrkog509y7.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Feel free to edit the mapping or ingest pipeline.&lt;/p&gt;

&lt;p&gt;In our case we need to edit the ingest pipeline to add our previously trained and imported model.&lt;/p&gt;

&lt;p&gt;Add the model that will infer the data being ingested into the processor as in the image below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu87kayxj9t342qiqrwem.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu87kayxj9t342qiqrwem.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  {
       "inference": {
       "model_id": "bhadresh-savani__distilbert-base-uncased-emotion"
        }
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After that add your index name and click Import. If for some reason it doesn't work, repeat the process and check if you typed something incorrectly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvyxuv1tfeircaq6pplkl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvyxuv1tfeircaq6pplkl.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: What we are doing is adding your model for &lt;strong&gt;inference&lt;/strong&gt; in the &lt;strong&gt;ingest pipeline&lt;/strong&gt;, it doesn't need to be a .csv. Read more about it &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;When it finishes loading, your screen will look like mine, click &lt;code&gt;View index in Discover&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvysyz999acviwjjli77m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvysyz999acviwjjli77m.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you didn't disable &lt;code&gt;Create data view&lt;/code&gt; when you were importing data you should be able to locate your index by the name you used. Now you can explore your index data. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7y3c4dao2p648jpbak6j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7y3c4dao2p648jpbak6j.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next to the word &lt;code&gt;Documents&lt;/code&gt;, click &lt;code&gt;Field statistics&lt;/code&gt;, so far this is a beta feature but excellent for exploring your data. As we can see, Elon was feeling Joyful in 70% of the analyzed tweets considering this sentiment analysis model. The second most popular sentiment in Elon's tweets was anger and then fear.&lt;/p&gt;

&lt;p&gt;Let's click on the lens button on the right side of the screen to open &lt;a href="https://www.elastic.co/guide/en/kibana/current/lens.html" rel="noopener noreferrer"&gt;Kibana Lens&lt;/a&gt; and explore this data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flp2lijn7t9z60yrgt1h2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flp2lijn7t9z60yrgt1h2.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When the screen loads. Click and drag the Time field to explore this data considering the date of each tweet.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3s4ouyausor7fjv2hsr4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3s4ouyausor7fjv2hsr4.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Considering time, some suggestions will appear, I liked one of them, but instead of every 30 days I edited it for an annual review. Also try filtering only by prediction probability between 0.90 and 1 for better accuracy. Here you can have fun with the analysis you want to run.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F67i77qwdj2ujmf57j6z5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F67i77qwdj2ujmf57j6z5.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Apparently anger has increased over time, but joy remains the most common in Elon's Tweets. Fear increased until the beginning of 2016 but decreased in 2017.&lt;/p&gt;

&lt;p&gt;Well, there are several interpretations for data, we always need to take into account the model used, accuracy, the quality of our data, the information we seek, the type of analysis and our interpretation, context and knowledge, but I believe that now it is possible to see how useful it is to analyze language.&lt;/p&gt;

&lt;p&gt;For example, try running a classification model with the inference data (which is now a new field) to predict sentiment in addition to checking for influencers. Also try importing other models and using other datasets.&lt;/p&gt;

&lt;p&gt;I also imported a &lt;a href="https://huggingface.co/dslim/bert-base-NER" rel="noopener noreferrer"&gt;NER&lt;/a&gt; model to identify entities in the same dataset so we can start to correlate text topics (keywords) with sentiment. The year Elon talked about Tesla the most in this dataset was 2015, which coincides with the year with the greatest increase in joy.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwbam6v4cnp7nkcls4n5m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwbam6v4cnp7nkcls4n5m.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This &lt;a href="https://money.cnn.com/2015/02/12/investing/tesla-apple-elon-musk/" rel="noopener noreferrer"&gt;news&lt;/a&gt; is from 2015 and Elon was really positive about Tesla even with the company reporting losses. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flmfgl7j6yni47fq3t6fi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flmfgl7j6yni47fq3t6fi.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Again, these are not necessarily facts. But my goal is to show a little bit of what we can do with NLP analysis and correlation (which &lt;strong&gt;does not imply causation&lt;/strong&gt; 😅).&lt;/p&gt;

&lt;h2&gt;
  
  
  Let's proceed with the last part, How to run this model on an existing index?
&lt;/h2&gt;

&lt;p&gt;If your data is already &lt;strong&gt;indexed&lt;/strong&gt; and you want to infer your model considering this data but without changing the index content, &lt;strong&gt;this is possible&lt;/strong&gt;. If this is your case, let's proceed with this test.&lt;/p&gt;

&lt;p&gt;In the Kibana menu click &lt;code&gt;Ingest Pipeline&lt;/code&gt; and then &lt;code&gt;Create pipeline&lt;/code&gt; and &lt;code&gt;New pipeline&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feulb9ea55pmg7uda3ync.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feulb9ea55pmg7uda3ync.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Give your pipeline a &lt;code&gt;name&lt;/code&gt; and click &lt;code&gt;Add a processor&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The first step is to rename the field that will be inferred to &lt;code&gt;text_field&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;For that add the Rename processor, in the message field add the field to be renamed and in the target field add &lt;code&gt;text_field&lt;/code&gt;. And then click &lt;code&gt;Add&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqyquh4cy4uxfjoodrg7l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqyquh4cy4uxfjoodrg7l.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we will add the Inference processor, for that click again &lt;code&gt;Add processor&lt;/code&gt; and then under Model ID add your Model ID, in our case: &lt;code&gt;bhadresh-savani__distilbert-base-uncased-emotion&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Add&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Create pipeline&lt;/code&gt; and copy the &lt;code&gt;name&lt;/code&gt; of your pipeline, you will need it later.&lt;/p&gt;

&lt;p&gt;Now open &lt;code&gt;Dev Tools&lt;/code&gt; and run the following request (adding your source index, dest index and pipeline name):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;POST _reindex
{
  "source": {
    "index": "&amp;lt;your-source-index-name&amp;gt;"
  },
  "dest": {
    "index": "&amp;lt;your-ml-dest-index-name&amp;gt;",
    "pipeline": "&amp;lt;your-pipeline-name&amp;gt;"
  }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-reindex.html#docs-reindex-filter-source" rel="noopener noreferrer"&gt;This&lt;/a&gt; copies documents from a source to a destination. You can copy all documents to the destination index, or reindex a subset of the documents, you can also use source filtering to reindex a subset of the fields in the original documents.&lt;/p&gt;

&lt;p&gt;This will take some time, wait for the successful response as in the image below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiwq0wug49rf96y34f9mj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiwq0wug49rf96y34f9mj.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For this new index you don't have the &lt;a href="https://www.elastic.co/guide/en/kibana/current/data-views.html" rel="noopener noreferrer"&gt;Data View&lt;/a&gt; yet, you need it to access the Elasticsearch data that you want to explore, to do that click &lt;code&gt;Stack Management&lt;/code&gt; in the Kibana menu and then click &lt;code&gt;Data Views&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Create new data view&lt;/code&gt; and then for the Name field add the name of your new index, in my case it is &lt;code&gt;elon-output-ml&lt;/code&gt;. Click &lt;code&gt;Create data view&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Now open &lt;code&gt;Discover&lt;/code&gt; and select the new index.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbnzjexmxzupe0p7omhgd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbnzjexmxzupe0p7omhgd.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That's it, without making changes to your current index you have a new index with the result of this model.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;I hope you enjoy using NLP with the Elastic Stack! Feedback is always welcome.&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This post is part of a series that covers &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Artificial Intelligence with a focus on Elastic's (Creators of Elasticsearch) Machine Learning solution&lt;/a&gt;, aiming to introduce and exemplify the possibilities and options available, in addition to addressing the context and usability.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>nlp</category>
      <category>tutorial</category>
      <category>elasticsearch</category>
    </item>
    <item>
      <title>Troubleshooting beginner level Elasticsearch errors</title>
      <dc:creator>Lisa Jung</dc:creator>
      <pubDate>Fri, 20 Aug 2021 14:22:25 +0000</pubDate>
      <link>https://dev.to/elastic/troubleshooting-beginner-level-elasticsearch-errors-5522</link>
      <guid>https://dev.to/elastic/troubleshooting-beginner-level-elasticsearch-errors-5522</guid>
      <description>&lt;p&gt;Throughout my blog posts, we have learned about &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-performing-crud-operations-with-elasticsearch-kibana-1h0n"&gt;CRUD operations&lt;/a&gt;, &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-understanding-the-relevance-of-your-search-with-elasticsearch-and-kibana-29n6"&gt;fine tuning the relevance of your search&lt;/a&gt;, &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-running-queries-with-elasticsearch-and-kibana-4kn9"&gt;queries&lt;/a&gt;, &lt;a href="https://dev.to/lisahjung/beginner-s-guide-running-aggregations-with-elasticsearch-and-kibana-16bn"&gt;aggregations&lt;/a&gt;, and &lt;a href="https://dev.to/lisahjung/beginner-s-guide-understanding-mapping-with-elasticsearch-and-kibana-3646"&gt;mapping&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;As you continue your journey with Elasticsearch, you will inevitably encounter some common errors associated with the topics we have covered in the blogs. &lt;/p&gt;

&lt;p&gt;Learn how to troubleshoot these pesky errors so you can get unstuck! &lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisite work
&lt;/h2&gt;

&lt;p&gt;Watch this &lt;a href="https://www.youtube.com/watch?v=CCTgroOcyfM" rel="noopener noreferrer"&gt;video&lt;/a&gt; from time stamp 14:28-21:46. This video will show you how to complete steps 1 and 2.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Set up Elasticsearch and Kibana* &lt;/li&gt;
&lt;li&gt;Add the &lt;a href="https://www.kaggle.com/rmisra/news-category-dataset" rel="noopener noreferrer"&gt;news headline dataset&lt;/a&gt; and &lt;a href="https://www.kaggle.com/rmisra/news-category-dataset" rel="noopener noreferrer"&gt;e-commerce dataset&lt;/a&gt; to Elasticsearch* &lt;/li&gt;
&lt;li&gt;Open the Kibana console(AKA Dev Tools) &lt;/li&gt;
&lt;li&gt;Keep two windows open side by side(this blog and the Kibana console)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We will be sending requests from Kibana to Elasticsearch to learn how to troubleshoot errors! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Notes&lt;/strong&gt;&lt;br&gt;
1) If you would rather download Elasticsearch and Kibana on your own machine, follow the steps outlined in &lt;a href="https://dev.to/elastic/downloading-elasticsearch-and-kibana-macos-linux-and-windows-1mmo"&gt;Downloading Elasticsearch and Kibana(macOS/Linux and Windows)&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;2) The video will show you how to add the news headlines dataset. &lt;/p&gt;

&lt;p&gt;Add the news headlines data first then follow the same steps to add the e-commerce dataset. &lt;/p&gt;

&lt;p&gt;Create an index called &lt;code&gt;ecommerce_original_data&lt;/code&gt; and add the e-commerce data to that index. &lt;/p&gt;
&lt;h2&gt;
  
  
  Additional Resources
&lt;/h2&gt;

&lt;p&gt;Interested in beginner friendly workshops on Elasticsearch and Kibana? Check out my Beginner's Crash Course to Elastic Stack series!&lt;/p&gt;

&lt;p&gt;1) &lt;a href="https://www.youtube.com/watch?v=jzBoSHcmTN0" rel="noopener noreferrer"&gt;Part 6 Workshop Recording&lt;/a&gt;&lt;br&gt;
This blog is a complementary blog to Part 6 of the Beginner's Crash Course to Elastic Stack. If you prefer learning by watching videos instead, check out the recording!&lt;/p&gt;

&lt;p&gt;2) &lt;a href="https://github.com/LisaHJung/Beginners-Crash-Course-to-the-Elastic-Stack-Series" rel="noopener noreferrer"&gt;Beginner's Crash Course to Elastic Stack Table of Contents&lt;/a&gt;&lt;br&gt;
This table of contents includes repos from all workshops in the series. Each repo includes resources shared during the workshop including the video recording, presentation slides, related blogs, Elasticsearch requests and more!&lt;/p&gt;
&lt;h2&gt;
  
  
  Want To Troubleshoot Your Errors? Follow The Clues!
&lt;/h2&gt;

&lt;p&gt;Whenever you perform an action with Elasticsearch and Kibana, Elasticsearch responds with an HTTP status(red box) and a response body(blue box). &lt;/p&gt;

&lt;p&gt;The request below asks Elasticsearch to index a document and assign it an id of 1. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3hcz5v462pw2ijtyzc3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3hcz5v462pw2ijtyzc3.png" alt="image" width="800" height="383"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The HTTP status of 201-success(red box) indicates that the document has been successfully created. &lt;/p&gt;

&lt;p&gt;The response body(blue box) indicates that a document with an assigned id of 1 has been created in the &lt;code&gt;beginners_crash_course&lt;/code&gt; index. &lt;/p&gt;

&lt;p&gt;As we work with Elasticsearch, we will inevitably encounter error messages like the one below. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuaabnvqh0ee72d8tf4yw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuaabnvqh0ee72d8tf4yw.png" alt="image" width="800" height="677"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When this happens, the HTTP status and the response body will provide valuable clues about why the request failed! &lt;/p&gt;

&lt;p&gt;When you are a beginner, it is hard to even understand what the error message is saying.  Often it is hard to know where to even start.&lt;/p&gt;

&lt;p&gt;I find that just the act of seeing different types of error messages and learning how to troubleshoot makes troubleshooting process less intimidating. &lt;/p&gt;

&lt;p&gt;To keep things simple, we are going to limit our scope. We will focus on errors you might see while working with requests covered in previous blogs. &lt;/p&gt;

&lt;p&gt;We are going to learn how to interpret these error messages and how we should approach troubleshooting these errors.&lt;/p&gt;
&lt;h3&gt;
  
  
  Common Errors
&lt;/h3&gt;

&lt;p&gt;Here are some common errors that you may encounter as you work with Elasticsearch. &lt;/p&gt;
&lt;h4&gt;
  
  
  Unable to connect
&lt;/h4&gt;

&lt;p&gt;The cluster may be down or it may be a network issue. Check the network status and cluster health to identify the problem. &lt;/p&gt;
&lt;h4&gt;
  
  
  Connection unexpectedly closed
&lt;/h4&gt;

&lt;p&gt;The node may have died or it may be a network issue. Retry your request. &lt;/p&gt;
&lt;h4&gt;
  
  
  5XX Errors
&lt;/h4&gt;

&lt;p&gt;Errors with an HTTP status starting with 5 stems from internal server error in Elasticsearch. When you see this error, take a look at the Elasticsearch log and identify the problem. &lt;/p&gt;
&lt;h4&gt;
  
  
  4XX Errors
&lt;/h4&gt;

&lt;p&gt;Errors with an HTTP status starting with 4 stems from client errors. When you see this error, correct the request before retrying. &lt;/p&gt;

&lt;p&gt;As beginners, we are still familiarizing ourselves with the rules and syntax required to communicate with Elasticsearch. Majority of the error messages we encounter are likely to have been caused by the mistakes we make while writing our requests(4XX errors). When we see this error, we need to fix the request before trying it again. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;To strengthen our understanding of the requests we have learned throughout the previous blogs, we will only focus on 4XX errors during this blog.&lt;/strong&gt; &lt;/p&gt;
&lt;h2&gt;
  
  
  Thought Process For Troubleshooting Errors
&lt;/h2&gt;

&lt;p&gt;Whenever we see an error message, it is really helpful to go through the thought process for troubleshooting errors.&lt;/p&gt;

&lt;p&gt;Follow the thought process to narrow down the cause and find the resources to help you fix the error.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;What number does the HTTP status start with(4XX? 5XX?)&lt;/li&gt;
&lt;li&gt;What does the response say? Always read the full message!&lt;/li&gt;
&lt;li&gt;Use the &lt;a href="https://www.elastic.co/guide/index.html" rel="noopener noreferrer"&gt;Elasticsearch documentation&lt;/a&gt; as your guide. Compare your request with the example from the documentation. Identify the mistake and make appropriate changes.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;At times, you will encounter error messages that are not very helpful. We will go over a couple of these and see how we can troubleshoot these types of errors.&lt;/strong&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Trip Down Memory Lane
&lt;/h2&gt;

&lt;p&gt;In my previous blogs, we learned how to send requests related to the following topics:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-performing-crud-operations-with-elasticsearch-kibana-1h0n"&gt;CRUD operations&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-running-queries-with-elasticsearch-and-kibana-4kn9"&gt;Queries&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://dev.to/lisahjung/beginner-s-guide-running-aggregations-with-elasticsearch-and-kibana-16bn"&gt;Aggregations&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://dev.to/lisahjung/beginner-s-guide-understanding-mapping-with-elasticsearch-and-kibana-3646"&gt;Mapping&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We will revisit each topic and troubleshoot common errors you may encounter as you explore each topic. &lt;/p&gt;
&lt;h2&gt;
  
  
  Errors Associated With CRUD Operations
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Error 1: 404 No such index[x]
&lt;/h3&gt;

&lt;p&gt;In the &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-performing-crud-operations-with-elasticsearch-kibana-1h0n"&gt;Beginner's guide to performing CRUD operations with Elasticsearch and Kibana&lt;/a&gt;, we learned how to perform CRUD operations. &lt;/p&gt;

&lt;p&gt;Let's say we have sent the following request to retrieve a document with an id of 1 from the &lt;code&gt;common_errors&lt;/code&gt; index.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Request sent:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;common_errors&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 404-error(red box) along with the cause of the error(blue lines) in the response body. &lt;/p&gt;

&lt;p&gt;The HTTP status starts with a 4, meaning that there was a client error with the request sent.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8t4ow0xhiz4zfrhb62el.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8t4ow0xhiz4zfrhb62el.png" alt="image" width="800" height="687"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you look at the response body, Elasticsearch lists the reason(line 6) as "no such index [common_errors]". &lt;/p&gt;

&lt;p&gt;The two possible explanations for this error are:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The index &lt;code&gt;common_errors&lt;/code&gt; truly does not exist or was deleted&lt;/li&gt;
&lt;li&gt;We do not have the correct index name&lt;/li&gt;
&lt;/ol&gt;
&lt;h3&gt;
  
  
  Cause of Error 1
&lt;/h3&gt;

&lt;p&gt;In our example, the cause of the error is quite clear! We have not created an index called &lt;code&gt;common_errors&lt;/code&gt; and we were trying to retrieve a document from an index that does not exist. &lt;/p&gt;

&lt;p&gt;Let's create an index called &lt;code&gt;common_errors&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;Name&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="k"&gt;of&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;the&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;Index&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example asks Elasticsearch to create a new index called &lt;code&gt;common_errors&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;common_errors&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the example into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 200-success HTTP status(red box) acknowledging that the index &lt;code&gt;common_errors&lt;/code&gt; has been successfully created(blue box). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh0nhqiadeq017q80sq99.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh0nhqiadeq017q80sq99.png" alt="image" width="800" height="231"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Error 2: 405 Incorrect HTTP method for uri, allowed: [x]
&lt;/h3&gt;

&lt;p&gt;Now that we have created the index &lt;code&gt;common_errors&lt;/code&gt;, let's index a document!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Suppose you have remembered that you could use the HTTP verb PUT to index a document and send the following request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;common_errors&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source_of_error&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Using the wrong syntax for PUT or POST indexing request&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 405-error(red box) along with the cause of the error(blue lines) in the response body. &lt;/p&gt;

&lt;p&gt;This HTTP status starts with a 4, meaning that there was a client error with the request sent.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Forawtdcvrht1ltowx80t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Forawtdcvrht1ltowx80t.png" alt="image" width="800" height="231"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you look at the response, Elasticsearch lists the reason as "Incorrect HTTP method for uri... and method: [PUT], allowed:[POST]". &lt;/p&gt;
&lt;h3&gt;
  
  
  Cause of Error 2
&lt;/h3&gt;

&lt;p&gt;This error message suggests that we used the wrong HTTP verb to index this document. &lt;/p&gt;

&lt;p&gt;You can use either PUT or POST HTTP verb to index a document. Each HTTP verb serves a different purpose and requires a different syntax. &lt;/p&gt;

&lt;p&gt;We learned about the difference between the two verbs in the &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-performing-crud-operations-with-elasticsearch-kibana-1h0n"&gt;Beginner's guide to performing CRUD operations with Elasticsearch and Kibana&lt;/a&gt; under the &lt;code&gt;Index a document&lt;/code&gt; section.  &lt;/p&gt;

&lt;p&gt;The following are excerpts from the blog. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;When indexing a document, HTTP verb PUT or POST can be used.&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The HTTP verb PUT is used when you want to assign a specific id to your document&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="k"&gt;of&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;the&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;Index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;you&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;want&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;to&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;assign&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;to&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nb"&gt;document&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;field_name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Let's compare the syntax to the request we just sent:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;common_errors&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source_of_error&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Using the wrong syntax for PUT or POST indexing request&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You will see that our request uses the HTTP verb PUT. However, it does not include the document id we want to assign to this document. &lt;/p&gt;

&lt;p&gt;If you add the id of the document to the request as seen below, you will see that the request is carried out without a hitch!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Correct example for PUT indexing request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;common_errors&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source_of_error&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Using the wrong syntax for PUT or POST indexing request&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the correct example into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 201-success HTTP status(red box) acknowledging that document 1 has been successfully created. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhy28npjyzup11d7hc7zt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhy28npjyzup11d7hc7zt.png" alt="image" width="800" height="369"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The HTTP verb POST is used when you want Elasticsearch to autogenerate an id for the document.&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;If this is the option you wanted, you could fix the error message by replacing the verb PUT with POST and not including the document id after the document endpoint(_doc). &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;Name&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="k"&gt;of&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;the&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;Index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nl"&gt;field_name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Correct example for POST indexing request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;common_errors&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source_of_error&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Using the wrong syntax for PUT or POST indexing request&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the correct example into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt; &lt;br&gt;
Elasticsearch returns a 201-success HTTP status(red box) and autogenerates an id(line 4) for the document that was indexed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3x4nuavv9spqvh38xzqk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3x4nuavv9spqvh38xzqk.png" alt="image" width="800" height="371"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Error 3: 400 Unexpected Character: was expecting a comma to separate Object entries at [Source: ...] line: x
&lt;/h3&gt;

&lt;p&gt;Suppose you wanted to update document 1 by adding the fields "error" and "solution" as seen in the example.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;common_errors&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_update&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;doc&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;error&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;405 Method Not Allowed&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;solution&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Look up the syntax of PUT and POST indexing requests and use the correct syntax.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 400-error(red box) along with the cause of the error in the response body. &lt;/p&gt;

&lt;p&gt;This HTTP error starts with a 4, meaning that there was a client error with the request sent.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft45ox94ypv1mg6pyawvi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft45ox94ypv1mg6pyawvi.png" alt="image" width="800" height="727"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Cause of Error 3
&lt;/h3&gt;

&lt;p&gt;If you look at the response, Elasticsearch lists the error type(line 12) as "json_parse_exception" and the reason(line 13) as "...was expecting comma to separate Object entries at ... line: 4]". &lt;/p&gt;

&lt;p&gt;In Elasticsearch, if you have multiple fields("errors" and "solution") in an object("doc"), you must separate each field with a comma. The error message tells us that we need to add a comma between the fields "error" and "solution".&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Add the comma as shown below and send the following request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;common_errors&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_update&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;doc&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;error&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;405 Method Not Allowed&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;solution&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Look up the syntax of PUT and POST indexing requests and use the correct syntax.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
You will see that the document with an id of 1 has been successfully updated. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3dy56barx08t9ibrecba.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3dy56barx08t9ibrecba.png" alt="image" width="800" height="417"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Errors Associated With Sending Queries
&lt;/h2&gt;

&lt;p&gt;In blogs &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-understanding-the-relevance-of-your-search-with-elasticsearch-and-kibana-29n6"&gt;fine tuning the relevance of your search&lt;/a&gt; and &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-running-queries-with-elasticsearch-and-kibana-4kn9"&gt;queries&lt;/a&gt;, we learned how to send queries about news headlines in our index.&lt;/p&gt;

&lt;p&gt;As a prerequisite part of these workshops, we added the news headlines dataset to an index we named &lt;code&gt;news_headlines&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Afterwards, we sent various queries to retrieve documents that match the specific criteria.  &lt;/p&gt;

&lt;p&gt;Let's go over common errors you may encounter while working with these queries. &lt;/p&gt;
&lt;h3&gt;
  
  
  Error 4: 400 [x] query does not support [y]
&lt;/h3&gt;

&lt;p&gt;Suppose you want to use the range query to pull up news headlines published within a specific date range. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;You have sent the following request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;range&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; 
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;gte&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2015-06-20&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;lte&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2015-09-22&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt; &lt;br&gt;
Elasticsearch returns a 400-error(red box) along with the cause of the error in the response body. &lt;/p&gt;

&lt;p&gt;This HTTP status starts with a 4, meaning that there was a client error with the request sent.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmor80f6zy5fp6u3bmlxj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmor80f6zy5fp6u3bmlxj.png" alt="image" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you look at the response, Elasticsearch lists the error type(line 5) as "parsing_exception" and the reason(line 6) as "[range] query does not support [date]". &lt;/p&gt;

&lt;p&gt;This error message is misleading as the range query should be able to retrieve documents that contain terms within a provided range. It should not matter that you have requested to run a range query against the field "date". &lt;/p&gt;

&lt;p&gt;Let's check out the screenshots from the &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/7.9/query-dsl-range-query.html" rel="noopener noreferrer"&gt;Elastic documentation on the range query&lt;/a&gt; to see what is going on.   &lt;/p&gt;

&lt;p&gt;Pay attention to the syntax of the range query line by line. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Screenshot from the documentation:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzh9ffmc0rejchh67slzr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzh9ffmc0rejchh67slzr.png" alt="image" width="800" height="768"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Compare this syntax to the request we have sent earlier:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;range&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; 
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;gte&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2015-06-20&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;lte&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2015-09-22&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Cause of Error 4
&lt;/h3&gt;

&lt;p&gt;The culprit of this error is the range query syntax! &lt;/p&gt;

&lt;p&gt;Our request is missing curly brackets around the inner fields("gte" and "lte") of the field "date". &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Let's add the curly brackets as shown below and send the following request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;range&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;gte&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2015-06-20&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;lte&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2015-09-22&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 200-success HTTP status(red box) and retrieves news headlines that were published between the specified date range. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2hlzfv6x3n3w2n8p3lip.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2hlzfv6x3n3w2n8p3lip.png" alt="image" width="800" height="633"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Error 5: 400 Unexpected character...: was expecting double-quote to start field name.
&lt;/h3&gt;

&lt;p&gt;In the &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-running-queries-with-elasticsearch-and-kibana-4kn9"&gt;queries&lt;/a&gt; blog, we learned about the &lt;code&gt;multi_match&lt;/code&gt; query. This query allows you to search for the same search terms in multiple fields at one time. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Suppose you wanted to search for the phrase "party planning" in the fields "headline" and "short_description" as shown below:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;multi_match&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;party planning&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fields&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;headline&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;short_description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;phrase&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 400-error(red box) along with the cause of the error in the response body. &lt;/p&gt;

&lt;p&gt;This HTTP status starts with a 4, meaning that something is not quite right with the request sent.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F171l08lhoo7j6o4fme4p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F171l08lhoo7j6o4fme4p.png" alt="image" width="800" height="561"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you look at the response, Elasticsearch lists the error type(line 5) as "parsing_exception" and the reason(line 6) as "[multi_match] malformed query, expected [END_OBJECT] but found [FIELD_NAME]".&lt;/p&gt;
&lt;h3&gt;
  
  
  Cause of Error 5
&lt;/h3&gt;

&lt;p&gt;This is a vague error message that does not really tell you what went wrong. &lt;/p&gt;

&lt;p&gt;However, we do know that the error is coming from somewhere around line 10, which suggests that the error may have something to do with the "type" parameter(line 11). &lt;/p&gt;

&lt;p&gt;When you check the opening and closing brackets from the outside in, you will realize that the "type" parameter is placed outside of the &lt;code&gt;multi_match&lt;/code&gt; query.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Move the "type" parameter up a line and move the comma from line 10 to line 9 as shown below and send the request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;multi_match&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;party planning&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fields&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;headline&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;short_description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;],&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;phrase&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt; &lt;br&gt;
Elastcsearch returns a 200-success(red box) response. &lt;/p&gt;

&lt;p&gt;All hits contain the phrase "party planning" in either the field "headline" or "short description" or both! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqg26hhjpi95t7h9aaqiy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqg26hhjpi95t7h9aaqiy.png" alt="image" width="800" height="744"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Error 6: 400 parsing_exception
&lt;/h3&gt;

&lt;p&gt;When we search for something, we often ask a multi-faceted question. For example, you may want to retrieve  entertainment news headlines published on "2018-04-12". &lt;/p&gt;

&lt;p&gt;This question actually requires sending multiple queries in one request:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A query that retrieves documents from the "ENTERTAINMENT" category &lt;/li&gt;
&lt;li&gt;A query that retrieves documents that were published on "2018-04-12"&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Let's say you are most familiar with the &lt;code&gt;match query&lt;/code&gt; so you write the following request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;match&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ENTERTAINMENT&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2018-04-12&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 400-error(red box) along with the cause of the error in the response body. &lt;/p&gt;

&lt;p&gt;This HTTP status starts with a 4, meaning that something is off with the query syntax.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F28d31edkak5812aun363.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F28d31edkak5812aun363.png" alt="image" width="800" height="556"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you look at the response, Elasticsearch lists the error type(line 5) as "parsing_exception" and the reason(line 6) as "[match] query doesn't support multiple fields, found [category] and [date]". &lt;/p&gt;
&lt;h3&gt;
  
  
  Cause of Error 6
&lt;/h3&gt;

&lt;p&gt;Elasticsearch throws an error because a &lt;code&gt;match query&lt;/code&gt; can query documents from only one field. Our request tried to query multiple fields using only one &lt;code&gt;match query&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Bool Query&lt;/strong&gt;&lt;br&gt;
In the &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-running-queries-with-elasticsearch-and-kibana-4kn9"&gt;queries&lt;/a&gt; blog, we learned how to combine multiple queries into one request by using the &lt;code&gt;bool query&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;With the &lt;code&gt;bool query&lt;/code&gt;, you can combine multiple queries into one request and you can further specify boolean clauses to narrow down your search results. &lt;/p&gt;

&lt;p&gt;This query offers four clauses that you can choose from:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;must&lt;/li&gt;
&lt;li&gt;must_not&lt;/li&gt;
&lt;li&gt;should&lt;/li&gt;
&lt;li&gt;filter &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You can mix and match any of these clauses to get the relevant search results you want. &lt;/p&gt;

&lt;p&gt;In our use case, we have two queries:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A query that retrieves documents from the "ENTERTAINMENT" category &lt;/li&gt;
&lt;li&gt;A query that retrieves documents that were published on "2018-04-12"&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The news headlines we want could be filtered into a yes or no category:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Is the news headline from the "ENTERTAINMENT" category? &lt;strong&gt;yes&lt;/strong&gt; or no
&lt;/li&gt;
&lt;li&gt;Was the news headline published on "2018-04-12"? &lt;strong&gt;yes&lt;/strong&gt; or no&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;When documents could be filtered into either a yes or no category, we can use the filter clause and include two &lt;code&gt;match queries&lt;/code&gt; within it:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;bool&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;filter&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;match&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ENTERTAINMENT&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;match&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2018-04-12&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elastcsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 200-success HTTP status(red box) and shows the top 10 hits whose "category" field contains the value "ENTERTAINMENT" and the "date" field contains the value of "2018-04-12".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9kysrsp2s84tj6gv1spv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9kysrsp2s84tj6gv1spv.png" alt="image" width="800" height="757"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Errors Associated With Aggregations and Mapping
&lt;/h2&gt;

&lt;p&gt;Suppose you want to get the summary of categories that exist in our dataset. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Since this requires summarizing your data, you decide to send the following aggregations request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;by_category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;terms&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
By default, Elasticsearch returns both the top 10 search hits and aggregations results. Notice that the top 10 search hits take up lines 16-168. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43x222w0c477hlnsgtay.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43x222w0c477hlnsgtay.png" alt="image" width="800" height="745"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Error 7: 400 Aggregation definition for [x], expected a [y]
&lt;/h3&gt;

&lt;p&gt;Let's say you are only interested in aggregations results.&lt;/p&gt;

&lt;p&gt;You remember that you can add a "size" parameter and set it equal to 0 to avoid fetching the hits. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;You send the following request to accomplish this task:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;by_category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;terms&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt; &lt;br&gt;
Elasticsearch returns a 400-error(red box) along with the cause of the error in the response body. &lt;/p&gt;

&lt;p&gt;This HTTP status starts with a 4, meaning that there was a client error with the request sent.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fftvvedv5sxvaq8a3124f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fftvvedv5sxvaq8a3124f.png" alt="image" width="800" height="573"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you look at the response, Elasticsearch lists the error type(line 5) as "parsing_exception" and the reason(line 6) as "Aggregation definition for [size starts with a [VALUE_NUMBER], expected a [START_OBJECT]". &lt;/p&gt;

&lt;p&gt;Something is off with our aggregations request syntax.  &lt;/p&gt;

&lt;p&gt;Let's take a look at the screenshots from the &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/7.13/search-aggregations.html" rel="noopener noreferrer"&gt;Elastic documentation on aggregations&lt;/a&gt; and see what we missed. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Screenshot from the documentation:&lt;/strong&gt;&lt;br&gt;
Pay close attention to the syntax of the aggregations request.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4om0quwd64azibqo9gal.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4om0quwd64azibqo9gal.png" alt="image" width="800" height="1052"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Cause of Error 7
&lt;/h3&gt;

&lt;p&gt;This error is occurring because the "size" parameter was placed in a spot where Elasticsearch is expecting the name of the aggregations("my-agg-name"). &lt;/p&gt;

&lt;p&gt;If you scroll down to the &lt;code&gt;Return only aggregation results&lt;/code&gt; section in the documentation, you will see that the "size" parameter is placed outside of the aggregations request as shown below. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Screenshot from the documentation:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyv12ql2osqaiscvlrfve.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyv12ql2osqaiscvlrfve.png" alt="image" width="800" height="1235"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Place the "size" parameter outside of the aggregations request and set it equal to 0 as shown below. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Send the following request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;news_headlines&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;by_category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;terms&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
As intended, Elasticsearch does not retrieve the top 10 hits(line 16). &lt;/p&gt;

&lt;p&gt;You can see the aggregations results(an array of categories) without having to scroll through the hits. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl0hwy3r4ikkessbe8h50.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl0hwy3r4ikkessbe8h50.png" alt="image" width="800" height="755"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Error 8: 400 Field [x] of type [y] is not supported for z type of aggregation
&lt;/h3&gt;

&lt;p&gt;The next two errors(error 8 &amp;amp; 9) are related to the requests we have learned in &lt;a href="https://dev.to/lisahjung/beginner-s-guide-running-aggregations-with-elasticsearch-and-kibana-16bn"&gt;aggregations&lt;/a&gt; and &lt;a href="https://dev.to/lisahjung/beginner-s-guide-understanding-mapping-with-elasticsearch-and-kibana-3646"&gt;mapping&lt;/a&gt; blogs. &lt;/p&gt;

&lt;p&gt;In these blogs, we have worked with the e-commerce dataset. &lt;/p&gt;

&lt;p&gt;In the &lt;a href="https://dev.to/lisahjung/beginner-s-guide-running-aggregations-with-elasticsearch-and-kibana-16bn"&gt;aggregations&lt;/a&gt; blog, we have added the e-commerce dataset to Elasticsearch and named the index  &lt;code&gt;ecommerce_original_data&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Then, we had to follow additional steps in &lt;code&gt;Set up data within Elasticsearch&lt;/code&gt; section in the &lt;a href="https://dev.to/lisahjung/beginner-s-guide-running-aggregations-with-elasticsearch-and-kibana-16bn"&gt;aggregations&lt;/a&gt; blog.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Screenshot from the aggregations blog:&lt;/strong&gt;&lt;br&gt;
To set up data within Elasticsearch, we have implemented the following steps.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frx7zyngavi26nx7bruvs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frx7zyngavi26nx7bruvs.png" alt="image" width="800" height="569"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We never covered why we had to go through these steps. It was all because of the the error message we are about to see next!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;From this point on, imagine that you had just added the e-commerce dataset into the &lt;code&gt;ecommerce_original_data&lt;/code&gt; index. We have not completed steps 1 and 2.&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;In the &lt;a href="https://dev.to/lisahjung/beginner-s-guide-running-aggregations-with-elasticsearch-and-kibana-16bn"&gt;aggregations&lt;/a&gt; blog, we learned how to group data into buckets based on time interval. &lt;/p&gt;

&lt;p&gt;This type of aggregations request is called the &lt;code&gt;date_histogram aggregation&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Suppose we wanted to group our data into 8 hour buckets so we have sent the request below:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_original_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_by_8_hrs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fixed_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;8h&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 400-error(red box) along with the cause of the error in the response body. &lt;/p&gt;

&lt;p&gt;This HTTP status starts with a 4, meaning that there was a client error with the request sent.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkh9ueuzkq6svjol8ehqq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkh9ueuzkq6svjol8ehqq.png" alt="image" width="800" height="753"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you look at the response, Elasticsearch lists the error type(line 5) as "illegal_argument_exception" and the reason(line 6) as "Field [InvoiceDate] of type [keyword] is not supported for aggregation [date_histogram]".&lt;/p&gt;

&lt;p&gt;This error is different from the syntax error messages we have gone over thus far. &lt;/p&gt;

&lt;p&gt;It says that the &lt;strong&gt;field type&lt;/strong&gt; "keyword" is not supported for the &lt;code&gt;date_histogram aggregation&lt;/code&gt;, which suggests that this error may have something to do with the mapping. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Let's check the mapping of the &lt;code&gt;ecommerce_original_data&lt;/code&gt; index:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_original_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
You will see that the field "InvoiceDate" is typed as "keyword"(blue box). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp395cg228bml3szjy5u0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp395cg228bml3szjy5u0.png" alt="image" width="800" height="751"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Cause of Error 8
&lt;/h3&gt;

&lt;p&gt;Let's take a look at the screenshots from the &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/7.13/search-aggregations-bucket-datehistogram-aggregation.html" rel="noopener noreferrer"&gt;Elastic documentation on date_histogram aggregation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Screenshot from the documentation:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhcylnaiwtqp21yy2gc05.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhcylnaiwtqp21yy2gc05.png" alt="image" width="800" height="994"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The first sentence gives us a valuable clue on why this error occurred!&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;date_histogram aggregation&lt;/code&gt; cannot be performed on a field typed as "keyword". &lt;/p&gt;

&lt;p&gt;To perform a &lt;code&gt;date_histogram aggregation&lt;/code&gt; on the "InvoiceDate" field, the "InvoiceDate" field must be mapped as field type "date". &lt;/p&gt;

&lt;p&gt;But the mapping for the field "date" already exists. What are we going to do?!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Remember, you cannot change the mapping of the existing field!&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The only way you can accomplish this is to:&lt;/strong&gt;&lt;br&gt;
Step 1: Create a new index with the desired mapping &lt;br&gt;
Step 2: Reindex data from the original index to the new one&lt;br&gt;
Step 3: Send the &lt;code&gt;date_histogram aggregation&lt;/code&gt; request to the &lt;em&gt;new index&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In the &lt;a href="https://dev.to/lisahjung/beginner-s-guide-running-aggregations-with-elasticsearch-and-kibana-16bn"&gt;aggregations&lt;/a&gt; blog, this is why we carried out steps 1 and 2. &lt;/p&gt;

&lt;p&gt;Let's go over step 1!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Create a new index(ecommerce_data) with the following mapping&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The following request creates a new index called &lt;code&gt;ecommerce_data&lt;/code&gt; with the desired mapping. Notice that the field "InvoiceDate" is typed as "date" and the "format" of the date has been specified here as well.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;mappings&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;properties&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Country&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;long&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;format&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;M/d/yyyy H:m&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceNo&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;long&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;StockCode&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;double&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt; &lt;br&gt;
You will get a 200-success HTTP status(red box) acknowledging that the index &lt;code&gt;ecommerce_data&lt;/code&gt; with the desired mapping has been successfully created. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fokyjdra8i42hnya1qrdx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fokyjdra8i42hnya1qrdx.png" alt="image" width="800" height="277"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Let's check the mapping of the &lt;code&gt;ecommerce_data&lt;/code&gt; index by sending the following request:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch retrieves the mapping of the &lt;code&gt;ecommerce_data&lt;/code&gt; index where the field "InvoiceDate" is typed as "date" and the date "format" is specified as "M/d/yyyy H:m". &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4jx179pc74ofqsiyl7as.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4jx179pc74ofqsiyl7as.png" alt="image" width="738" height="1342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why did we have to specify the "format" of the field "InvoiceDate"?&lt;/strong&gt;&lt;br&gt;
It has to do with the format of the field "InvoiceDate" in our e-commerce dataset. &lt;/p&gt;

&lt;p&gt;We have not put any data into the &lt;code&gt;ecommerce_data&lt;/code&gt; index. However, we still have e-commerce data in the &lt;code&gt;ecommerce_original_data&lt;/code&gt; index. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Let's send the following request to retrieve documents from this index so we can see the date format of the field "InvoiceDate":&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_original_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt; 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj7yhdkpkd8mh8p1velfx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj7yhdkpkd8mh8p1velfx.png" alt="image" width="800" height="365"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The format of the field "InvoiceDate" is "M/d/yyyy H:m".&lt;/p&gt;

&lt;p&gt;By default, Elasticsearch is configured to recognize iso8601 date format(ex. 2021-07-16T17:12:56.123Z).&lt;/p&gt;

&lt;p&gt;If the date format in your dataset differs from the iso8601 format, Elasticsearch will not recognize it and throw an error. &lt;/p&gt;

&lt;p&gt;In order to prevent this from happening, we specify the date format of the "InvoiceDate" field("format": "M/d/yyyy H:m") within the mapping. &lt;/p&gt;

&lt;p&gt;The symbols used in the date format was formed using this &lt;a href="https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;We have covered a LOT! Let's do a recap on why we are carrying out these steps in the first place. &lt;/p&gt;

&lt;p&gt;In the &lt;a href="https://dev.to/lisahjung/beginner-s-guide-running-aggregations-with-elasticsearch-and-kibana-16bn"&gt;aggregations&lt;/a&gt; blog, we added the e-commerce dataset to the &lt;code&gt;ecommerce_original_data&lt;/code&gt; index where the field "InvoiceDate" was dynamically typed as "keyword". &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rl12eo1heko264vfxna.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rl12eo1heko264vfxna.png" alt="image" width="800" height="451"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When we tried to run a &lt;code&gt;date_histogram aggregation&lt;/code&gt; on the field "InvoiceDate", Elasticsearch threw an error saying that it can only perform the &lt;code&gt;date_histogram aggregation&lt;/code&gt; on a field typed as "date". &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F60980933%2F126417282-73f5a571-701f-4881-a5c9-f7cb953ccd5e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F60980933%2F126417282-73f5a571-701f-4881-a5c9-f7cb953ccd5e.png" alt="image" width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Since we could not change the mapping of an existing field "InvoiceDate", we had to carry out step 1 where we created a new index called &lt;code&gt;ecommerce_data&lt;/code&gt; with the desired mapping for the field "InvoiceDate". &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F60980933%2F126423539-8da82a73-10bb-46cc-86d7-98ab31bea30b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fuser-images.githubusercontent.com%2F60980933%2F126423539-8da82a73-10bb-46cc-86d7-98ab31bea30b.png" alt="image" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Reindex the data from original index(source) to the one you just created(destination)&lt;/strong&gt;&lt;br&gt;
At this point, we have a new index called &lt;code&gt;ecommerce_data&lt;/code&gt; with the desired mapping. However, there is no data in this index. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;To correct that, we will send the following request to reindex the data from the &lt;code&gt;ecommerce_original_data&lt;/code&gt; index to the &lt;code&gt;ecommerce_data&lt;/code&gt; index:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;_reindex&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;index&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ecommerce_original_data&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;dest&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;index&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ecommerce_data&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch successfully reindexes the e-commerce dataset from the &lt;code&gt;ecommerce_original_data&lt;/code&gt; index to the &lt;code&gt;ecommerce_data&lt;/code&gt; index. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3rbjbcev2vzevv9tt36e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3rbjbcev2vzevv9tt36e.png" alt="image" width="800" height="710"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Send the date_histogram aggregation request to the new index(ecommerce_data)&lt;/strong&gt;&lt;br&gt;
Now that the data has been reindexed to the new index, let’s send the &lt;code&gt;date_histogram aggregation&lt;/code&gt; request we sent earlier. &lt;/p&gt;

&lt;p&gt;The following is almost identical to the original request except that the index name has been changed to the new index(&lt;code&gt;ecommerce_data&lt;/code&gt;).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_by_8_hrs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fixed_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;8h&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 200-success(red box) response. It divides the dataset into 8 hour buckets and returns them in the response. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94vi5htg50nud08dw7sq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94vi5htg50nud08dw7sq.png" alt="image" width="800" height="743"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's go over the last error! &lt;/p&gt;
&lt;h3&gt;
  
  
  Error 9: 400 Found two aggregation type definitions in [x]: y and z
&lt;/h3&gt;

&lt;p&gt;One of the cool things about Elasticsearch is that you can build any combination of aggregations to answer more complex questions. &lt;/p&gt;

&lt;p&gt;For example, let's say we want to get the daily revenue and the number of unique customers per day.&lt;/p&gt;

&lt;p&gt;This requires grouping data into daily buckets.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9y8sfpxoozuy072gawq0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9y8sfpxoozuy072gawq0.png" alt="image" width="626" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within each bucket, we calculate the daily revenue and the number of unique customers per day. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm640ywqnh67hjpyxyk3v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm640ywqnh67hjpyxyk3v.png" alt="image" width="619" height="353"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Let's say we wrote the following request to accomplish this task:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;calendar_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;daily_revenue&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;script&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;doc['UnitPrice'].value * doc['Quantity'].value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;number_of_unique_customers_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;cardinality&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request above into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns a 400-error(red box) along with the cause of the error in the response body. &lt;/p&gt;

&lt;p&gt;This HTTP error starts with a 4, meaning that there was a client error with the request sent.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwao9j3khwyj4urezt4lh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwao9j3khwyj4urezt4lh.png" alt="image" width="800" height="624"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Cause of Error 9
&lt;/h3&gt;

&lt;p&gt;This error is occurring because the structure of the aggregations request is incorrect.&lt;/p&gt;

&lt;p&gt;In order to accomplish our goals, we first group data into daily buckets. Within each bucket, we calculate the daily revenue and the unique number of customers per day.&lt;/p&gt;

&lt;p&gt;Therefore, our request contains an aggregation(pink brackets) within an aggregation(blue brackets). &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqfd8qh1y8mb2ebx94tne.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqfd8qh1y8mb2ebx94tne.png" alt="image" width="632" height="350"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The following demonstrates the correct aggregations request structure. Note the sub-aggregations that encloses the "daily_revenue" and the "number_of_unique_customers_per_day":&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;calendar_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;daily_revenue&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;script&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
              &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;doc['UnitPrice'].value * doc['Quantity'].value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;number_of_unique_customers_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;cardinality&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the request into the Kibana console and send the request.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt; &lt;br&gt;
Elasticsearch returns a 200-success HTTP status(red box). &lt;/p&gt;

&lt;p&gt;It groups the dataset into daily buckets. Within each bucket, the number of unique customers per day as well as the daily revenue are calculated.  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7korj5w0lgbwt4n8m8ap.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7korj5w0lgbwt4n8m8ap.png" alt="image" width="800" height="752"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Congratulations. You have mastered how to troubleshoot beginner level Elasticsearch errors! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fms3zps05cp2jtjpde970.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fms3zps05cp2jtjpde970.gif" alt="image" width="478" height="200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next time you see an error, breathe easy and apply the thought process of troubleshooting errors covered in this blog!&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>elasticsearch</category>
      <category>database</category>
      <category>datascience</category>
    </item>
    <item>
      <title>Understanding mapping with Elasticsearch and Kibana</title>
      <dc:creator>Lisa Jung</dc:creator>
      <pubDate>Fri, 13 Aug 2021 18:20:06 +0000</pubDate>
      <link>https://dev.to/elastic/understanding-mapping-with-elasticsearch-and-kibana-4lda</link>
      <guid>https://dev.to/elastic/understanding-mapping-with-elasticsearch-and-kibana-4lda</guid>
      <description>&lt;p&gt;Have you ever encountered the error “Field type is not supported for [whatever you are trying to do with Elasticsearch]”?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fauiiya7r3nw23soegqn5.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fauiiya7r3nw23soegqn5.gif" alt="image" width="168" height="168"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The most likely culprit of this error is the &lt;code&gt;mapping&lt;/code&gt; of your index!&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Mapping&lt;/code&gt; is the process of defining how a document and its fields are indexed and stored. It defines the type and format of the fields in the documents. As a result, &lt;code&gt;mapping&lt;/code&gt; can significantly affect how Elasticsearch searches and stores data.&lt;/p&gt;

&lt;p&gt;Understanding how &lt;code&gt;mapping&lt;/code&gt; works will help you define &lt;code&gt;mapping&lt;/code&gt; that best serves your use case.&lt;/p&gt;

&lt;p&gt;By the end of this blog, you will be able to define what a &lt;code&gt;mapping&lt;/code&gt; is and customize your own &lt;code&gt;mapping&lt;/code&gt; to make indexing and searching more efficient. &lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisite work
&lt;/h2&gt;

&lt;p&gt;Watch this &lt;a href="https://www.youtube.com/watch?v=CCTgroOcyfM" rel="noopener noreferrer"&gt;video&lt;/a&gt; from time stamp 14:28-19:00. This video will show you how to complete steps 1 and 2.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Set up Elasticsearch and Kibana* &lt;/li&gt;
&lt;li&gt;Open the Kibana console(AKA Dev Tools)
&lt;/li&gt;
&lt;li&gt;Keep two windows open side by side(this blog and the Kibana console)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We will be sending requests from Kibana to Elasticsearch to learn how &lt;code&gt;mapping&lt;/code&gt; works! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;&lt;br&gt;
If you would rather download Elasticsearch and Kibana on your own machine, follow the steps outlined in &lt;a href="https://dev.to/elastic/downloading-elasticsearch-and-kibana-macos-linux-and-windows-1mmo"&gt;Downloading Elasticsearch and Kibana(macOS/Linux and Windows)&lt;/a&gt;. &lt;/p&gt;
&lt;h2&gt;
  
  
  Additional Resources
&lt;/h2&gt;

&lt;p&gt;Interested in beginner friendly workshops on Elasticsearch and Kibana? Check out my Beginner's Crash Course to Elastic Stack series!&lt;/p&gt;

&lt;p&gt;1) &lt;a href="https://www.youtube.com/watch?v=FQAHDrVwfok&amp;amp;list=PL_mJOmq4zsHZYAyK606y7wjQtC0aoE6Es&amp;amp;index=5&amp;amp;t=598s" rel="noopener noreferrer"&gt;Part 5 Workshop Recording&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This blog is a complementary blog to Part 5 of the Beginner's Crash Course to Elastic Stack. If you prefer learning by watching videos instead, check out the recording!&lt;/p&gt;

&lt;p&gt;2) &lt;a href="https://github.com/LisaHJung/Beginners-Crash-Course-to-the-Elastic-Stack-Series" rel="noopener noreferrer"&gt;Beginner's Crash Course to Elastic Stack Table of Contents&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This table of contents includes repos from all workshops in the series. Each repo includes resources shared during the workshop including the video recording, presentation slides, related blogs, Elasticsearch requests and more!&lt;/p&gt;
&lt;h2&gt;
  
  
  What is a Mapping?
&lt;/h2&gt;

&lt;p&gt;Imagine you are building an app that requires you to store and search data. Naturally, you will want to store data using the smallest disk space while maximizing your search performance. &lt;/p&gt;

&lt;p&gt;This is where &lt;code&gt;mapping&lt;/code&gt; comes into play! &lt;/p&gt;

&lt;p&gt;&lt;code&gt;Mapping&lt;/code&gt; defines how a document and its fields are indexed and stored. &lt;/p&gt;

&lt;p&gt;It does that by defining field types. Depending on its type, the fields are stored and indexed accordingly. &lt;/p&gt;

&lt;p&gt;Learning how to define your own mapping will help you:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;optimize the performance of Elasticsearch&lt;/li&gt;
&lt;li&gt;save disk space&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Before we delve into mapping, let's review a few concepts!&lt;/p&gt;
&lt;h2&gt;
  
  
  Review from previous blogs
&lt;/h2&gt;

&lt;p&gt;In the previous blogs, we learned how to &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-running-queries-with-elasticsearch-and-kibana-4kn9"&gt;query&lt;/a&gt; and &lt;a href="https://dev.to/lisahjung/beginner-s-guide-running-aggregations-with-elasticsearch-and-kibana-16bn"&gt;aggregate&lt;/a&gt; data to gain insights.   &lt;/p&gt;

&lt;p&gt;But before we could run queries or aggregations, we had to first add data to Elasticsearch. &lt;/p&gt;

&lt;p&gt;Let’s say we are creating an app for a produce warehouse. You want to store data about produce in Elasticsearch so you can search for it.&lt;/p&gt;

&lt;p&gt;In Elasticsearch, data is stored as documents. A document is a JSON object that contains whatever data you want to store in Elasticsearch. &lt;/p&gt;

&lt;p&gt;Take a look at the following document about a produce item. In a JSON object, it contains a list of fields such as "name", "botanical_name", "produce_type" and etc. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo68y3y2wau8boh2qngbg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo68y3y2wau8boh2qngbg.png" alt="image" width="800" height="436"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you take a closer look at the fields, you will see that each field is of a different JSON data type. &lt;/p&gt;

&lt;p&gt;For example, for the field "name", the JSON data type is  string. For the field "quantity", the JSON data type is integer. For the field "preferred_vendor", the JSON data type is boolean. &lt;/p&gt;
&lt;h3&gt;
  
  
  Indexing a Document
&lt;/h3&gt;

&lt;p&gt;Let’s say we want to index the document above. We would send the following request.  &lt;/p&gt;

&lt;p&gt;For requests we will go over, the syntax is included for you so you can customize this for your own use case. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;Enter&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="k"&gt;of&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;the&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;But for our tutorial, we will use the example instead. &lt;/p&gt;

&lt;p&gt;The following example asks Elasticsearch to create a new index called the &lt;code&gt;temp_index&lt;/code&gt;, then index the following document into it. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;temp_index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Pineapple&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;botanical_name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Ananas comosus&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Fruit&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;country_of_origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_purchased&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2020-06-02T12:15:35&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;3.11&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;a large juicy tropical fruit consisting of aromatic edible yellow flesh surrounded by a tough segmented skin and topped with a tuft of stiff leaves.These pineapples are sourced from New Zealand.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_details&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Tropical Fruit Growers of New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;main_contact&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Hugh Rose&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_location&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Whangarei, New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;preferred_vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the example into the console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch will confirm that this document has been successfully indexed into the &lt;code&gt;temp_index&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg6slzgwtwhjm25zd8c2w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg6slzgwtwhjm25zd8c2w.png" alt="image" width="800" height="695"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What we have covered so far is a review from previous blogs. What we have not gone over is what actually goes on behind the scenes when you index a document. &lt;/p&gt;

&lt;p&gt;This is where &lt;code&gt;mapping&lt;/code&gt; comes into play!&lt;/p&gt;
&lt;h2&gt;
  
  
  Mapping Explained
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;Mapping&lt;/code&gt; determines how a document and its fields are indexed and stored by defining the type of each field.  &lt;/p&gt;

&lt;p&gt;The &lt;code&gt;mapping&lt;/code&gt; of an index look something like the following:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2cpvd2n3wflb4ynswx89.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2cpvd2n3wflb4ynswx89.png" alt="image" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It contains a list of the names and types of fields in an index. Depending on its type, each field is indexed and stored differently in Elasticsearch.  &lt;/p&gt;

&lt;p&gt;Therefore, &lt;code&gt;mapping&lt;/code&gt; plays an important role in how Elasticsearch stores and searches for data. &lt;/p&gt;
&lt;h3&gt;
  
  
  Dynamic Mapping
&lt;/h3&gt;

&lt;p&gt;In the previous request we sent, we had created a new index called the &lt;code&gt;temp_index&lt;/code&gt;, then we indexed a document into it. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;Mapping&lt;/code&gt; determines how a document and its fields should be indexed and stored. But we did not define the &lt;code&gt;mapping&lt;/code&gt; ahead of time. &lt;/p&gt;

&lt;p&gt;So how did this document get indexed?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F25kksi0rwy2hc6xmhtu1.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F25kksi0rwy2hc6xmhtu1.gif" alt="image" width="200" height="183"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When a user does not define the &lt;code&gt;mapping&lt;/code&gt; in advance, Elasticsearch creates or updates the &lt;code&gt;mapping&lt;/code&gt; as needed by default. This is known as &lt;code&gt;dynamic mapping&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Take a look at the following diagram. It illustrates what happens when a user asks Elasticsearch to create a new index without defining the &lt;code&gt;mapping&lt;/code&gt; ahead of time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9gi1o02c60dcz321ftu5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9gi1o02c60dcz321ftu5.png" alt="image" width="800" height="438"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When a user does not define the &lt;code&gt;mapping&lt;/code&gt; in advance, Elasticsearch creates or updates the &lt;code&gt;mapping&lt;/code&gt; as needed by default. This is known as &lt;code&gt;dynamic mapping&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;With &lt;code&gt;dynamic mapping&lt;/code&gt;, Elasticsearch looks at each field and tries to infer the data type from the field content. Then, it assigns a type to each field and creates a list of field names and types known as &lt;code&gt;mapping&lt;/code&gt;.  &lt;/p&gt;

&lt;p&gt;Depending on the assigned field type, each field is indexed and primed for different types of requests(full text search, aggregations, sorting). This is why &lt;code&gt;mapping&lt;/code&gt; plays an important role in how Elasticsearch stores and searches for data. &lt;/p&gt;
&lt;h3&gt;
  
  
  View the Mapping
&lt;/h3&gt;

&lt;p&gt;Now that we have indexed a document without defining the mapping in advance, let's take a look at the &lt;code&gt;dyanmic mapping&lt;/code&gt; that Elasticsearch has created for us. &lt;/p&gt;

&lt;p&gt;To view the &lt;code&gt;mapping&lt;/code&gt; of an index, you send the following request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example asks Elasticsearch to retrieve the &lt;code&gt;mapping&lt;/code&gt; of the &lt;code&gt;temp_index&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;temp_index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the example into the Kibana console and send the request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns the &lt;code&gt;mapping&lt;/code&gt; of the temp_index. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkhyl9xcuosag9pzzedlz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkhyl9xcuosag9pzzedlz.png" alt="image" width="800" height="1092"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs23ss8an8h4oweppic7i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs23ss8an8h4oweppic7i.png" alt="image" width="800" height="1082"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuip0msikk3xvol508md9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuip0msikk3xvol508md9.png" alt="image" width="800" height="987"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It lists all the fields of the document in an alphabetical order and lists the type of each field(text, keyword, long, float, date, boolean and etc). These are a few of many field types that are recognized by Elasticsearch. For the list of all field types, click &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-types.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;At a first glance, this &lt;code&gt;mapping&lt;/code&gt; may look really complicated. Rest assured. We are going to break this down into bite sized pieces! &lt;/p&gt;

&lt;p&gt;Depending on what your use case is, the &lt;code&gt;mapping&lt;/code&gt; can be customized to make storage and indexing more efficient.&lt;/p&gt;

&lt;p&gt;The rest of the blog will cover what type of &lt;code&gt;mapping&lt;/code&gt; is best for different types of requests. Then, we will learn how to define our own &lt;code&gt;mapping&lt;/code&gt;!&lt;/p&gt;
&lt;h3&gt;
  
  
  Indexing Strings
&lt;/h3&gt;

&lt;p&gt;Let's take a look at our document again.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgus3ibdip5v8ekwneovi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgus3ibdip5v8ekwneovi.png" alt="image" width="800" height="468"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Many of these fields such as "botanical_name", "produce_type", and "country_of_origin"(teal box) contain strings. &lt;/p&gt;

&lt;p&gt;Let's examine the mapping of the string fields "botanical_name","country_of_origin", and "description" below. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8nglbt98ys7bpr1c89w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8nglbt98ys7bpr1c89w.png" alt="image" width="800" height="1528"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You will see that these string fields(orange lines) have been mapped as both &lt;code&gt;text&lt;/code&gt; and &lt;code&gt;keyword&lt;/code&gt;(green lines) by default. &lt;/p&gt;

&lt;p&gt;There are two kinds of string data types:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Text&lt;/li&gt;
&lt;li&gt;Keyword&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;By default, every string gets mapped twice as a &lt;code&gt;text&lt;/code&gt; field and as a &lt;code&gt;keyword&lt;/code&gt; multi-field. Each field type is primed for different types of requests. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;Text&lt;/code&gt; field type is designed for full-text searches. One of the benefits of full text search is that you can search for individual terms in a non-case sensitive manner. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;Keyword&lt;/code&gt; field type is designed for exact searches, aggregations, and sorting. This field type becomes handy when you are searching for original strings. &lt;/p&gt;

&lt;p&gt;Depending on what type of request you want to run on each field, you can assign the field type as either &lt;code&gt;text&lt;/code&gt; or &lt;code&gt;keyword&lt;/code&gt; or both! &lt;/p&gt;

&lt;p&gt;Let's delve into &lt;code&gt;text&lt;/code&gt; field type first.&lt;/p&gt;
&lt;h4&gt;
  
  
  Text Field Type
&lt;/h4&gt;
&lt;h5&gt;
  
  
  Text Analysis
&lt;/h5&gt;

&lt;p&gt;Ever notice that when you search in Elasticsearch, it is not case sensitive or the punctuation does not seem to matter? This is because &lt;code&gt;text analysis&lt;/code&gt; occurs when your fields are indexed. &lt;/p&gt;

&lt;p&gt;The diagram below illustrates how the string "These pineapples are sourced from New Zealand." would be analyzed in Elasticsearch. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq5jy7jnwarw3ukpq9b7k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq5jy7jnwarw3ukpq9b7k.png" alt="image" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By default, strings are analyzed when it is indexed. When the string is analyzed, the string is broken up into individual words also known as tokens. The analyzer further lowercases each token and removes punctuations. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Inverted Index&lt;/strong&gt;&lt;br&gt;
Take a look at the following request shown in the diagram. It asks Elasticsearch to index a document with the field "description" and assign the document an id of 1. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqy8nt79vs5p5oj6wnln.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqy8nt79vs5p5oj6wnln.png" alt="image" width="800" height="369"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When this request is sent, Elasticsearch will look at the field "description" and see that this field contains a string. Therefore, it will map the field "description" as both &lt;code&gt;text&lt;/code&gt; and &lt;code&gt;keyword&lt;/code&gt; by default. &lt;/p&gt;

&lt;p&gt;When a field is mapped as type &lt;code&gt;text&lt;/code&gt;, the content of the field passes through an analyzer.&lt;/p&gt;

&lt;p&gt;Once the string is analyzed, the individual tokens are stored in a sorted list known as the &lt;code&gt;inverted index&lt;/code&gt;(see table above). Each unique token is stored in an &lt;code&gt;inverted index&lt;/code&gt; with its associated ID(red numbers in the table above). &lt;/p&gt;

&lt;p&gt;The same process occurs every time you index a new document. &lt;/p&gt;

&lt;p&gt;Let's say we indexed a new document with identical content as document 1. But we gave this document an id of 2(pink box).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhnl65vztjq6k593iore1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhnl65vztjq6k593iore1.png" alt="image" width="800" height="446"&gt;&lt;/a&gt;&lt;br&gt;
The content of the field "description" will go through the same process where it is split into individual tokens. However, if these tokens already exist in the &lt;code&gt;inverted index&lt;/code&gt;, only the document IDs are updated(red numbers in the table above).&lt;/p&gt;

&lt;p&gt;Look at the diagram below. If we add another document with an id of 3(pink box) with one new token(red box), then the new token(caledonia)is added to the &lt;code&gt;inverted index&lt;/code&gt;. Also, the doc ids are updated for existing tokens(red numbers in the table below).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5diwk7iguh18j09xvf77.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5diwk7iguh18j09xvf77.png" alt="image" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is what happens in the background when you index fields that are typed as &lt;code&gt;text&lt;/code&gt;.  &lt;/p&gt;

&lt;p&gt;Fields that are assigned the type &lt;code&gt;text&lt;/code&gt; are optimal for full text search.&lt;/p&gt;

&lt;p&gt;Take a look at this diagram below. The client is asking to fetch all documents containing the terms "new" or "zealand". &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu11winyezhirxqzgxq91.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu11winyezhirxqzgxq91.png" alt="image" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When this request is sent, Elasticsearch does not read through every document for the search terms. It goes straight to the &lt;code&gt;inverted index&lt;/code&gt; to look up the search terms, finds the matching document ids and sends the ids back to the user. &lt;/p&gt;

&lt;p&gt;The &lt;code&gt;inverted index&lt;/code&gt; is the reason Elasticsearch is able to search with speed.&lt;/p&gt;

&lt;p&gt;Moreover, since all the terms in the &lt;code&gt;inverted index&lt;/code&gt; are lowercase and your search queries are also analyzed, you can search for things in a non-case sensitive manner, and still get the same results.&lt;/p&gt;

&lt;p&gt;Let's do another review. &lt;/p&gt;

&lt;p&gt;Take a look at the diagram below. We send three requests to index three documents. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff60matt9iap215hprefu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff60matt9iap215hprefu.png" alt="image" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each document has a field called "country" with a string value. By default, the field "country" is mapped twice as text(teal arrow) and keyword(yellow arrow). &lt;/p&gt;

&lt;p&gt;&lt;code&gt;Text&lt;/code&gt; fields(teal arrow) are analyzed and the tokens are stored in an &lt;code&gt;inverted index&lt;/code&gt;(teal table). This is great for full text search but it is not optimized for performing aggregations, sorting or exact searches. &lt;/p&gt;

&lt;p&gt;For the types of actions mentioned above, Elasticsearch depends on the &lt;code&gt;keyword&lt;/code&gt; field(yellow arrow). &lt;/p&gt;

&lt;p&gt;When a &lt;code&gt;keyword&lt;/code&gt; field(yellow arrow) is created, the content of the field is not analyzed and is not stored in an inverted index. &lt;/p&gt;

&lt;p&gt;Instead, &lt;code&gt;keyword&lt;/code&gt; field uses a data structure called &lt;code&gt;doc values&lt;/code&gt;(yellow table) to store data.  &lt;/p&gt;
&lt;h4&gt;
  
  
  Keyword Field Type
&lt;/h4&gt;

&lt;p&gt;&lt;code&gt;Keyword&lt;/code&gt; field type is used for aggregations, sorting, and exact searches. These actions look up the document ID to find the values it has in its fields. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;Keyword&lt;/code&gt; field is suited to perform these actions because it uses a data structure called &lt;code&gt;doc values&lt;/code&gt; to store data. &lt;/p&gt;

&lt;p&gt;Take a look at the diagram below that illustrates &lt;code&gt;doc values&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsaurxgl7tog4npw2mixw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsaurxgl7tog4npw2mixw.png" alt="image" width="800" height="447"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For each document, the document id along with the field value(the original string) are added to a table. This data structure(&lt;code&gt;doc values&lt;/code&gt;) is designed for actions that require looking up the document ID to find the values it has in its fields. Therefore, the fields that are typed as &lt;code&gt;keyword&lt;/code&gt; are best suited for actions such as aggregations, sorting, and exact searches.  &lt;/p&gt;

&lt;p&gt;When Elasticsearch dynamically creates a &lt;code&gt;mapping&lt;/code&gt; for you, it does not know what you want to use a string for. As a result, Elasticsearch maps all strings to both field types(&lt;code&gt;text&lt;/code&gt; and &lt;code&gt;keyword&lt;/code&gt;). &lt;/p&gt;

&lt;p&gt;In cases where you do not need both field types, the default setting is wasteful. Since both field types require creating either an &lt;code&gt;inverted index&lt;/code&gt; or &lt;code&gt;doc values&lt;/code&gt;, creating both field types for unnecessary fields will slow down indexing and take up more disk space.  &lt;/p&gt;

&lt;p&gt;This is why we define our own &lt;code&gt;mapping&lt;/code&gt; as it helps us store and search data more efficiently. &lt;/p&gt;

&lt;p&gt;Doing so takes more planning because we need to decide what type of requests we want to run on these fields. The decisions we make will allow us to designate which string fields will:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;only be full text searchable or&lt;/li&gt;
&lt;li&gt;only be used in aggregation or &lt;/li&gt;
&lt;li&gt;be able to support both options &lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  Mapping Exercise
&lt;/h3&gt;

&lt;p&gt;Now that we understand the field types &lt;code&gt;text&lt;/code&gt; and &lt;code&gt;keyword&lt;/code&gt;, let’s go over how we can optimize our &lt;code&gt;mapping&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;To do so, we are going to do an exercise! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Project&lt;/strong&gt;: Build an app for a client who manages a produce warehouse &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;This app must enable users to:&lt;/strong&gt; &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;search for produce name, country of origin and description&lt;/li&gt;
&lt;li&gt;identify top countries of origin with the most frequent purchase history&lt;/li&gt;
&lt;li&gt;sort produce by produce type(fruit or vegetable)&lt;/li&gt;
&lt;li&gt;get the summary of monthly expense&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Goals&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Figure out the optimal &lt;code&gt;mapping&lt;/code&gt; for desired features&lt;/li&gt;
&lt;li&gt;Create an index with the optimal &lt;code&gt;mapping&lt;/code&gt; and index documents into it&lt;/li&gt;
&lt;li&gt;Learn how you should approach a situation that requires changing the &lt;code&gt;mapping&lt;/code&gt; of an existing field&lt;/li&gt;
&lt;li&gt;Learn how to use the &lt;code&gt;runtime field&lt;/code&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The following is a sample document we will be working with for this exercise. Pay attention to the field names as these field names will come up frequently during this exercise! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sample data&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Pineapple&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;botanical_name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Ananas comosus&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Fruit&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;country_of_origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_purchased&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2020-06-02T12:15:35&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;3.11&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;a large juicy tropical fruit consisting of aromatic edible yellow flesh surrounded by a tough segmented skin and topped with a tuft of stiff leaves.These pineapples are sourced from New Zealand.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_details&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Tropical Fruit Growers of New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;main_contact&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Hugh Rose&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_location&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Whangarei, New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;preferred_vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Plan of Action&lt;/strong&gt;&lt;br&gt;
The figures below outline the plan of action for each feature requested by the client. Explanations regarding each bullet point are provided in the blog.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3lqf13q0nln1w42x01z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3lqf13q0nln1w42x01z.png" alt="image" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The first feature allows the user to search for produce name, country of origin and description. &lt;/p&gt;

&lt;p&gt;Let's take a look at the sample document shared with you earlier.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sample data&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Pineapple&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;botanical_name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Ananas comosus&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Fruit&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;country_of_origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_purchased&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2020-06-02T12:15:35&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;3.11&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;a large juicy tropical fruit consisting of aromatic edible yellow flesh surrounded by a tough segmented skin and topped with a tuft of stiff leaves.These pineapples are sourced from New Zealand.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_details&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Tropical Fruit Growers of New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;main_contact&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Hugh Rose&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_location&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Whangarei, New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;preferred_vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The first feature involves working with the fields "name", "country_of_origin", and "description"(first bullet point in the figure below). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3lqf13q0nln1w42x01z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3lqf13q0nln1w42x01z.png" alt="image" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;All of these fields contain strings. By default, these fields will get mapped twice as &lt;code&gt;text&lt;/code&gt; and &lt;code&gt;keyword&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Let’s see if we need both field types. &lt;/p&gt;

&lt;p&gt;Our client wants to search on these fields. But it is unlikely that the user will send search terms that are exactly the way it is written in our documents. &lt;/p&gt;

&lt;p&gt;So the user should be able to run a search for individual terms in a non-case sensitive manner. Therefore, the fields "name", "country_of_origin" and "description" should be full text searchable(second bullet point in the figure below). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3lqf13q0nln1w42x01z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3lqf13q0nln1w42x01z.png" alt="image" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let’s say our client does not want to run aggregation, sorting, or exact searches on the fields "name" and "description"(third bullet point in the figure above). &lt;/p&gt;

&lt;p&gt;For these fields, we do not need the &lt;code&gt;keyword&lt;/code&gt; type. To avoid mapping these fields twice, we will specify that we only want the &lt;code&gt;text&lt;/code&gt; type for these fields(blue bullet point in the figure above). &lt;/p&gt;

&lt;p&gt;At this point, we know that the field "country_of_origin" should be full text searchable(&lt;code&gt;text&lt;/code&gt;). But it is unclear whether we will need aggregation, sorting, or exact searches to be performed on this field(fourth bullet point in the figure above). &lt;/p&gt;

&lt;p&gt;Let’s leave that for now. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1j1kpfy59vvzdgu13fxz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1j1kpfy59vvzdgu13fxz.png" alt="image" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you look at the second feature, the user should be able to identify top countries of origin with the most frequent purchase history.&lt;/p&gt;

&lt;p&gt;This feature involves the field "country_of_origin"(first bullet point in the figure above). &lt;/p&gt;

&lt;p&gt;For this type of request, we need to run the terms aggregations on this field(second bullet point in the figure above), which means that we need a &lt;code&gt;keyword&lt;/code&gt; field.&lt;/p&gt;

&lt;p&gt;Since we need to perform full text search(first feature) and aggregations(second feature) on this field, we will map this field twice as &lt;code&gt;text&lt;/code&gt; and &lt;code&gt;keyword&lt;/code&gt;(third bullet point in the figure above). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fljxwtjri8ypmidbp6e8v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fljxwtjri8ypmidbp6e8v.png" alt="image" width="800" height="435"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The third feature allows the user to sort by produce type. This feature involves the field "produce_type" which contains a string(first bullet point in the figure above).  &lt;/p&gt;

&lt;p&gt;Since this feature requires sorting(second bullet point in the figure above), we need a &lt;code&gt;keyword&lt;/code&gt; type for this field.  &lt;/p&gt;

&lt;p&gt;Let’s say the client does not want to run a full text search for this field. &lt;/p&gt;

&lt;p&gt;As a result, we will need to map this field as &lt;code&gt;keyword&lt;/code&gt; type only(third bullet point in the figure above). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkteq05hpg9dwtjkvz4d4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkteq05hpg9dwtjkvz4d4.png" alt="image" width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The fourth feature allows the user to get the summary of monthly expense.&lt;/p&gt;

&lt;p&gt;This feature involves the fields "date_purchased", "quantity", and "unit_price"(first bullet point in the figure above).&lt;/p&gt;

&lt;p&gt;So let’s break this down.&lt;/p&gt;

&lt;p&gt;This feature requires splitting the data into monthly buckets. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbeyvn7vdm1xg8rbqoek4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbeyvn7vdm1xg8rbqoek4.png" alt="image" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, calculating the monthly revenue for each bucket by adding up the total of each document in the bucket. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rqm7d7nunpkfd9vw0u4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rqm7d7nunpkfd9vw0u4.png" alt="image" width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The next diagram shows you the April bucket. The documents of all produce purchased during April are included in this bucket. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr51rowafams5j819ml2w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr51rowafams5j819ml2w.png" alt="image" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's zoom in on one of the documents to look at its fields(json object highlighted with pink lines). &lt;/p&gt;

&lt;p&gt;You will notice that our documents do not have a field called "total".&lt;/p&gt;

&lt;p&gt;However, we do have fields called "quantity" and "unit_price"(pink box). &lt;/p&gt;

&lt;p&gt;In order to get the total for each document, we need to multiply the values of the fields "quantity" by "unit_price". Then, add up the total of all documents in each bucket. This will yield the monthly expense.&lt;/p&gt;

&lt;p&gt;The following figure shows the summary of optimal &lt;code&gt;mapping&lt;/code&gt; for all string fields that we have just discussed. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flezzge1fzl617986qvik.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flezzge1fzl617986qvik.png" alt="image" width="800" height="440"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Take a look at the first bullet point in the figure above. It states that for the fourth feature, we need to calculate the total for each produce so it can be used to calculate the monthly expense.&lt;/p&gt;

&lt;p&gt;The second bullet point in the figure outlines the additional step we are going to take to make our &lt;code&gt;mapping&lt;/code&gt; more efficient. &lt;/p&gt;

&lt;p&gt;The following is the reason why we are disabling the field "botanical_name" and the object "vendor_details"(second bullet point).&lt;/p&gt;

&lt;p&gt;Our document contains multiple fields. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sample data&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Pineapple&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;botanical_name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Ananas comosus&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Fruit&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;country_of_origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_purchased&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2020-06-02T12:15:35&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;3.11&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;a large juicy tropical fruit consisting of aromatic edible yellow flesh surrounded by a tough segmented skin and topped with a tuft of stiff leaves.These pineapples are sourced from New Zealand.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_details&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Tropical Fruit Growers of New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;main_contact&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Hugh Rose&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_location&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Whangarei, New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;preferred_vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After going through each feature, we know that the string field "botanical_name" and the object "vendor_details" will not be used.&lt;/p&gt;

&lt;p&gt;Since we do not need the &lt;code&gt;inverted index&lt;/code&gt; or the &lt;code&gt;doc values&lt;/code&gt; of these fields, we are going to disable these fields. This in turn prevents the &lt;code&gt;inverted index&lt;/code&gt; and the &lt;code&gt;doc values&lt;/code&gt; of these fields from being created.&lt;/p&gt;

&lt;p&gt;Disabling these fields will help us save disk space and minimize the number of fields in the &lt;code&gt;mapping&lt;/code&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  Defining your own mapping
&lt;/h3&gt;

&lt;p&gt;Now that we have thought through the optimal &lt;code&gt;mapping&lt;/code&gt;, let’s define our own! &lt;/p&gt;

&lt;p&gt;Before we get to that, there are a few rules about &lt;code&gt;mapping&lt;/code&gt; you need to know. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Rules&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;If you do not define a &lt;code&gt;mapping&lt;/code&gt; ahead of time, Elasticsearch dynamically creates a &lt;code&gt;mapping&lt;/code&gt; for you.&lt;/li&gt;
&lt;li&gt;If you do decide to define your own &lt;code&gt;mapping&lt;/code&gt;, you can do so at index creation.&lt;/li&gt;
&lt;li&gt;ONE &lt;code&gt;mapping&lt;/code&gt; is defined per index. Once the index has been created, we can only add &lt;em&gt;new&lt;/em&gt; fields to a &lt;code&gt;mapping&lt;/code&gt;. We CANNOT change the &lt;code&gt;mapping&lt;/code&gt; of an &lt;em&gt;existing&lt;/em&gt; field. &lt;/li&gt;
&lt;li&gt;If you must change the type of an existing field, you must create a new index with the desired &lt;code&gt;mapping&lt;/code&gt;, then reindex all documents into the new index. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We will go over how these rules work as we go over the steps of defining our own &lt;code&gt;mapping&lt;/code&gt;! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Index a sample document into a test index.&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;Name&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="k"&gt;of&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;test&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The first thing we are going to do is to create a new index called &lt;code&gt;test_index&lt;/code&gt; and index a sample document. &lt;/p&gt;

&lt;p&gt;The document in the following example is the same sample document we have seen earlier. The sample document must contain the fields that you want to define. These fields must also contain values that map closely to the field types you want. &lt;/p&gt;

&lt;p&gt;Copy and paste the following example into the Kibana console and send the request.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;test_index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Pineapple&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;botanical_name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Ananas comosus&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Fruit&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;country_of_origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_purchased&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2020-06-02T12:15:35&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;3.11&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;a large juicy tropical fruit consisting of aromatic edible yellow flesh surrounded by a tough segmented skin and topped with a tuft of stiff leaves.These pineapples are sourced from New Zealand.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_details&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Tropical Fruit Growers of New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;main_contact&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Hugh Rose&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_location&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Whangarei, New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;preferred_vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
The &lt;code&gt;test_index&lt;/code&gt; is successfully created. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fir9relcwajlfvsprl1xo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fir9relcwajlfvsprl1xo.png" alt="image" width="415" height="440"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By default, Elasticsearch will create a &lt;code&gt;dynamic mapping&lt;/code&gt; based on the sample document. &lt;/p&gt;

&lt;p&gt;What is the point of step 1?&lt;/p&gt;

&lt;p&gt;Earlier in the blog, we have indexed the sample document and viewed the &lt;code&gt;dynamic mapping&lt;/code&gt;. If you remember, the &lt;code&gt;mapping&lt;/code&gt; for our sample document was pretty lengthy. &lt;/p&gt;

&lt;p&gt;Since we do not want to write our optimized &lt;code&gt;mapping&lt;/code&gt; from scratch, we are indexing a sample document so that Elasticsearch will create the &lt;code&gt;dynamic mapping&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;We will use the &lt;code&gt;dynamic mapping&lt;/code&gt; as a template and make changes to it to avoid writing out the whole &lt;code&gt;mapping&lt;/code&gt; for our index!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: View the dynamic mapping&lt;/strong&gt; &lt;br&gt;
&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Name&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;the&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;whose&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;mapping&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;you&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;want&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;to&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;view&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's view the mapping of the &lt;code&gt;test_index&lt;/code&gt; by sending the following request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;test_index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch will display the &lt;code&gt;dynamic mapping&lt;/code&gt; it has created. It lists the fields in an alphabetical order. The sample document is identical to the one we previously indexed into the&lt;code&gt;temp_index&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F218zpdhuuzwd65q7xafm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F218zpdhuuzwd65q7xafm.png" alt="image" width="800" height="1195"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4xtorujpgspurl82n8ut.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4xtorujpgspurl82n8ut.png" alt="image" width="800" height="1169"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frnucdz4c0w53gimjjvph.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frnucdz4c0w53gimjjvph.png" alt="image" width="800" height="999"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Edit the mapping&lt;/strong&gt;&lt;br&gt;
Copy and paste the entire &lt;code&gt;mapping&lt;/code&gt; from step 2 into the Kibana console. Then, make the changes specified below. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8fhj4b12mtwwrrinwvjh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8fhj4b12mtwwrrinwvjh.png" alt="image" width="800" height="715"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From the pasted results, remove the "test_index" along with its opening and closing brackets. Then, edit the &lt;code&gt;mapping&lt;/code&gt; to satisfy the requirements outlined in the diagram below.   &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F71txq0tmi1kxbt0ja6mx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F71txq0tmi1kxbt0ja6mx.png" alt="image" width="800" height="440"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Your optimized &lt;code&gt;mapping&lt;/code&gt; should look like the following: &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8fjn8ljtostx5mk301p1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8fjn8ljtostx5mk301p1.png" alt="image" width="668" height="1728"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You will see that the field "country_of_origin"(2) has been typed as both &lt;code&gt;text&lt;/code&gt; and &lt;code&gt;keyword&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;The fields "description"(3) and "name"(4) have been typed as &lt;code&gt;text&lt;/code&gt; only. &lt;/p&gt;

&lt;p&gt;The field field "produce_type"(5) has been typed as &lt;code&gt;keyword&lt;/code&gt; only. &lt;/p&gt;

&lt;p&gt;In the field "botanical_name"(1), a parameter called "enabled" was added and was set to "false". This prevents the &lt;code&gt;inverted index&lt;/code&gt; and &lt;code&gt;doc values&lt;/code&gt; from being created for this field. &lt;/p&gt;

&lt;p&gt;The same has been done for the object "vendor_details"(6). &lt;/p&gt;

&lt;p&gt;The &lt;code&gt;mapping&lt;/code&gt; above defines the optimal &lt;code&gt;mapping&lt;/code&gt; for all of our desired features except for the one marked a red x in the figure below. &lt;/p&gt;

&lt;p&gt;Don't worry about the unaddressed feature yet as we are saving this for later! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcbj8yxu9oue5clllsumt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcbj8yxu9oue5clllsumt.png" alt="image" width="800" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4: Create a new index with the optimized mapping from step 3.&lt;/strong&gt; &lt;br&gt;
&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;Name&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="k"&gt;of&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;your&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;final&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;copy&lt;/span&gt; &lt;span class="nx"&gt;and&lt;/span&gt; &lt;span class="nx"&gt;paste&lt;/span&gt; &lt;span class="nx"&gt;your&lt;/span&gt; &lt;span class="nx"&gt;optimized&lt;/span&gt; &lt;span class="nx"&gt;mapping&lt;/span&gt; &lt;span class="nx"&gt;here&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, we will create a new index called &lt;code&gt;produce_index&lt;/code&gt; with the optimized &lt;code&gt;mapping&lt;/code&gt; we just worked on. &lt;/p&gt;

&lt;p&gt;If you still have the &lt;code&gt;mapping&lt;/code&gt; from step 3 in the Kibana console, delete the &lt;code&gt;mapping&lt;/code&gt; from the Kibana console. &lt;/p&gt;

&lt;p&gt;Then, copy and paste the following example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;produce_index&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;mappings&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;properties&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;botanical_name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;enabled&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;country_of_origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fields&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_purchased&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;long&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;float&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_details&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;enabled&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch creates the &lt;code&gt;produce_index&lt;/code&gt; with the optimized &lt;code&gt;mapping&lt;/code&gt; we defined above! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7zfi4329k2enu9f1r1a4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7zfi4329k2enu9f1r1a4.png" alt="image" width="412" height="152"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 5: Check the mapping of the new index to make sure the all the fields have been mapped correctly&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Name&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="k"&gt;of&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;test&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's check the mapping of the &lt;code&gt;produce_index&lt;/code&gt; to make sure that all the fields have been mapped correctly.&lt;/p&gt;

&lt;p&gt;Copy and paste the following example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;produce_index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Compared to the dynamic mapping, our optimized &lt;code&gt;mapping&lt;/code&gt; looks more simple and concise!  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feeff2rqq7q1kv9gkrv3c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feeff2rqq7q1kv9gkrv3c.png" alt="image" width="800" height="1949"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The current &lt;code&gt;mapping&lt;/code&gt; satisfies the requirements that are marked with green check marks in the figure below.&lt;br&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffo6qiz0pb8cr45ir1ijy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffo6qiz0pb8cr45ir1ijy.png" alt="image" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Just by defining our own &lt;code&gt;mapping&lt;/code&gt;, we prevented unnecessary &lt;code&gt;inverted index&lt;/code&gt; and &lt;code&gt;doc values&lt;/code&gt; from being created which saved a lot of disk space. &lt;/p&gt;

&lt;p&gt;The type of each field has been customized to serve specific client requests. As a result, we also optimized the performance of Elasticsearch as well!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 6: Index your dataset into the new index&lt;/strong&gt;&lt;br&gt;
Now that we have created a new index(&lt;code&gt;produce_index&lt;/code&gt;) with the optimal &lt;code&gt;mapping&lt;/code&gt;, it is time to add data to the &lt;code&gt;produce_index&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;For simplicity's sake, we will index two documents by sending the following requests.  &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Index the first document&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;produce_index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Pineapple&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;botanical_name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Ananas comosus&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Fruit&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;country_of_origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_purchased&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2020-06-02T12:15:35&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;3.11&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;a large juicy tropical fruit consisting of aromatic edible yellow flesh surrounded by a tough segmented skin and topped with a tuft of stiff leaves.These pineapples are sourced from New Zealand.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_details&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Tropical Fruit Growers of New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;main_contact&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Hugh Rose&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_location&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Whangarei, New Zealand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;preferred_vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch successfully indexes the first document. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzn75qepi3z2qach6he2b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzn75qepi3z2qach6he2b.png" alt="image" width="571" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Index the second document&lt;/em&gt;&lt;br&gt;
The second document has almost identical fields as the first document except that it has an extra field called "organic" set to true!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;produce_index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_doc&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Mango&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;botanical_name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Harum Manis&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Fruit&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;country_of_origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Indonesia&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;organic&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_purchased&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2020-05-02T07:15:35&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;1.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Mango Arumanis or Harum Manis is originated from East Java. Arumanis means harum dan manis or fragrant and sweet just like its taste. The ripe Mango Arumanis has dark green skin coated with thin grayish natural wax. The flesh is deep yellow, thick, and soft with little to no fiber. Mango Arumanis is best eaten when ripe.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_details&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Ayra Shezan Trading&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;main_contact&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Suharto&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_location&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Binjai, Indonesia&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;preferred_vendor&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch successfully indexes the second document. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjdxuedr454pcyms9cp8g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjdxuedr454pcyms9cp8g.png" alt="image" width="481" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's see what happens to the &lt;code&gt;mapping&lt;/code&gt; by sending this request below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;produce_index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
The new field("organic") and its field type(boolean) have been added to the &lt;code&gt;mapping&lt;/code&gt;(orange box). This is in line with the rules of &lt;code&gt;mapping&lt;/code&gt; we discussed earlier since you can add &lt;em&gt;new&lt;/em&gt; fields to the &lt;code&gt;mapping&lt;/code&gt;. We just cannot change the &lt;code&gt;mapping&lt;/code&gt; of an &lt;em&gt;existing&lt;/em&gt; field! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffk2v520lnbwoprh6x7ry.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffk2v520lnbwoprh6x7ry.png" alt="image" width="726" height="1712"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h4&gt;
  
  
  What if you do need to make changes to the existing field type?
&lt;/h4&gt;

&lt;p&gt;Let's say you have added a huge produce dataset to the &lt;code&gt;produce_index&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Then, your client asks for an additional feature where the user can run a full text search on the field "botanical_name".  &lt;/p&gt;

&lt;p&gt;This is a dilemma since we disabled the field "botanical_name" when we created the &lt;code&gt;produce_index&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhthnuufaa7tc2c0gsmut.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhthnuufaa7tc2c0gsmut.gif" alt="image" width="420" height="468"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What are we to do?&lt;/p&gt;

&lt;p&gt;Remember, you CANNOT change the &lt;code&gt;mapping&lt;/code&gt; of an &lt;em&gt;existing&lt;/em&gt; field. If you do need to make changes to the existing field, you must create a new index with the desired &lt;code&gt;mapping&lt;/code&gt;, then reindex all documents into the new index. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;STEP 1: Create a new index(produce_v2) with the latest mapping.&lt;/strong&gt;&lt;br&gt;
This step is very similar to what we just did with the &lt;code&gt;test_index&lt;/code&gt; and the &lt;code&gt;produce_index&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;We are just creating a new index(&lt;code&gt;produce_v2&lt;/code&gt;) with a new &lt;code&gt;mapping&lt;/code&gt; where we remove the "enabled" parameter from the field "botanical_name" and change its type to "text". &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;produce_v2&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;mappings&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;properties&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;botanical_name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;country_of_origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fields&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ignore_above&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;256&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_purchased&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;organic&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;boolean&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;long&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;float&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;vendor_details&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;object&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;enabled&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch creates a new index(produce_v2) with the latest &lt;code&gt;mapping&lt;/code&gt;. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzvs72d45pr6kykr6ksae.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzvs72d45pr6kykr6ksae.png" alt="image" width="800" height="205"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Check the &lt;code&gt;mapping&lt;/code&gt; of the &lt;code&gt;produce_v2&lt;/code&gt; index by sending the following request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;View the mapping of produce_v2:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;produce_v2&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
You will see that the field "botanical_name" has been typed as &lt;code&gt;text&lt;/code&gt;(orange box). &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feg119a3hmcu4cwmosg5p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feg119a3hmcu4cwmosg5p.png" alt="image" width="640" height="1740"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;STEP 2: Reindex the data from the original index(&lt;code&gt;produce_index&lt;/code&gt;) to the one you just created(&lt;code&gt;produce_v2&lt;/code&gt;).&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We just created a new index(&lt;code&gt;produce_v2&lt;/code&gt;) with the desired &lt;code&gt;mapping&lt;/code&gt;. The next step is to reindex the data from the &lt;code&gt;produce_index&lt;/code&gt; to the &lt;code&gt;produce_v2&lt;/code&gt; index. &lt;/p&gt;

&lt;p&gt;To do so, we send the following request:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;_reindex&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;index&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_index&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;dest&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;index&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;produce_v2&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The field "source" refers to the original index that contains the dataset we want to reindex. The field "dest"(destination) refers to the new index with the optimized &lt;code&gt;mapping&lt;/code&gt;. The example requests for the data to be reindexed to the new index.&lt;/p&gt;

&lt;p&gt;Copy and paste the request above into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response form Elasticsearch:&lt;/strong&gt;&lt;br&gt;
This request reindexes data from the &lt;code&gt;produce_index&lt;/code&gt; to the &lt;code&gt;produce_v2&lt;/code&gt; index. The &lt;code&gt;produce_v2&lt;/code&gt; index can now be used to run the requests that the client has specified. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftpqdhu1r1ipfa5nz3ecp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftpqdhu1r1ipfa5nz3ecp.png" alt="image" width="686" height="543"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h4&gt;
  
  
  Runtime Field
&lt;/h4&gt;

&lt;p&gt;We have one last feature to work on! &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkteq05hpg9dwtjkvz4d4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkteq05hpg9dwtjkvz4d4.png" alt="image" width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Our client wants to be able to get the summary of monthly expense.&lt;/p&gt;

&lt;p&gt;This feature requires splitting the data into monthly buckets. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbeyvn7vdm1xg8rbqoek4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbeyvn7vdm1xg8rbqoek4.png" alt="image" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, calculating the monthly revenue for each bucket by adding up the total of each document in the bucket. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rqm7d7nunpkfd9vw0u4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5rqm7d7nunpkfd9vw0u4.png" alt="image" width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The problem is that our documents do not have a field called "total". However, we do have fields called "quantity" and "unit_price". &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr51rowafams5j819ml2w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr51rowafams5j819ml2w.png" alt="image" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In order to get the total for each document, we need to multiply the values of the field "quantity" by "unit_price". Then, add up the total of all documents in each bucket. This will yield the monthly expense. &lt;/p&gt;

&lt;p&gt;At this point, you might be thinking we will have to calculate the total and store this in a new field called "total". Then, add the field "total" to the existing documents and run the sum aggregation on this field. &lt;/p&gt;

&lt;p&gt;This can get kinda hairy because it involves adding a new field to the existing documents in your index. This requires reindexing your data and starting all over again. &lt;/p&gt;

&lt;p&gt;But there is a new feature that helps you get around this issue. It is called the &lt;code&gt;runtime field&lt;/code&gt;! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is &lt;code&gt;runtime&lt;/code&gt;?&lt;/strong&gt;&lt;br&gt;
A &lt;code&gt;runtime&lt;/code&gt; is that moment in time when Elasticsearch is executing your requests. During &lt;code&gt;runtime&lt;/code&gt;, you can actually create a temporary field and calculate a value within it. This field then can be used to run whatever request that was sent during runtime. &lt;/p&gt;

&lt;p&gt;This may sound very abstract to you at this point so let’s break this down.&lt;/p&gt;

&lt;p&gt;For our last feature, the missing ingredient is the field "total". We must calculate the total by multiplying the values of the fields "quantity" and "unit_price". &lt;/p&gt;

&lt;p&gt;Right now the field "total" does not exist in our dataset so we cannot aggregate on this field. &lt;/p&gt;

&lt;p&gt;What we are going to do is to create a &lt;code&gt;runtime field&lt;/code&gt; called "total" and add it to the &lt;code&gt;mapping&lt;/code&gt; of the existing index. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Create a runtime field and add it to the mapping of the existing index.&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;Enter&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="k"&gt;of&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;runtime&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name-your-runtime-field-here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Specify-field-type-here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;script&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Specify the formula you want executed&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following request asks Elasticsearch to create a &lt;code&gt;runtime field&lt;/code&gt; called "total" and add it to the mapping of &lt;code&gt;produce_v2&lt;/code&gt; index. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6jhoa0wbl5i7pbjepv5p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6jhoa0wbl5i7pbjepv5p.png" alt="image" width="800" height="294"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;runtime&lt;/code&gt; field(1) is created only when a user runs a request against this field. &lt;/p&gt;

&lt;p&gt;We know that the field "total"(2) will contain decimal points so we will set the field "type" to "double"(3). &lt;/p&gt;

&lt;p&gt;In the "script"(4), we write out the formula to calculate the value for the field "total". The "script" instructs Elasticsearch to do the following(5): For each document, multiply the value of the field "unit_price" by the value of the field "quantity".&lt;/p&gt;

&lt;p&gt;Copy and paste the following example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;produce_v2&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;runtime&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;total&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;double&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;script&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;emit(doc['unit_price'].value* doc['quantity'].value)&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt; &lt;br&gt;
Elasticsearch successfully adds the &lt;code&gt;runtime field&lt;/code&gt; to the mapping. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgifeobu6bl28bd35kdon.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgifeobu6bl28bd35kdon.png" alt="image" width="695" height="143"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Check the mapping:&lt;/strong&gt;&lt;br&gt;
Let's check the mapping of the &lt;code&gt;produce_v2&lt;/code&gt; index by sending the following request.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;produce_v2&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_mapping&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch adds a &lt;code&gt;runtime field&lt;/code&gt; to the mapping(red box).  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyi1u4cpx7gl65udiuvr3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyi1u4cpx7gl65udiuvr3.png" alt="image" width="800" height="1261"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Note that the &lt;code&gt;runtime field&lt;/code&gt; is not listed under "properties" object which includes the fields in our documents. This is because the &lt;code&gt;runtime field&lt;/code&gt; "total" is not indexed! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The &lt;code&gt;runtime field&lt;/code&gt; is only created and calculated at &lt;code&gt;runtime&lt;/code&gt; as you execute your request!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Remember, our indexed documents do not have a field "total". But by adding a &lt;code&gt;runtime field&lt;/code&gt; to our &lt;code&gt;mapping&lt;/code&gt;, we are leaving some instructions for Elasticsearch.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;mapping&lt;/code&gt; tells Elasticsearch that if it were to receive a request on the field "total", it should create a temporary field called "total" for each document and calculate its value by running the "script". Then, run the request on the field "total" and send the results to the user.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Run a request on the &lt;code&gt;runtime field&lt;/code&gt; to see it perform its magic!&lt;/strong&gt; &lt;br&gt;
Let's put &lt;code&gt;runtime field&lt;/code&gt; to the test. &lt;/p&gt;

&lt;p&gt;We are going to send the following request to perform a sum aggregation on the &lt;code&gt;runtime field&lt;/code&gt; "total". &lt;/p&gt;

&lt;p&gt;Note that the following request does not aggregate the monthly expense here. We are running a simple aggregation request to demonstrate how the &lt;code&gt;runtime field&lt;/code&gt; works!  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Specify the aggregation type here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following request runs a sum aggregation against the &lt;code&gt;runtime field&lt;/code&gt; "total" over all documents in our index.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;produce_v2&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;total_expense&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;total&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
When this request is sent, a &lt;code&gt;runtime field&lt;/code&gt; called "total" is created and calculated for documents within the scope of our request(entire index). Then, the sum aggregation is ran on the field "total" over all documents in our index.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvaz7nbc14t5s16uk1ptb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvaz7nbc14t5s16uk1ptb.png" alt="image" width="800" height="944"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What are some of the benefits of using the &lt;code&gt;runtime field&lt;/code&gt;?&lt;/strong&gt;&lt;br&gt;
The &lt;code&gt;runtime field&lt;/code&gt; is only created and calculated when a request made on the &lt;code&gt;runtime field&lt;/code&gt; is being executed. &lt;code&gt;Runtime fields&lt;/code&gt; are not indexed so these do not take up disk space.  &lt;/p&gt;

&lt;p&gt;We also did not have to reindex in order to add a new field to existing documents. For more information on runtime fields, check out this &lt;a href="https://www.elastic.co/blog/introducing-elasticsearch-runtime-fields" rel="noopener noreferrer"&gt;blog&lt;/a&gt;! &lt;/p&gt;

&lt;p&gt;There you have it. You have now mastered the basics of &lt;code&gt;mapping&lt;/code&gt;! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2glgz9gzhorzsf77izjh.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2glgz9gzhorzsf77izjh.gif" alt="image" width="500" height="218"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Play around with &lt;code&gt;mapping&lt;/code&gt; on your own and come up with the optimal &lt;code&gt;mapping&lt;/code&gt; for your use case! &lt;/p&gt;

</description>
      <category>beginners</category>
      <category>elasticsearch</category>
      <category>database</category>
      <category>datascience</category>
    </item>
    <item>
      <title>Running aggregations with Elasticsearch and Kibana</title>
      <dc:creator>Lisa Jung</dc:creator>
      <pubDate>Thu, 05 Aug 2021 17:59:39 +0000</pubDate>
      <link>https://dev.to/elastic/running-aggregations-with-elasticsearch-and-kibana-lni</link>
      <guid>https://dev.to/elastic/running-aggregations-with-elasticsearch-and-kibana-lni</guid>
      <description>&lt;p&gt;There are two main ways to search in Elasticsearch:&lt;br&gt;
1) &lt;code&gt;Queries&lt;/code&gt; retrieve documents that match the specified criteria. &lt;br&gt;
2) &lt;code&gt;Aggregations&lt;/code&gt; present the summary of your data as metrics, statistics, and other analytics. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flqfqb8irs6ob1b13hxla.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flqfqb8irs6ob1b13hxla.png" alt="image" width="800" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frq5t03d1e7ciqpj1226j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frq5t03d1e7ciqpj1226j.png" alt="image" width="800" height="385"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In my previous &lt;a href="https://dev.to/lisahjung/beginner-s-guide-to-running-queries-with-elasticsearch-and-kibana-4kn9"&gt;blog&lt;/a&gt;, we learned how to retrieve documents by sending &lt;code&gt;queries&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;This blog will cover how you can summarize your data as metrics, statistics, or other analytics by sending &lt;code&gt;aggregations&lt;/code&gt; requests! &lt;/p&gt;

&lt;p&gt;By the end of this blog, you will be able to run:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;metric aggregations&lt;/li&gt;
&lt;li&gt;bucket aggregations&lt;/li&gt;
&lt;li&gt;combined aggregations&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Prerequisite work
&lt;/h2&gt;

&lt;p&gt;Watch this &lt;a href="https://www.youtube.com/watch?v=CCTgroOcyfM" rel="noopener noreferrer"&gt;video&lt;/a&gt; from time stamp 15:00-21:46. This video will show you how to complete steps 1-3.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Set up Elasticsearch and Kibana* &lt;/li&gt;
&lt;li&gt;Add &lt;a href="https://www.kaggle.com/carrie1/ecommerce-data" rel="noopener noreferrer"&gt;e-commerce dataset&lt;/a&gt; to Elasticsearch*&lt;/li&gt;
&lt;li&gt;Open the Kibana console(AKA Dev Tools)
&lt;/li&gt;
&lt;li&gt;Keep two windows open side by side(this blog and the Kibana console)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We will be sending &lt;code&gt;aggregations&lt;/code&gt; requests from Kibana to Elasticsearch to learn how &lt;code&gt;aggregations&lt;/code&gt; work! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Notes&lt;/strong&gt;&lt;br&gt;
1) If you would rather download Elasticsearch and Kibana on your own machine, follow the steps outlined in &lt;a href="https://dev.to/elastic/downloading-elasticsearch-and-kibana-macos-linux-and-windows-1mmo"&gt;Downloading Elasticsearch and Kibana(macOS/Linux and Windows)&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;2) The video will show you how to add the news headline dataset. Follow the same steps but add the e-commerce dataset linked above. Create an index called &lt;code&gt;ecommerce_original&lt;/code&gt; and add the data to that index. &lt;/p&gt;
&lt;h2&gt;
  
  
  Additional Resources
&lt;/h2&gt;

&lt;p&gt;Interested in beginner friendly workshops on Elasticsearch and Kibana? Check out my Beginner's Crash Course to Elastic Stack series!&lt;/p&gt;

&lt;p&gt;1) &lt;a href="https://www.youtube.com/watch?v=iGKOdep1Iss" rel="noopener noreferrer"&gt;Part 4 Workshop Recording&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This blog is a complementary blog to Part 4 of the Beginner's Crash Course to Elastic Stack. If you prefer learning by watching videos instead, check out the recording!&lt;/p&gt;

&lt;p&gt;2) &lt;a href="https://github.com/LisaHJung/Beginners-Crash-Course-to-the-Elastic-Stack-Series" rel="noopener noreferrer"&gt;Beginner's Crash Course to Elastic Stack Table of Contents&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This table of contents includes repos of all workshops in the series. Each repo includes resources shared during the workshop including the video recording, presentation, related blogs, Elasticsearch requests and more!&lt;/p&gt;
&lt;h2&gt;
  
  
  Set up data within Elasticsearch
&lt;/h2&gt;

&lt;p&gt;Now that you have completed the prerequisite steps, it is time to set up data within Elasticsearch. &lt;/p&gt;

&lt;p&gt;Often times, the dataset is not optimal for running requests in its original state. &lt;/p&gt;

&lt;p&gt;For example, the type of a field may not be recognized by Elasticsearch or the dataset may contain a value that was accidentally included in the wrong field and etc. &lt;/p&gt;

&lt;p&gt;These are exact problems that I ran into while working with this dataset. The following are the requests that I sent to Elasticsearch to yield the results included in the blog. &lt;/p&gt;

&lt;p&gt;Copy and paste these requests into the Kibana console(Dev Tools) and run these requests in the order shown below. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;STEP 1: Create a new index(ecommerce_data) with the following mapping.&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;PUT&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;mappings&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;properties&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Country&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;long&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;format&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;M/d/yyyy H:m&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceNo&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Quantity&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;long&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;StockCode&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;keyword&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;double&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;STEP 2: Reindex the data from the original index(source) to the one you just created(destination).&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;_reindex&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;index&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;name of your original index when you added the data to Elasticsearch&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;dest&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;index&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ecommerce_data&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;STEP 3: Remove the negative values from the field "UnitPrice".&lt;/strong&gt;&lt;br&gt;
When you explore the minimum unit price in this dataset, you will see that the minimum unit price value is -11062.06. &lt;/p&gt;

&lt;p&gt;To keep our data simple, I used the delete_by_query API to remove all unit prices less than 0.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_delete_by_query&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;range&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;lte&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;STEP 4: Remove values greater than 500 from the field "UnitPrice".&lt;/strong&gt;&lt;br&gt;
When you explore the maximum unit price in this dataset, you will see that the maximum unit price value is 38,970. &lt;/p&gt;

&lt;p&gt;When the data is manually examined, the majority of the unit prices are less than 500. The max value of 38,970 would skew the average. &lt;/p&gt;

&lt;p&gt;To simplify our demo, I used the delete_by_query API to remove all unit prices greater than 500.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;POST&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_delete_by_query&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;range&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;gte&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Get information about documents in an index
&lt;/h2&gt;

&lt;p&gt;Before we can run &lt;code&gt;aggregations&lt;/code&gt; requests, we need to know what information is included in our dataset. &lt;/p&gt;

&lt;p&gt;This will help us figure out what type of questions we could ask and identify the appropriate fields to run &lt;code&gt;aggregations&lt;/code&gt; on to get the answers.&lt;/p&gt;

&lt;p&gt;The following &lt;code&gt;query&lt;/code&gt; will retrieve information about documents in the &lt;code&gt;ecommerce_data&lt;/code&gt; index. This &lt;code&gt;query&lt;/code&gt; is a great way to explore the structure and content of your document. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch displays a number of hits(line 12) and a sample of 10 search results by default(lines 16+). &lt;/p&gt;

&lt;p&gt;The first hit(a document) is shown on lines 17-31.  The field "_source"(line 22) lists all the fields(the content) of the document.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ab7sqrvs5v5sf2ar63m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ab7sqrvs5v5sf2ar63m.png" alt="image" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;ecommerce_data&lt;/code&gt; index contains transaction data from a company that operates in multiple countries. &lt;/p&gt;

&lt;p&gt;Each document is a transaction of an item and it contains the following fields:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Description &lt;/li&gt;
&lt;li&gt;Quantity &lt;/li&gt;
&lt;li&gt;InvoiceNo&lt;/li&gt;
&lt;li&gt;CustomerID&lt;/li&gt;
&lt;li&gt;UnitPrice&lt;/li&gt;
&lt;li&gt;Country&lt;/li&gt;
&lt;li&gt;InvoiceDate&lt;/li&gt;
&lt;li&gt;StockCode
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As you can see, running this &lt;code&gt;query&lt;/code&gt; helps us understand what types of questions we could ask about our dataset and which fields we need to &lt;code&gt;aggregate&lt;/code&gt; on to get the answers. &lt;/p&gt;
&lt;h2&gt;
  
  
  Aggregations Requests
&lt;/h2&gt;

&lt;p&gt;The basic syntax of an &lt;code&gt;aggregations&lt;/code&gt; request looks like the following.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Specify the aggregation type here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;There are various types of &lt;code&gt;aggregations&lt;/code&gt; that you can run with Elasticsearch.&lt;/p&gt;

&lt;p&gt;In the prerequisite steps, we added the e-commerce dataset and set up data within Elasticsearch.&lt;/p&gt;

&lt;p&gt;To learn how these different types of &lt;code&gt;aggregations&lt;/code&gt; work, we will pretend that we own an e-commerce app and that we have added our transaction data to the &lt;code&gt;ecommerce_data&lt;/code&gt; index. &lt;/p&gt;

&lt;p&gt;We will be sending various &lt;code&gt;aggregations&lt;/code&gt; requests to get insights about transactions of items sold on our e-commerce app!&lt;/p&gt;

&lt;h2&gt;
  
  
  Metric Aggregations
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;Metric aggregations&lt;/code&gt; are used to compute numeric values based on your dataset. It can be used to calculate the values of &lt;code&gt;sum&lt;/code&gt;,&lt;code&gt;min&lt;/code&gt;, &lt;code&gt;max&lt;/code&gt;, &lt;code&gt;avg&lt;/code&gt;, unique count(&lt;code&gt;cardinality&lt;/code&gt;) and etc.  &lt;/p&gt;

&lt;p&gt;When you are running an e-commerce app, it is important to know how your business is performing. A great way to measure that is to compute these metrics we mentioned above. &lt;/p&gt;

&lt;p&gt;Let's calculate these values!&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Metric aggregations&lt;/code&gt; can only be performed on fields that contain numeric values. &lt;/p&gt;

&lt;p&gt;Take a look at the image below. This is an example of a document in our index(lines 22-31).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ab7sqrvs5v5sf2ar63m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ab7sqrvs5v5sf2ar63m.png" alt="image" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The fields "Quantity" and "UnitPrice" contain numeric values.  &lt;code&gt;Metric aggregations&lt;/code&gt; can be performed on these fields. &lt;/p&gt;

&lt;h3&gt;
  
  
  Compute the sum of all unit prices in the index
&lt;/h3&gt;

&lt;p&gt;Let's say we want to &lt;code&gt;sum&lt;/code&gt; up the values of the field "UnitPrice" over all documents in the &lt;code&gt;ecommerce_data&lt;/code&gt; index. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example follows the &lt;code&gt;aggregations&lt;/code&gt; syntax. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum_unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This &lt;code&gt;aggregations&lt;/code&gt; request is named as "sum_unit_price". It instructs Elasticsearch to perform &lt;code&gt;sum&lt;/code&gt; aggregations on the field "UnitPrice" over all documents in the &lt;code&gt;ecommerce_data&lt;/code&gt; index&lt;/p&gt;

&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
By default, Elasticsearch returns top 10 hits(lines 16+). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc2la19852m8k3mq7yhfr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc2la19852m8k3mq7yhfr.png" alt="image" width="800" height="1066"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you minimize hits(red box- line 10), you will see the results of &lt;code&gt;aggregations&lt;/code&gt; we named "sum_unit_price"(image below). It displays the &lt;code&gt;sum&lt;/code&gt; of all unit prices present in our index. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ht4bs2cuewzni1w9pkz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ht4bs2cuewzni1w9pkz.png" alt="image" width="800" height="918"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If the sole purpose of running &lt;code&gt;aggregations&lt;/code&gt; requests is to get the &lt;code&gt;aggregations&lt;/code&gt; results, you can add a &lt;code&gt;size&lt;/code&gt; parameter and set it to 0 as shown below. &lt;/p&gt;

&lt;p&gt;This parameter prevents Elasticsearch from fetching the top 10 hits so that the aggregations results are shown at the top of the response. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Using a size parameter&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum_unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
We no longer need to minimize the hits to get access to the &lt;code&gt;aggregations&lt;/code&gt; results! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg9gtj96g4fo58gv1zw0u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg9gtj96g4fo58gv1zw0u.png" alt="image" width="800" height="1369"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will be setting the &lt;code&gt;size&lt;/code&gt; parameter to 0 in all requests from this point on. &lt;/p&gt;
&lt;h3&gt;
  
  
  Compute the lowest(min) unit price of an item
&lt;/h3&gt;

&lt;p&gt;What if we wanted to calculate the lowest(&lt;code&gt;min&lt;/code&gt;) unit price of an item? &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;min&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example is very similar to the last &lt;code&gt;aggregations&lt;/code&gt; request we sent.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;lowest_unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;min&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The differences are that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;we are naming the &lt;code&gt;aggregations&lt;/code&gt; as "lowest_unit_price".&lt;/li&gt;
&lt;li&gt;we set the &lt;code&gt;aggregations&lt;/code&gt; type to &lt;code&gt;min&lt;/code&gt; which is short for minimum. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
The lowest unit price of an item is 1.01. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrh6yj1k6heera7gm5g5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrh6yj1k6heera7gm5g5.png" alt="image" width="800" height="1205"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Compute the highest(max) unit price of an item
&lt;/h3&gt;

&lt;p&gt;What if we were interested in the highest(&lt;code&gt;max&lt;/code&gt;) unit price of an item? &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;max&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example is very similar to the last &lt;code&gt;aggregations&lt;/code&gt; request we sent.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;highest_unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;max&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The differences are that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;we are naming the &lt;code&gt;aggregations&lt;/code&gt; as "highest_unit_price".&lt;/li&gt;
&lt;li&gt;we set the &lt;code&gt;aggregations&lt;/code&gt; type to &lt;code&gt;max&lt;/code&gt; which is short for maximum. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
The highest unit price of an item is 498.79. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fals6qpgfo9ycmeypjidi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fals6qpgfo9ycmeypjidi.png" alt="image" width="800" height="1107"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Compute the average unit price of items
&lt;/h3&gt;

&lt;p&gt;What if we wanted to calculate the &lt;code&gt;average&lt;/code&gt; unit price of items?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;avg&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example is very similar to the last &lt;code&gt;aggregations&lt;/code&gt; request we sent.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;average_unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;avg&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The differences are that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;we are naming the &lt;code&gt;aggregations&lt;/code&gt; as "average_unit_price".&lt;/li&gt;
&lt;li&gt;we set the &lt;code&gt;aggregations&lt;/code&gt; type to type to &lt;code&gt;avg&lt;/code&gt; which is short for average.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt; &lt;br&gt;
The average unit price of an item is ~4.39.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fem740v5zp28q1a64p38e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fem740v5zp28q1a64p38e.png" alt="image" width="800" height="1055"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Compute the count, min, max, avg, sum in one go
&lt;/h3&gt;

&lt;p&gt;Calculating the &lt;code&gt;count&lt;/code&gt;, &lt;code&gt;min&lt;/code&gt;, &lt;code&gt;max&lt;/code&gt;, &lt;code&gt;avg&lt;/code&gt;, and &lt;code&gt;sum&lt;/code&gt; individually can be a tedious task. The &lt;code&gt;stats aggregations&lt;/code&gt; can  calculate all of these in one go! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;stats&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example is very similar to the last &lt;code&gt;aggregations&lt;/code&gt; request we sent.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;all_stats_unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;stats&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The differences are that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;we are naming the &lt;code&gt;aggregations&lt;/code&gt; as "all_stats_unit_price".&lt;/li&gt;
&lt;li&gt;we set the &lt;code&gt;aggregations&lt;/code&gt; type to &lt;code&gt;stats&lt;/code&gt; which is short for statistics.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let's copy and paste this example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected Response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
The &lt;code&gt;stats aggregation&lt;/code&gt; will yield the values of &lt;code&gt;count&lt;/code&gt;(the number of unit prices aggregation was performed on), &lt;code&gt;min&lt;/code&gt;, &lt;code&gt;max&lt;/code&gt;, &lt;code&gt;avg&lt;/code&gt;, and &lt;code&gt;sum&lt;/code&gt;(sum of all unit prices in the index). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzitexssnox901hkrbg22.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzitexssnox901hkrbg22.png" alt="image" width="800" height="1300"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Cardinality Aggregation
&lt;/h3&gt;

&lt;p&gt;What if you want the number of unique customers whom have bought from the app? &lt;/p&gt;

&lt;p&gt;You would run the &lt;code&gt;cardinality aggregation&lt;/code&gt;, which computes the count of unique values for a given field! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;cardinality&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example is very similar to the last &lt;code&gt;aggregations&lt;/code&gt; request we sent.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;number_unique_customers&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;cardinality&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The differences are that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;we are naming the &lt;code&gt;aggregations&lt;/code&gt; as "number_unique_customers".&lt;/li&gt;
&lt;li&gt;we set the &lt;code&gt;aggregations&lt;/code&gt; type to &lt;code&gt;cardinality&lt;/code&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We perform &lt;code&gt;cardinality aggregations&lt;/code&gt; on the field "CustomerID" as each customer is given a unique customer ID. By identifying unique customer IDs, we are able to get the number of unique customers in our transaction data.&lt;/p&gt;

&lt;p&gt;Copy and paste the example into the Kibana console and send the request! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt; &lt;br&gt;
Approximately, there are 4325 unique number of customers in our dataset. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9n6ykgr05xgl4fbfipyh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9n6ykgr05xgl4fbfipyh.png" alt="image" width="800" height="1102"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Limiting the scope of an &lt;code&gt;aggregation&lt;/code&gt;
&lt;/h3&gt;

&lt;p&gt;In previous examples, &lt;code&gt;aggregations&lt;/code&gt; were performed on all documents in the &lt;code&gt;ecommerce_data&lt;/code&gt; index. &lt;/p&gt;

&lt;p&gt;What if you want to run &lt;code&gt;aggregations&lt;/code&gt; on a subset of the documents? &lt;/p&gt;

&lt;p&gt;For example, our index contains e-commerce data from multiple countries. Let's say you want to calculate the average unit price of items sold in Germany.&lt;/p&gt;

&lt;p&gt;To limit the scope of the &lt;code&gt;aggregations&lt;/code&gt;, you can add a &lt;code&gt;query&lt;/code&gt; clause to the &lt;code&gt;aggregations&lt;/code&gt; request. &lt;/p&gt;

&lt;p&gt;The &lt;code&gt;query&lt;/code&gt; clause defines the subset of documents that &lt;code&gt;aggregations&lt;/code&gt; should be performed on.  &lt;/p&gt;

&lt;p&gt;The syntax of combined &lt;code&gt;query&lt;/code&gt; and &lt;code&gt;aggregations&lt;/code&gt; request look like the following. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Enter match or match_phrase here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Enter the name of the field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Enter the value you are looking for&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggregations&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Specify aggregations type here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's take a look at the following example which calculates the average unit price of items sold in Germany.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt; &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2tgaiweis9d6gq5t7qmu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2tgaiweis9d6gq5t7qmu.png" alt="image" width="800" height="757"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This request instructs Elasticsearch to &lt;code&gt;query&lt;/code&gt;(1) all documents that "match"(2) the value "Germany" in the field "Country"(3).&lt;/p&gt;

&lt;p&gt;Elasticsearch is then instructed to run &lt;code&gt;aggregations&lt;/code&gt;(4) on the queried data. We name the &lt;code&gt;aggregations&lt;/code&gt; "germany_average_unit_price"(5). Then, we tell Elasticsearch to get the average(6) of the values in the field "UnitPrice"(7) over all queried documents. &lt;/p&gt;

&lt;p&gt;This in turn will tell us the average unit price of items sold in Germany.&lt;/p&gt;

&lt;p&gt;Let's copy and paste the following example into the Kibana console and send the request! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;query&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;match&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Country&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Germany&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;germany_average_unit_price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;avg&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
The average of unit price of items sold in Germany is ~4.58.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw7s4m1v067w9xthiooo0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw7s4m1v067w9xthiooo0.png" alt="image" width="800" height="899"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The combination of &lt;code&gt;query&lt;/code&gt; and &lt;code&gt;aggregations&lt;/code&gt; request allowed us to perform &lt;code&gt;aggregations&lt;/code&gt; on a subset of documents. &lt;/p&gt;

&lt;p&gt;What if we wanted to perform &lt;code&gt;aggregations&lt;/code&gt; on several subsets of documents? &lt;/p&gt;

&lt;p&gt;This is where &lt;code&gt;bucket aggregations&lt;/code&gt; come into play! &lt;/p&gt;
&lt;h2&gt;
  
  
  Bucket Aggregations
&lt;/h2&gt;

&lt;p&gt;When you want to &lt;code&gt;aggregate&lt;/code&gt; on several subsets of documents, &lt;code&gt;bucket aggregations&lt;/code&gt; will come in handy. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;Bucket aggregations&lt;/code&gt; group documents into several subsets of documents called buckets. All documents in a bucket share a common criteria.  &lt;/p&gt;

&lt;p&gt;The following diagram illustrates a &lt;code&gt;bucket aggregations&lt;/code&gt; request that splits documents into monthly buckets. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0qdxrkmwoo6r4ot50iyy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0qdxrkmwoo6r4ot50iyy.png" alt="image" width="800" height="435"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are various ways you can group documents into buckets. These are: &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Date_histogram aggregation&lt;/li&gt;
&lt;li&gt;Histogram aggregation&lt;/li&gt;
&lt;li&gt;Range aggregation&lt;/li&gt;
&lt;li&gt;Terms aggregation&lt;/li&gt;
&lt;/ol&gt;
&lt;h3&gt;
  
  
  Date_histogram Aggregation
&lt;/h3&gt;

&lt;p&gt;When you are looking to group data by time interval, the &lt;code&gt;date_histogram aggregation&lt;/code&gt; will prove very useful! &lt;/p&gt;

&lt;p&gt;Our &lt;code&gt;ecommerce_data&lt;/code&gt; index contains transaction data that has been collected over time(from the year 2010 to 2011). &lt;/p&gt;

&lt;p&gt;If we are looking to get insights about transactions over time, our first instinct should be to run the &lt;code&gt;date_histogram aggregation&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;There are two ways to define a time interval with &lt;code&gt;date_histogram aggregation&lt;/code&gt;. These are &lt;code&gt;Fixed_interval&lt;/code&gt; and &lt;code&gt;Calendar_interval&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Fixed_interval&lt;/strong&gt;&lt;br&gt;
With the &lt;code&gt;fixed_interval&lt;/code&gt;, the interval is always &lt;strong&gt;constant&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Let's say we wanted to create a bucket for every 8 hour interval. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fixed_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Specify the interval here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example is similar to the previous &lt;code&gt;aggregations&lt;/code&gt; request we have sent. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_by_8_hrs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fixed_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;8h&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We name our &lt;code&gt;aggregations&lt;/code&gt; "transactions_by_8_hrs". Then, we set the type of &lt;code&gt;aggregation&lt;/code&gt; to "date_histogram".&lt;/p&gt;

&lt;p&gt;We instruct Elasticsearch to perform this &lt;code&gt;aggregation&lt;/code&gt; on the field "InvoiceDate" and to split the documents into buckets at a "fixed_ interval" of 8 hours. &lt;/p&gt;

&lt;p&gt;Let's copy and paste this example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch creates a bucket for every 8 hours("key_as_string") and shows the number of documents("doc_count") grouped into each bucket. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwk8vjt945abrcc2qsxwp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwk8vjt945abrcc2qsxwp.png" alt="image" width="800" height="714"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Another way we can define the time interval is through the &lt;code&gt;calendar_interval&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Calendar_interval&lt;/strong&gt;&lt;br&gt;
With the &lt;code&gt;calendar_interval&lt;/code&gt;, the interval may &lt;strong&gt;vary&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;For example, we could choose a time interval of day, month or year. But daylight savings can change the length of specific days, months can have different number of days, and leap seconds can be tacked onto a particular year. &lt;/p&gt;

&lt;p&gt;So the time interval of day, month, or leap seconds could vary! &lt;/p&gt;

&lt;p&gt;A scenario where you might use the &lt;code&gt;calendar_interval&lt;/code&gt; is when you want to calculate the monthly revenue. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;calendar_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Specify the interval here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example is similar to the previous &lt;code&gt;aggregations&lt;/code&gt; request we have sent. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_by_month&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;calendar_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;1M&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;However, we name the &lt;code&gt;aggregations&lt;/code&gt; "transactions_by_month". Then, we set the type of aggregation to "date_histogram".&lt;/p&gt;

&lt;p&gt;We instruct Elasticsearch to perform  &lt;code&gt;date_histogram aggregation&lt;/code&gt; on the field "InvoiceDate" and to split the documents into buckets at a "calendar_interval" of 1 month.&lt;/p&gt;

&lt;p&gt;Copy and paste the example into the Kibana console and send the request! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch creates monthly buckets. Within each bucket, the starting date and time of each monthly bucket is included in the field "key_as_string". &lt;/p&gt;

&lt;p&gt;The field "key" shows the date and time represented as a timestamp.&lt;/p&gt;

&lt;p&gt;The field "doc_count" shows the number of documents that fall within the time interval.  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0du6fafonngmi427cnid.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0du6fafonngmi427cnid.png" alt="image" width="800" height="1021"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Bucket sorting for date_histogram aggregation&lt;/strong&gt;&lt;br&gt;
Take a look at the response above. You will  notice that these buckets were sorted in ascending order of dates.&lt;/p&gt;

&lt;p&gt;The field "key" shows the date and time represented as timestamps. &lt;/p&gt;

&lt;p&gt;By default, the &lt;code&gt;date_histogram aggregation&lt;/code&gt; sorts buckets based on the "key"&lt;br&gt;
values in ascending order. &lt;/p&gt;

&lt;p&gt;To reverse this order, you can add an &lt;code&gt;order&lt;/code&gt; parameter to the &lt;code&gt;aggregations&lt;/code&gt; as shown below. Then, specify that you want to sort buckets based on the "_key" values in descending(desc) order.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_by_month&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;calendar_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;1M&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;order&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;_key&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;desc&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
You will see that buckets are now sorted to return the most recent interval first.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmuuif4kz5aw86mmuk673.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmuuif4kz5aw86mmuk673.png" alt="image" width="800" height="1162"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Histogram Aggregation
&lt;/h3&gt;

&lt;p&gt;With the &lt;code&gt;date_histogram aggregation&lt;/code&gt;, we were able to create buckets based on time intervals. &lt;/p&gt;

&lt;p&gt;With the &lt;code&gt;histogram aggregation&lt;/code&gt;, we can create buckets based on any numerical interval. &lt;/p&gt;

&lt;p&gt;For example, let's say we wanted to create buckets based on price interval that increases in increments of 10. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;Specify&lt;/span&gt; &lt;span class="nx"&gt;the&lt;/span&gt; &lt;span class="nx"&gt;interval&lt;/span&gt; &lt;span class="nx"&gt;here&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example is similar to the last request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_per_price_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The differences are that we are naming the &lt;code&gt;aggregations&lt;/code&gt; "transactions_per_price_interval".&lt;/p&gt;

&lt;p&gt;We instruct Elasticsearch to run a &lt;code&gt;histogram aggregation&lt;/code&gt; on the field "UnitPrice" and configure the price interval to increase in increments of 10. &lt;/p&gt;

&lt;p&gt;Copy and paste the example into the Kibana console and send the request! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns an array of buckets where each bucket represents a price interval("key"). &lt;/p&gt;

&lt;p&gt;Each interval increases in increments of 10 in unit price. It also includes the number of documents placed in each bucket("doc_count"). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fouwrwi26tism81u6m9oa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fouwrwi26tism81u6m9oa.png" alt="image" width="662" height="653"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the first price interval, there are more than 400,000 transactions for items priced within this interval. &lt;/p&gt;

&lt;p&gt;In the next price interval, there are over 20,000 transactions.&lt;/p&gt;

&lt;p&gt;It seems like the higher we go up in price interval, the number of transactions decreases. Could this be a pattern?!&lt;/p&gt;

&lt;p&gt;This might be something worth exploring if we are looking to improve our sales strategy. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Bucket sorting for histogram aggregation&lt;/strong&gt;&lt;br&gt;
Similar to the &lt;code&gt;date_histogram aggregation&lt;/code&gt;, the &lt;code&gt;histogram aggregation&lt;/code&gt; sorts the buckets based on the "key" values as well. &lt;/p&gt;

&lt;p&gt;By default, the &lt;code&gt;histogram aggregation&lt;/code&gt; sorts buckets based on the "key" values in ascending order. &lt;/p&gt;

&lt;p&gt;But what if we wanted to sort by descending order?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_per_price_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;order&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;_key&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;desc&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All you have to do is to add the &lt;code&gt;order&lt;/code&gt; parameter as shown above. Then, specify that you want to sort by "_key" values in descending("desc") order.&lt;/p&gt;

&lt;p&gt;This way, the highest price interval is listed first! &lt;/p&gt;

&lt;p&gt;Copy and paste the example into the console and send the request! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
You will see that the buckets are now sorted to return the price intervals in descending order. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcuxwc1tcwrpvi51d0lg7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcuxwc1tcwrpvi51d0lg7.png" alt="image" width="800" height="1219"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Range Aggregation
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;range aggregation&lt;/code&gt; is similar to the &lt;code&gt;histogram aggregation&lt;/code&gt; in that it can create buckets based on any numerical interval. &lt;/p&gt;

&lt;p&gt;The difference is that the &lt;code&gt;range aggregation&lt;/code&gt; allows you to define intervals of varying sizes so you can customize it to your use case.  &lt;/p&gt;

&lt;p&gt;For example, what if you wanted to know the number of transactions for items from varying price ranges(between 0 and $50, between $50-$200, and between $200 and up)? &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
   &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;range&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ranges&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;to&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;x&lt;/span&gt;
          &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;from&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;x&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;to&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;y&lt;/span&gt;
          &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;from&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;z&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;]&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following example is similar to the last request. &lt;br&gt;
&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_per_custom_price_ranges&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;range&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;UnitPrice&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ranges&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;to&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;
          &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;from&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;to&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;
          &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;from&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;]&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The differences are that we are naming the aggregations "transactions_per_custom_price_ranges". &lt;/p&gt;

&lt;p&gt;We run the &lt;code&gt;range aggregation&lt;/code&gt; on the field "UnitPrice". Then, we provide the ranges below(0 to 50, 50 to 200, 200 and up). &lt;/p&gt;

&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns an array of buckets where each bucket represents a customized price interval("key"). It also includes the number of documents("doc_count") placed in each bucket. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj9zcfli4w8yh8t09x4j4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj9zcfli4w8yh8t09x4j4.png" alt="image" width="800" height="846"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We see that over 400,000 transactions have occurred for items priced between 0 to 50. &lt;/p&gt;

&lt;p&gt;855 transactions have occurred for items priced between 50-200.&lt;/p&gt;

&lt;p&gt;307 transactions have occurred for items priced from 200 and up. &lt;/p&gt;

&lt;p&gt;At this point you might be wondering if you can sort the &lt;code&gt;range aggregation&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Bucket sorting for range aggregation&lt;/strong&gt;&lt;br&gt;
The &lt;code&gt;range aggregation&lt;/code&gt; is sorted based on the input ranges you specify and it cannot be sorted any other way! &lt;/p&gt;
&lt;h3&gt;
  
  
  Terms Aggregation
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;terms aggregation&lt;/code&gt; creates a new bucket for every unique term it encounters for the specified field. It is often used to find the most frequently found terms in a specified field of a document. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Syntax:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;Enter_name_of_the_index_here&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name your aggregations here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;terms&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Name the field you want to aggregate on here&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;State&lt;/span&gt; &lt;span class="nx"&gt;how&lt;/span&gt; &lt;span class="nx"&gt;many&lt;/span&gt; &lt;span class="nx"&gt;top&lt;/span&gt; &lt;span class="nx"&gt;results&lt;/span&gt; &lt;span class="nx"&gt;you&lt;/span&gt; &lt;span class="nx"&gt;want&lt;/span&gt; &lt;span class="nx"&gt;returned&lt;/span&gt; &lt;span class="nx"&gt;here&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For example, let's say you want to identify 5 customers with the highest number of transactions(documents). &lt;/p&gt;

&lt;p&gt;Each document is a transaction and each transaction includes a customer ID. &lt;/p&gt;

&lt;p&gt;Therefore, if we find the 5 most frequently occurring customer IDs, we will find our top 5 customers. &lt;/p&gt;

&lt;p&gt;To do this, we send the following example request. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;top_5_customers&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;terms&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We name the &lt;code&gt;aggregations&lt;/code&gt; as "top_5_customers". We specify that we want to perform &lt;code&gt;terms aggregation&lt;/code&gt; on the field "CustomerID". &lt;/p&gt;

&lt;p&gt;Since we only want the top 5 customers, we set the &lt;code&gt;size&lt;/code&gt; parameter within the &lt;code&gt;terms aggregation&lt;/code&gt; to 5. &lt;/p&gt;

&lt;p&gt;Copy and paste the example into the Kibana console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch will return 5 customer IDs("key") with the highest number of transactions("doc_count"). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgtlvu4xm2jdk6rzlugku.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgtlvu4xm2jdk6rzlugku.png" alt="image" width="800" height="995"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Bucket sorting for terms aggregation&lt;/strong&gt;&lt;br&gt;
By default, the &lt;code&gt;terms aggregation&lt;/code&gt; sorts buckets based on the "doc_count"&lt;br&gt;
values in descending order. &lt;/p&gt;

&lt;p&gt;But what if you want to sort the results in ascending order? &lt;/p&gt;

&lt;p&gt;You would send the following request!&lt;br&gt;
&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;5_customers_with_lowest_number_of_transactions&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;terms&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;order&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;_count&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;asc&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When we ask Elasticsearch to sort the buckets in ascending order, it will display customers IDs with the lowest number of documents. In other words, customers with the lowest number of transactions.&lt;/p&gt;

&lt;p&gt;To account for that, we name our &lt;code&gt;aggregations&lt;/code&gt; "5_customers_with_lowest_number_of_transactions". Then, we instruct Elasticsearch to perform "terms" aggregations on the field "CustomerID" and that we want 5 buckets returned("size": 5). &lt;/p&gt;

&lt;p&gt;Then, we add an "order" parameter to the "terms" aggregation and specify that we want to sort buckets based on the "_count" values in ascending("asc") order! &lt;/p&gt;

&lt;p&gt;Copy and paste this example into the Kibana console and send the request! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
You will see that the buckets are now sorted in ascending order of "doc_count", showing buckets with the lowest "doc_count" first.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvek622fafejw141te7sl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvek622fafejw141te7sl.png" alt="image" width="800" height="873"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Combined Aggregations
&lt;/h3&gt;

&lt;p&gt;So far, we have ran &lt;code&gt;metric aggregations&lt;/code&gt; or &lt;code&gt;bucket aggregations&lt;/code&gt; to answer simple questions. &lt;/p&gt;

&lt;p&gt;There will be times when we will ask more complex questions that require running combinations of these &lt;code&gt;aggregations&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;For example, let's say we wanted to know the sum of revenue per day. &lt;/p&gt;

&lt;p&gt;To get the answer, we need to first split our data into daily buckets(&lt;code&gt;date_histogram aggregation&lt;/code&gt;). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3mcqa7qs9pi7bosopwjr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3mcqa7qs9pi7bosopwjr.png" alt="image" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within each bucket, we need to perform &lt;code&gt;metric aggregations&lt;/code&gt; to calculate the daily revenue.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnsb440bhvz9o16yo40dg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnsb440bhvz9o16yo40dg.png" alt="image" width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The combined &lt;code&gt;aggregations&lt;/code&gt; request looks like the following.&lt;/p&gt;
&lt;h4&gt;
  
  
  Calculate the daily revenue
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm4nh1l2zkq2rpcbeodvf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm4nh1l2zkq2rpcbeodvf.png" alt="image" width="800" height="581"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's break this down. &lt;/p&gt;

&lt;p&gt;We let Elasticsearch know that we are sending an aggregations request(1).&lt;/p&gt;

&lt;p&gt;In order to calculate daily revenue, we first need to split documents into daily buckets. So we name this &lt;code&gt;aggregations&lt;/code&gt; request "transactions_per_day"(2). &lt;/p&gt;

&lt;p&gt;Since we are creating buckets based on time intervals, we run a &lt;code&gt;date_histogram aggregation&lt;/code&gt;(3) on the field "InvoiceDate"(4).&lt;/p&gt;

&lt;p&gt;Then, we set the "calendar_interval" to a "day"(5). &lt;/p&gt;

&lt;p&gt;Thus far, we have split the documents into daily buckets. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm4nh1l2zkq2rpcbeodvf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm4nh1l2zkq2rpcbeodvf.png" alt="image" width="800" height="581"&gt;&lt;/a&gt;&lt;br&gt;
Within the "transactions_per_day" aggregations, we create a &lt;code&gt;sub-aggregations&lt;/code&gt;(6) called "daily_revenue"(7). This will calculate the total revenue generated each day.&lt;/p&gt;

&lt;p&gt;You will notice this aggregation looks a little different.&lt;/p&gt;

&lt;p&gt;It uses the "script"(9) to calculate the daily revenue. What is that all about? &lt;/p&gt;

&lt;p&gt;Let’s take a look at the fields of our transaction data again.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fka2eahl2b4opjikx81nf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fka2eahl2b4opjikx81nf.png" alt="image" width="800" height="1366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When we look at our transaction document(lines 22-31), you will see that it lists the "Quantity"(line 24) of an item sold and the "UnitPrice"(line 27) of that item. &lt;/p&gt;

&lt;p&gt;But we do not see the total revenue from this transaction. So to get the total revenue, we need to multiply the "Quantity" of item sold by its "UnitPrice". &lt;/p&gt;

&lt;p&gt;This is where the &lt;code&gt;script&lt;/code&gt;(9) comes in.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff6sgirn12as678e3ng1m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff6sgirn12as678e3ng1m.png" alt="image" width="800" height="582"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Script&lt;/code&gt; is used to dynamically create something in Elasticsearch.  &lt;/p&gt;

&lt;p&gt;In our context, it is used to dynamically calculate the total revenue per transaction.&lt;/p&gt;

&lt;p&gt;In the "source" field(10), we instruct Elasticsearch that for each document(doc) in the daily bucket, get the value of the field "UnitPrice" and multiply that by the value of the field "Quantity". &lt;/p&gt;

&lt;p&gt;That will calculate the total revenue of each transaction in our bucket. Then, we tell Elasticsearch to "sum"(8) up the total revenue from all the transactions in our bucket to calculate the daily_revenue(7).&lt;/p&gt;

&lt;p&gt;Copy and paste the following example into the Kibana console and send the request! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;calendar_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;daily_revenue&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;script&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
              &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;doc['UnitPrice'].value * doc['Quantity'].value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected Response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns an array of daily buckets. &lt;/p&gt;

&lt;p&gt;Within each bucket, it shows the number of documents("doc_count") within each bucket as well as the revenue generated from each day("daily_revenue"). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbw375wwkoascclnfj7er.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbw375wwkoascclnfj7er.png" alt="image" width="800" height="1053"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h4&gt;
  
  
  Calculating multiple metrics per bucket
&lt;/h4&gt;

&lt;p&gt;You can also calculate multiple metrics per bucket. &lt;/p&gt;

&lt;p&gt;For example, let's say you wanted to calculate the daily revenue and the number of unique customers per day in one go. To do this, you can add multiple &lt;code&gt;metric aggregations&lt;/code&gt; per bucket as shown below!  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fftx0opc989wti08ng0bp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fftx0opc989wti08ng0bp.png" alt="image" width="800" height="731"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The request above is almost identical to the last request except that we added an additional &lt;code&gt;metric aggregations&lt;/code&gt; called  "number_of_unique_customers_per_day"(2) to the sub-aggregations(1). &lt;/p&gt;

&lt;p&gt;To calculate this value, we instruct Elasticsearch to perform &lt;code&gt;cardinality aggregations&lt;/code&gt;(3) on the field "CustomerID"(4). &lt;/p&gt;

&lt;p&gt;Copy and paste the following example into the Kibana console and send the request! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;calendar_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;daily_revenue&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;script&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
              &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;doc['UnitPrice'].value * doc['Quantity'].value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;number_of_unique_customers_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;cardinality&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected Response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
Elasticsearch returns an array of daily buckets. &lt;/p&gt;

&lt;p&gt;Within each bucket, you will see that the "number_of_unique_customers_per_day" and the "daily_revenue" have been calculated for each day!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkuqx1lewiwz1y98zfp3s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkuqx1lewiwz1y98zfp3s.png" alt="image" width="800" height="1150"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;calendar_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;daily_revenue&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;script&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
              &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;doc['UnitPrice'].value * doc['Quantity'].value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;number_of_unique_customers_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;cardinality&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Sorting by metric value of a sub-aggregation&lt;/strong&gt;&lt;br&gt;
You do not always need to sort by time interval, numerical interval, or by doc_count! You can also sort by metric value of &lt;code&gt;sub-aggregations&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Let's take a look at the request below. Within the &lt;code&gt;sub-aggregation&lt;/code&gt;, metric values "daily_revenue"(3) and "number_of_unique_customers_per_day"(4) are calculated. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr1ah1j9isuvpa27iwu4h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr1ah1j9isuvpa27iwu4h.png" alt="image" width="800" height="780"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's say you wanted to find which day had the highest daily revenue to date!&lt;/p&gt;

&lt;p&gt;All you need to do is to add the "order" parameter(1) and sort buckets based on the metric value of "daily_revenue" in descending("desc") order(2)! &lt;/p&gt;

&lt;p&gt;Copy and paste the following example into the console and send the request!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;GET&lt;/span&gt; &lt;span class="nx"&gt;ecommerce_data&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="nx"&gt;_search&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;size&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;transactions_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;date_histogram&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;InvoiceDate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;calendar_interval&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;order&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;daily_revenue&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;desc&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;aggs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;daily_revenue&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sum&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;script&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
              &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;source&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;doc['UnitPrice'].value * doc['Quantity'].value&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;number_of_unique_customers_per_day&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;cardinality&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;field&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;CustomerID&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Expected response from Elasticsearch:&lt;/strong&gt;&lt;br&gt;
You will see that the response is no longer sorted by date. The buckets are now sorted to return the highest daily revenue first! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2hx5xv7lu0fku6w6qwlt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2hx5xv7lu0fku6w6qwlt.png" alt="image" width="800" height="1095"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Great job! You have mastered the basics of running &lt;code&gt;aggregations&lt;/code&gt; requests with Elasticsearch and Kibana. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F46oeh0n9a8ep51nz9f99.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F46oeh0n9a8ep51nz9f99.gif" alt="image" width="539" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Try to run various &lt;code&gt;metric aggregations&lt;/code&gt;, &lt;code&gt;bucket aggregations&lt;/code&gt;, and &lt;code&gt;combined aggregations&lt;/code&gt; on your own and see what type of insights you can find! &lt;/p&gt;

</description>
      <category>beginners</category>
      <category>elasticsearch</category>
      <category>datascience</category>
      <category>database</category>
    </item>
    <item>
      <title>Elastic Anomaly Detection - Learning Process and Anomaly Score</title>
      <dc:creator>Priscilla Parodi</dc:creator>
      <pubDate>Mon, 02 Aug 2021 14:39:44 +0000</pubDate>
      <link>https://dev.to/elastic/elastic-anomaly-detection-learning-process-and-anomaly-score-3nl7</link>
      <guid>https://dev.to/elastic/elastic-anomaly-detection-learning-process-and-anomaly-score-3nl7</guid>
      <description>&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/priscilla_parodi/elastic-anomaly-detection-categorization-5cdg"&gt;Next Post: Elastic Anomaly Detection - Categorization&lt;/a&gt; |&lt;/p&gt;

&lt;p&gt;As the name suggests, the algorithm needs to identify anomalies in the data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;But how does the model identify anomalies?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;How do &lt;strong&gt;we&lt;/strong&gt; identify anomalies? &lt;/p&gt;

&lt;p&gt;For example, considering the image bellow.&lt;/p&gt;

&lt;p&gt;What is abnormal in this image?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7p8u1jgvnezedq5mtqbx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7p8u1jgvnezedq5mtqbx.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What if I add something to this image? Now, considering the updated image below, what is abnormal?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxuje9mwxjmc1n379t5re.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxuje9mwxjmc1n379t5re.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It was probably easier with the second image because the cat is not a dog, making the cat the anomaly in this image, for most people. In this process, you are identifying patterns.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Identifying patterns&lt;/strong&gt; is an essential part of our learning process, but the answers are not necessarily obvious, because you know what a cat is and what a dog is, not from the pictures I showed you, I never told you this, but because you learned it during your life.&lt;/p&gt;

&lt;p&gt;We must always remember that the algorithms will only process the data that you choose to share.&lt;/p&gt;

&lt;p&gt;In the case of a child who is still learning the difference between a dog and a cow, for example, we might receive the answer that all animals in the image belong to the same category: animals. This answer is not incorrect; it simply applies different criteria based on similar characteristics observed in the available data.&lt;/p&gt;

&lt;p&gt;If we are seeking a more specific answer, considering all possible details, variables, and behavior, we need to ensure that all data that could contribute to the answer is analyzed over time. The more data we have, the better our understanding will be.&lt;/p&gt;

&lt;p&gt;In the case of a child, for them to identify the cat as 'abnormal' they would need more examples, more data would need to be “analyzed” over time. &lt;strong&gt;The conclusion is the same for the algorithms.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Based on this information, you already know that the question 'What is abnormal?' is answered by taking into account what is considered normal (which can vary), and to determine what is normal, &lt;strong&gt;the algorithm identifies patterns over time.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;There are multiple types of Anomaly Detection analyses available in Elastic's ML solution, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Single Metric analysis&lt;/strong&gt;, for jobs that analyze a single time series;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Multi-Metric analysis&lt;/strong&gt;, to split a single time series into multiple time series;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Population analysis&lt;/strong&gt;, to identify abnormal behaviors in a homogeneous "population" over a period of time;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://dev.to/priscilla_parodi/elastic-anomaly-detection-categorization-5cdg"&gt;&lt;strong&gt;Categorization analysis&lt;/strong&gt;&lt;/a&gt;, which is a machine learning process that tokenizes a text field, clusters similar to data together, and classifies it into categories;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The Anomaly Detection feature analyzes the input stream of data, models its behavior using techniques to construct a model that best matches your data, and performs analysis based on the detectors you defined in your job, considering possible rules and dates you want to ignore or disqualify from being modeled.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpx5hjvkc066r639nivt5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpx5hjvkc066r639nivt5.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The blue line in the chart represents the actual data values, while the shaded blue area represents the bounds for the expected values. Initially, the range of expected values is wide due to a limited amount of data in the analyzed time period. Consequently, the model fails to capture the periodicity in the data.&lt;/p&gt;

&lt;p&gt;After processing more data, a model is built with coefficients that result in expected values close to the actual values. This leads to the shaded blue area being close to the blue line. By comparing the values to this area, we can determine if they fall outside of it and monitor the anomaly score to indicate the severity of potential anomalies.&lt;/p&gt;

&lt;h3&gt;
  
  
  Anomaly Score
&lt;/h3&gt;

&lt;p&gt;The &lt;strong&gt;anomaly score&lt;/strong&gt; (severity) is a value from 0 to 100, which indicates the significance of the observed anomaly compared to previously seen anomalies. Highly anomalous values ​​are shown in red.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqxsrhv1xaavfigaayosh.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqxsrhv1xaavfigaayosh.jpg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F064rrp0bs9v32pjn4uzb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F064rrp0bs9v32pjn4uzb.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In order to provide a sensible view of the results, an anomaly score is calculated for each &lt;strong&gt;bucket time interval&lt;/strong&gt; (we use the concept of a bucket to divide up a continuous stream of data into batches, between 10 minutes and 1 hour, for processing).&lt;/p&gt;

&lt;p&gt;When you review your machine learning results, there is a &lt;code&gt;multi_bucket_impact&lt;/code&gt; property that indicates how strongly the final anomaly score is influenced by multi-bucket analysis; anomalies with medium or high impact on multiple buckets are represented with a cross symbol instead of a circle.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk3oxpfwo22imt26014cg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk3oxpfwo22imt26014cg.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/priscilla_parodi/elastic-anomaly-detection-categorization-5cdg"&gt;Next Post: Elastic Anomaly Detection - Categorization&lt;/a&gt; |&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This post is part of a series that covers &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Artificial Intelligence with a focus on Elastic's (Creators of Elasticsearch) Machine Learning solution&lt;/a&gt;, aiming to introduce and exemplify the possibilities and options available, in addition to addressing the context and usability.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>machinelearning</category>
      <category>elasticsearch</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Elastic Anomaly Detection - Categorization</title>
      <dc:creator>Priscilla Parodi</dc:creator>
      <pubDate>Mon, 02 Aug 2021 14:39:32 +0000</pubDate>
      <link>https://dev.to/elastic/elastic-anomaly-detection-categorization-5cdg</link>
      <guid>https://dev.to/elastic/elastic-anomaly-detection-categorization-5cdg</guid>
      <description>&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/priscilla_parodi/elastic-anomaly-detection-and-data-visualizer-handson-3c2j"&gt;Next Post: Elastic Anomaly Detection and Data Visualizer HandsOn&lt;/a&gt;|&lt;/p&gt;

&lt;p&gt;For categorization analysis, the learning process &lt;a href="https://dev.to/priscilla_parodi/elastic-anomaly-detection-learning-process-and-anomaly-score-3nl7"&gt;is the same&lt;/a&gt;, but there are other steps to process the text.&lt;/p&gt;

&lt;p&gt;The input data must be a text field, typically containing repeated elements such as log messages because it's not a natural language processing &lt;a href="https://en.wikipedia.org/wiki/Natural_language_processing" rel="noopener noreferrer"&gt;(NLP)&lt;/a&gt; and it works best on machine-written messages.&lt;/p&gt;

&lt;p&gt;When you create a categorization anomaly detection job, the machine learning model processes the input text into different categories, identifying patterns over time, as you can see in this example:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Input text&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Log message:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Jul 20 15:02:19 localhost sshd[8903]: Invalid user admin from 58.218.92.41 port 26062
Jul 20 15:02:19 localhost sshd[8903]: input_userauth_request: invalid user admin [preauth]
Jul 20 15:02:20 localhost sshd[8903]: Connection closed by 58.218.92.41 port 26062 [preauth]
Jul 20 17:10:23 localhost sshd[2074]: Received disconnect from 41.43.112.199 port 41805:11: disconnected by user
Jul 20 17:10:23 localhost sshd[2074]: Disconnected from 41.43.112.199 port 26062
Jul 20 17:10:23 localhost sshd[2072]: pam_unix (sshd:session): session closed for user ec2-user
Jul 20 19:14:55 localhost sshd[8944]: pam_unix (sshd:session): session closed for user ec2-user by (uid=0)
Jul 20 19:17:22 localhost runner: pam_unix(runuser-1:session): session closed for user ec2-user 
Jul 20 19:17:22 localhost runner: pam_unix(runuser-1:session): session opened for user ec2-user by (uid=0)
Jul 20 19:17:23 localhost runner: pam_unix(runuser-1:session): session closed for user ec2-user 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 1 - Remove mutable text&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Mutable texts are not taken into account to not identify an anomaly or a pattern where there is no relevance as the value is always changing, e.g, date and time.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;localhost sshd: Invalid user from port
localhost sshd: input_userauth_request: invalid user [preauth]
localhost sshd: Connection closed by port [preauth]
localhost sshd: Received disconnect from port disconnected by user
localhost sshd: Disconnected from port
localhost sshd: pam_unix session: session closed for user ec2-user
localhost sshd[8944]: pam_unix session: session closed for user ec2-user by (uid=0)
localhost runner: pam_unix session: session closed for user ec2-user 
localhost runner: pam_unix session: session opened for user ec2-user by (uid=0)
localhost runner: pam_unix session: session closed for user ec2-user 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 2 - cluster similar messages together&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Which can mean a line or several lines that are part of a task, for example, and that are respecting a pattern.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;-&amp;gt;mlcategory:1&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;localhost sshd: Invalid user from port&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;-&amp;gt;mlcategory:2&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;localhost sshd: input_userauth_request: invalid user [preauth]&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;-&amp;gt;mlcategory:3&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;localhost sshd: Connection closed by port [preauth]&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;-&amp;gt;mlcategory:4&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;localhost sshd: Received disconnect from port disconnected by user&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;-&amp;gt;mlcategory:5&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;localhost sshd: Disconnected from port&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;-&amp;gt;mlcategory:6&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;localhost sshd: pam_unix session: session closed for user ec2-user&lt;br&gt;
localhost sshd[8944]: pam_unix session: session closed for user ec2-user by (uid=0)&lt;br&gt;
localhost runner: pam_unix session: session closed for user ec2-user &lt;br&gt;
localhost runner: pam_unix session: session opened for user ec2-user by (uid=0)&lt;br&gt;
localhost runner: pam_unix session: session closed for user ec2-user&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3 - Count per time bucket&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;By processing analyzing time buckets, the behavior in a cluster can be better and easily identified for anomaly checking.&lt;/p&gt;

&lt;p&gt;In the image below you can see an example of the graphic behavior of each ml category over time for a further time bucket analysis:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2n0fblbgzsda6nnvv3sv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2n0fblbgzsda6nnvv3sv.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As an example, at a specific time bucket, we could see an mlcategory:1 followed by an mlcategory:4, twice:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;mlcategory:1 -&amp;gt; mlcategory:4 -&amp;gt; mlcategory:1 -&amp;gt; mlcategory:4&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;We could call it bucket 1, as a reference, and so on, bucket 2...&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzml0mskmfvr95y8dudjy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzml0mskmfvr95y8dudjy.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/priscilla_parodi/elastic-anomaly-detection-and-data-visualizer-handson-3c2j"&gt;Next Post: Elastic Anomaly Detection and Data Visualizer HandsOn&lt;/a&gt;|&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This post is part of a series that covers &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Artificial Intelligence with a focus on Elastic's (Creators of Elasticsearch) Machine Learning solution&lt;/a&gt;, aiming to introduce and exemplify the possibilities and options available, in addition to addressing the context and usability.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>machinelearning</category>
      <category>elasticsearch</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Elastic Anomaly Detection and Data Visualizer HandsOn</title>
      <dc:creator>Priscilla Parodi</dc:creator>
      <pubDate>Mon, 02 Aug 2021 14:39:16 +0000</pubDate>
      <link>https://dev.to/elastic/elastic-anomaly-detection-and-data-visualizer-handson-3c2j</link>
      <guid>https://dev.to/elastic/elastic-anomaly-detection-and-data-visualizer-handson-3c2j</guid>
      <description>&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-outlier-detection-16ni"&gt;Next Post: Elastic Data Frame - Outlier Detection&lt;/a&gt; |&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Note: This HandsOn assumes that you have already followed the step-by-step Setup of your Elastic Cloud account and added the Samples available there to replicate the analysis mentioned here. If not, please, &lt;a href="https://dev.to/priscilla_parodi/handson-setup-elastic-cloud-4j8p"&gt;follow the steps mentioned there&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;We've added a &lt;a href="https://dev.to/priscilla_parodi/handson-setup-elastic-cloud-4j8p"&gt;Sample&lt;/a&gt; containing data you don't know anything about.&lt;/p&gt;

&lt;p&gt;As mentioned here, it is very &lt;strong&gt;important&lt;/strong&gt; to know the data and type of data we have in order to know what &lt;strong&gt;kind of analysis we can do.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;So, before proceeding with Anomaly Detection we can, for example, use &lt;strong&gt;Data Visualizer&lt;/strong&gt; &lt;code&gt;Kibana&amp;gt;Machine Learning&amp;gt;Data visualizer&lt;/code&gt; to &lt;strong&gt;understand more&lt;/strong&gt; about the available data. &lt;/p&gt;

&lt;p&gt;In a real use case you may prefer to change some field, mapping or wrong/empty data, but in this case we will use the available data exactly as it is.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxrr9gk0fd0gt17emd6x7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxrr9gk0fd0gt17emd6x7.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's select the Index Pattern &lt;code&gt;[eCommerce] Orders&lt;/code&gt;. When the page loads you usually won't be able to see any data because the time interval initially set is too short (15 minutes) for this type of data that is not continuously being updated, for you to access all available data click on &lt;code&gt;Use full kibana_sample_data_ecommerce_data&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7zmjks6vvtg7h3123rc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7zmjks6vvtg7h3123rc.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we have some important information, the data type (text/keyword/geo_point/number...), % of documents containing that data type, distintic values, distributions and the possibility to visualize data in a graph &lt;code&gt;Actions&lt;/code&gt;, something we won't cover at this point. This can help us with the most important part before analyzing data, which is: What answer don't we have? What is this data not telling us that it would be important to know based on the needs that I/my company have?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi5v5kxvavanrxltc0qn0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi5v5kxvavanrxltc0qn0.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's pretend we are an Ecommerce owner and we want to analyze our data, the first thing we can notice is that we only have 1 month data &lt;code&gt;May 26, 2021 -&amp;gt; Jun 26, 2021&lt;/code&gt; which may not be positive for a complete analysis because we have holidays and events that can momentarily change customer behavior.&lt;/p&gt;

&lt;p&gt;Maybe it's our first month with the company, we could do a full analysis after a few years and possibly map and add rules to skip holidays and momentary events, but now what we want is to start an analysis that will be useful at this point, clearly, with limited possibilities.&lt;/p&gt;

&lt;p&gt;Something that can make sense is to understand who our customers are, after all, in just 1 month we already have a total of 4675 events with 3321 distinct values ​​for &lt;code&gt;customer_full_name.keyword&lt;/code&gt;, which means that we have a good amount of unique customers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0atcakxl1feiexvt3m5o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0atcakxl1feiexvt3m5o.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Another thing that stands out is that these customers are from different continents and countries, we are a global ecommerce company. But how does this &lt;strong&gt;population&lt;/strong&gt; spend their money? Do they &lt;strong&gt;behave similarly&lt;/strong&gt;?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5d3ienbf63bo7mit67xb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5d3ienbf63bo7mit67xb.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's run an Anomaly Detection analysis: &lt;code&gt;Machine Learning&amp;gt;Anomaly Detection&amp;gt;Create job&lt;/code&gt; and select &lt;code&gt;[eCommerce] Orders&lt;/code&gt; then, click &lt;code&gt;Use a Wizard&amp;gt;Population&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Again, let’s use the full sample, click on &lt;code&gt;Use full kibana_sample_data_ecommerce_data&amp;gt;Next&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;To define a population we need a relationship between the data, not necessarily with the same value for all, but we need at least one field that is common, which characterizes the data as belonging to the same population.&lt;/p&gt;

&lt;p&gt;For the Population field, let's add some location data, let’s use the &lt;code&gt;geoip.region_name&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Now, considering a population characterized by groups from different regions, we want to know how these people spend their money, so as a metric to identify abnormal behavior, let's add the sum of &lt;code&gt;taxful_total_price&lt;/code&gt;, to understand if there are abnormal behaviors in the sum of the total amount spent over time.&lt;/p&gt;

&lt;p&gt;Your screen should look like this image:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4d75uvfw52g1vhhf1vkj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4d75uvfw52g1vhhf1vkj.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can also add the Bucket Span on the left side, usually the Bucket Span is 15 minutes, but you can click &lt;code&gt;Estimate bucket span&lt;/code&gt; and based on your data automatically set a good interval for time series analysis. In my case the estimated time was 3h.&lt;/p&gt;

&lt;p&gt;And on the right side you will see the &lt;code&gt;Influencers&lt;/code&gt;, if you want to see the influence of other fields on the result, and as you can see, the region will already be there.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe6a3yagdrnniz0ir5nkh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe6a3yagdrnniz0ir5nkh.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click Next and then define a name for the Job ID, I used the name: &lt;code&gt;pop_price_region&lt;/code&gt;. At this time we are not going to add additional or advanced settings. After that, click Next and your webpage should look like this image:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc0d8zycoo74jgmarm9hw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc0d8zycoo74jgmarm9hw.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click Next one more time if your webpage looks like this, otherwise check the error message. Finally, click Create Job. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fej0tn6jcy87wo1vhv5cf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fej0tn6jcy87wo1vhv5cf.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After loading, click on View Results, in this case we don't want the Job running in real time, otherwise you could do that. &lt;/p&gt;

&lt;p&gt;A new page will load and as you can see, we don't have an Anomaly Score &amp;gt; 75, which means we don't have one high-severity event, but we do have two anomalies &amp;gt; 50, in orange.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9n65ghu5sgjrih2816fr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9n65ghu5sgjrih2816fr.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The two events with &lt;code&gt;severity&amp;gt;50&lt;/code&gt;, taking into account the population and not just a single metric, came from New York on June 17th 2021 (Current: $998.88 / Typical: $117.59, 8x higher, Probability: 0.00148...), and from Cairo Governorate on June 21st 2021 (Current: $885.97 / Typical: $118.45, 7x higher, Probability: 0.00234), although all this detailed information is important it is worth remembering that the severity value is a &lt;strong&gt;normalized value&lt;/strong&gt; from 0-100 of all these data considering the behavior of the population in the analyzed period of time, which means that only one data, alone, is not necessarily relevant, it is possible to find other purchases with a value also 7x higher with less relevance, for example.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fetoltwcts99c4fym3lds.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fetoltwcts99c4fym3lds.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you want to leave this job running, as mentioned above, you just need to click &lt;code&gt;Machine Learning&amp;gt;Anomaly Detection&amp;gt;(your job)&amp;gt;Start datafeed&lt;/code&gt;, you will set the start date and set the end time, select &lt;code&gt;no end time&lt;/code&gt; to search in real time and then click Start.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgw0a8qer5oox8ef538u1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgw0a8qer5oox8ef538u1.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can also &lt;a href="https://www.elastic.co/guide/en/machine-learning/7.13/ml-configuring-alerts.html" rel="noopener noreferrer"&gt;create alerts&lt;/a&gt; based on severity and connect to services like Email, &lt;em&gt;IBM Resilient, Jira, Microsoft Teams, Slack&lt;/em&gt; or even write to an index or create a &lt;em&gt;Webhook&lt;/em&gt; connector. There are also &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/ml-apis.html" rel="noopener noreferrer"&gt;APIs&lt;/a&gt; to perform machine learning anomaly detection activities.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg22h806b0c8h9bewpm2u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg22h806b0c8h9bewpm2u.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-outlier-detection-16ni"&gt;Next Post: Elastic Data Frame - Outlier Detection&lt;/a&gt; |&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This post is part of a series that covers &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Artificial Intelligence with a focus on Elastic's (Creators of Elasticsearch) Machine Learning solution&lt;/a&gt;, aiming to introduce and exemplify the possibilities and options available, in addition to addressing the context and usability.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>machinelearning</category>
      <category>elasticsearch</category>
      <category>tutorial</category>
      <category>beginners</category>
    </item>
    <item>
      <title>HandsOn Setup - Elastic Cloud</title>
      <dc:creator>Priscilla Parodi</dc:creator>
      <pubDate>Mon, 02 Aug 2021 14:39:00 +0000</pubDate>
      <link>https://dev.to/elastic/handson-setup-elastic-cloud-4j8p</link>
      <guid>https://dev.to/elastic/handson-setup-elastic-cloud-4j8p</guid>
      <description>&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; |&lt;/p&gt;

&lt;p&gt;For HandsOn posts we will use Elastic Cloud. If you don't use Elastic Cloud yet, on &lt;a href="https://ela.st/latam-community"&gt;this link&lt;/a&gt; you can access a 30-day trial. Just add your email to start using your free trial as you can see in the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HmqMi_O9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7t21s89n4rac5dyl5u5u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HmqMi_O9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7t21s89n4rac5dyl5u5u.png" alt="Alt Text" width="800" height="329"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After creating your account you will not have any deployment available, to create a new one click on Create Deployment as in the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JwMdJzUt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mt5pxurjjjbg1vomwz52.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JwMdJzUt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mt5pxurjjjbg1vomwz52.png" alt="Alt Text" width="800" height="172"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Feel free to choose the settings you prefer but before creating your deployment you need to customize to add the ML node.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QhcXxS6J--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uv9qaazl3hizr081hlau.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QhcXxS6J--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uv9qaazl3hizr081hlau.png" alt="Alt Text" width="800" height="460"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FCFiS8Mb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tjrl6p93cyd1ev6bhals.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FCFiS8Mb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tjrl6p93cyd1ev6bhals.png" alt="Alt Text" width="800" height="217"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then you can finally create your deployment and open Kibana. At this point don't worry about the settings beyond that.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hYDJSktD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r2gj6ujprlgin4lyyq0j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hYDJSktD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r2gj6ujprlgin4lyyq0j.png" alt="Alt Text" width="800" height="377"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When Kibana opens you will see a message like the message in image 1 below, suggesting that you start adding data. To proceed implementing the data analysis demonstrated here you will need to add all available samples.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iIopDPWG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dkyrk0ra5ir4ymghkjia.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iIopDPWG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dkyrk0ra5ir4ymghkjia.png" alt="Alt Text" width="800" height="988"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3blQ8Jgs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3afo2dr4fxb2qexkvf1e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3blQ8Jgs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3afo2dr4fxb2qexkvf1e.png" alt="Alt Text" width="800" height="306"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That's it. Now you can proceed with the data analysis examples.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lgmKuMts--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vp2anvl2mix8optzmrch.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lgmKuMts--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vp2anvl2mix8optzmrch.png" alt="Alt Text" width="800" height="753"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://dev.to/priscilla_parodi/elastic-anomaly-detection-and-data-visualizer-handson-3c2j"&gt;Elastic Anomaly Detection and Data Visualizer HandsOn&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-classification-analysis-handson-4ilb"&gt;Elastic Data Frame - Classification Analysis HandsOn&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-inference-processor-handson-3392"&gt;Elastic Data Frame - Inference Processor HandsOn&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This post is part of a series that covers &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Artificial Intelligence with a focus on Elastic's (Creators of Elasticsearch, Kibana, Logstash and Beats) Machine Learning solution&lt;/a&gt;, aiming to introduce and exemplify the possibilities and options available, in addition to addressing the context and usability.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>beginners</category>
      <category>tutorial</category>
      <category>elasticsearch</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Elastic Data Frame - Outlier Detection</title>
      <dc:creator>Priscilla Parodi</dc:creator>
      <pubDate>Mon, 02 Aug 2021 14:38:34 +0000</pubDate>
      <link>https://dev.to/elastic/elastic-data-frame-outlier-detection-16ni</link>
      <guid>https://dev.to/elastic/elastic-data-frame-outlier-detection-16ni</guid>
      <description>&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-regression-analysis-2ge2"&gt;Next Post: Elastic Data Frame - Regression Analysis&lt;/a&gt; |&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Unlike the &lt;a href="https://dev.to/priscilla_parodi/elastic-anomaly-detection-learning-process-and-anomaly-score-3nl7"&gt;Anomaly Detection&lt;/a&gt; models, this is a multi-variate analysis, it enables a better understanding of complex behaviors that are described by many features. For this analysis we have 3 models with different algorithms and learning types (Outlier, &lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-regression-analysis-2ge2"&gt;Regression&lt;/a&gt; and &lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-classification-analysis-1b3f"&gt;Classification&lt;/a&gt;) and in this post we'll talk about Outlier Detection.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Outlier detection&lt;/strong&gt; identifies unusual data points in the dataset (Unsupervised ML).&lt;/p&gt;

&lt;p&gt;When we talk about time series modeling and population anomaly detection, we look for outliers but basing it on how far the metric is from the normal model.  &lt;/p&gt;

&lt;p&gt;With Outlier Detection we are looking at clusters of data and evaluating density and distance using multi-variate analysis. We are not interested in tracking evolution of this dataset over time like we do in population anomaly detection and there are no buckets.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SRNjE1cj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cl024oq275dcoie8yya7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SRNjE1cj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cl024oq275dcoie8yya7.png" alt="Alt Text" width="800" height="718"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Evaluation of the Outlier detection
&lt;/h3&gt;

&lt;p&gt;Outliers may denote errors or unusual behavior. In the Elastic Stack, we use an ensemble of four different distance and density based outlier detection methods, based on this approach, a metric is computed called local outlier factor for each data point. The higher the local outlier factor, the more outlying is the data point.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ssxA61tl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5bzxtmxc2eokycb2y4rb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ssxA61tl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5bzxtmxc2eokycb2y4rb.png" alt="Alt Text" width="800" height="611"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;| &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Menu&lt;/a&gt; | &lt;a href="https://dev.to/priscilla_parodi/elastic-data-frame-regression-analysis-2ge2"&gt;Next Post: Elastic Data Frame - Regression Analysis&lt;/a&gt; |&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This post is part of a series that covers &lt;a href="https://dev.to/priscilla_parodi/a-guide-to-machine-learning-ai-with-the-elastic-stack-1dkl"&gt;Artificial Intelligence with a focus on Elastic's (Creators of Elasticsearch) Machine Learning solution&lt;/a&gt;, aiming to introduce and exemplify the possibilities and options available, in addition to addressing the context and usability.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>machinelearning</category>
      <category>elasticsearch</category>
      <category>beginners</category>
    </item>
  </channel>
</rss>
