<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Dustin Brown</title>
    <description>The latest articles on DEV Community by Dustin Brown (@willcode).</description>
    <link>https://dev.to/willcode</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/willcode"/>
    <language>en</language>
    <item>
      <title>Introducing the ChatGPT Weather Plugin: Bringing Real-Time Weather Data to Your Conversations!</title>
      <dc:creator>Dustin Brown</dc:creator>
      <pubDate>Wed, 21 Jun 2023 17:05:29 +0000</pubDate>
      <link>https://dev.to/willcode/introducing-the-chatgpt-weather-plugin-bringing-real-time-weather-data-to-your-conversations-1h8o</link>
      <guid>https://dev.to/willcode/introducing-the-chatgpt-weather-plugin-bringing-real-time-weather-data-to-your-conversations-1h8o</guid>
      <description>&lt;p&gt;Are you tired of constantly switching between your chat application and weather websites just to check the forecast? We have the perfect solution for you! Introducing the ChatGPT Weather Plugin, a revolutionary addition to ChatGPT that seamlessly integrates real-time weather data into your conversations. With this plugin, you can now easily gather current weather conditions and forecasts for any location without leaving the chat interface.&lt;/p&gt;

&lt;p&gt;All of the code for this working ChatGPT Plugin are available on GitHub&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;chatgpt-plugin-express-vue3-vite-starter&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://github.com/willCode2Surf/chatgpt-plugin-express-vue3-vite-starter"&gt;https://github.com/willCode2Surf/chatgpt-plugin-express-vue3-vite-starter&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Endless Possibilities with ChatGPT Plugins
&lt;/h2&gt;

&lt;p&gt;The ChatGPT Weather Plugin is a powerful tool that demonstrates the endless possibilities of what can be achieved with ChatGPT 4. By combining the capabilities of ChatGPT with real-world APIs, this plugin takes the basic TODO application to a whole new level &lt;br&gt;
by providing it with real context.&lt;/p&gt;
&lt;h2&gt;
  
  
  Rapid Development with Github
&lt;/h2&gt;

&lt;p&gt;To bring this exciting plugin to life, we utilized the lightning-fast development capabilities of Vite, and Express.js from an existing repo template. &lt;/p&gt;
&lt;h2&gt;
  
  
  Getting Started with the ChatGPT Weather Plugin
&lt;/h2&gt;

&lt;p&gt;For this to work properly you will need the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; ChatGPT Plugin Developer Access; apply here if you do not have it yet: &lt;a href="https://openai.com/blog/chatgpt-plugins"&gt;chatgpt-plugins&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt; OpenWeather API access for their Weather API. 1,000 FREE Calls a day is perfect. You will need an account and an API key. &lt;a href="https://openweathermap.org/api"&gt;openweather&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt; Google Maps API access for their GeoLocation. You will need an account and an API key. &lt;a href="https://developers.google.com/maps/documentation/geolocation/get-api-key"&gt;Google Maps&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This allowed us to quickly prototype a fully functional solution that can be easily customized and integrated with your existing ChatGPT application.&lt;/p&gt;
&lt;h2&gt;
  
  
  Once you have the prerequisites in place, follow these steps to set up the ChatGPT Weather Plugin:
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Clone the repo into your developemnt space
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone git@github.com:willCode2Surf/chatgpt-plugin-express-vue3-vite-starter.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  Open the project with VS Code/Terminal and install dependencies
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd vue3-vite-express-js-boilerplate
npm install
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  Add .env file to the project
&lt;/h3&gt;

&lt;p&gt;You will need to have a .env file that has 2 variables needed that are used in the application.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;OPEN_WEATHER_KEY=YOUR_WEATHERKEY
MAPS_KEY=YOUR_MAPS_KEY
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Running the application stack
&lt;/h3&gt;

&lt;p&gt;If you want to use the node application as a stand alone without Docker, we are ready. Running npm run start will do a couple things. It builds the SRC and PUBLIC directories into the DIST folder that is used for manifest validation.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm run start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you want to use Docker it is just as easy. There is a known issue with using Husky and Docker. Since Husky and Prettier are geared for developer experience we can remove it for when we need to build the docker image. Just remove this section from the package.json file (inside the scripts portion of package.json).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"prepare": "husky install"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once that has been removed from the package.json file we can build the Docker image.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker build -t chatgptweatherplugin:dev .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once it is completed we can run the container image by executing the following docker command (the ENV variables that we have in our .env file will need to be passed to Docker at runtime as well):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -d -p 6909:6909 chatgptweatherplugin:dev -e OPEN_WEATHER_KEY=YOUR_WEATHERKEY -e MAPS_KEY=YOUR_MAPS_KEY
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Usage
&lt;/h1&gt;

&lt;p&gt;Set up your GPT Plugin in the ChatGPT Plugin UI.&lt;br&gt;
When prompted for Plugin that you created plug in:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;http://localhost:6909
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Project Design
&lt;/h1&gt;

&lt;p&gt;This application is intended to be headless as an API. We are using the existing boilerplate to place needed OpenAPI files and ChatGPT manifest.&lt;/p&gt;

&lt;h3&gt;
  
  
  Classes
&lt;/h3&gt;

&lt;p&gt;In the folder /classes you will find a couple helpers to complete the requests in a clean, async manner.&lt;/p&gt;

&lt;h3&gt;
  
  
  Configuration
&lt;/h3&gt;

&lt;p&gt;In the directory /config you will find the Configuration details that are binded to your environment variables for use throughout the application.&lt;/p&gt;

&lt;h3&gt;
  
  
  API Routes
&lt;/h3&gt;

&lt;p&gt;In the directory /express-routes you will find the endpoints that ChatGPT will communicate with and that your OpenAPI file will define.&lt;/p&gt;

&lt;h3&gt;
  
  
  Public Resources
&lt;/h3&gt;

&lt;p&gt;There are a couple very specific files in here that are necessary for the ChatGPT Plugin to work properly.&lt;/p&gt;

&lt;p&gt;/public/.well-known/ai-plugin.json provides the manifest file for ChatGPT to understand the context it is working within. Referenced files in this manifest are also found here (logo, openapi.yaml)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "schema_version": "v1",
  "name_for_human": "WEATHER Plugin (no authorizations)",
  "name_for_model": "location",
  "description_for_human": "Plugin for gathering current weather conditions and forecasts for a given location.",
  "description_for_model": "Plugin for gathering current weather conditions and forecasts for a given location.",
  "auth": {
    "type": "none"
  },
  "api": {
    "type": "openapi",
    "url": "http://localhost:6909/openapi.yaml",
    "is_user_authenticated": false
  },
  "logo_url": "http://localhost:6909/logo.png",
  "contact_email": "support@example.com",
  "legal_info_url": "https://example.com/legal"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;/public/openapi.yaml defines the contraints around the API endpoints that ChatGPT can interact with.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;openapi: 3.0.1
info:
  title: Weather ChatGPT Plugin
  description: A plugin that allows the user to request current conditions and forecasts.
  version: "v1"
servers:
  - url: http://localhost:6909
paths:
  /routes/weather:
    post:
      operationId: getWeather
      summary: Get weather for a provided location
      requestBody:
        required: true
        content:
          application/json:
            schema:
              $ref: "#/components/schemas/getWeatherRequest"
      responses:
        "200":
          description: OK
          content:
            application/json:
              schema:
                $ref: "#/components/schemas/getWeatherResponse"
  /routes/forecast:
    post:
      operationId: getForecast
      summary: Get forecast for a provided location
      requestBody:
        required: true
        content:
          application/json:
            schema:
              $ref: "#/components/schemas/getForecastRequest"
      responses:
        "200":
          description: OK
          content:
            application/json:
              schema:
                $ref: "#/components/schemas/getForecastResponse"
components:
  schemas:
    getWeatherRequest:
      type: object
      required:
        - 'location'
      properties:
        location:
          type: string
          description: The location's latitude and longitude to gather real time information for.
          required: true
    getForecastRequest:
      type: object
      required:
        - location
      properties:
        location:
          type: string
          description: The location's latitude and longitude to gather forecast information for.
          required: true
    getWeatherResponse:
      type: object
      properties:
        weather:
          type: array
          items:
            type: string
          description: The list of current weather statements.
    getForecastResponse:
      type: object
      properties:
        weather:
          type: array
          items:
            type: string
          description: The list of current forecast statements.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  What's Next and Feedback
&lt;/h1&gt;

&lt;p&gt;Let me know if there are any other neat ideas for starter kits around ChatGPT plugins that you would like top see.&lt;/p&gt;

</description>
      <category>chatgpt</category>
      <category>chatgptplugin</category>
      <category>plugin</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Getting to know Redis Enterprise TimeSeries Module</title>
      <dc:creator>Dustin Brown</dc:creator>
      <pubDate>Mon, 05 Apr 2021 12:39:13 +0000</pubDate>
      <link>https://dev.to/willcode/getting-to-know-redis-enterprise-timeseries-module-version-2fn7</link>
      <guid>https://dev.to/willcode/getting-to-know-redis-enterprise-timeseries-module-version-2fn7</guid>
      <description>&lt;p&gt;In this post we will cover all the details and considerations taken to make for a successful deployment of a product or service leveraging the Redis Enterprise TimeSeries Module.&lt;/p&gt;

&lt;p&gt;We will walk through:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Identifying a problem area that TimeSeries can fix&lt;/li&gt;
&lt;li&gt;Laying out the path to success by

&lt;ul&gt;
&lt;li&gt;Setting Up the proper instance in Azure&lt;/li&gt;
&lt;li&gt;Understanding the TimeSeries Module features&lt;/li&gt;
&lt;li&gt;Key/Data Topology&lt;/li&gt;
&lt;li&gt;Planning for creating data sets that are downsampled for quick aggregation&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Where TimeSeries Module Fits In
&lt;/h2&gt;

&lt;p&gt;Leveraging the TimeSeries Module you are likely engaged in some level of analytics.  Analytics is the power food for intelligent decision-making.  Finding a place where the TimeSeries Module can streamline operations in your enterprise will depend on existing infrastructure, pipelines, and legacy processes.  Never fear, working this technology into existing stacks is well-supported across various programming languages (Java, Go, Python, JavaScript/TypeScript, Rust, and Ruby).&lt;/p&gt;

&lt;p&gt;For this scenario we will look at replacing some long-standing processes that create fixed cost and ongoing development costs.  We will also leverage Node with &lt;br&gt;
&lt;a href="https://www.npmjs.com/package/redistimeseries-js" rel="noopener noreferrer"&gt;RedisTimeSeries-JS&lt;/a&gt;.  We will also be propping up our Redis TimeSeries instance in &lt;a href="https://redislabs.com/cloud-partners/microsoft-azure/" rel="noopener noreferrer"&gt;Azure&lt;/a&gt;  &lt;/p&gt;
&lt;h2&gt;
  
  
  Success with Redis Enterprise TimeSeries Module
&lt;/h2&gt;


&lt;h3&gt;
  
  
  Getting started in Azure
&lt;/h3&gt;

&lt;p&gt;Since the first post in this series Redis Labs has fully launched Redis Enterprise in Azure!  &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In the Azure portal, search for Azure Cache for Redis.&lt;/li&gt;
&lt;li&gt;Click + Add.&lt;/li&gt;
&lt;li&gt;Complete the initial screen with your needed details, select the Enterprise plan that fits your immediate needs.&lt;/li&gt;
&lt;li&gt;Select appropriate public/private endpoint.&lt;/li&gt;
&lt;li&gt;On the advanced screen we will select the modules that we need to use in this instance of Redis Enterprise.&lt;/li&gt;
&lt;li&gt;Review and create the instance!&lt;/li&gt;
&lt;li&gt;Once the instance has created, configure your &lt;a href="https://redislabs.com/redis-enterprise/redis-insight/" rel="noopener noreferrer"&gt;RedisInsight&lt;/a&gt; so that you can see the new cluster.&lt;/li&gt;
&lt;/ol&gt;


&lt;h3&gt;
  
  
  Understanding the TimeSeries Module Features
&lt;/h3&gt;

&lt;p&gt;The Redis Enterprise platform brings with it the standard expectation of a tried-and-true tool in any modern stack but with scale in mind.  It is important to cover the features that can be leveraged and combined within the TimeSeries Module so early planning can take these into consideration.&lt;/p&gt;

&lt;p&gt;There are a few key concepts that we need to expand on so that you can get most of the implementation.  Let’s cover the concepts that need to be understood before ingesting data.  (We will talk consumption in depth in the next post.)&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Keys, this is not a new concept and are the way to sink data into the instance.  There are some attributes of keys that you can fine-tune data policies.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;retention, give your data a TTL&lt;/li&gt;
&lt;li&gt;labels, give context and additional attributes to your data&lt;/li&gt;
&lt;li&gt;duplicate policy, determine what happens when you have data bumping into each other&lt;/li&gt;
&lt;li&gt;compression policy&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Aggregation, there are many out of the box.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AVG&lt;/li&gt;
&lt;li&gt;SUM&lt;/li&gt;
&lt;li&gt;MIN&lt;/li&gt;
&lt;li&gt;MAX&lt;/li&gt;
&lt;li&gt;RANGE&lt;/li&gt;
&lt;li&gt;FIRST&lt;/li&gt;
&lt;li&gt;LAST&lt;/li&gt;
&lt;li&gt;STD.P&lt;/li&gt;
&lt;li&gt;STD.S&lt;/li&gt;
&lt;li&gt;VAR.P&lt;/li&gt;
&lt;li&gt;VAR.S&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Rules, make the data you are storing reactive to data ingest.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;are applied to a specific key.&lt;/li&gt;
&lt;li&gt;are predefined aggregation procedures

&lt;ul&gt;
&lt;li&gt;a destination key for the reactive data must be present&lt;/li&gt;
&lt;li&gt;a predefined aggregation option must be provided&lt;/li&gt;
&lt;li&gt;the time interval for the sampled data must be provided&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Labels, the metadata option to give your data more context.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Label/Value collection that represents broader context of the underlying data&lt;/li&gt;
&lt;li&gt;valuable at the digest/consumption layer to provide aggregation across keys by other similar characteristics&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ingest&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;


&lt;h3&gt;
  
  
  Key/Data Topology
&lt;/h3&gt;

&lt;p&gt;If migrating from another system or starting from the ground up there will be some data points that need to be mapped to appropriate key structures.  Data points can be ingested as singular readings but are usually accompanied by additional data that gives a full picture of what just occurred.  Data ingest could come in at extremely high sampling rates (health devices, IoT sensors, weather apparatus) for the same system providing the data point, and we need to plan on how to handle this at scale so that we can surface downsampled data concisely without impacting performance.&lt;/p&gt;

&lt;p&gt;Let's take a look at a sample log reading for a weather sensor.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  time: 1617293819263,
  temp: 55.425,
  wind: 13,
  windDirection: 13,
  windChill: 51,
  precipitation: 0,
  dewPoint: 20,
  humidity: .24,
  visibility: 10,
  barometer: 30.52,
  lat: 33.67,
  lon: 101.82,
  elevation: 3281,
  city: 2232,
  deviceId: 47732234
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This sample log has many constant attributes, and some interesting data points that can vary with every reading.  If the scientific equipment produces a sample each second, then over a period of 24 hours we will receive 86,400 readings.  If we had 100 devices in a particular region providing realtime weather data, we could easily ingest close to 9M logs.  To perform some level of analytics (either visually or with additional ETL/ML pipelines) we need to downsample the footprint.  &lt;/p&gt;

&lt;p&gt;Traditionally, one might sink this log reading into Cosmos or Mongo and aggregate on demand while handling the downsampling in code.  This is problematic, does not scale well, and is destined to fail at some point.  The better approach is to sink this data into Redis Enterprise using the TimeSeries Module.  The first thing to planning for a successful TimeSeries implementation is flattening your logs by identifying both reading data and metadata.  &lt;/p&gt;

&lt;p&gt;With our sample log we can build a pattern for key definitions that we can sink readings into while attaching, define the metadata we will attach with the keys and resolving a key path.&lt;/p&gt;

&lt;p&gt;Let's evaluate again with some context.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  time: 1617293819263,  // timestamp   
  temp: 55.425,         // reading
  wind: 13,             // reading
  windDirection: 13,    // reading
  windChill: 51,        // reading
  precipitation: 0,     // reading
  dewPoint: 20,         // reading
  humidity: .24,        // reading
  visibility: 10,       // reading
  barometer: 30.52,     // reading
  lat: 33.67,           // meta-data
  lon: 101.82,          // meta-data
  elevation: 3281,      // meta-data
  city: 2232,           // meta-data
  deviceId: 47732234    // identification
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With this context we can outline the keys we will need to represent each attribute in our log.  If it is a reading, it needs a key.  We will leverage the identifier combined with the readings to create the following keys based on this pattern.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight console"&gt;&lt;code&gt;&lt;span class="go"&gt;sensors:{deviceId}:{attribute}
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With this topology defined we need to create keys for the attributes. &lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Attribute&lt;/th&gt;
&lt;th&gt;Key Definition&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;temp&lt;/td&gt;
&lt;td&gt;sensors:47732234:temp&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;wind&lt;/td&gt;
&lt;td&gt;sensors:47732234:wind&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;windDirection&lt;/td&gt;
&lt;td&gt;sensors:47732234:windDirection&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;windChill&lt;/td&gt;
&lt;td&gt;sensors:47732234:windChill&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;precipitation&lt;/td&gt;
&lt;td&gt;sensors:47732234:precipitation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;dewPoint&lt;/td&gt;
&lt;td&gt;sensors:47732234:dewPoint&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;humidity&lt;/td&gt;
&lt;td&gt;sensors:47732234:humidity&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;visibility&lt;/td&gt;
&lt;td&gt;sensors:47732234:visibility&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;barometer&lt;/td&gt;
&lt;td&gt;sensors:47732234:barometer&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Before we create the keys, we need to identify the meta-data that we are going to associate with them as well as the TTL for series data and our duplicate policy.  TTL will default to 0 by default but in our case, we want to keep the data for 30 days and take the last reading if there was contention on a series. This&lt;/p&gt;

&lt;p&gt;Based on our sample log we need to add the following labels to our keys.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Attribute&lt;/th&gt;
&lt;th&gt;Label&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;lat&lt;/td&gt;
&lt;td&gt;lat 33.67&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;lon&lt;/td&gt;
&lt;td&gt;lon 101.82&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;elevation&lt;/td&gt;
&lt;td&gt;elevation 3281&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;city&lt;/td&gt;
&lt;td&gt;city 2232&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Now let's create the keys.  You can always create the keys from the CLI in RedisInsight like so.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight console"&gt;&lt;code&gt;&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.CREATE sensors:47732234:temp RETENTION 2592000000 DUPLICATE_POLICY LAST LABELS lat 33.67 lon 101.82 elevation 3281 city 2232
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This works but to complete this for many sensors across many attributes we should lean in on some code to streamline this generation.  Using the redistimeseries-js library we can handle this programmatically.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// set up our Redis Enterprise instance&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;RedisTimeSeries&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;redistimeseries-js&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;  &lt;span class="nx"&gt;Aggregation&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;redistimeseries-js&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;options&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;host&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;{ENTER-YOUR-REDIS-ENTERPRISE-INSTANCE}&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;port&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;password&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;{ENTER-YOUR-PASSWORD}&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;rtsClient&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;RedisTimeSeries&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;options&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;retentionTime&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;86400000&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;30&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// key generation&lt;/span&gt;
 &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;labels&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;lat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;33.67&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;lon&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;101.82&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;elevation&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;3281&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;city&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;2232&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;rtsClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`sensors:47732234:temp`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;duplicatePolicy&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;LAST&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;labels&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;labels&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
              &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;retention&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;retentionTime&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
              &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
              &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
              &lt;span class="p"&gt;})&lt;/span&gt;
              &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
              &lt;span class="p"&gt;});&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We can validate that our key has been created by using the RedisInsight CLI command TS.INFO.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight console"&gt;&lt;code&gt;&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.INFO sensors:47732234:temp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fniqeczff6adzqv4sn53k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fniqeczff6adzqv4sn53k.png" alt="TS.INFO"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  Making Data Reactive with rules
&lt;/h3&gt;

&lt;p&gt;Aggregation out of the Redis Enterprise TimeSeries Module is very fast but can be optimized to provide even better performance if you know what kind of data summaries your data scientist, analyst, or systems need to make correct decisions.  Using our singular device that is producing real time weather information we have determined that there are a few intervals that will be extracted regularly.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;5 minutes&lt;/li&gt;
&lt;li&gt;1 hour&lt;/li&gt;
&lt;li&gt;1 day&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is where we can define rules that react to data as it is ingested and downsampled using the built-in aggregation techniques to provide concise representation of many raw logs into a real time data point. &lt;/p&gt;

&lt;p&gt;NOTE:  there are a couple of things about rules that need to be made understood upfront.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;rules need an aggregation type.&lt;/li&gt;
&lt;li&gt;rules need a time interval to retain samples for.&lt;/li&gt;
&lt;li&gt;rules need downstream keys to sink the reactive data into.&lt;/li&gt;
&lt;li&gt;rules will not resample destination key data if rule is added after ingesting has begun, that is why we are defining these before ingesting data.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For this post we are going to focus on creating rules for the temp attribute.  We can create our 3 destination keys and 3 rules from the CLI as well.  We will define what aggregation type we are scoping to in the key structure with the interval it is concerned with.&lt;/p&gt;

&lt;p&gt;Destination Keys&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight console"&gt;&lt;code&gt;&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.CREATE sensors:47732234:temp:avg:5min RETENTION 2592000000 DUPLICATE_POLICY LAST LABELS lat 33.67 lon 101.82 elevation 3281 city 2232
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.CREATE sensors:47732234:temp:avg:1hr RETENTION 2592000000 DUPLICATE_POLICY LAST LABELS lat 33.67 lon 101.82 elevation 3281 city 2232
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.CREATE sensors:47732234:temp:avg:1day RETENTION 2592000000 DUPLICATE_POLICY LAST LABELS lat 33.67 lon 101.82 elevation 3281 city 2232
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Rules&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight console"&gt;&lt;code&gt;&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.CREATERULE sensors:47732234:temp sensors:47732234:temp:avg:5min AGGREGATION AVG 300000
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.CREATERULE sensors:47732234:temp sensors:47732234:temp:avg:1hr AGGREGATION AVG 3600000
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.CREATERULE sensors:47732234:temp sensors:47732234:temp:avg:1day AGGREGATION AVG 86400000
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This can also be completed via code!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;rtsClient&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createRule&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`sensors:47732234:temp`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;`pumps:47732234:temp:avg:5min`&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;aggregation&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;Aggregation&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;300000&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;}).&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;
 &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;rtsClient&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createRule&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`sensors:47732234:temp`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;`pumps:47732234:temp:avg:1hr`&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;aggregation&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;Aggregation&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3600000&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;}).&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;rtsClient&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createRule&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`sensors:47732234:temp`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;`pumps:47732234:temp:avg:1day`&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;aggregation&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;Aggregation&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;86400000&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;}).&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We can use TS.INFO to see our new rules just the same and now running it against our base key we will see the rules attached.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight console"&gt;&lt;code&gt;&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.INFO sensors:47732234:temp
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.INFO sensors:47732234:temp:avg:5min
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.INFO sensors:47732234:temp:avg:1hr
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.INFO sensors:47732234:temp:avg:1day
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzs3c8ijcl8fwjrq1w7p7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzs3c8ijcl8fwjrq1w7p7.png" alt="TS.INFO.WITH.RULES"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;1 Day&lt;/th&gt;
&lt;th&gt;1 Hour&lt;/th&gt;
&lt;th&gt;5 Min&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshi45nz3y2zd23c8jld3.png" alt="TS.INFO.WITH.RULES"&gt;&lt;/td&gt;
&lt;td&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fukrzanjvh0f58ncxp1wm.png" alt="TS.INFO.WITH.RULES"&gt;&lt;/td&gt;
&lt;td&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxmf90zpo9xvcvnphkdsc.png" alt="TS.INFO.WITH.RULES"&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Now that we have all of our keys and rules created, we can look into ingesting some sample data. &lt;/p&gt;




&lt;h3&gt;
  
  
  Ingest
&lt;/h3&gt;

&lt;p&gt;For the purpose of this post, we will continue to focus on ingesting a single attribute from our sample log.  As data readings are coming from the system, we would have an API on the edge that would parse the log and update the TimeSeries instance.  We are going to simply ingest a few days worth of sample data for demo purposes around the temp attribute.  &lt;/p&gt;

&lt;p&gt;Using the base code and redistimeseries-js module that we set up earlier we can add some functionality to simulate this process.  You can tweak the setInterval so that you can ingest different attributes or intervals. I ingested Jan 1 12AM through Jan 2 3:40AM with one minute interval logs.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;moment&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;moment&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="c1"&gt;// add this to the project&lt;/span&gt;

&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;startTime&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;moment&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2021-01-01&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;insertTS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;labels&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
   &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;rtsClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;labels&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;labels&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;err&lt;/span&gt;&lt;span class="p"&gt;);});&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;randomTemp&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;min&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;max&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;random&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;max&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="nx"&gt;min&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="nx"&gt;min&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;labels&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;lat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;33.67&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;lon&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;101.82&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;elevation&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;3281&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;city&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;2232&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="nf"&gt;setInterval&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;timeStamp&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;startTime&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;minutes&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;⏰&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;timeStamp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;LLLL&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;  &lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;timeStamp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;format&lt;/span&gt;&lt;span class="p"&gt;()));&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;insertTS&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;sensors:47732234:temp&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;timeStamp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;format&lt;/span&gt;&lt;span class="p"&gt;()),&lt;/span&gt; &lt;span class="nf"&gt;randomTemp&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;45&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;77&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="nx"&gt;labels&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once ingest is complete we can return to RedisInsight and use TS.INFO to retrieve our base key details as well as take a peak at the destination keys that have the groomed data represented in real time!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight console"&gt;&lt;code&gt;&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.INFO sensors:47732234:temp
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.INFO sensors:47732234:temp:avg:5min
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.INFO sensors:47732234:temp:avg:1hr
&lt;span class="gp"&gt;&amp;gt;&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; TS.INFO sensors:47732234:temp:avg:1day
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3kgtbqw76wr1huxu9ge3.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3kgtbqw76wr1huxu9ge3.PNG" alt="TS.INFO.WITH.RULES"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;1 Day - 1 Sample&lt;/th&gt;
&lt;th&gt;1 Hour - 27 Samples&lt;/th&gt;
&lt;th&gt;5 Min - 332 Samples&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk4si3x48oz5ie7myevmc.png" alt="TS.INFO.WITH.RULES"&gt;&lt;/td&gt;
&lt;td&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faxg866j0824daqqvi7f3.png" alt="TS.INFO.WITH.RULES"&gt;&lt;/td&gt;
&lt;td&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F972a4bygihxvcbasd1lx.PNG" alt="TS.INFO.WITH.RULES"&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;A quick look over the base key entries count (highlighted), and the destination keys gives us good context.  All the down sampled data is accurately represented by the time buckets we allocated for them to produce.&lt;/p&gt;




&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;Awesome! Hopefully at this point you have a grasp on the following concepts:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Setting up an instance of Redis Enterprise in Azure with the TimeSeries Module &lt;/li&gt;
&lt;li&gt;Data Topology and flattening logs to linear attributes as keys&lt;/li&gt;
&lt;li&gt;Rules for reactive data and destination keys&lt;/li&gt;
&lt;li&gt;Using RedisInsight and some basic commands for checking series state (TS.INFO, TS.CREATE, TS.CREATERULE)&lt;/li&gt;
&lt;li&gt;How to programmatically interact with the TimeSeries instance to ingest and process data to scale&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In the next post we will cover the exciting parts that really show off the power of extracting and working with the data in the Redis Enterprise TimeSeries instance.  What to look forward to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Querying Data Out&lt;/li&gt;
&lt;li&gt;Exploring the destination key samples in detail&lt;/li&gt;
&lt;li&gt;Filtering with Labels&lt;/li&gt;
&lt;li&gt;API Code for composing complex objects across multiple keys&lt;/li&gt;
&lt;li&gt;Visualizing with Grafana&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>redis</category>
      <category>timeseries</category>
      <category>azure</category>
      <category>database</category>
    </item>
    <item>
      <title>Get excited for Redis Enterprise on Azure</title>
      <dc:creator>Dustin Brown</dc:creator>
      <pubDate>Mon, 01 Mar 2021 18:07:31 +0000</pubDate>
      <link>https://dev.to/willcode/get-excited-for-redis-enterprise-on-azure-bf3</link>
      <guid>https://dev.to/willcode/get-excited-for-redis-enterprise-on-azure-bf3</guid>
      <description>&lt;p&gt;Everyone that I have worked with loves Redis.  It is a highly effective tool used as a database, cache, pubsub, and many other things.  It is highly optimized and enjoyed success for 10 years in its space.  Redis's ease of use and intense scalability make it a must use in my opinion.  So how could Redis Labs improve Redis?  Take on a strategic partnership with Microsoft Azure and roll out the Redis Enterprise tier!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://azure.microsoft.com/en-us/blog/microsoft-and-redis-labs-collaborate-to-give-developers-new-azure-cache-for-redis-capabilities/"&gt;Redis Labs and Microsoft Azure Partnership&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Redis Enterprise Better on Azure
&lt;/h2&gt;

&lt;p&gt;Redis Labs thinks latency is the new downtime and they mean business with Redis Enterprise!  Redis Enterprise is about to exit preview and go mainstream on Azure for the masses.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;High Availability (99.99%)&lt;/li&gt;
&lt;li&gt;Geographically Distributed&lt;/li&gt;
&lt;li&gt;Module Extension

&lt;ul&gt;
&lt;li&gt;RedisTimeSeries&lt;/li&gt;
&lt;li&gt;RedisSearch&lt;/li&gt;
&lt;li&gt;RedisBloom&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;Redis on Flash (RoF) over Azure NVMe&lt;/li&gt;
&lt;li&gt;Scaling

&lt;ul&gt;
&lt;li&gt;up to 15TB&lt;/li&gt;
&lt;li&gt;50K to 500K connections&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;Secure

&lt;ul&gt;
&lt;li&gt;Azure Virtual Network, Private Link&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Usage
&lt;/h2&gt;

&lt;p&gt;Currently in the preview version we have made heavy use of the TimeSeries Module, to replace organic processes.  There will be subsequent posts and code samples detailing our work around the TimeSeries Module but the others can be leveraged either in conjunction or independently.  Our use needs to support hundreds of thousands of IoT sensors providing real-time data. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://redislabs.com/modules/redis-timeseries/"&gt;Redis TimeSeries Module&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://oss.redislabs.com/redisearch/"&gt;Redis Search&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://redislabs.com/modules/redis-bloom/"&gt;Redis Bloom&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;p&gt;The developer ecosystem around Redis is fantastic.  Redis also has some tools that you should have already installed. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://redislabs.com/redis-enterprise/redis-insight/"&gt;RedisInsight&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Node resources&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/danitseitlin/redis-modules-sdk/"&gt;Redis Modules (TypeScript)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.npmjs.com/package/redistimeseries-js"&gt;RedisTimeSeries-JS&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Planning&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://redislabs.com/modules/redis-timeseries/time-series-sizing-calculator/"&gt;Redis TimeSeries Sizing Calculator&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;As Redis Enterprise becomes generally available you owe it to yourself and organization to see if this improves your current Redis topology or if one of the new modules for Redis Enterprise can solve some business problems.&lt;/p&gt;

&lt;p&gt;In the next post we will dive into TimeSeries and showcase some concepts including data topology, down sampling, covering the steps needed to be successful when planning for leveraging the Redis TimeSeries module to scale!&lt;/p&gt;

&lt;p&gt;Thank You!&lt;/p&gt;

</description>
      <category>redis</category>
      <category>timeseries</category>
      <category>azure</category>
      <category>database</category>
    </item>
  </channel>
</rss>
