<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Lorna Jane Mitchell</title>
    <description>The latest articles on DEV Community by Lorna Jane Mitchell (@lornajane).</description>
    <link>https://dev.to/lornajane</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/lornajane"/>
    <language>en</language>
    <item>
      <title>Get to know the Aiven API with Postman</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Tue, 22 Nov 2022 16:54:00 +0000</pubDate>
      <link>https://dev.to/lornajane/get-to-know-the-aiven-api-with-postman-365j</link>
      <guid>https://dev.to/lornajane/get-to-know-the-aiven-api-with-postman-365j</guid>
      <description>&lt;p&gt;Choosing cloud providers with APIs is a smart move; having an API available means you can build any additional integrations that your application requires, in addition to what's provided as standard. The downside is getting to know someone else's API, which can be a slow and painful process if good developer resources aren't available.&lt;/p&gt;

&lt;p&gt;At Aiven we'd love you to get to know our API, so we created a &lt;a href="https://www.postman.com/aiven-apis/workspace/aiven/documentation/21112408-1f6306ef-982e-49f8-bdae-4d9fdadbd6cd"&gt;Postman collection&lt;/a&gt; to help you get started. If you haven't used &lt;a href="https://www.postman.com/"&gt;Postman&lt;/a&gt;, it's a friendly interface for humans to work with APIs. We use it ourselves, so we created something you could use too. In this post, you'll use Postman to take your first steps with the Aiven API.&lt;/p&gt;

&lt;h2&gt;
  
  
  Before you start
&lt;/h2&gt;

&lt;p&gt;You can use Postman either in your web browser or as a desktop application. Whichever you choose, &lt;a href="https://www.postman.com/postman-account/"&gt;sign up for a free account&lt;/a&gt;, or sign in if you have one already.&lt;/p&gt;

&lt;p&gt;Navigate to the &lt;a href="https://www.postman.com/aiven-apis/workspace/aiven/documentation/21112408-1f6306ef-982e-49f8-bdae-4d9fdadbd6cd"&gt;collection&lt;/a&gt;. To get your own copy of the collection to work with, right-click on the collection and choose "Create a fork".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--r8ghhG_e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8df7okvmhjb2h39qzuip.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--r8ghhG_e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8df7okvmhjb2h39qzuip.png" alt="Fork the Postman collection so you can edit it" width="880" height="625"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You're prompted to give the collection a name, and which workspace to move it to. "My workspace" is a good choice if you're not sure.&lt;/p&gt;

&lt;h2&gt;
  
  
  Get your head in the clouds
&lt;/h2&gt;

&lt;p&gt;The first API call we'll make is simple: ask the Aiven API for a list of all the clouds that a user can deploy their services to on Aiven.&lt;/p&gt;

&lt;p&gt;Expand your collection in the left-hand bar and click on "Clouds". The request you want is called "List available cloud platforms", and it doesn't need any editing or changes. Go ahead and click the "Send" button over on the right-hand side.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KZz00xti--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lvk8zu0yg0yf2c5pbv0s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KZz00xti--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lvk8zu0yg0yf2c5pbv0s.png" alt="List of clouds" width="880" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Well done - you've made your first API call! The results display in the lower half of the screen; Postman autodetects that the response is in JSON, and format it nicely so that us mere mortals can read it. Have a scroll down and pick a location that sounds like a fun place to host your database.&lt;/p&gt;

&lt;p&gt;This is fun, but it'll be more fun when you're working with your own databases in the cloud, so the next step is to identify yourself to the API when you make a request.&lt;/p&gt;

&lt;h2&gt;
  
  
  Send credentials with your API request
&lt;/h2&gt;

&lt;p&gt;When you sign in to a website, you probably use a username and password (Aiven also supports SSO and allows you to add 2FA, in case you're interested). For APIs, we usually use an authentication token. This is good practice because you can create a new token for each application; those credentials can each be independently rotated or revoked without affecting any other applications - because they each use their own tokens.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;If you don't already have an Aiven account, &lt;a href="https://console.aiven.io/signup"&gt;sign up for a free trial&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Visit &lt;a href="https://console.aiven.io/profile/auth"&gt;your Aiven profile page&lt;/a&gt; and scroll down to "Authentication tokens" to create a new token.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enter a description so that you will know which token this is, and set the expiry in hours for your token (leave it blank for a token that does not expire). Click "Generate token".&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Copy the token, ready to set it in the Postman collection.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Back in Postman, click the collection menu, and go to "Edit". On the "Variables" tab, paste your token value in under "Current value" for the "authToken" variable. Press the "Save" button on the right-hand side above the tabs.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OWeoS5Bi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/660jdz3ov0kp9m4gwn46.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OWeoS5Bi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/660jdz3ov0kp9m4gwn46.png" alt="Paste your API token as a collection variable" width="880" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;All the requests in the collection are set up to use this variable, so by setting it here, it's used for all the API calls. Speaking of which, shall we make another one?&lt;/p&gt;

&lt;h2&gt;
  
  
  Create a cloud database
&lt;/h2&gt;

&lt;p&gt;Aiven offers a variety of different databases and other data &lt;em&gt;things&lt;/em&gt;, like &lt;a href="https://kafka.apache.org/"&gt;Apache Kafka&lt;/a&gt; (technically not a database) and &lt;a href="https://grafana.com/"&gt;Grafana&lt;/a&gt; (&lt;em&gt;definitely&lt;/em&gt; not a database), and we call them all "services".&lt;/p&gt;

&lt;p&gt;In the "Services" folder within the collection, select the request called "Create a service". You can create whichever service type you like, but the example values create a Redis, which is a great place to start.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Under the "Params" section, update the path variable &lt;code&gt;project&lt;/code&gt; with a project name that exists in your account that you want to use.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;This is a &lt;code&gt;POST&lt;/code&gt; request, so most of the interesting parts are in the body data; this can be found on the "Body" tab of the request. Have a look at the data, you might like to set a nicer value for &lt;code&gt;service_name&lt;/code&gt;, for example.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When you are happy with the values, click "Send". The response should have status 200 and show information about your new service.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---3cRvz3Z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uo7h1lp86lknp0op5n1r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---3cRvz3Z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uo7h1lp86lknp0op5n1r.png" alt="Create a service by API" width="880" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The right status code and some JSON is a good outcome, but it's not very exciting, is it? Head back to your Aiven console and you should see a new Redis service.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---3cRvz3Z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uo7h1lp86lknp0op5n1r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---3cRvz3Z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uo7h1lp86lknp0op5n1r.png" alt="You have a new Redis service" width="880" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A whole new Redis to play with! If you click on the service, you'll see an overview page for the service including lots of configuration and the connection information to use it. Check out our &lt;a href="https://docs.aiven.io/docs/products/redis/howto/connect-redis-cli.html"&gt;docs on connecting to Redis&lt;/a&gt; using the &lt;code&gt;redis_cli&lt;/code&gt; tool.&lt;/p&gt;

&lt;h2&gt;
  
  
  Power off the services when you don't need them
&lt;/h2&gt;

&lt;p&gt;One big concern about working with cloud services is how easy it is to start services that then hang around, running up the bills. So, once you've finished playing with your cloud Redis service, let's remove it. It's only a Postman call to get it back, after all.&lt;/p&gt;

&lt;p&gt;Look in the "Services" folder of the collection again, and this time select "Delete a service".&lt;/p&gt;

&lt;p&gt;On the "Params" tab, enter both the project and service name that you used. Send the request - and again we're looking for a response status of 200 OK.&lt;/p&gt;

&lt;p&gt;Check back on your Aiven services list: as if by magic, your service is gone.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Tip: You can also use the "Update service configuration" request and set the &lt;code&gt;powered&lt;/code&gt; value to true or false to power off your services when you're not using them. This is especially handy for development platforms that only need to run while you are working, which is hopefully not all hours of the day. Please note that this &lt;a href="https://docs.aiven.io/docs/platform/concepts/service-power-cycle.html"&gt;fully removes the service&lt;/a&gt;, and powering it on restores it from a backup. This can cause settings to reset.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Postman, Aiven and APIs
&lt;/h2&gt;

&lt;p&gt;Whether you use Postman for your day-to-day tasks, or use it as an excellent way to get to know a new API, the Aiven Postman collection gives you a way to get started quickly. &lt;/p&gt;

&lt;p&gt;Learn more about our API by checking out a &lt;a href="https://aiven.io/blog/your-first-aiven-api-call"&gt;detailed blog post about using Aiven's API from cURL and Postman&lt;/a&gt;, and the &lt;a href="https://api.aiven.io/doc/"&gt;Aiven API reference docs&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The Postman collection has just a handful of endpoints. &lt;a href="https://twitter.com/aiven_io"&gt;Tell us&lt;/a&gt; what we should add there next! And let us know what you build, we love to see it (and we accept guest blog posts if it's something &lt;em&gt;really&lt;/em&gt; cool).&lt;/p&gt;

</description>
      <category>database</category>
      <category>api</category>
      <category>developer</category>
    </item>
    <item>
      <title>Add Aiven database magic to your Laravel project</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Mon, 01 Nov 2021 11:37:05 +0000</pubDate>
      <link>https://dev.to/lornajane/add-aiven-database-magic-to-your-laravel-project-mn4</link>
      <guid>https://dev.to/lornajane/add-aiven-database-magic-to-your-laravel-project-mn4</guid>
      <description>&lt;p&gt;Our mission at Aiven is to make developers' lives better. Today I'd like to share our new tool that makes Laravel developers' lives better by adding some helpful functionality to their projects.&lt;/p&gt;

&lt;p&gt;Laravel is the most popular PHP framework in use today, and is usually backed by either MySQL or PostgreSQL databases (Aiven offers both). Many developer platforms use one database for development and another for specific sets of test data. You might also need to connect elsewhere to debug a specific problem. However, switching between databases can be cumbersome.&lt;/p&gt;

&lt;p&gt;Luckily we have a solution for you! The &lt;a href="https://github.com/aiven/aiven-laravel/"&gt;&lt;code&gt;aiven-laravel&lt;/code&gt;&lt;/a&gt; package makes it painless to reconfigure your database connection. It adds commands to the &lt;code&gt;artisan&lt;/code&gt; command line utility that you can use to configure your application to point to any of your Aiven database services. It also adds some helpers to power off any databases that are not in use, so you can save that money for a more productive purpose.&lt;/p&gt;

&lt;h2&gt;
  
  
  Set up the Aiven-Laravel package
&lt;/h2&gt;

&lt;p&gt;If you're new to Aiven, you can &lt;a href="https://console.aiven.io/signup"&gt;sign up for a free trial&lt;/a&gt; to get started. If you're new to Laravel, it has a nice &lt;a href="https://laravel.com/docs/8.x#installation-via-composer"&gt;starter project&lt;/a&gt; you can use to begin.&lt;/p&gt;

&lt;p&gt;Add &lt;code&gt;aiven-laravel&lt;/code&gt; to your existing (or newly created) project with Composer; the project page has the &lt;a href="https://github.com/aiven/aiven-laravel/#getting-started"&gt;full installation instructions&lt;/a&gt;. You will need to generate an &lt;a href="https://developer.aiven.io/docs/platform/howto/create_authentication_token"&gt;Aiven access token&lt;/a&gt; to connect your Aiven account, and then you are ready to Aiven your Laravel ... (is Aiven a verb? It is now).&lt;/p&gt;

&lt;h2&gt;
  
  
  Configure Laravel to use an Aiven service
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;aiven-laravel&lt;/code&gt; makes setting the configuration easier whether you're running MySQL or PostgreSQL on Aiven (or indeed Redis or OpenSearch, which are also common and sound combinations for Laravel). The default configuration for Laravel is to use a &lt;code&gt;.env&lt;/code&gt; file and to list the database host, port, and other details separately. However, it also supports use of the &lt;code&gt;DATABASE_URL&lt;/code&gt; environment variable to hold an entire connection string. This can be a more convenient way to work with database connections. Set the single string in your &lt;code&gt;.env&lt;/code&gt; file for development, and configure just one environment variable on the cloud platform you deploy to.&lt;/p&gt;

&lt;p&gt;To check the databases and their types that are currently in your Aiven account, use this command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;php artisan aiven:list
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Note:&lt;/em&gt; If you installed the &lt;code&gt;aiven/aiven-laravel&lt;/code&gt; package without reading the instructions and didn't configure it yet, it will prompt you with the environment variables you need to set. Because, yes, this package was built by an engineer who doesn't read setup instructions either.&lt;/p&gt;

&lt;p&gt;Let's say you have a MySQL service called &lt;code&gt;dev-db&lt;/code&gt;. You can get the configuration that you need to connect to it from Laravel with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;php artisan aiven:getconfig --service dev-db
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Take the output of the command and put it wherever you need your config to go! If you're developing locally, this is &lt;code&gt;.env&lt;/code&gt; by default, but if you use multiple environment config files or are managing your environment differently, you can use these values there instead.&lt;/p&gt;

&lt;h2&gt;
  
  
  Avoid cloud bill surprises
&lt;/h2&gt;

&lt;p&gt;One thing that stops people from modernizing their setup and moving to the cloud, especially for development, is how easy it is to accidentally misconfigure something and get a large bill. It happens too often, and it's definitely something that all developers should be cautious of. Aiven's &lt;strong&gt;pricing includes data transfer&lt;/strong&gt; so the price shown is what you'll pay if you leave the database running full time - this by itself reduces the risks hugely.&lt;/p&gt;

&lt;p&gt;Another way to keep costs under control is to turn things off when you are not using them, and this applies particularly to development platforms. Aiven gives the ability to power off your database when you're not using it, and power it back on unchanged - and we've added support to do that from &lt;code&gt;artisan&lt;/code&gt; with the Aiven-Laravel package.&lt;/p&gt;

&lt;p&gt;When you sit down to do some development work and need your database:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;php artisan aiven:powerup --service dev-db
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;[your amazing things happen here]&lt;/p&gt;

&lt;p&gt;When you have finished for the day, stop the meter from running by powering the service down again until next time:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;php artisan aiven:powerdown --service dev-db
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For databases that don't need to be always on, the test platforms for a client that you don't need to use every day ... this approach makes the cloud a very attractive place to keep your databases for low-friction access when you need them.&lt;/p&gt;

&lt;h2&gt;
  
  
  Aiven-Laravel and you
&lt;/h2&gt;

&lt;p&gt;The Aiven-Laravel package is a new release, we know that plenty of PHP developers use our platform already, and we would love to hear from you if you find the tool useful (or not!). What would you add? Is there anyone you think should try out the tool? Issues, stars and pull requests are more than welcome on the &lt;a href="https://github.com/aiven/aiven-laravel/"&gt;GitHub repository&lt;/a&gt;, we're excited!&lt;/p&gt;

</description>
      <category>php</category>
      <category>laravel</category>
      <category>database</category>
      <category>aiven</category>
    </item>
    <item>
      <title>M3 for Metrics</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Tue, 17 Aug 2021 11:45:51 +0000</pubDate>
      <link>https://dev.to/lornajane/m3-for-metrics-3m11</link>
      <guid>https://dev.to/lornajane/m3-for-metrics-3m11</guid>
      <description>&lt;p&gt;One key use case for Aiven for M3 is for handling the monitoring data from other systems. In this post you will do just that, by setting up a PostgreSQL service and collecting metrics from it using M3. To finish the picture, Grafana will be used to inspect the data you collect.&lt;/p&gt;

&lt;h2&gt;
  
  
  Start with something to monitor
&lt;/h2&gt;

&lt;p&gt;To begin with, you need a PostgreSQL service. It doesn't matter what you choose here since this service isn't going to do any important work, it's merely an exhibition piece to collect metrics from (sorry, PostgreSQL!).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3WHTiPM_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/agd69mr6t7jqqdymni41.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3WHTiPM_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/agd69mr6t7jqqdymni41.png" alt="Screenshot of creating a PostgreSQL service in the Aiven console"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With a PostgreSQL service running, you can start to collect metrics from it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Collect metrics with Aiven for M3
&lt;/h2&gt;

&lt;p&gt;From the PostgreSQL service overview page, configure the metrics integration with a new M3 service:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Choose &lt;strong&gt;Manage integrations&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Locate the one named "Send service metrics to..." (not the one for receiving), and choose &lt;strong&gt;Use integration&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;When prompted, create a new M3DB service, and configure it to your liking. To minimize latency, I usually use the same cloud region as the thing I'm monitoring.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The metrics collected from PostgreSQL are now being sent to M3.&lt;/p&gt;

&lt;h2&gt;
  
  
  Visualize the data with Grafana
&lt;/h2&gt;

&lt;p&gt;The next step is to add the dashboard integration so you can see the data you are collecting. From the M3 service overview page, let's set up a Grafana dashboard:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Choose &lt;strong&gt;Manage integrations&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Dashboard&lt;/strong&gt; and then &lt;strong&gt;Use integration&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choose a new Grafana service, and again, configure the service to suit your needs.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Once created, you can access Grafana by opening its Service URI in a new browser tab, and logging in with the user and password listed on the service overview page. To get you started, Aiven includes a default dashboard that shows some of the most common metrics.&lt;/p&gt;

&lt;p&gt;To access the pre-built dashboard:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;From the left hand menu, go to the Dashboards section (4 squares) and choose &lt;strong&gt;Manage&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on the dashboard called "Aiven PostgreSQL - [database name] - Resources"&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You now have a clear view of your PostgreSQL database health.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vKId9-Nv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wdiry5hticipf5j911s6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vKId9-Nv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wdiry5hticipf5j911s6.png" alt="Screenshot of the default Grafana PostgreSQL panel"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Click the dashboard settings, and choose "Save As" to use this dashboard as a template for your own that you can make changes to. The default one gets reset to its original state at intervals.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The &lt;a href="https://developer.aiven.io/docs/products/postgresql/reference/pg-metrics"&gt;list of PostgreSQL metrics exposed to Grafana&lt;/a&gt; gives more information about the fields that are available.&lt;/p&gt;

&lt;h2&gt;
  
  
  Integrations in action
&lt;/h2&gt;

&lt;p&gt;The integrations are now in place, and you can see both the data coming and the dashboard integration from the M3 service overview page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1XVOYj5d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bzmswvky0s92jkdgnn3r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1XVOYj5d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bzmswvky0s92jkdgnn3r.png" alt="Aiven Console showing that this service receives metrics and has a dashboard integration"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Following the same approach, you can add metrics integrations to all of your other Aiven services, and send data to the M3 database you created. You can also write metrics from non-Aiven services to M3 to bring all the data you need into one place, and to visualize it with Grafana.&lt;/p&gt;

&lt;h2&gt;
  
  
  Further Reading
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Bring metrics from other services into M3 with our &lt;a href="https://help.aiven.io/en/articles/5161440-writing-metrics-to-m3-using-telegraf"&gt;article on using Telegraf with M3&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Build better dashboards with these &lt;a href="https://grafana.com/docs/grafana/latest/best-practices/best-practices-for-creating-dashboards/"&gt;tips from Grafana&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Learn more about &lt;a href="https://grafana.com/docs/grafana/latest/best-practices/best-practices-for-creating-dashboards/"&gt;Aiven service integrations&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>data</category>
      <category>monitoring</category>
      <category>servers</category>
    </item>
    <item>
      <title>A Developer's First Impression of M3DB </title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Tue, 27 Jul 2021 21:05:48 +0000</pubDate>
      <link>https://dev.to/lornajane/a-developer-s-first-impression-of-m3db-2ibg</link>
      <guid>https://dev.to/lornajane/a-developer-s-first-impression-of-m3db-2ibg</guid>
      <description>&lt;p&gt;One of the newest products on the Aiven platform is our M3 offering. As a distributed time series database it's ideal for storing the ever-growing number of data points that we collect in and about modern applications.&lt;/p&gt;

&lt;p&gt;What immediately strikes me about M3 is how it is designed to fit in alongside the other tools in a likely architecture. It has interfaces to match the ones we already know and love from &lt;a href="https://prometheus.io/"&gt;Prometheus&lt;/a&gt; and &lt;a href="https://github.com/influxdata/influxdb"&gt;InfluxDB&lt;/a&gt;. The integration with &lt;a href="https://grafana.com/"&gt;Grafana&lt;/a&gt; is also seamless.&lt;/p&gt;

&lt;p&gt;It's as if M3 is designed to painlessly upgrade a single component of our existing data platform.&lt;/p&gt;

&lt;h2&gt;
  
  
  M3 speaks Prometheus
&lt;/h2&gt;

&lt;p&gt;Prometheus is commonly used for server monitoring. If your platform is already sending data to Prometheus, then changing the data store to M3 will be painless. A server agent such as Telegraf can send data to M3 via its Prometheus write endpoints just as easily as it sends data to Prometheus itself.&lt;/p&gt;

&lt;p&gt;There's nothing shiny here, because the intentional reuse of protocols does not make good headlines. It does however make excellent integrations and successful migration projects, and I know which I prefer from the tools I choose!&lt;/p&gt;

&lt;p&gt;As our platforms scale and grow, so do our monitoring data requirements. Since we outgrew our previous solution, &lt;a href="https://aiven.io/case-studies/open-source-tools-provide-observability-for-aiven"&gt;Aiven now uses M3 in an observability setup&lt;/a&gt;. We think this platform could be very useful for other organisations on a similar journey.&lt;/p&gt;

&lt;h2&gt;
  
  
  M3 speaks InfluxDB
&lt;/h2&gt;

&lt;p&gt;Whether you are replacing an existing InfluxDB for your time series data or metrics, or adopting M3 and making use of one of the existing tools that integrates with InfluxDB, M3 is here for you. The compatible wire protocol enables both seamless migrations of existing platforms and easy adoption of M3 as a time series database platform in a new application.&lt;/p&gt;

&lt;p&gt;InfluxDB is a mature platform and there are several libraries and integrations available. Since those will also work for M3, the learning curve for pushing data into the application is pretty flat. I found this makes M3 with all its incredible features and scalability a surprisingly approachable offering.&lt;/p&gt;

&lt;h2&gt;
  
  
  Grafanical ease of integration
&lt;/h2&gt;

&lt;p&gt;The thing about time series data is that there's rather a lot of it. That's even more true if you're building at a scale that needs M3. Luckily, the Aiven platform makes it very simple to link the M3 service with a Grafana dashboard using the Prometheus read API that M3 has. This brings your data into Grafana, alongside any other data sources that you're already using, and lets you start visualising what's going on in your application.&lt;/p&gt;

&lt;p&gt;M3 builds on all these existing integrations and adds some magical scalability into the mix. This makes M3 a very exciting prospect for developers, and it's a platform I'll be building on again in future.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Not using Aiven services yet? Sign up now for your free trial at &lt;a href="https://console.aiven.io/signup?utm_source=devto"&gt;https://console.aiven.io/signup&lt;/a&gt;!&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>database</category>
      <category>ops</category>
      <category>data</category>
      <category>metrics</category>
    </item>
    <item>
      <title>Identifying Time Series Data </title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Fri, 16 Jul 2021 16:16:54 +0000</pubDate>
      <link>https://dev.to/lornajane/identifying-time-series-data-5d14</link>
      <guid>https://dev.to/lornajane/identifying-time-series-data-5d14</guid>
      <description>&lt;p&gt;One key use case for Aiven for M3 is for handling the monitoring data from other systems. In this post you will do just that, by setting up a PostgreSQL service and collecting metrics from it using M3. To finish the picture, Grafana is used to inspect the data you collect.&lt;/p&gt;

&lt;h2&gt;
  
  
  Start with something to monitor
&lt;/h2&gt;

&lt;p&gt;Start with a PostgreSQL service. It doesn't matter what you choose here since this service isn't going to do any important work, it's merely an exhibition piece to collect metrics from (sorry, PostgreSQL!).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1azecouJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ciq2g0e8bot3suae5076.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1azecouJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ciq2g0e8bot3suae5076.png" alt="Screenshot of creating a PostgreSQL service in the Aiven console"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With a PostgreSQL service running, you can start to collect metrics from it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Collect metrics with Aiven for M3
&lt;/h2&gt;

&lt;p&gt;From the PostgreSQL service overview page, configure the metrics integration with a new M3 service:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Choose "Manage integrations"&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Locate the one for sending metrics (not the receiving one), and chose "Use integration"&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;When prompted, create a new M3DB service, and configure it to your liking. I usually use the same cloud region as the thing I'm monitoring.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With PostgreSQL now having its metrics collected and sent to M3, your next step is to add the dashboard integration so you can see the data.&lt;/p&gt;

&lt;h2&gt;
  
  
  Visualize the data with Grafana
&lt;/h2&gt;

&lt;p&gt;From the M3 service overview page, let's set up a Grafana dashboard:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Choose "Manage integrations"&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select "Dashboard" and then "Use integration"&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choose a new Grafana service, and again, configure the service to suit your needs.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once created, you can access Grafana by opening its Service URI in a new browser tab, and logging in with the User and Password listed on the service overview page. To get you started, Aiven includes a default dashboard that shows some of the most common metrics.&lt;/p&gt;

&lt;p&gt;To access the pre-built dashboard:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;From the left hand menu, go to the Dashboards section (4 squares) and choose "Manage"&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on the dashboard called "Aiven PostgreSQL - [database name] - Resources"&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You now have a clear view of your PostgreSQL database health.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FGop5TIb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/re4hzyyrvb4lvuz3ygul.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FGop5TIb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/re4hzyyrvb4lvuz3ygul.png" alt="Screenshot of the default Grafana PostgreSQL panel"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Click the dashboard settings and choose "Save As" to use this dashboard as a template for your own that you can make changes to. The default one gets reset to its original state at intervals.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Integrations in action
&lt;/h2&gt;

&lt;p&gt;The integrations are now in place, and you can see both the data coming and the dashboard integration from the M3 service overview page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fH37xI9s--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mp3i4wvj9gonfk3b2zyw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fH37xI9s--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mp3i4wvj9gonfk3b2zyw.png" alt="Aiven Console showing that this service receives metrics and has a dashboard integration"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Following the same approach, you can add metrics integrations to all of your other Aiven services and send data to the M3 database you created. You can also write metrics from non-Aiven services to M3 to bring all the data you need into one place, and to visualize it with Grafana.&lt;/p&gt;

&lt;h2&gt;
  
  
  Further Reading
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Bring metrics from other services into M3 with our &lt;a href="https://help.aiven.io/en/articles/5161440-writing-metrics-to-m3-using-telegraf"&gt;article on using Telegraf with M3&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Build better dashboards with these &lt;a href="https://grafana.com/docs/grafana/latest/best-practices/best-practices-for-creating-dashboards/"&gt;tips from Grafana&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Learn more about &lt;a href="https://grafana.com/docs/grafana/latest/best-practices/best-practices-for-creating-dashboards/"&gt;Aiven service integrations&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>data</category>
      <category>events</category>
      <category>timeseries</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Tips for Designing Apache Kafka Message Payloads</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Thu, 29 Apr 2021 09:42:21 +0000</pubDate>
      <link>https://dev.to/lornajane/tips-for-designing-apache-kafka-message-payloads-5alk</link>
      <guid>https://dev.to/lornajane/tips-for-designing-apache-kafka-message-payloads-5alk</guid>
      <description>&lt;p&gt;Event-Driven systems are increasingly our future and that's one reason why so many developers are adding Apache Kafka to their tech stacks. Getting the different components in your systems talking nicely to one another relies on a rather mundane but crucial detail: a good data structure in the message payloads. This article will pick out some of the best advice we have for getting your Apache Kafka data payloads well designed from the very beginning of your project.&lt;/p&gt;

&lt;h2&gt;
  
  
  Use all the features of Apache Kafka records
&lt;/h2&gt;

&lt;p&gt;The events that we stream with Kafka can support headers as well as keys and the main body of the payload. The most scalable systems use all these features appropriately.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fimages.ctfassets.net%2Fq3u27v6lpki8%2F6DEJhvIPyF84skyZZ9UCOk%2F16c2809e0ddc15f73ad90227d5e8acd0%2FInline-image.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fimages.ctfassets.net%2Fq3u27v6lpki8%2F6DEJhvIPyF84skyZZ9UCOk%2F16c2809e0ddc15f73ad90227d5e8acd0%2FInline-image.png" alt="diagram showing the header, key and value as boxes inside the payload"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use the header for metadata about the payload, such as the &lt;a href="https://opentelemetry.io" rel="noopener noreferrer"&gt;OpenTelemetry&lt;/a&gt; trace IDs. It can also be useful to duplicate some of the fields from the payload itself, if they are used for routing or filtering the data. In secure systems, intermediate components may not have access to the whole payload, so putting the data in the header can expose just the appropriate fields there. Also consider that, for larger payloads, the overhead of deserializing can be non-trivial. Being able to access just a couple of fields while keeping the system moving can help performance, too.&lt;/p&gt;

&lt;p&gt;The keys in Apache Kafka typically do get more attention than the headers, but we should still make sure we are using them as a force for good. When a producer sends data to Kafka, it specifies which topic it should be sent to. The key usually defines which partition is used. If the key isn't set, then the data will be spread evenly across the partitions using a round-robin approach. For a lot of unrelated events in a stream, this makes good use of your resources.&lt;/p&gt;

&lt;p&gt;If the key you're using doesn't vary much, your events can get bunched into a small number of partitions (rather than spread out). When this happens, try adding more fields to give more granular partition routing. Keep in mind that the contents of each partition will be processed in order, so it still makes sense to keep logical groupings of data.&lt;/p&gt;

&lt;p&gt;For example, consider a collection of imaginary factories where all the machines can send events. Mostly they send &lt;code&gt;sensor_reading&lt;/code&gt; events, but they can also send &lt;code&gt;alarm&lt;/code&gt; events, that are like a paper jam in the printer but on a factory scale! Using a key like this will give us a LOT of data on one partition:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sensor_reading"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So we could add another field to the key for these readings, maybe to group them by factory location:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sensor_reading"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"factory_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;44891&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Combining the type and factory in the key ensures that records of the same event type and the same factory will be processed in the order they were received. When it comes to designing the payloads, thinking about these aspects early on in the implementation process can help avoid performance bottlenecks later.&lt;/p&gt;

&lt;h2&gt;
  
  
  Data structures: nested data or simple layout?
&lt;/h2&gt;

&lt;p&gt;No matter how certain I am that this payload will only ever contain a collection of things, I always use an object structure rather than making the data an array at the top level. Sometimes, it just leaves a rather lonely fieldname with a collection to take care of. But when things change and I &lt;em&gt;do&lt;/em&gt; need to add an extra field, this "one weird trick" makes me very grateful.&lt;/p&gt;

&lt;p&gt;Make no mistake, it's not foresight. It's the scars of the first API I ever shipped having to move to v1.1 within a week of launch for precisely this reason. Learn from my mistakes!&lt;/p&gt;

&lt;p&gt;In general, it's also helpful to group related fields together; once you get 30 fields in a payload and they are sorted alphabetically then you will wish you had done something differently! Here's an example showing what I mean:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"stores_request_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;10004352789&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"parent_order"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"order_ref"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;777289&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"agent"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Mr Thing (1185)"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"bom"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"part"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"hinge_cup_sg7"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"quantity"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;18&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"part"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"worktop_kit_sm"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"quantity"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"part"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"softcls_norm2"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"quantity"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;9&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Using the &lt;code&gt;parent_order&lt;/code&gt; object to keep the order ref, its responsible person, and any other related data together makes for an easily navigable structure, more so than having those fields scattered across the payload. It also avoids having to artificially group the fields using a prefix. Don't be afraid to introduce extra levels of data nesting to keep your data logically organised.&lt;/p&gt;

&lt;p&gt;How &lt;em&gt;much&lt;/em&gt; data to include is another tricky subject. With most Kafka platforms limiting payloads to 1MB, it's important to choose carefully. For text-based data, 1MB is quite a lot of information, especially if a binary format such as Avro or Protobuf is used (more on those in a moment). As a general rule of thumb, if you are trying to send a file in a Kafka payload, you are probably doing it wrong!&lt;/p&gt;

&lt;p&gt;These design tradeoffs are nothing new and I rely mostly on the prior art in the API/webhooks space to inform my decisions. For example, &lt;a href="https://en.wikipedia.org/wiki/Hypermedia" rel="noopener noreferrer"&gt;hypermedia&lt;/a&gt; is the practice of including links to resources rather than the whole resource. Publishing messages that will cause every subscriber to make follow-on calls is a good way to create load problems for your server but hypermedia can be a useful middle ground, especially where the linked resources are cacheable.&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Formats: JSON, Avro ... these are not real words
&lt;/h2&gt;

&lt;p&gt;Wading through the jargon of data formats is a mission by itself, but I'd like to give some special mentions to my favourites!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;JSON: Keep it simple&lt;/strong&gt; JSON formats are very easy to understand, write, read and debug. They can use a JSON Schema to ensure they fulfil an expected data structure, but you can equally well go freeform for prototyping and iterating quickly. For small data payloads, I often start here and never travel any further. However, JSON is fairly large in size for the amount of data it transmits, and it also has a rather relaxed relationship with data types. In applications where either or both of these issues cause a problem, then I move on from JSON and choose something a bit more advanced.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Avro: Small and schema-driven&lt;/strong&gt; &lt;a href="http://avro.apache.org/" rel="noopener noreferrer"&gt;Apache Avro&lt;/a&gt; is a serialisation system that keeps the data tidy and small, which is ideal for Kafka records. The data structure is described with a schema (example below) and messages can only be created if they conform with the requirements of the schema. The producer takes the data and the schema, produces a message that goes to the kafka broker, and registers the schema with a schema registry. The consumers do the same in reverse: take the message, ask the schema registry for the schema, and assemble the full data structure. Avro has a strong respect for data types, requires all payloads conform with the schema, and since data such as fieldnames is encoded in the schema rather than repeated in every payload, the overall payload size is reduced.&lt;/p&gt;

&lt;p&gt;Here's an example Avro schema:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"namespace"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"io.aiven.example"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"record"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"MachineSensor"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"fields"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"machine"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"doc"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"The machine whose sensor this is"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sensor"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"doc"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Which sensor was read"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"value"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"float"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"doc"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Sensor reading"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"units"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"doc"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Measurement units"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;There are other alternatives, notably &lt;a href="https://developers.google.com/protocol-buffers" rel="noopener noreferrer"&gt;Protocol Buffers, known as ProtoBuf&lt;/a&gt;. It achieves similar goals by generating code to use in your own application, making it available on fewer tech stacks. If it's available for yours, it's worth a look.&lt;/p&gt;

&lt;h2&gt;
  
  
  A note on timestamps
&lt;/h2&gt;

&lt;p&gt;Kafka will add a publish time in the header of a message. However it can also be useful to include your own timestamps for some situations, such as when the data is gathered at a different time to when it is published, or when a retry implementation is needed. Also since using Apache Kafka allows additional consumers to reprocess records later, a timestamp can give a handy insight into progress through an existing data set.&lt;/p&gt;

&lt;p&gt;If I could make rules, I'd make rules about timestamp formats! The only acceptable formats are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Seconds since the epoch &lt;code&gt;1615910306&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;ISO 8601 format &lt;code&gt;2021-05-11T10:58:26Z&lt;/code&gt; including timezone information, I should not have to know where on the planet on which day of the year this payload was created.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Design with intention
&lt;/h2&gt;

&lt;p&gt;With the size limitations on the payloads supported by Apache Kafka, it's important to only include fields that can justify their own inclusion. When the consumers of the data are known, it's easier to plan for their context and likely use cases. When they're not, that's a more difficult assignment but the tips shared here will hopefully set you on a road to success.&lt;/p&gt;

&lt;h2&gt;
  
  
  Further reading
&lt;/h2&gt;

&lt;p&gt;If you found this post useful, how about one of these resources to read next?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://help.aiven.io/en/articles/489572-getting-started-with-aiven-for-apache-kafka" rel="noopener noreferrer"&gt;Getting Started with Apache Kafka on Aiven&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://help.aiven.io/en/articles/2302613-using-schema-registry-with-aiven-kafka" rel="noopener noreferrer"&gt;Apache Kafka and the schema registry&lt;/a&gt; (with Java code examples)&lt;/li&gt;
&lt;li&gt;The &lt;a href="http://avro.apache.org/" rel="noopener noreferrer"&gt;Apache Avro&lt;/a&gt; project&lt;/li&gt;
&lt;li&gt;Another open standards project, &lt;a href="https://opentelemetry.io/docs/" rel="noopener noreferrer"&gt;OpenTelemetry&lt;/a&gt; to find out more about adding tracing to your Kafka applications&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>data</category>
      <category>opensource</category>
      <category>kafka</category>
    </item>
    <item>
      <title>Manage Database Infrastructure with Aiven's API</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Tue, 30 Mar 2021 16:17:52 +0000</pubDate>
      <link>https://dev.to/lornajane/manage-database-infrastructure-with-aiven-s-api-5h1n</link>
      <guid>https://dev.to/lornajane/manage-database-infrastructure-with-aiven-s-api-5h1n</guid>
      <description>&lt;p&gt;Perhaps you recognise Aiven as the multicloud platform for all your open source data needs, with the lovely web interface. And perhaps, if you didn't already, you recognise us that way now :) But Aiven is more than just a pretty web interface, we also have an API for all your integration/automation needs. In this post, we'll take a look at using the API and what you can do with it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Meet the API
&lt;/h2&gt;

&lt;p&gt;Aiven offers an HTTP API with token authentication and JSON-formatted data. A great place to start is the &lt;a href="https://api.aiven.io/doc/"&gt;API reference documentation&lt;/a&gt; which gives you the details of all the endpoints. The API supports pretty much everything you can do with the &lt;a href="https://console.aiven.io"&gt;web console&lt;/a&gt; ... because the web console uses the API too!&lt;/p&gt;

&lt;h2&gt;
  
  
  Get your auth token
&lt;/h2&gt;

&lt;p&gt;To access the API, you'll need an access token. To obtain this, log into your Aiven account and go to &lt;a href="https://console.aiven.io/profile/auth"&gt;your profile page&lt;/a&gt;; then under "Authentication Tokens" click "Generate Token". This prompts for a token description (for your own reference) and some expiry configuration. It's probably a good idea to set the token to expire after a while (for example, a week is 168 hours) but of course the choice is yours.&lt;/p&gt;

&lt;p&gt;Copy the new token to somewhere safe, you won't be able to access it again.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If you lose your token, revoke it and generate a new one at any time. It's good practice to rotate tokens from time to time. Note to self: revoke the token used in these examples when this post is published!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  First steps: using cURL
&lt;/h2&gt;

&lt;p&gt;We'll be using the terminal for this section, so set the token you copied before as an environment variable:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export AIVEN_API_TOKEN=[paste]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Starting simple, try getting a &lt;a href="https://api.aiven.io/doc/#operation/ProjectList"&gt;list of projects&lt;/a&gt; using the API. Here's a &lt;a href="https://curl.se/"&gt;&lt;code&gt;cURL&lt;/code&gt;&lt;/a&gt; showing this in action.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -H "Authorization: aivenv1 $AIVEN_API_TOKEN" https://api.aiven.io/v1/project
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I use this call quite often, because many endpoints take the project as a parameter. If reading raw JSON isn't your thing, then one option is to use a tool called &lt;a href="https://stedolan.github.io/jq/"&gt;jq&lt;/a&gt; to make things more readable. Pipe your curl output to it, like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -s -H "Authorization: aivenv1 $AIVEN_API_TOKEN" https://api.aiven.io/v1/project | jq ".project_membership"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This returns the fields we're most interested in: the project names and our role for each.&lt;/p&gt;

&lt;h2&gt;
  
  
  API calls with Postman
&lt;/h2&gt;

&lt;p&gt;If you prefer a GUI application for trying out API calls, then that seems pretty reasonable to me! There are lots of options but for this example I'll be using my personal favourite, &lt;a href="https://postman.com"&gt;Postman&lt;/a&gt;. It's available as a cross-platform desktop app, or it can be used from the browser.&lt;/p&gt;

&lt;p&gt;✨ Magic alert ✨ Since Aiven offers an OpenAPI description of the API, we can use this to get a ready-made set of collections to use in Postman!&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Download the OpenAPI description file from the API documentation itself, using the "Download" button.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ZXfYskpq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xd3sn7p22txwla78px9g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ZXfYskpq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xd3sn7p22txwla78px9g.png" alt="download button on API documentation"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Import the OpenAPI file into Postman. This creates a new collection in the left hand bar called "Aiven".&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ejQZP-pO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2lhw2ohr1b86d2ptspaj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ejQZP-pO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2lhw2ohr1b86d2ptspaj.png" alt="postman collection is now imported"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;On the left side of the window, right-click your new Aiven collection and select Edit. On the "Authorization" tab, on the Type drop-down, select Bearer Token. Copy your API token into the field on the right, or as shown in the screenshot, use &lt;a href="https://learning.postman.com/docs/sending-requests/variables/"&gt;Postman's variables feature&lt;/a&gt; to store this separately.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--P0FzuD6T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qj72rg7v6xsm1oa64joq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--P0FzuD6T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qj72rg7v6xsm1oa64joq.png" alt='set collection-level authorization option to be "bearer token"'&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You're all set! Check things are working by looking in the "Project" folder in the collection and running the "List projects" request. Postman helpfully detects the JSON response and formats the output nicely for us.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating Aiven services from the API
&lt;/h2&gt;

&lt;p&gt;We can do more than just fetch some data fields from the API; let's go ahead and create some services.&lt;/p&gt;

&lt;p&gt;By default, Postman organises requests by URL segment, so we need to go digging at this point! Under "project" and then "{project}", it's the "Services" folder we want. And there: "Create a service" is the request to open.&lt;/p&gt;

&lt;p&gt;On the "Params" tab, enter the project name that you will be using.&lt;/p&gt;

&lt;p&gt;Check the Authorization tab has the type set to "Inherit auth from parent". This will pick up on the token or variable we configured on the collection earlier.&lt;/p&gt;

&lt;p&gt;Creating services needs a specific JSON API structure to be sent to the API, but in the "Body" tab Postman already  has the outline and we can add the values we want to use. This endpoint allows quite a lot of configuration but checking the &lt;a href="https://api.aiven.io/doc/#operation/ServiceCreate"&gt;API docs for the Create Services endpoint&lt;/a&gt; shows not all fields are required. In fact, this payload is enough to get a service going:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"plan"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"business-4"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"service_name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"data-and-clouds"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"service_type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"pg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"cloud"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"google-europe-north1"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If all goes well, you will get a response with a 200 status, and some data about your new service! It takes a few minutes to be responsive but essentially that's all there is to it.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Wondering which other clouds you could choose? Try the &lt;a href="https://api.aiven.io/doc/#operation/ListClouds"&gt;list clouds endpoint&lt;/a&gt; and take your pick!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Whatever next?
&lt;/h2&gt;

&lt;p&gt;We'd love to hear how you use our API in your workflow!&lt;/p&gt;

&lt;p&gt;Depending what you're interested in, try some of these handy links next:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;a href="https://api.aiven.io/doc"&gt;Aiven API reference&lt;/a&gt; for more information on any of the endpoints not covered here&lt;/li&gt;
&lt;li&gt;We also have a &lt;a href="https://github.aiven.io/aiven-client"&gt;command line tool&lt;/a&gt;; an excellent addition to the developers' toolbox when working with Aiven&lt;/li&gt;
&lt;li&gt;For infrastructure integration, try the &lt;a href="https://registry.terraform.io/providers/aiven/aiven/latest/docs"&gt;Aiven terraform provider&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Check out &lt;a href="https://postman.com"&gt;Postman&lt;/a&gt;, the HTTP client used in the examples in this post.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>database</category>
      <category>api</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Go and PostgreSQL with pgx and squirrel libraries</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Fri, 26 Feb 2021 00:00:00 +0000</pubDate>
      <link>https://dev.to/lornajane/go-and-postgresql-with-pgx-and-squirrel-libraries-25jo</link>
      <guid>https://dev.to/lornajane/go-and-postgresql-with-pgx-and-squirrel-libraries-25jo</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe0wsvzqxax4rqmokmx2o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe0wsvzqxax4rqmokmx2o.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Applications and databases go together like milk and cookies; on the menu today is a PostgreSQL-flavored cookie with a splash of refreshing Go milk. I've picked two of my favorite Go libraries to show how to connect to and work with your Aiven PostgreSQL service. The libraries are &lt;a href="https://github.com/jackc/pgx" rel="noopener noreferrer"&gt;pgx&lt;/a&gt; which is a nice PostgreSQL-specific alternative to &lt;code&gt;database/sql&lt;/code&gt; and &lt;a href="https://github.com/Masterminds/squirrel" rel="noopener noreferrer"&gt;Squirrel&lt;/a&gt; because I love the name (it also has some quite cool features).&lt;/p&gt;

&lt;h2&gt;
  
  
  Start with a database
&lt;/h2&gt;

&lt;p&gt;It would make sense to already have a PostgreSQL database handy, ideally with some data in it. If you don't already have an Aiven account, then &lt;a href="https://console.aiven.io/signup" rel="noopener noreferrer"&gt;sign up&lt;/a&gt; and start a PostgreSQL service. If you already have some data to use, great! I'm using some open data from the Kepler space mission, you can follow along with the recent blog post to set this up yourself.&lt;/p&gt;

&lt;p&gt;In the Aiven Console, copy the connection string for Postgres - or if you are using a different database, copy the &lt;code&gt;postgres://....&lt;/code&gt; connection string.&lt;/p&gt;

&lt;h2&gt;
  
  
  Connecting to PostgreSQL from Go
&lt;/h2&gt;

&lt;p&gt;Go has built-in database support in its &lt;code&gt;database/sql&lt;/code&gt; library, but it's quite generic since it has to be able to cope with so many different database platforms. For applications that are specifically connecting to PostgreSQL, it make sense to use a specialist library.&lt;/p&gt;

&lt;p&gt;For this example I chose &lt;code&gt;jackc/pgx&lt;/code&gt; which is PostgreSQL-specific and has some nice features around understanding PostgreSQL data types in addition to improved performance. The overall pattern isn't radically different from other applications using &lt;code&gt;database/sql&lt;/code&gt;, which makes it feel quite familiar.&lt;/p&gt;

&lt;p&gt;Set the connection string you copied earlier as the &lt;code&gt;DATABASE_URL&lt;/code&gt; environment variable. Then initialise your go application with &lt;code&gt;go mod init pgfun&lt;/code&gt;; &lt;code&gt;pgx&lt;/code&gt; uses go modules.&lt;/p&gt;

&lt;p&gt;When you have everything set up, try the code example below to connect to a database and run one query (my database has the exoplanets data in it):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;
&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="s"&gt;"context"&lt;/span&gt;
    &lt;span class="s"&gt;"fmt"&lt;/span&gt;
    &lt;span class="s"&gt;"os"&lt;/span&gt;

    &lt;span class="s"&gt;"github.com/jackc/pgx/v4/pgxpool"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;dbpool&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dberr&lt;/span&gt; &lt;span class="o"&gt;:=&lt;/span&gt; &lt;span class="n"&gt;pgxpool&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Connect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Background&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Getenv&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"DATABASE_URL"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;dberr&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="no"&gt;nil&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nb"&gt;panic&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Unable to connect to database"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;defer&lt;/span&gt; &lt;span class="n"&gt;dbpool&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Close&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="n"&gt;sql&lt;/span&gt; &lt;span class="o"&gt;:=&lt;/span&gt; &lt;span class="s"&gt;"SELECT kepler_name, koi_score "&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
        &lt;span class="s"&gt;"FROM cumulative "&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
        &lt;span class="s"&gt;"WHERE kepler_name IS NOT NULL AND koi_pdisposition = 'CANDIDATE' "&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
        &lt;span class="s"&gt;"ORDER BY koi_score LIMIT 5"&lt;/span&gt;
    &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sqlerr&lt;/span&gt; &lt;span class="o"&gt;:=&lt;/span&gt; &lt;span class="n"&gt;dbpool&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Background&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;sql&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;sqlerr&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="no"&gt;nil&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nb"&gt;panic&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;fmt&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Sprintf&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"QueryRow failed: %v"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sqlerr&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Next&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;var&lt;/span&gt; &lt;span class="n"&gt;planet_name&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt;
        &lt;span class="k"&gt;var&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt; &lt;span class="kt"&gt;float64&lt;/span&gt;
        &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Scan&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;planet_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;score&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;fmt&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Printf&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"%s&lt;/span&gt;&lt;span class="se"&gt;\t&lt;/span&gt;&lt;span class="s"&gt;%.2f&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;planet_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the &lt;code&gt;main()&lt;/code&gt; function, first the code connects to the database using the &lt;a href="https://help.aiven.io/en/articles/964730-postgresql-connection-pooling" rel="noopener noreferrer"&gt;connection pool&lt;/a&gt; features of &lt;code&gt;pgx&lt;/code&gt;. As a general rule I consider that pooling connections for PostgreSQL is a wise move which is why it's used here.&lt;/p&gt;

&lt;p&gt;Then a simple SQL statement that shows the exoplanets that the Kepler mission assigned a "CANDIDATE" status, and how confident they were that it was a real planet. The statement has a row limit (the dataset has ~2k rows without the limit) so you only see the five lowest-confidence-scoring planets in the final section that iterates over the rows and reads the data in as variables.&lt;/p&gt;

&lt;p&gt;I love that the names are here - if you're interested you can even look up the planets in the &lt;a href="https://exoplanets.nasa.gov/discovery/exoplanet-catalog/?" rel="noopener noreferrer"&gt;Exoplanet Catalogue&lt;/a&gt;. I spent way too much time browsing that, science is ace!&lt;/p&gt;

&lt;p&gt;So to recap, we have one PostgreSQL connection string, one Go database library and a hardcoded SQL query. What's next?&lt;/p&gt;

&lt;h2&gt;
  
  
  Fluid Interfaces with Squirrel
&lt;/h2&gt;

&lt;p&gt;While the &lt;code&gt;pgx&lt;/code&gt; library is a PostgreSQL-specific alternative to &lt;code&gt;database/sql&lt;/code&gt;, it also has a compatible interface. This is ideal if you want to switch an existing application over, but it also means that other tools designed to play nicely with &lt;code&gt;database/sql&lt;/code&gt; can work with &lt;code&gt;pgx&lt;/code&gt; too. That's good news for me because I want to find something more elegant than my hardcoded SQL string.&lt;/p&gt;

&lt;p&gt;Squirrel!&lt;/p&gt;

&lt;p&gt;No, I didn't get distracted, it's an SQL library. Although, it is a bit of a shiny toy. It's a fluid interface for building SQL queries, and SQL lends itself well to being thought about in this way.&lt;/p&gt;

&lt;p&gt;Rebuilding our existing SQL query in this interface produces something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;    &lt;span class="n"&gt;planets&lt;/span&gt; &lt;span class="o"&gt;:=&lt;/span&gt; &lt;span class="n"&gt;psql&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Select&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"kepler_name"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"koi_score"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;From&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"cumulative"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;Where&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sq&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;NotEq&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="s"&gt;"kepler_name"&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="no"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;Where&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sq&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Eq&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="s"&gt;"koi_pdisposition"&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"CANDIDATE"&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;OrderBy&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"koi_score"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;Limit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="m"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It's a bit more manageable and I can imagine building this in a real application much more easily than concatenating bits of SQL string together. It's still possible to inspect the SQL that it built as a string by calling &lt;code&gt;planets.ToSql()&lt;/code&gt;, which is a nice touch.&lt;/p&gt;

&lt;p&gt;Switching &lt;code&gt;pgx&lt;/code&gt; over to the "pretend to be &lt;code&gt;database/sql&lt;/code&gt;" mode needed a little refactoring. I also need a spoonful of secret sauce to make Squirrel play nicely with PostgreSQL, so here's the whole runnable example at once. It expects the database connection string to be in &lt;code&gt;DATABASE_URL&lt;/code&gt; as for the previous example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight go"&gt;&lt;code&gt;&lt;span class="k"&gt;package&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;

&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="s"&gt;"database/sql"&lt;/span&gt;
    &lt;span class="s"&gt;"fmt"&lt;/span&gt;
    &lt;span class="s"&gt;"os"&lt;/span&gt;

    &lt;span class="n"&gt;sq&lt;/span&gt; &lt;span class="s"&gt;"github.com/Masterminds/squirrel"&lt;/span&gt;
    &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="s"&gt;"github.com/jackc/pgx/v4/stdlib"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;db&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;err&lt;/span&gt; &lt;span class="o"&gt;:=&lt;/span&gt; &lt;span class="n"&gt;sql&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"pgx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Getenv&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"DATABASE_URL"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;err&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="no"&gt;nil&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nb"&gt;panic&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Unable to connect to database"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;defer&lt;/span&gt; &lt;span class="n"&gt;db&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Close&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c"&gt;// a little magic to tell squirrel it's postgres&lt;/span&gt;
    &lt;span class="n"&gt;psql&lt;/span&gt; &lt;span class="o"&gt;:=&lt;/span&gt; &lt;span class="n"&gt;sq&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;StatementBuilder&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PlaceholderFormat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sq&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Dollar&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;planets&lt;/span&gt; &lt;span class="o"&gt;:=&lt;/span&gt; &lt;span class="n"&gt;psql&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Select&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"kepler_name"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"koi_score"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;From&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"cumulative"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;Where&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sq&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;NotEq&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="s"&gt;"kepler_name"&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="no"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;Where&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sq&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Eq&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="s"&gt;"koi_pdisposition"&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"CANDIDATE"&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;OrderBy&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"koi_score"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;Limit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="m"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sqlerr&lt;/span&gt; &lt;span class="o"&gt;:=&lt;/span&gt; &lt;span class="n"&gt;planets&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RunWith&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;db&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Query&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;sqlerr&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="no"&gt;nil&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nb"&gt;panic&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;fmt&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Sprintf&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"QueryRow failed: %v"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sqlerr&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Next&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;var&lt;/span&gt; &lt;span class="n"&gt;planet_name&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt;
        &lt;span class="k"&gt;var&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt; &lt;span class="kt"&gt;float64&lt;/span&gt;
        &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Scan&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;planet_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;score&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;fmt&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Printf&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"%s&lt;/span&gt;&lt;span class="se"&gt;\t&lt;/span&gt;&lt;span class="s"&gt;%.2f&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;planet_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note that the &lt;code&gt;pgx&lt;/code&gt; import is different this time to get the &lt;code&gt;database/sql&lt;/code&gt; compatible library that Squirrel expects in its &lt;code&gt;RunWith()&lt;/code&gt; method. The connection step changes a little as well so don't try to amend the previous example; this one is different.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;psql&lt;/code&gt; variable holds a reconfigured squirrel which is needed because PostgreSQL handles placeholders slightly differently. Skipping this causes the code to tell you there is a syntax error near the &lt;code&gt;LIMIT&lt;/code&gt; clause. Hopefully now I've written this down my valued readers can avoid this problem, and I may even remember this next time too!&lt;/p&gt;

&lt;h2&gt;
  
  
  Fun with Go and PostgreSQL
&lt;/h2&gt;

&lt;p&gt;This was a lightweight introduction to show you a couple of my favourite libraries for Go and PostgreSQL applications - and of course with an open data set to play with, the fun is multiplied :)&lt;/p&gt;

&lt;p&gt;Here are some related links and further reading, reach out to us on &lt;a href="https://twitter.com/aiven_io" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt; if you have questions. We are always pleased to chat.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://help.aiven.io/en/articles/489573-getting-started-with-aiven-postgresql" rel="noopener noreferrer"&gt;Getting Started with Aiven PostgreSQL&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Importing Exoplanets into your PostgreSQL&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/aiven/aiven-examples" rel="noopener noreferrer"&gt;Code examples for Aiven connections&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Postgres13 is Available&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>go</category>
      <category>postgres</category>
    </item>
    <item>
      <title>Discover exoplanets with PostgreSQL sample data</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Wed, 17 Feb 2021 00:00:00 +0000</pubDate>
      <link>https://dev.to/lornajane/discover-exoplanets-with-postgresql-sample-data-10if</link>
      <guid>https://dev.to/lornajane/discover-exoplanets-with-postgresql-sample-data-10if</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mNbozFnV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8t018f89j8awj5bbwn5g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mNbozFnV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8t018f89j8awj5bbwn5g.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;PostgreSQL is and remains one of our most popular and strongly growing storage platforms. Other storage technologies come and go but modern Postgres is a solid choice for so many applications. In this article we'll take a look at how to get started with Aiven for PostgreSQL, populate it with data and start playing.&lt;/p&gt;

&lt;p&gt;When you spin up your first Aiven for PostgreSQL, you'll want to take some time to play with the features ... but there's a problem. Your new shiny database is empty.&lt;/p&gt;

&lt;p&gt;Finding and using some open datasets is a great way to fill this gap. One option is go try the &lt;a href="https://www.kaggle.com/"&gt;Kaggle&lt;/a&gt; platform. It's a place to find open data, advice about data science, and some competitions you can participate in to hone your skills. There's quite a selection of datasets to choose from, but today I'll be using the &lt;a href="https://www.kaggle.com/nasa/kepler-exoplanet-search-results"&gt;exoplanets data&lt;/a&gt; from the Kepler mission. You'll need a (free) account to log in and download the data. Go ahead and extract the zip file, I'm using &lt;code&gt;cumulative.csv&lt;/code&gt; for the example in this post.&lt;/p&gt;

&lt;h2&gt;
  
  
  Get Started with Aiven
&lt;/h2&gt;

&lt;p&gt;If you are not already an Aiven user, you can &lt;a href="https://console.aiven.io/signup"&gt;sign up for an Aiven account&lt;/a&gt; to follow the steps in this post - we'll wait right here!&lt;/p&gt;

&lt;p&gt;We will also be using the &lt;a href="https://github.com/aiven/aiven-client"&gt;Aiven CLI&lt;/a&gt;. This tool requires Python 3.6 or later, and can be installed from PyPI:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install aiven-client
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You also need to authenticate your Aiven account against the CLI tool. Run the following command, substituting your own details:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;avn user login &amp;lt;email@example.com&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You now have everything you need to create an Aiven database in the cloud.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create PostgreSQL Service
&lt;/h2&gt;

&lt;p&gt;A good first step is to create a project to keep the services in. All it needs is a name:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;avn project create exoplanets
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Aiven offers many options when creating services but to get us going quickly, I'll use the newest postgres available and the smallest package, called "hobbyist". One of the most fun things though is being able to choose any cloud platform you like! Take a moment to check the list and copy the &lt;code&gt;CLOUD_NAME&lt;/code&gt; field of your favorite:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;avn cloud list
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I chose &lt;code&gt;google-europe-west1&lt;/code&gt; for my example. Here is the command to run to create the Postgres database (remember to swap in the cloud of your choice):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;avn service create -t pg -p hobbyist --cloud google-europe-west1 pg-exoplanets
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The node takes a few minutes to be ready, but the Aiven CLI has a handy "wait" command that doesn't return until the service is ready to talk to us. This is less critical for running the commands by hand as I have here, but it's super useful when your CI system is spinning up the data platforms by itself!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;avn service wait pg-exoplanets
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When the command returns, my PostgreSQL cluster is ready to use. Let's create a database to hold the sample data; the command below creates one named "exoplanets":&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;avn service database-create --dbname exoplanets pg-exoplanets
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now I have a shiny new database... but it's all sad and empty. Let's look at the sample data and get it imported.&lt;/p&gt;

&lt;h2&gt;
  
  
  Adding CSV Data to PostgreSQL
&lt;/h2&gt;

&lt;p&gt;PostgreSQL has built-in support for importing CSV data into an existing table, but I don't have the table structure, just a CSV. Luckily there's a tool for that - &lt;a href="https://github.com/catherinedevlin/ddl-generator"&gt;ddlgenerator&lt;/a&gt; is another Python commandline tool.&lt;/p&gt;

&lt;p&gt;Here's how to install the &lt;code&gt;ddlgenerator&lt;/code&gt; tool and then generate the &lt;code&gt;CREATE TABLE&lt;/code&gt; statement from &lt;code&gt;cumulative.csv&lt;/code&gt; that  I downloaded earlier:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install ddlgenerator
ddlgenerator postgres cumulative.csv &amp;gt; create.sql
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Have a look inside the file and you will see that it has the structure I need to explain to PostgreSQL how to hold the data. The &lt;code&gt;avn service cli&lt;/code&gt; command will give us a &lt;code&gt;psql&lt;/code&gt; prompt on the new database:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;avn service cli pg-exoplanets
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;From within &lt;code&gt;psql&lt;/code&gt; I can connect to the "exoplanets" database, and then run the SQL file to create the table structure:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;\c exoplanets
\i create.sql
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Adding the final piece to the puzzle, and still from the &lt;code&gt;psql&lt;/code&gt; prompt, the next command brings in the CSV data:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;\copy cumulative from data/cumulative.csv csv header
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Nice work! The &lt;code&gt;cumulative&lt;/code&gt; table should now have some data for you to play with!&lt;/p&gt;

&lt;h2&gt;
  
  
  Dreaming of Exoplanets
&lt;/h2&gt;

&lt;p&gt;Now you have a database full of measurements of exoplanets taken by the Kepler Space Telescope. If you're not already familiar with the project, the &lt;a href="https://www.nasa.gov/mission_pages/kepler/overview/index.html"&gt;NASA mission page&lt;/a&gt; is worth a read. The mission went into a second phase when one of the controls failed, which serves to remind us that engineering systems we can see and touch, or at least ssh into, is much easier gig than operating in space!&lt;/p&gt;

&lt;p&gt;You can explore the dataset, which describes observations and compares the Kepler assessment of each exoplanet with its official status in the pre-existing literature. For example, try this to see the false-positives identified by Kepler:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="n"&gt;kepler_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;koi_pdisposition&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;cumulative&lt;/span&gt; &lt;span class="k"&gt;where&lt;/span&gt; &lt;span class="n"&gt;koi_disposition&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CONFIRMED'&lt;/span&gt; &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;koi_pdisposition&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'FALSE POSITIVE'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can also connect this data to other tools to use the dataset further. Either grab the connection details from the web console, or use &lt;a href="https://stedolan.github.io/jq/"&gt;jq&lt;/a&gt; with &lt;code&gt;avn&lt;/code&gt; for a one-liner:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;avn service get pg-exoplanets --json | jq ".service_uri"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  What's Next?
&lt;/h2&gt;

&lt;p&gt;Good cloud experimentation practice suggests that if you've finished with your exoplanets database, you can delete it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;avn service terminate pg-exoplanets
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For even more fun and learning, how about one of these resources:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://www.kaggle.com/datasets"&gt;Kaggle Open Datasets&lt;/a&gt; in case you don't fancy exoplanets, there are some excellent alternatives here&lt;/li&gt;
&lt;li&gt;In our documentation you can find &lt;a href="https://help.aiven.io/en/articles/4358591-postgresql-migration-to-aiven"&gt;instructions for migrating your existing PostgreSQL to Aiven&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;More about the &lt;a href="https://github.com/aiven/aiven-client#aiven-client-"&gt;Aiven CLI, &lt;code&gt;avn&lt;/code&gt;&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Need more PostgreSQL? Check out our &lt;a href="https://aiven.io/blog/an-introduction-to-postgresql"&gt;introduction to PostgreSQL&lt;/a&gt; post&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>database</category>
      <category>data</category>
      <category>postgres</category>
      <category>science</category>
    </item>
    <item>
      <title>Kafka v. RabbitMQ - a comparison</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Mon, 01 Feb 2021 00:00:00 +0000</pubDate>
      <link>https://dev.to/lornajane/kafka-v-rabbitmq-a-comparison-4eo2</link>
      <guid>https://dev.to/lornajane/kafka-v-rabbitmq-a-comparison-4eo2</guid>
      <description>&lt;h1&gt;
  
  
  Kafka v. RabbitMQ - a comparison
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fp7whr59qvx77igbqdsnl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fp7whr59qvx77igbqdsnl.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Today's post is all about queues and how to choose a solution that fits your application's requirements. We'll go over the key design features of RabbitMQ and Kafka and outline how they process queues differently. Read on, and we'll help you decide which is a better match for you!&lt;/p&gt;

&lt;p&gt;There are many roads that can lead to the moment you decide you need a queue. Queues are an excellent way to loosely couple many different components and allow them to exchange data without detailed knowledge about one another. Using a queue is also an excellent way to distribute work between multiple nodes to perform asynchronous tasks.&lt;/p&gt;

&lt;p&gt;Queues come in different flavours and success is morely likely when you can use the queue that best fits the shape of your use case. There's some overlap between the use cases but in general it can be summarised as choosing between a job queue and a message queue.&lt;/p&gt;

&lt;h2&gt;
  
  
  Processed and Forgotten: RabbitMQ
&lt;/h2&gt;

&lt;p&gt;A job queue such as &lt;a href="https://www.rabbitmq.com/" rel="noopener noreferrer"&gt;RabbitMQ&lt;/a&gt; is a good choice when work is being delegated to an asynchronous endpoint, such as a serverless function. The classic example is resizing an image. When a user uploads a new image, the application needs to produce a thumbnail or some custom sizes for that image, but the user shouldn't have to wait for that work to be completed before getting on with what they were doing. So we can put the request onto a queue and carry on with generic placeholder images, until the resizing is complete.&lt;/p&gt;

&lt;p&gt;RabbitMQ is a popular message broker and is a good fit for those job-shaped application requirements. It's an open source tool and here at Aiven we're big fans of all the open source tools. RabbitMQ supports multiple protocols, has predefined exchange types and has configurable flexible routing. When you work with a job queue, the message broker transports the messages to the place where they are processed. The job gets processed once (technically "at least once"), and then it is completed and is removed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Event-Driven Application: Apache Kafka
&lt;/h2&gt;

&lt;p&gt;In contrast with the RabbitMQ model, message queues can also be more of data bus in architecture terms, a conduit for communicating the events throughout the application. A message (called a "record" in our favourite message queue tool, &lt;a href="https://kafka.apache.org/" rel="noopener noreferrer"&gt;Kafka&lt;/a&gt; is put onto the bus and then any interested consumers, now or in the future, can access and consume the message. The message persists so that other consumers can also access the data, either at the time that the data is added ("produced" in Kafka terminology) to the message bus, or later if we decide we want to revisit the data for additional analysis.&lt;/p&gt;

&lt;p&gt;We commonly see Apache Kafka used in event-driven applications where data must flow between multiple components in the application. Using this distributed message bus model gives a great deal of scalability and it's not a coincidence that the roots of the open source Apache Kafka tool are in the software stack of LinkedIn, a company with a lot of data and many components consuming it. Kafka is a distributed log of past events, which gives the advantage that every past change is also always still available, so you can build features based on the events or simply have the peace of mind that the data will always be availabe for inspection or audit if needed.&lt;/p&gt;

&lt;p&gt;Kafka is quite approachable as a technology, you can either install it yourself or take up the free trial available on the Aiven platform to get started. It's ideal for getting to know the technology, with a friendly web interface to get you on the right track and a selection of connectors that can be added easily. It can be scaled up to handle colossal workloads and we see some very large clients with some great performance on our platform.&lt;/p&gt;

&lt;h2&gt;
  
  
  More Resources
&lt;/h2&gt;

&lt;p&gt;Thinking about the shape of the data requirements that you have will ensure you pick a queue that works for you. If it's a task on a task list, then try RabbitMQ. But for data that needs to flow around your application and drive multiple integrations, Kakfa is probably your best bet. If you'd like to know more, then some of these links may be useful:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://aiven.io/blog/an-introduction-to-apache-kafka" rel="noopener noreferrer"&gt;An introduction to Apache Kafka&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://help.aiven.io/en/articles/489572-getting-started-with-aiven-kafka" rel="noopener noreferrer"&gt;Getting Started with Aiven Kafka&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/aiven/aiven-client" rel="noopener noreferrer"&gt;&lt;code&gt;avn&lt;/code&gt;, the Aiven CLI tool&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://help.aiven.io/en/articles/3325560-kafka-tools" rel="noopener noreferrer"&gt;Learn about tools for Kafka&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Of course, we'd love you to &lt;a href="https://aiven.io/kafka?utm_source=devto&amp;amp;utm_medium=organic&amp;amp;utm_content=post" rel="noopener noreferrer"&gt;try Kafka on the Aiven platform&lt;/a&gt; if you read this far and think you have Kafka-shaped requirements! Sign up and let us know what you build.&lt;/p&gt;

</description>
      <category>kafka</category>
      <category>data</category>
      <category>cloud</category>
      <category>eventstreaming</category>
    </item>
    <item>
      <title>How to Send WhatsApp Messages With Laravel</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Mon, 26 Oct 2020 14:31:25 +0000</pubDate>
      <link>https://dev.to/vonagedev/how-to-send-whatsapp-messages-with-laravel-1fj7</link>
      <guid>https://dev.to/vonagedev/how-to-send-whatsapp-messages-with-laravel-1fj7</guid>
      <description>&lt;p&gt;Take two things that we love: WhatsApp messages and Laravel framework. Mix them together. What do you get? Fun with phones!&lt;/p&gt;

&lt;p&gt;This tutorial shows you how to create a Laravel application that sends and responds to WhatsApp messages.&lt;/p&gt;

&lt;p&gt;You will need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A PHP development platform (Laravel 8 requires PHP 7.3 or later)&lt;/li&gt;
&lt;li&gt;WhatsApp on your phone&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Vonage API Account
&lt;/h2&gt;

&lt;p&gt;To complete this tutorial, you will need a &lt;a href="http://developer.nexmo.com/ed?c=blog_text&amp;amp;ct=2020-09-29-send-whatsapp-messages-with-laravel-8-dr" rel="noopener noreferrer"&gt;Vonage API account&lt;/a&gt;. If you don’t have one already, you can &lt;a href="http://developer.nexmo.com/ed?c=blog_text&amp;amp;ct=2020-09-29-send-whatsapp-messages-with-laravel-8-dr" rel="noopener noreferrer"&gt;sign up today&lt;/a&gt; and start building with free credit.&lt;/p&gt;

&lt;p&gt;Once you have an account, you can find your API Key and API Secret at the top of the &lt;a href="http://developer.nexmo.com/ed?c=blog_text&amp;amp;ct=2020-09-29-send-whatsapp-messages-with-laravel-8-dr" rel="noopener noreferrer"&gt;Vonage API Dashboard&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Get the Code Running
&lt;/h2&gt;

&lt;p&gt;The project is on GitHub at &lt;a href="https://github.com/nexmo-community/laravel-messages" rel="noopener noreferrer"&gt;nexmo-community/laravel-messages&lt;/a&gt;, so go ahead and clone that repository to your computer.&lt;/p&gt;

&lt;p&gt;Run the command &lt;code&gt;composer install&lt;/code&gt; to get the dependencies needed for this project.&lt;/p&gt;

&lt;p&gt;This application uses the &lt;code&gt;phpdotenv&lt;/code&gt; library to manage its configuration on a dev platform. Copy the file &lt;code&gt;.env.example&lt;/code&gt; to &lt;code&gt;.env&lt;/code&gt; and edit the file as needed. In particular, you should update the &lt;code&gt;NEXMO_API_KEY&lt;/code&gt; and &lt;code&gt;NEXMO_API_SECRET&lt;/code&gt; lines at the end of the file to connect your Vonage account (we used to be called Nexmo, and old habits die hard!)&lt;/p&gt;

&lt;p&gt;The application is ready to go! Start it with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;php artisan serve
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;By default, this will run your application on port 8000. Check that you have a Laravel homepage on &lt;a href="http://localhost:8000" rel="noopener noreferrer"&gt;http://localhost:8000&lt;/a&gt; before moving on.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Flaravel.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Flaravel.png" alt="Webpage showing the Laravel one-word greeting for a new app" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Set Up Ngrok
&lt;/h2&gt;

&lt;p&gt;Since the application needs to be able to receive incoming webhooks for two-way messaging communication, we need a way to allow public URLs to access the dev platform. I usually use &lt;a href="https://ngrok.com" rel="noopener noreferrer"&gt;Ngrok&lt;/a&gt; for this; it’s an excellent tool.&lt;/p&gt;

&lt;p&gt;Start an Ngrok tunnel to port 8000 (or whatever port your application is running on):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ngrok http 8000
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This starts an in-terminal console, so it looks something like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Fngrok-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Fngrok-1.png" alt="Screenshot of the ngrok console after startup" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Copy the “Forwarding” https URL. We will need this shortly to configure the Vonage Messages API sandbox. You can also check everything is wired up by requesting this URL in your browser and seeing the same Laravel landing page as before.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configure the Messages Sandbox
&lt;/h2&gt;

&lt;p&gt;To get incoming WhatsApp messages directed to your application, we need to do a little bit of configuration in the &lt;a href="https://dashboard.nexmo.com" rel="noopener noreferrer"&gt;dashboard&lt;/a&gt;. Under “Messages and Dispatch,” click on “Sandbox” we will use the Messages API Sandbox for the demo today, but if you have a WhatsApp Business account, you could use the same approach to message any user without the Sandbox or whitelisting process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Fsandbox-qr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Fsandbox-qr.png" alt="Screenshot of dashboard screen for whitelisting a number to the messages sandbox" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Add your phone number to the Sandbox by scanning the QR code or messaging the magic words to the number shown. I am not sure why I enjoy the magic words “auth method” so much, but it really seems like magic!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Fsandbox-webhooks.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Fsandbox-webhooks.png" alt="Screenshot of dashboard screen for configuring webhooks for messages sandbox" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Time to configure the webhooks, and you will need the URL copied from the ngrok console earlier.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Inbound should be &lt;code&gt;[url you copied earlier]/webhooks/inbound&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Status should be &lt;code&gt;[url you copied earlier]/webhooks/status&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Don’t forget to press “Save” here! That seems to trip me up more often than it ought to.&lt;/p&gt;

&lt;h2&gt;
  
  
  Inside the Code
&lt;/h2&gt;

&lt;p&gt;Let’s look at the various routes set up in this application and how they interact with the Messages API.&lt;/p&gt;

&lt;h3&gt;
  
  
  Sending From Laravel
&lt;/h3&gt;

&lt;p&gt;The best place to start with this application is at &lt;a href="http://localhost:8000/messages" rel="noopener noreferrer"&gt;http://localhost:8000/messages&lt;/a&gt; where you can add your phone number and send yourself a message.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Note that the number needs to be in the international (E.164) format without a leading +. So for US numbers, start with 1 and then add the full number with area code. For the UK, start with 44 and then add the whole number without the leading 0.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Here’s what happens when that form gets submitted:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// in routes/web.php

Route::post('/message', function(Request $request) {
    // TODO: validate incoming params first!

    $url = "https://messages-sandbox.nexmo.com/v0.1/messages";
    $params = ["to" =&amp;gt; ["type" =&amp;gt; "whatsapp", "number" =&amp;gt; $request-&amp;gt;input('number')],
        "from" =&amp;gt; ["type" =&amp;gt; "whatsapp", "number" =&amp;gt; "14157386170"],
        "message" =&amp;gt; [
            "content" =&amp;gt; [
                "type" =&amp;gt; "text",
                "text" =&amp;gt; "Hello from Vonage and Laravel :) Please reply to this message with a number between 1 and 100"
            ]
        ]
    ];
    $headers = ["Authorization" =&amp;gt; "Basic " . base64_encode(env('NEXMO_API_KEY') . ":" . env('NEXMO_API_SECRET'))];

    $client = new \GuzzleHttp\Client();
    $response = $client-&amp;gt;request('POST', $url, ["headers" =&amp;gt; $headers, "json" =&amp;gt; $params]);
    $data = $response-&amp;gt;getBody();
    Log::Info($data);

    return view('thanks');
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This route uses Guzzle to send a POST request to the Messages Sandbox, and if the credentials are correct and the supplied phone number is whitelisted to the application, it will send the message you can see defined here. The message prompts the user to reply, so let’s see that code next.&lt;/p&gt;

&lt;h3&gt;
  
  
  Receiving and Responding to WhatsApp Messages
&lt;/h3&gt;

&lt;p&gt;With the webhooks configured in the dashboard and the Ngrok tunnel running, the local development application can receive incoming WhatsApp messages. If you already replied to the challenge to supply a number, then you know what happens next &lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3weqd4n5mfl7oao2qrw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3weqd4n5mfl7oao2qrw.png" alt="🙂" width="72" height="72"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here’s the code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// in routes/web.php

Route::post('/webhooks/inbound', function(Request $request) {
    $data = $request-&amp;gt;all();

    $text = $data['message']['content']['text'];
    $number = intval($text);
    Log::Info($number);
    if($number &amp;gt; 0) {
        $random = rand(1, 8);
        Log::Info($random);
        $respond_number = $number * $random;
        Log::Info($respond_number);
        $url = "https://messages-sandbox.nexmo.com/v0.1/messages";
        $params = ["to" =&amp;gt; ["type" =&amp;gt; "whatsapp", "number" =&amp;gt; $data['from']['number']],
            "from" =&amp;gt; ["type" =&amp;gt; "whatsapp", "number" =&amp;gt; "14157386170"],
            "message" =&amp;gt; [
                "content" =&amp;gt; [
                    "type" =&amp;gt; "text",
                    "text" =&amp;gt; "The answer is " . $respond_number . ", we multiplied by " . $random . "."
                ]
            ]
        ];
        $headers = ["Authorization" =&amp;gt; "Basic " . base64_encode(env('NEXMO_API_KEY') . ":" . env('NEXMO_API_SECRET'))];

        $client = new \GuzzleHttp\Client();
        $response = $client-&amp;gt;request('POST', $url, ["headers" =&amp;gt; $headers, "json" =&amp;gt; $params]);
        $data = $response-&amp;gt;getBody();
    }
    Log::Info($data);
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This example probably isn’t exactly what you’d want your application to do, but it includes the moving parts you might need! The incoming webhook arrives, and we grab the data and try to read the message content as an integer. We do a little fun maths operation with a randomly generated number and send a reply to the user exactly as we did in the first code example.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;You could also use the &lt;a href="https://laravel.com/docs/8.x/http-client" rel="noopener noreferrer"&gt;Laravel HTTP Client&lt;/a&gt; in place of Guzzle if you’re more familiar with that. The Laravel HTTP Client is a wrapper for Guzzle.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Status Updates
&lt;/h3&gt;

&lt;p&gt;Remember the status webhook we configured earlier? Events about the messages such as when they are submitted (sent from Vonage to WhatsApp), delivered (arrived on the user’s device), and read (the user opened the message) are delivered to this endpoint. This application doesn’t do a lot with them, but it is great to have access and be able to respond to them.&lt;/p&gt;

&lt;p&gt;The status endpoints are also pretty handy for debugging. The route in the example application looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// in routes/web.php

Route::post('/webhooks/status', function(Request $request) {
    $data = $request-&amp;gt;all();
    Log::Info($data);
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So it just logs each event as it happens. Logging is incredibly valuable when debugging issues! By default, you can find these logs in &lt;code&gt;storage/logs/laravel.log&lt;/code&gt; and it is well worth keeping an eye on what is happening in that file as you develop an application like this one!&lt;/p&gt;

&lt;h2&gt;
  
  
  Fun With Phones
&lt;/h2&gt;

&lt;p&gt;If you didn’t try it already, then go ahead and enjoy a WhatsApp chat with your application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Flaravel-chat.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.nexmo.com%2Fwp-content%2Fuploads%2F2020%2F09%2Flaravel-chat.png" alt="Screenshot of the messages between user and application in whatsapp on a mobile phone" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Today’s example is simple but hopefully serves to get you started with WhatsApp and Laravel. We would love to know what you build, so let us know, and of course, always reach out to us if you have questions!&lt;/p&gt;

&lt;h2&gt;
  
  
  More Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;GitHub repo: &lt;a href="https://github.com/nexmo-community/laravel-messages" rel="noopener noreferrer"&gt;https://github.com/nexmo-community/laravel-messages&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;The day we built this on our Twitch Stream: &lt;a href="https://youtu.be/aV4IW3v-CTw" rel="noopener noreferrer"&gt;https://youtu.be/aV4IW3v-CTw&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Documentation for Messages Sandbox: &lt;a href="https://developer.nexmo.com/messages/concepts/messages-api-sandbox" rel="noopener noreferrer"&gt;https://developer.nexmo.com/messages/concepts/messages-api-sandbox&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;The &lt;code&gt;nexmo-laravel&lt;/code&gt; library for integrating your Laravel application with other Vonage APIs &lt;a href="https://github.com/Nexmo/nexmo-laravel" rel="noopener noreferrer"&gt;https://github.com/Nexmo/nexmo-laravel&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The post &lt;a href="https://www.nexmo.com/blog/2020/10/26/send-whatsapp-messages-with-laravel-dr" rel="noopener noreferrer"&gt;How to Send WhatsApp Messages With Laravel&lt;/a&gt; appeared first on &lt;a href="https://www.nexmo.com" rel="noopener noreferrer"&gt;Vonage Developer Blog&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>developer</category>
      <category>tutorial</category>
      <category>laravel</category>
      <category>messagesapi</category>
    </item>
    <item>
      <title>Surviving Hacktoberfest: A Guide for Maintainers</title>
      <dc:creator>Lorna Jane Mitchell</dc:creator>
      <pubDate>Tue, 13 Oct 2020 13:28:53 +0000</pubDate>
      <link>https://dev.to/lornajane/surviving-hacktoberfest-a-guide-for-maintainers-428k</link>
      <guid>https://dev.to/lornajane/surviving-hacktoberfest-a-guide-for-maintainers-428k</guid>
      <description>&lt;p&gt;Vonage is thrilled to be a Hacktoberfest 2020 partner. We’re &lt;a href="https://youtu.be/zYJpYMCy6PA"&gt;no strangers to open source&lt;/a&gt;, with our libraries, code snippets, and demos all on GitHub. To fully immerse yourself in the festivities, be sure to check out our &lt;a href="https://developer.nexmo.com/hacktoberfest"&gt;Hacktoberfest page&lt;/a&gt; for details on all that we have planned!&lt;/p&gt;

&lt;p&gt;Happy Hacktoberfest to one and all! Contributors, I hope you are having a wonderful time learning new skills and discovering projects where you can make a difference. Maintainers, today’s post is just for you. Hopefully, the rules changes mean a better quality of pull requests for your projects, but it can still be a lot to handle. I’m a long-time maintainer, and today I’d like to share some tips that I hope will help you through.&lt;/p&gt;

&lt;h2&gt;
  
  
  Let’s Talk Priorities
&lt;/h2&gt;

&lt;p&gt;Open source project maintainers are mostly doing this work in their “spare” time, and it’s a lot. It’s not just for Hacktober, and a lot of the work can be rather unseen. This Hacktoberfest, especially with global events around us, it’s important to consider your priorities and to keep sight of them!&lt;/p&gt;

&lt;p&gt;I’d suggest that you, your project, and then its contributors, in that order, is a good order of priorities. A project is nothing without maintainers, and many are teams of one. The aims and goal of the project is an important priority; there’s no pressure to expand the scope or pivot the project because a pull request arrived to do that. The joy of open source is that people can use their own forks as the basis of a new project if they don’t like the way you run things! And finally the contributors; Hacktoberfest has a lot of new contributors, but we want to raise them to be contributors, not spoilt children. So if they need to read the project guidelines before contributing, then say so rather than doing a lot of re-work yourself.&lt;/p&gt;

&lt;h2&gt;
  
  
  Handling Notification Fatigue
&lt;/h2&gt;

&lt;p&gt;Hacktoberfest is now opt-in, but if you are “in”, then it’s easy to feel overwhelmed, especially on a high profile or already-busy project. Key to coping is to manage your notifications. And no, an email rule to just file or delete everything with your project name in it is not the answer!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WKAnE_i5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.nexmo.com/wp-content/uploads/2020/10/gmail-github-filter.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WKAnE_i5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.nexmo.com/wp-content/uploads/2020/10/gmail-github-filter.png" alt="Add a filter to file all incoming mail with the word GitHub in"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Take some time with your GitHub notifications setup to make sure you are being notified when you want to be, and not being notified with too much that isn’t relevant. You can also route different notifications for different orgs to different email addresses, which can be very useful.&lt;/p&gt;

&lt;h3&gt;
  
  
  Configure and Route Emails
&lt;/h3&gt;

&lt;p&gt;GitHub has excellent help documentation, so I won’t repeat their content here, but I’ll direct you to the places I find the most useful!&lt;br&gt;&lt;br&gt;
First, you can link multiple email addresses to one GitHub account, which is useful if you do some work projects with your GitHub account, or use a different email address for a particular open source project. Take a look at the &lt;a href="https://docs.github.com/en/free-pro-team@latest/github/getting-started-with-github/verifying-your-email-address"&gt;documentation for verifying additional email addresses&lt;/a&gt; on GitHub.&lt;/p&gt;

&lt;p&gt;Next, get the right notifications going to the right email address. It’s under “custom routing” in the notifications configuration, and of course, there is &lt;a href="https://docs.github.com/en/free-pro-team@latest/github/managing-subscriptions-and-notifications-on-github/configuring-notifications#choosing-where-your-organizations-email-notifications-are-sent"&gt;excellent documentation on routing emails on GitHub&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Watch and Unwatch
&lt;/h3&gt;

&lt;p&gt;The ability to “Watch” a repo is very valuable. If there’s a project you want to get all the notifications for, click the “Watch” button at the top and choose “Watching”. This is useful if you need to keep track of activity in a particular repo.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--A2PLktqt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.nexmo.com/wp-content/uploads/2020/10/github-watch-settings.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--A2PLktqt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.nexmo.com/wp-content/uploads/2020/10/github-watch-settings.png" alt="screenshot showing the GitHub watch button and options: not watching, releases only, not watching, ignoring"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Perhaps more valuable is the ability to “unwatch” a project! I find that because I do some of the GitHub maintenance at work, I have access to many repositories, and by default, if you have access, you are subscribed to the notifications! This can be pretty noisy as you can imagine, so the same “Watch” button gives us some other options – the default is “Not Watching”, so you’ll get notifications on your own issues/PRs or if you’re mentioned. You can also set it to “Ignoring” if you’re getting involved when you don’t want to be.&lt;/p&gt;

&lt;h3&gt;
  
  
  Subscribe and Unsubscribe
&lt;/h3&gt;

&lt;p&gt;On a per-repo level, you can also get notifications without having to comment on the discussion to make yourself “participating”. Look for the button on the right-hand side labelled “Subscribe” under “Notifications”. Again, there’s an opposite option too! Suppose you’ve chimed in on something that you’re no longer interested in getting notifications for. In that case, you can “Unsubscribe” from just one issue or pull request without having to unsubscribe from a whole repository.&lt;/p&gt;

&lt;h2&gt;
  
  
  Move Quickly
&lt;/h2&gt;

&lt;p&gt;If a pull request isn’t useful or doesn’t meet the project goals, don’t be afraid to reject it. The &lt;a href="https://hacktoberfest.digitalocean.com/faq"&gt;Hacktoberfest FAQ&lt;/a&gt; is your friend here. Always be friendly – but quick responses are valuable if you have the availability to keep up with things every few days. If a pull request could be made acceptable, for example, because it makes the build fail but can be corrected, offer some feedback to your new contributor explaining what would make the pull request ready to merge. If it’s a change you don’t want (emoji to decorate your &lt;code&gt;README&lt;/code&gt; seems a popular contribution), then say so and close it.&lt;/p&gt;

&lt;p&gt;Open source isn’t always a welcoming place, and we operate in public, so bystanders get a good impression of our projects by the way we interact with people. Take the time to thank people for their input! Even if the pull request isn’t worth the time it took you to read it, a simple “This doesn’t seem useful to the project, why not check the issue list for ideas?” is much more welcoming than closing with no other communication or explanation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Thankyou
&lt;/h2&gt;

&lt;p&gt;On that note of thanking contributors, I’d like to close by thanking you, the maintainer. It’s a common misconception that open source projects are maintained by some amazing, distant, heroic figure. In fact, those of us who give our time and energy in this way are real people with real lives.&lt;/p&gt;

&lt;p&gt;Thank you for all that you do, open source changes the world, and in your own way, you’re making that happen.&lt;/p&gt;

&lt;p&gt;The post &lt;a href="https://www.nexmo.com/blog/2020/10/13/surviving-hacktoberfest-a-guide-for-maintainers"&gt;Surviving Hacktoberfest: A Guide for Maintainers&lt;/a&gt; appeared first on &lt;a href="https://www.nexmo.com"&gt;Vonage Developer Blog&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>developer</category>
      <category>uncategorised</category>
    </item>
  </channel>
</rss>
