<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Ivan Porollo</title>
    <description>The latest articles on DEV Community by Ivan Porollo (@iporollo).</description>
    <link>https://dev.to/iporollo</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/iporollo"/>
    <language>en</language>
    <item>
      <title>Custom Emails with Supertokens, Resend, and React Email</title>
      <dc:creator>Ivan Porollo</dc:creator>
      <pubDate>Sun, 23 Jun 2024 21:05:59 +0000</pubDate>
      <link>https://dev.to/iporollo/custom-emails-with-supertokens-resend-and-react-email-2mi1</link>
      <guid>https://dev.to/iporollo/custom-emails-with-supertokens-resend-and-react-email-2mi1</guid>
      <description>&lt;p&gt;At &lt;a href="https://cerebralvalley.ai"&gt;Cerebral Valley&lt;/a&gt; we use &lt;a href="https://supertokens.com/"&gt;Supertokens&lt;/a&gt; for authentication into our platform.&lt;/p&gt;

&lt;p&gt;Supertokens comes with a default email template / design that is sent to users upon account creation, email confirmation, and other actions. &lt;/p&gt;

&lt;p&gt;We wanted to customize emails sent out from Supertokens to our users to keep the brand aesthetic, so we used &lt;a href="https://resend.com/"&gt;Resend&lt;/a&gt; and &lt;a href="https://react.email/"&gt;React Email&lt;/a&gt; to do so. &lt;/p&gt;

&lt;p&gt;I wrote up this post to give step by step instructions of how to combine the three technologies and showcase their simplicity. &lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-requisites
&lt;/h2&gt;

&lt;p&gt;Your project will need to be using &lt;a href="https://supertokens.com/"&gt;Supertokens&lt;/a&gt; as the method of authentication.&lt;/p&gt;

&lt;p&gt;In this walkthrough, I am working out of a Typescript project with an Express backend.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resend setup
&lt;/h2&gt;

&lt;p&gt;First, you will need to create a &lt;a href="https://resend.com/"&gt;Resend&lt;/a&gt; account. Resend is a new platform to send emails programmatically. Think &lt;a href="https://sendgrid.com/"&gt;Sendgrid&lt;/a&gt; but modern and easier to use. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://resend.com/signup"&gt;Sign up&lt;/a&gt; and go through the onboarding flow to get an API key. By default, the first key that is created is called "Onboarding"&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F25dm2n5j2aohgblheer0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F25dm2n5j2aohgblheer0.png" alt="Resend API Keys" width="800" height="435"&gt;&lt;/a&gt;&lt;br&gt;
Add your API key to your &lt;code&gt;.env&lt;/code&gt; file, something like&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;RESEND_API_KEY="&amp;lt;your_api_key&amp;gt;"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then, add the domain you want to send your emails from. &lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bxxz6qf6rdthsf1moqs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bxxz6qf6rdthsf1moqs.png" alt="Add domain in Resend" width="800" height="434"&gt;&lt;/a&gt;&lt;br&gt;
You will have to add a few records to your domain which Resend will walk you through. &lt;/p&gt;

&lt;p&gt;Next, create a file in your project called &lt;code&gt;smpt.ts&lt;/code&gt; with the following contents&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const smtpSettings = {
  host: 'smtp.resend.com',
  authUsername: 'resend',
  password: process.env.RESEND_API_KEY,
  port: 465,
  from: {
    name: '&amp;lt;your_email_sender_name&amp;gt;',
    email: '&amp;lt;your_email_account&amp;gt;',
  },
  secure: true,
};

export { smtpSettings };

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  React Email setup
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://react.email/"&gt;React Email&lt;/a&gt; is a library built by the founder of Resend. Since emails support HTML, the library allows you to customize emails with React components and then compiles them down to HTML before sending. &lt;/p&gt;

&lt;p&gt;To install React Email, run&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;yarn add react @react-email/components @react-email/render
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create your a file named &lt;code&gt;Email.tsx&lt;/code&gt; for your React Email components.  The file contents would be something like this&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import * as React from 'react';
import { render } from '@react-email/render';
import {
  Body,
  Button,
  Container,
  Head,
  Heading,
  Hr,
  Html,
  Img,
  Link,
  Preview,
  Section,
} from '@react-email/components';

const emailSubject = 'Email Subject';

const EmailHtml = (content: string): string =&amp;gt; {
  return render(&amp;lt;Email content={content} /&amp;gt;);
};

// Component

interface EmailProps {
  content: string;
}

function Email(props: EmailProps): JSX.Element {
  const { content } = props;
  return (
    &amp;lt;Html&amp;gt;
      &amp;lt;Head /&amp;gt;
      &amp;lt;Preview&amp;gt;{'What the user sees in the preview'}&amp;lt;/Preview&amp;gt;
      &amp;lt;Body style={main}&amp;gt;
        &amp;lt;Container style={container}&amp;gt;
          &amp;lt;Img
            src={`https://yourlogo.com/logo.png`}
            width="42"
            height="42"
            alt="Logo"
            style={logo}
          /&amp;gt;
          &amp;lt;Heading style={heading}&amp;gt;Click the button below&amp;lt;/Heading&amp;gt;
          &amp;lt;Section style={buttonContainer}&amp;gt;
            &amp;lt;Button style={button} href={content}&amp;gt;
              My button
            &amp;lt;/Button&amp;gt;
          &amp;lt;/Section&amp;gt;
        &amp;lt;/Container&amp;gt;
      &amp;lt;/Body&amp;gt;
    &amp;lt;/Html&amp;gt;
  );
}

// Styling

const logo = {
  borderRadius: 21,
  width: 42,
  height: 42,
};

const main = {
  backgroundColor: '#090909',
  fontFamily:
    '-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,Oxygen-Sans,Ubuntu,Cantarell,"Helvetica Neue",sans-serif',
  // ui-sans-serif, system-ui, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji"
};

const container = {
  margin: '0 auto',
  padding: '20px 0 48px',
  maxWidth: '560px',
};

const heading = {
  fontSize: '24px',
  letterSpacing: '-0.5px',
  lineHeight: '1.3',
  fontWeight: '400',
  color: '#fff',
  padding: '17px 0 0',
};

const paragraph = {
  margin: '0 0 15px',
  fontSize: '15px',
  lineHeight: '1.4',
  color: '#3c4149',
};

const buttonContainer = {
  padding: '27px 0 27px',
};

const button = {
  backgroundColor: '#fff',
  borderRadius: '3px',
  fontWeight: '600',
  color: '#000',
  fontSize: '15px',
  textDecoration: 'none',
  textAlign: 'center' as const,
  display: 'block',
  padding: '11px 23px',
};

const reportLink = {
  fontSize: '14px',
  color: '#b4becc',
};

const hr = {
  borderColor: '#dfe1e4',
  margin: '42px 0 26px',
};

export { EmailHtml, emailSubject };

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note that the above code is for a showing a link in the email. Your content may be different.&lt;/p&gt;

&lt;h2&gt;
  
  
  Supertokens Config
&lt;/h2&gt;

&lt;p&gt;Supertokens allows you to use your own domain / SMTP server (&lt;a href="https://supertokens.com/docs/thirdpartypasswordless/email-delivery/about#method-2-use-your-own-domain--smtp-server"&gt;link to docs&lt;/a&gt;). We take advantage of this method to plug in Resend and the created React component.&lt;/p&gt;

&lt;p&gt;In your Supertokens config, override the smtp settings as described in their docs &lt;a href="https://supertokens.com/docs/thirdpartypasswordless/email-delivery/smtp/change-email-content"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Don't forget to import your smtpSettings and Email component.&lt;/p&gt;

&lt;p&gt;Your Supertokens config would look something like this&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import supertokens from "supertokens-node";
import Passwordless from "supertokens-node/recipe/passwordless";
import Session from "supertokens-node/recipe/session";
import { SMTPService } from "supertokens-node/recipe/passwordless/emaildelivery";
import EmailVerification from "supertokens-node/recipe/emailverification"
import { SMTPService as EmailVerificationSMTPService } from "supertokens-node/recipe/emailverification/emaildelivery";
import { smtpSettings } from './smtp';
import { EmailHtml, emailSubject } from './Email';

supertokens.init({
    appInfo: {
        apiDomain: "...",
        appName: "...",
        websiteDomain: "..."
    },
    recipeList: [
       Passwordless.init({
         emailDelivery: {
          service: new SMTPService({
            smtpSettings,
            override: (originalImplementation): any =&amp;gt; {
              return {
                ...originalImplementation,
                getContent: async function (input): Promise&amp;lt;any&amp;gt; {
                  const {
                    isFirstFactor,
                    codeLifetime, // amount of time the code is alive for (in MS)
                    email,
                    urlWithLinkCode, // magic link
                  } = input;

                  if (isFirstFactor) {
                    return {
                      body: EmailHtml(urlWithLinkCode),
                      isHtml: true,
                      subject: emailSubject,
                      toEmail: email,
                    };
                  } else {
                    return {
                      body: EmailHtml(urlWithLinkCode),
                      isHtml: true,
                      subject: emailSubject,
                      toEmail: email,
                    };
                  }
                },
              };
            },
          }),
        },
      }),

      Session.init()
    ]
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the config above, we override the email delivery service with our own definition of the &lt;code&gt;SMPTService&lt;/code&gt;, where we set the previously defined &lt;code&gt;smptSettings&lt;/code&gt; and a new &lt;code&gt;getContent&lt;/code&gt; function definition. In the new &lt;code&gt;getContent&lt;/code&gt;, we get the defined React Component as the body of the email. The component is compiled into HTML, so we can use it in the email.&lt;/p&gt;

&lt;p&gt;With those changes, you can start the server and run through the authentication flow. When you receive the email from Supertokens, you should see your new component design as the body of the email. &lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;If you're using Supertokens for your authentication, I highly recommend using Resend and React Email to customize your emails. It's very simple to setup and a pleasant developer experience. &lt;/p&gt;

&lt;p&gt;If you have any questions, send me a DM in our community &lt;a href="https://cerebralvalley.ai/slack"&gt;Slack&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Check out what we're building at &lt;a href="https://cerebralvalley.ai"&gt;Cerebral Valley&lt;/a&gt;. &lt;/p&gt;

</description>
    </item>
    <item>
      <title>Reference Data Stack for Data-Driven Startups</title>
      <dc:creator>Ivan Porollo</dc:creator>
      <pubDate>Thu, 03 Mar 2022 21:15:20 +0000</pubDate>
      <link>https://dev.to/iporollo/reference-data-stack-for-data-driven-startups-2opl</link>
      <guid>https://dev.to/iporollo/reference-data-stack-for-data-driven-startups-2opl</guid>
      <description>&lt;p&gt;At &lt;a href="https://monosi.dev"&gt;Monosi&lt;/a&gt;, we recently set up our internal data stack to fully enable data collection and metrics tracking. I figured a post like this may be useful for earlier stage companies that are in the process of setting up their own pipelines as they start tracking metrics and becoming more data driven.&lt;/p&gt;

&lt;p&gt;This post is a high level overview of the technologies we are using and why we chose them. If you’re interested in a more technical overview or tutorial with actual Terraform files to spin up a replica stack, comment below and I can write up a more detailed post.&lt;/p&gt;

&lt;h2&gt;
  
  
  Technologies Used
&lt;/h2&gt;

&lt;p&gt;Monosi is an &lt;a href="https://monosi.dev"&gt;open source data observability platform&lt;/a&gt;, therefore our stack is composed mostly of open source technologies. Our current stack looks like:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgzv99rbkbh5rugzalvq8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgzv99rbkbh5rugzalvq8.png" alt="Monosi Stack Overview" width="800" height="316"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Technologies include &lt;a href="https://snowplowanalytics.com/"&gt;Snowplow&lt;/a&gt;, &lt;a href="https://airbyte.com/"&gt;Airbyte&lt;/a&gt;, &lt;a href="https://www.postgresql.org/"&gt;PostgreSQL&lt;/a&gt;, &lt;a href="https://www.getdbt.com/"&gt;dbt&lt;/a&gt;, &lt;a href="https://www.snowflake.com/"&gt;Snowflake&lt;/a&gt;, &lt;a href="https://www.metabase.com/"&gt;Metabase&lt;/a&gt;, and &lt;a href="https://monosi.dev"&gt;Monosi&lt;/a&gt; itself. All of it is hosted on AWS running in a VPC. &lt;/p&gt;

&lt;h2&gt;
  
  
  Data Collection / Extraction / Ingestion
&lt;/h2&gt;

&lt;p&gt;There are two ways we collect data and ingest it into our pipeline.&lt;/p&gt;

&lt;p&gt;First, we have data coming in from various third party tools such as Google Analytics, MailChimp, Github, Slack, etc. For extraction we use &lt;a href="https://airbyte.com/"&gt;Airbyte&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc8vs43exz1dfwdirr82z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc8vs43exz1dfwdirr82z.png" alt="Extraction with Airbyte" width="800" height="394"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We chose Airbyte because of their open source self-hosted offering and their connector flexibility. There are other open source tools in the market like &lt;a href="https://www.stitchdata.com/"&gt;StitchData&lt;/a&gt; or &lt;a href="https://meltano.com/"&gt;Meltano&lt;/a&gt; but have limitations as discussed in &lt;a href="https://airbyte.com/blog/airbyte-vs-singer-why-airbyte-is-not-built-on-top-of-singer"&gt;this post&lt;/a&gt; by the Airbyte team. If you don’t want to self host extraction, consider &lt;a href="https://fivetran.com/"&gt;Fivetran&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjig570ngvqeif1diyvu2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjig570ngvqeif1diyvu2.png" alt="Snowplow event collection" width="800" height="331"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We also have telemetry set up on our Monosi product which is collected through &lt;a href="https://snowplowanalytics.com/"&gt;Snowplow&lt;/a&gt;,. As with Airbyte, we chose Snowplow because of its open source offering and because of their scalable event ingestion framework. There are other open source options to consider including &lt;a href="https://jitsu.com/"&gt;Jitsu&lt;/a&gt; and &lt;a href="https://www.rudderstack.com/"&gt;RudderStack&lt;/a&gt; or closed source options like &lt;a href="https://segment.com/"&gt;Segment&lt;/a&gt;. Since we started building our product with just a CLI offering, we didn’t need a full CDP solution so we chose Snowplow.&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Storage
&lt;/h2&gt;

&lt;p&gt;To store our collected data, we use a Snowflake data warehouse for all final data collection and a PostgreSQL database as an event destination. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fckf1adpz9f3z747x1ayt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fckf1adpz9f3z747x1ayt.png" alt="Airbyte to Snowflake" width="800" height="335"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;All data from third party tools extracted with Airbyte is piped into Snowflake. We chose Snowflake because it’s easy to get started with, has a simple UX, and we use it internally for testing. Snowflake offers a free trial to quickly get started with and provides an intuitive UI. Since Monosi supports a Snowflake integration, it also makes development and testing easier.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqheay3n915rq2k1spz20.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqheay3n915rq2k1spz20.png" alt="Snowplow to PostgreSQL" width="800" height="347"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;All events collected with Snowplow are piped into a PostgreSQL database. This is separate because we wanted an individual database to act as storage for product event collection and it was quick to setup with Snowplow.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F40xaz99xb42cik8x6n7v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F40xaz99xb42cik8x6n7v.png" alt="Postgres to Snowflake and dbt" width="800" height="379"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The setup also enables us to sync all of our event data to the data warehouse &lt;a href="https://airbyte.com/recipes/postgresql-database-to-snowflake"&gt;with Airbyte&lt;/a&gt; and perform necessary transformations on the data with &lt;a href="https://www.getdbt.com/"&gt;dbt cloud&lt;/a&gt;. &lt;/p&gt;

&lt;h2&gt;
  
  
  Data Visualization / Analysis
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffc7gfc20atq14c1v119f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffc7gfc20atq14c1v119f.png" alt="Snowflake to Metabase" width="800" height="339"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To analyze the data stored in Snowflake and Postgres, we use &lt;a href="https://metabase.com"&gt;Metabase&lt;/a&gt;. We chose Metabase because of it’s open source offering and easy to use interface. Other open source tools like &lt;a href="https://www.lightdash.com/"&gt;Lightdash&lt;/a&gt; and &lt;a href="https://superset.apache.org/"&gt;Superset&lt;/a&gt; exist which we may add to the stack as our data team grows. &lt;/p&gt;

&lt;h2&gt;
  
  
  Data Reliability
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsp7o5rl1oe72insiabk2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsp7o5rl1oe72insiabk2.png" alt="Snowflake and Monosi" width="800" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As a data driven company, we need to make sure that our data is flowing reliably from extraction to visualization. It is important to have our metrics up to date and accurately defined at all times. Hence, when a certain part of the data stack stops working or changes (e.g. event ingestion, data extraction, schema updates) we need to be aware of what happened and why. &lt;/p&gt;

&lt;p&gt;This is where our own tool, &lt;a href="https://monosi.dev"&gt;Monosi&lt;/a&gt;, comes in. To ensure data reliability within our stack, we’ve deployed our own internal instance of Monosi. With Monosi, we get alerted if anomalies occur in our data and can perform root cause analysis.&lt;/p&gt;

&lt;h2&gt;
  
  
  Other Tooling
&lt;/h2&gt;

&lt;p&gt;There are other tools that we will have to adopt in the future but haven’t yet due to lack of necessity. Specifically, one category that is popular in modern data stacks is Reverse ETL (&lt;a href="https://hightouch.io/"&gt;Hightouch&lt;/a&gt;, &lt;a href="https://www.getcensus.com/"&gt;Census&lt;/a&gt;, or &lt;a href="https://www.grouparoo.com/"&gt;Grouparoo&lt;/a&gt;). We currently don’t have a usecase for piping data back into 3rd party tools but it will definitely come up in the future.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;Thanks for reading the overview of what our startup data stack looks like! If you have any questions about the setup or would like to see a more in depth post with actual Terraform files that handle both service creation and networking, leave a comment below!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>How to monitor Segment event data with Monosi</title>
      <dc:creator>Ivan Porollo</dc:creator>
      <pubDate>Fri, 18 Feb 2022 17:03:59 +0000</pubDate>
      <link>https://dev.to/iporollo/how-to-monitor-segment-event-data-with-monosi-343n</link>
      <guid>https://dev.to/iporollo/how-to-monitor-segment-event-data-with-monosi-343n</guid>
      <description>&lt;p&gt;&lt;a href="https://segment.com/"&gt;Segment&lt;/a&gt; is a popular CDP (Customer Data Platform) used by over 20,000 companies. One of the primary use cases for Segment is to collect event or clickstream data and pipe it into a destination of choice.&lt;/p&gt;

&lt;p&gt;With any event collection system, anomalies are bound to occur in the ingestion pipeline and the data itself. General observability of such pipelines is important. It is possible with tools like &lt;a href="https://www.datadoghq.com/blog/monitor-segment-datadog/"&gt;Datadog&lt;/a&gt;, which collects metadata around event delivery such as latency and delivery success vs. failure. &lt;/p&gt;

&lt;p&gt;Unfortunately, these tools fall short when it comes to monitoring the actual data (and the related metadata) collected in the events. &lt;/p&gt;

&lt;p&gt;Segment events generally hold vital information standardized by a set schema and instrumented in code. Schemas defined in code are occasionally incorrect or not set to the underlying standards. This quickly becomes a data quality issue which cannot be detected by standard observability tools.&lt;/p&gt;

&lt;p&gt;This is where &lt;strong&gt;data&lt;/strong&gt; observability tooling comes to the rescue. Monosi is an &lt;a href="https://monosi.dev"&gt;open source data observability&lt;/a&gt; and monitoring platform for data teams (see &lt;a href="https://github.com/monosidev/monosi"&gt;Monosi Github&lt;/a&gt;). It is used to quickly set up monitors on a data store to run checks for data quality issues and alert on detected anomalies.&lt;/p&gt;

&lt;p&gt;This post will walk you through getting started with Segment event data monitoring through the Monosi platform, ensuring the data quality of your events in less than &lt;strong&gt;10 minutes&lt;/strong&gt;. &lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;For this tutorial, you will need a &lt;a href="https://app.segment.com/"&gt;Segment business account&lt;/a&gt; with an event source set up. If you don't have a business plan, Segment offers a 14 day free trial. If you don't have an event source, learn more on how to start collecting events with Segment &lt;a href="https://segment.com/docs/connections/sources/"&gt;here&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;You will also need one of your Segment destinations to be Postgres or Snowflake to work with Monosi. If you don't have either, I recommend creating a free &lt;a href="https://segment.com/docs/connections/storage/catalog/snowflake/"&gt;Snowflake account&lt;/a&gt; and creating a &lt;a href="https://segment.com/docs/connections/storage/catalog/snowflake/"&gt;Snowflake destination in Segment&lt;/a&gt;. For the purposes of this tutorial, we will be using Snowflake, but Postgres works as well. &lt;/p&gt;

&lt;p&gt;Finally, you will need Docker to install and run Monosi. If you're not familiar with Docker, follow the tutorial &lt;a href="https://docker-curriculum.com/#introduction"&gt;here&lt;/a&gt;. Make sure you have docker installed and ready to use.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing Monosi
&lt;/h2&gt;

&lt;p&gt;Monosi provides a &lt;a href="https://hub.docker.com/r/monosi/monosi"&gt;Docker image&lt;/a&gt; to run the web interface and simplify deployment. To install and run Monosi through Docker, run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -p 3000:3000 monosi/monosi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Note: Monosi also has a CLI version distributed through &lt;a href="https://pypi.org/project/monosi/"&gt;pypi&lt;/a&gt;. More information on how to run Monosi through the CLI can be found &lt;a href="https://docs.monosi.dev/introduction/getting-started/cli-as-code"&gt;here&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Connecting to the Segment Event Data Store
&lt;/h2&gt;

&lt;p&gt;Since Segment is writing data to a data store destination (in this case Snowflake), we need to give Monosi the appropriate connection details. &lt;/p&gt;

&lt;p&gt;Set up a connection to the data store in the UI by navigating to &lt;code&gt;http://localhost:3000/settings/sources&lt;/code&gt;. Click the &lt;code&gt;Create a Data Source&lt;/code&gt; button and fill out the connection details. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F92pxb34114xa3ji776cr.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F92pxb34114xa3ji776cr.gif" alt="Monosi snowflake connection setup" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Integrating with Slack
&lt;/h2&gt;

&lt;p&gt;Monosi sends alerts on detected anomalies to Slack. Set up a Slack integration by navigating to &lt;code&gt;http://localhost:3000/settings/integrations&lt;/code&gt; and creating a new integration with a Slack &lt;a href="https://api.slack.com/messaging/webhooks"&gt;channel webhook&lt;/a&gt;. More information can be found &lt;a href="https://docs.monosi.dev/user-guide/web-app#integrations"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nrx7296eb4k50idbknc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nrx7296eb4k50idbknc.png" alt="Monosi integrations page" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Monitor
&lt;/h2&gt;

&lt;p&gt;With the Segment event data source and the Slack integration connected, we can now create a monitor. Navigate to the Monosi page &lt;code&gt;http://localhost:3000/monitors&lt;/code&gt; and select the &lt;a href="https://docs.monosi.dev/user-guide/cli/monitors/table-health"&gt;table health monitor&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9hjw9148onlhmmjpu9pv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9hjw9148onlhmmjpu9pv.png" alt="Monosi monitor creation page" width="800" height="441"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Fill out the form with a monitor name, description, and an interval for the monitor to run on (in minutes).&lt;/p&gt;

&lt;p&gt;Then, select your created data source. For the purposes of this tutorial we are using Snowflake, therefore the inputs for a Segment &lt;code&gt;Page&lt;/code&gt; event monitor would be:&lt;/p&gt;

&lt;p&gt;Table name: &lt;code&gt;SEGMENT_WAREHOUSE&lt;/code&gt;.&lt;code&gt;SEGMENT_EVENTS&lt;/code&gt;.&lt;code&gt;PAGES&lt;/code&gt;&lt;br&gt;
Timestamp column: &lt;code&gt;RECIEVED_AT&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Save the monitor and it will appear in the monitors index view. This monitor will run until deletion. If any anomalies in the data are detected, it will send them to the connected Slack channel.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;🎉 Congratulations, you've just set up and scheduled a data monitor on your Segment event data. You can now add more monitors to other event tables in your database. Find more information on how to use Monosi &lt;a href="https://docs.monosi.dev/"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you have any questions, join our &lt;a href="https://monosi.dev/slack"&gt;Slack community&lt;/a&gt; or open an issue in our repository on &lt;a href="https://github.com/monosidev/monosi"&gt;Github&lt;/a&gt;. If you want to see more posts like this, &lt;a href="https://monosi.dev/community.html"&gt;subscribe to our newsletter&lt;/a&gt;!&lt;/p&gt;

</description>
      <category>opensource</category>
      <category>database</category>
      <category>monitoring</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>How to Monitor Supabase with Monosi</title>
      <dc:creator>Ivan Porollo</dc:creator>
      <pubDate>Tue, 15 Feb 2022 22:01:08 +0000</pubDate>
      <link>https://dev.to/iporollo/how-to-monitor-supabase-with-monosi-32e0</link>
      <guid>https://dev.to/iporollo/how-to-monitor-supabase-with-monosi-32e0</guid>
      <description>&lt;p&gt;Monosi is an &lt;a href="https://monosi.dev"&gt;open source data observability&lt;/a&gt; and monitoring platform for data teams (see &lt;a href="https://github.com/monosidev/monosi"&gt;Monosi Github&lt;/a&gt;). It is used to quickly set up monitors on a data store to run checks for data quality issues and alert on detected anomalies.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://supabase.com/"&gt;Supabase&lt;/a&gt; is an open source Firebase alternative. It lets you easily create a backend with a Postgres database, storage, APIs and more.&lt;/p&gt;

&lt;p&gt;In this article, we will show you how to set up data observability for Supabase using Monosi. With this setup, you will be able to detect data anomailes that occur in your Supabase instance.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;For this tutorial, you will need a &lt;a href="https://app.supabase.io/"&gt;Supabase account&lt;/a&gt; with a populated database. If you don't have an account, it's quick to get started with their platform by navigating to the &lt;a href="https://app.supabase.io/"&gt;Supabase app&lt;/a&gt; and following their &lt;a href="https://supabase.com/docs/"&gt;getting started documentation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;You will also need Docker to install and run Monosi. If you're not familiar with Docker, follow the tutorial &lt;a href="https://docker-curriculum.com/#introduction"&gt;here&lt;/a&gt;. Make sure you have docker installed and ready to use.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing Monosi
&lt;/h2&gt;

&lt;p&gt;Monosi provides a &lt;a href="https://hub.docker.com/r/monosi/monosi"&gt;Docker image&lt;/a&gt; to run the web interface and simplify deployment. To install and run Monosi through Docker, run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -p 3000:3000 monosi/monosi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Note: Monosi also has a CLI version distributed through &lt;a href="https://pypi.org/project/monosi/"&gt;pypi&lt;/a&gt;. More information on how to run Monosi through the CLI can be found &lt;a href="https://docs.monosi.dev/introduction/getting-started/cli-as-code"&gt;here&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Connecting to Supabase
&lt;/h2&gt;

&lt;p&gt;Since Monosi supports PostgreSQL, it can connect and monitor any Supabase instance.&lt;/p&gt;

&lt;p&gt;To get your database credentials from Supabase, navigate to the Settings &amp;gt; Database page and find the Connection info section.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgylauxh2zi891tyem6ns.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgylauxh2zi891tyem6ns.png" alt="Supabase connection string" width="800" height="411"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the Supabase connection details in hand, navigate to the Monosi datasources page at &lt;code&gt;http://localhost:3000/settings/sources&lt;/code&gt;. Fill out the Postgres connection details with your respective Supabase instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F19n4vam1646p67e8jhcm.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F19n4vam1646p67e8jhcm.gif" alt="Monosi PostgreSQL setup" width="800" height="441"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Integrating with Slack
&lt;/h2&gt;

&lt;p&gt;Monosi sends alerts on detected anomalies to Slack. Set up a Slack integration by navigating to &lt;code&gt;http://localhost:3000/settings/integrations&lt;/code&gt; and creating a new integration with a Slack &lt;a href="https://api.slack.com/messaging/webhooks"&gt;channel webhook&lt;/a&gt;. More information can be found &lt;a href="https://docs.monosi.dev/user-guide/web-app#integrations"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nrx7296eb4k50idbknc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nrx7296eb4k50idbknc.png" alt="Monosi integrations page" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Monitor
&lt;/h2&gt;

&lt;p&gt;With the Supabase data source and Slack integration connected, we can now create a monitor. Navigate to the Monosi page &lt;code&gt;http://localhost:3000/monitors&lt;/code&gt; and select the &lt;a href="https://docs.monosi.dev/user-guide/cli/monitors/table-health"&gt;table health monitor&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9hjw9148onlhmmjpu9pv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9hjw9148onlhmmjpu9pv.png" alt="Monosi monitor creation page" width="800" height="441"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Fill out the form with a monitor name, description, and an interval for the monitor to run on (in minutes).&lt;/p&gt;

&lt;p&gt;Then, select the Supabase data source and enter the table name that you want monitored. Ensure that this table has a timestamp column. The timestamp column field must be put into the timestamp field.&lt;/p&gt;

&lt;p&gt;Save the monitor and it will appear in the monitors index view. This monitor will run until deletion. If any alerts are detected, it will send them to the connected Slack channel.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;🎉 Congratulations, you've just set up and scheduled a data monitor on your Supabase instance. You can now add more monitors to other tables in your database. Find more information on how to use Monosi &lt;a href="https://docs.monosi.dev/"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you have any questions, join our &lt;a href="https://monosi.dev/slack"&gt;Slack community&lt;/a&gt; or open an issue in our repository on &lt;a href="https://github.com/monosidev/monosi"&gt;Github&lt;/a&gt;. If you want to see more posts like this, &lt;a href="https://monosi.dev/community.html"&gt;subscribe to our newsletter&lt;/a&gt;!&lt;/p&gt;

</description>
      <category>postgres</category>
      <category>python</category>
      <category>opensource</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Setting up data monitoring for PostgreSQL</title>
      <dc:creator>Ivan Porollo</dc:creator>
      <pubDate>Tue, 08 Feb 2022 23:26:30 +0000</pubDate>
      <link>https://dev.to/iporollo/setting-up-data-monitoring-for-postgresql-280e</link>
      <guid>https://dev.to/iporollo/setting-up-data-monitoring-for-postgresql-280e</guid>
      <description>&lt;p&gt;Data quality and reliability are still a source of headaches for data organizations today. &lt;a href="https://monosi.dev"&gt;Monosi&lt;/a&gt; exists to resolve the issues that teams face.&lt;/p&gt;

&lt;p&gt;Monosi is an &lt;a href="https://monosi.dev"&gt;open source data observability&lt;/a&gt; and monitoring platform for data teams (see &lt;a href="https://github.com/monosidev/monosi"&gt;Monosi Github&lt;/a&gt;). It is used to quickly set up monitors on a data store to run checks for data quality issues and alert on detected anomalies.&lt;/p&gt;

&lt;p&gt;This article will walk you through how to get started monitoring a PostgreSQL database in &lt;strong&gt;less than 10 minutes&lt;/strong&gt; with Monosi. &lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-requisites
&lt;/h2&gt;

&lt;p&gt;For this tutorial, we are going to use &lt;a href="https://www.postgresql.org/"&gt;PostgreSQL&lt;/a&gt; as our database to monitor. We are going to use a public Postgres instance that is available online from &lt;a href="https://rnacentral.org/help/public-database"&gt;RNAcentral&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing Monosi (CLI)
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;There are two ways of using Monosi - a CLI and a web interface. We will first run through the CLI workflow, then showcase the web UI.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Monosi's CLI is a Python package that is distributed through &lt;a href="https://pypi.org/project/monosi/"&gt;pypi&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: Monosi CLI is only compatible with Python 3.6 and higher.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;To install it, open your terminal and run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install monosi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Check that it's been installed by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;monosi --version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Configuring a data source
&lt;/h2&gt;

&lt;p&gt;With Monosi installed, we need to configure a data source to monitor. Monosi reads connection details from the &lt;code&gt;~/.monosi/workspaces.yml&lt;/code&gt; file, so let's create it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir ~/.monosi
touch ~/.monosi/workspaces.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Edit the &lt;code&gt;workspaces.yml&lt;/code&gt; file in your editor of choice and fill it out with the public database information. This is what the file will look like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;default:
    sources:
        default:
            type: postgres
            user: reader
            password: NWDMCE5xdipIjRrp
            host: hh-pgsql-public.ebi.ac.uk
            port: 5432
            database: pfmegrnargs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can find more information on setting up the &lt;code&gt;workspaces.yml&lt;/code&gt; file &lt;a href="https://docs.monosi.dev/user-guide/cli/data-sources"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Monosi project
&lt;/h2&gt;

&lt;p&gt;Navigate to the directory where you want your Monosi project to live and create a project repository by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;monosi init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the directory you should now see a folder called &lt;code&gt;monosi-repo&lt;/code&gt; (you can rename this if you want). Navigate into the folder by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd monosi-repo 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the &lt;code&gt;monosi-repo&lt;/code&gt; directory, you should see a &lt;code&gt;monosi_project.yml&lt;/code&gt; file. &lt;/p&gt;

&lt;p&gt;This file configures which connection to use and your monitor paths, as well as other metadata. More information on setting up the &lt;code&gt;monosi_project.yml&lt;/code&gt; file can be found &lt;a href="https://docs.monosi.dev/user-guide/cli/monitors"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;For the purposes of this tutorial, we don't need to edit the file. &lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a monitor
&lt;/h2&gt;

&lt;p&gt;Monosi automatically creates a folder called &lt;code&gt;monitors&lt;/code&gt; in the &lt;code&gt;monosi-repo&lt;/code&gt; directory. In that folder, an example custom SQL monitor is defined to show how the syntax works.  &lt;/p&gt;

&lt;p&gt;The example public database is populated with 40+ tables. To take a look at how the schema is structured, click &lt;a href="https://rnacentral.org/help/public-database"&gt;here&lt;/a&gt;. For the purposes of this example, we will create a monitor on one of the tables, specifically &lt;code&gt;auth_user&lt;/code&gt;. To do so, run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;touch ./monitors/auth_user.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Edit the &lt;code&gt;auth_user.yml&lt;/code&gt; file in your editor of choice and paste the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;monosi:
  monitors:
  - name: pfmegrnargs.rnacen.auth_user - Table Health
    description: Monitoring the health of the auth_user table
    type: table
    table: pfmegrnargs.rnacen.auth_user
    timestamp_field: last_login
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This creates a monitor as code on the provided auth_user table. It monitors for &lt;a href="https://docs.monosi.dev/user-guide/monitors/table-health"&gt;table health metrics&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Run the monitors
&lt;/h2&gt;

&lt;p&gt;Start the monitor by running the following command in the &lt;code&gt;monosi-repo&lt;/code&gt; directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;monosi run
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It takes a few seconds to run. The resulting output should be similar to: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbpnbro641pwnbi9e747k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbpnbro641pwnbi9e747k.png" alt="monitors-output" width="800" height="344"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🎉 Congratulations, you just ran your first Monosi monitor! From the output of the run, you should see that the public data on the auth_user table has no anomalies that the Monosi monitor has detected.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing Monosi (Web Interface)
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Note: At the moment, the web interface is decoupled from the CLI, so any monitors that are created in the CLI will not be synced.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;With the release of &lt;code&gt;v0.0.3&lt;/code&gt;, Monosi has a web user interface. For a full walkthrough of the UI, watch the video &lt;a href="https://youtu.be/MdbMDphpgUI"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Monosi provides a docker image to run the web application. If you're not familiar with docker, follow the tutorial &lt;a href="https://docker-curriculum.com/#introduction"&gt;here&lt;/a&gt;. Make sure you have docker installed and ready to use by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker ps
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To get the Monosi UI up and running with docker, run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -p 3000:3000 monosi/monosi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Navigate to &lt;code&gt;http://localhost:3000&lt;/code&gt; and you will see the Monosi UI. &lt;/p&gt;

&lt;p&gt;Set up a connection to the public PostgreSQL instance in the UI by navigating to &lt;code&gt;http://localhost:3000/settings/sources&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fylbu62hs7r6dor99msfo.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fylbu62hs7r6dor99msfo.gif" alt="monosi postgres setup web ui" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Fill out the database connection form with the following values:&lt;/p&gt;

&lt;p&gt;Name for Data Source: &lt;code&gt;RNAcentral Public Database&lt;/code&gt;&lt;br&gt;
User: &lt;code&gt;reader&lt;/code&gt;&lt;br&gt;
Password: &lt;code&gt;NWDMCE5xdipIjRrp&lt;/code&gt;&lt;br&gt;
Host: &lt;code&gt;hh-pgsql-public.ebi.ac.uk&lt;/code&gt;&lt;br&gt;
Port: &lt;code&gt;5432&lt;/code&gt;&lt;br&gt;
Database: &lt;code&gt;pfmegrnargs&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;After setting up the connection, create a monitor in the UI by navigating to &lt;code&gt;http://localhost:3000/monitors&lt;/code&gt; and clicking the Create Monitor button. Fill out the form with the following information:&lt;/p&gt;

&lt;p&gt;Name: &lt;code&gt;auth_user table health&lt;/code&gt;&lt;br&gt;
Description: &lt;code&gt;auth_user table health monitor&lt;/code&gt;&lt;br&gt;
Check every: &lt;code&gt;720&lt;/code&gt; minutes&lt;br&gt;
Monitor type: &lt;code&gt;Table Health&lt;/code&gt;&lt;br&gt;
Data Source: &lt;code&gt;RNAcentral Public Database&lt;/code&gt;&lt;br&gt;
Table: &lt;code&gt;pfmegrnargs.rnacen.auth_user&lt;/code&gt;&lt;br&gt;
Timestamp Field: &lt;code&gt;last_login&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Hit save and your new monitor will appear in the monitors table.&lt;/p&gt;

&lt;p&gt;🎉 Congratulations, you've just set up and scheduled a data monitor! This process will run indefinitely until you delete the monitor. To get alerts on detected anomalies, set up a Slack connection in &lt;code&gt;http://localhost:3000/settings/integrations&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;Now that you’ve worked through an example using a public PostgreSQL instance, you can further extend this to your own data store. For more information, get started &lt;a href="https://docs.monosi.dev"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you have any questions, join our &lt;a href="https://monosi.dev/slack"&gt;Slack&lt;/a&gt; community or open an issue in our repository on &lt;a href="https://github.com/monosidev/monosi"&gt;Github&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>postgres</category>
      <category>python</category>
      <category>opensource</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Setting up data monitoring for Snowflake</title>
      <dc:creator>Ivan Porollo</dc:creator>
      <pubDate>Mon, 31 Jan 2022 08:51:46 +0000</pubDate>
      <link>https://dev.to/iporollo/setting-up-data-monitoring-for-snowflake-3b5p</link>
      <guid>https://dev.to/iporollo/setting-up-data-monitoring-for-snowflake-3b5p</guid>
      <description>&lt;p&gt;&lt;em&gt;Updated Feb. 4th, 2022&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Data quality and reliability are still a source of headaches for data organizations today. &lt;a href="https://monosi.dev"&gt;Monosi&lt;/a&gt; exists to resolve the issues that teams face.&lt;/p&gt;

&lt;p&gt;Monosi is an &lt;a href="https://monosi.dev"&gt;open source data observability&lt;/a&gt; and monitoring platform for data teams (see &lt;a href="https://github.com/monosidev/monosi"&gt;Monosi Github&lt;/a&gt;). It is used to quickly set up monitors on a data store to run checks for data quality issues and alert on detected anomalies.&lt;/p&gt;

&lt;p&gt;This article will walk you through how to get started monitoring a data warehouse in &lt;strong&gt;less than 10 minutes&lt;/strong&gt; with Monosi. &lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-requisites
&lt;/h2&gt;

&lt;p&gt;For this tutorial, we are going to use &lt;a href="https://www.snowflake.com/"&gt;Snowflake&lt;/a&gt; as our data warehouse. &lt;/p&gt;

&lt;p&gt;If you don't have a Snowflake account, it's easy to &lt;a href="https://signup.snowflake.com"&gt;create one&lt;/a&gt;. Go ahead and create an account - select the &lt;strong&gt;standard&lt;/strong&gt; edition and your preferred cloud provider (for this example we will be using AWS - US West). &lt;/p&gt;

&lt;p&gt;After signing up, you should receive an email with account details to login. With the account details in place, you are ready to start with Monosi. &lt;/p&gt;

&lt;h2&gt;
  
  
  Installing Monosi
&lt;/h2&gt;

&lt;p&gt;Monosi is a Python package that is distributed through &lt;a href="https://pypi.org/project/monosi/"&gt;pypi&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: Monosi is only compatible with Python 3.6 and higher.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;To install it, open your terminal and run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install monosi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Check that it's been installed by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;monosi --version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Configuring a data source
&lt;/h2&gt;

&lt;p&gt;With Monosi installed, we need to configure a data source to monitor. Monosi reads connection details from the &lt;code&gt;~/.monosi/workspaces.yml&lt;/code&gt; file, so let's create it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir ~/.monosi
touch ~/.monosi/workspaces.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Edit the &lt;code&gt;workspaces.yml&lt;/code&gt; file in your editor of choice and fill it out with your information (specifically &lt;code&gt;&amp;lt;user-name&amp;gt;&lt;/code&gt;, &lt;code&gt;&amp;lt;password&amp;gt;&lt;/code&gt;, &lt;code&gt;&amp;lt;account-name&amp;gt;&lt;/code&gt;). This is what the file will look like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;default:
    sources:
        default:
            type: snowflake
            user: &amp;lt;user-name&amp;gt;
            password: &amp;lt;password&amp;gt;
            account: &amp;lt;account-name&amp;gt;
            warehouse: COMPUTE_WH
            database: SNOWFLAKE_SAMPLE_DATA
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For the purposes of this tutorial, we will be using the &lt;code&gt;SNOWFLAKE_SAMPLE_DATA&lt;/code&gt; database. This database is provided out of the box by Snowflake. &lt;/p&gt;

&lt;p&gt;You can find more information on setting up the &lt;code&gt;workspaces.yml&lt;/code&gt; file &lt;a href="https://docs.monosi.dev/user-guide/data-sources"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Monosi project
&lt;/h2&gt;

&lt;p&gt;Navigate to the directory where you want your Monosi project to live and create a project repository by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;monosi init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the directory you should now see a folder called &lt;code&gt;monosi-repo&lt;/code&gt; (you can rename this if you want). Navigate into the folder by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd monosi-repo 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the &lt;code&gt;monosi-repo&lt;/code&gt; directory, you should see a &lt;code&gt;monosi_project.yml&lt;/code&gt; file. &lt;/p&gt;

&lt;p&gt;This file configures which connection to use and your monitor paths, as well as other metadata. More information on setting up the &lt;code&gt;monosi_project.yml&lt;/code&gt; file can be found &lt;a href="https://docs.monosi.dev/user-guide/cli/monitors"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;For the purposes of this tutorial, we don't need to edit the file. &lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a monitor
&lt;/h2&gt;

&lt;p&gt;Monosi automatically creates a folder called &lt;code&gt;monitors&lt;/code&gt; in the &lt;code&gt;monosi-repo&lt;/code&gt; directory. In that folder, an example custom SQL monitor is defined to show how the syntax works.  &lt;/p&gt;

&lt;p&gt;As mentioned, Snowflake provides some example data in their &lt;code&gt;SNOWFLAKE_SAMPLE_DATA&lt;/code&gt; database. We can create a monitor for the orders table:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;touch ./monitors/orders.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Edit the &lt;code&gt;orders.yml&lt;/code&gt; file in your editor of choice and paste the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;monosi:
  monitors:
  - table: SNOWFLAKE_SAMPLE_DATA.TPCH_SF100.orders
    timestamp_field: o_orderdate
    type: table
    days_ago: -10000
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This creates a monitor as code on the provided orders table. It monitors for &lt;a href="https://docs.monosi.dev/user-guide/monitors/table-health"&gt;table health metrics&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Run the monitors
&lt;/h2&gt;

&lt;p&gt;Start the monitor by running the following command in the &lt;code&gt;monosi-repo&lt;/code&gt; directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;monosi run
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It takes a few seconds to run. The resulting output should be similar to: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flonx7pm57kgjgglnvoun.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flonx7pm57kgjgglnvoun.png" alt="orders monitor output" width="800" height="366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🎉 Congratulations, you just ran your first Monosi monitor! From the output of the run, you should see that the example data has several anomalies that the Monosi monitor has detected.&lt;/p&gt;

&lt;h2&gt;
  
  
  Scheduling monitors
&lt;/h2&gt;

&lt;p&gt;With the release of &lt;code&gt;v0.0.3&lt;/code&gt;, we have added a user interface that supports the scheduling of monitors. We've also created a video walkthrough of the UI &lt;a href="https://youtu.be/MdbMDphpgUI"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Monosi provides a docker image to run the application. Make sure you have docker installed and ready to use by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker ps
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To get the Monosi UI up and running with docker, run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -p 3000:3000 monosi/monosi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Navigate to &lt;code&gt;http://localhost:3000&lt;/code&gt; and you will see the Monosi UI. &lt;/p&gt;

&lt;p&gt;Set up a connection to your Snowflake account in the UI by navigating to &lt;code&gt;http://localhost:3000/settings/sources&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;After setting up the connection, create a monitor in the UI by navigating to &lt;code&gt;http://localhost:3000/monitors&lt;/code&gt; and clicking the Create Monitor button. Fill out the form with the following information:&lt;/p&gt;

&lt;p&gt;Name: &lt;code&gt;Orders Monitor&lt;/code&gt;&lt;br&gt;
Check every: &lt;code&gt;720&lt;/code&gt; minutes&lt;br&gt;
Monitor type: &lt;code&gt;Table Health&lt;/code&gt;&lt;br&gt;
Data Source: &lt;code&gt;&amp;lt;Your Snowflake Datasource Name&amp;gt;&lt;/code&gt;&lt;br&gt;
Table: &lt;code&gt;SNOWFLAKE_SAMPLE_DATA.TPCH_SF100.orders&lt;/code&gt;&lt;br&gt;
Timestamp Field: &lt;code&gt;o_orderdate&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Hit save and your new monitor will appear in the monitors table.&lt;/p&gt;

&lt;p&gt;🎉 Congratulations, you just scheduled a data monitor! This process will run indefinitely until you delete the monitor. To get alerts on detected anomalies, set up a Slack connection in &lt;code&gt;http://localhost:3000/settings/integrations&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;Now that you’ve worked through an example using Snowflake's provided data, you can further extend this to your own data with the Monosi &lt;a href="https://docs.monosi.dev/user-guide/cli#profile"&gt;profiler&lt;/a&gt; and &lt;a href="https://docs.monosi.dev/user-guide/monitors/custom-sql"&gt;custom SQL monitors&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you have any questions, join our &lt;a href="https://monosi.dev/slack"&gt;Slack&lt;/a&gt; community or open an issue in our repository on &lt;a href="https://github.com/monosidev/monosi"&gt;Github&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>database</category>
      <category>opensource</category>
      <category>tutorial</category>
      <category>python</category>
    </item>
  </channel>
</rss>
