<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Akash Deep Sharma</title>
    <description>The latest articles on DEV Community by Akash Deep Sharma (@akash3106).</description>
    <link>https://dev.to/akash3106</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/akash3106"/>
    <language>en</language>
    <item>
      <title>Let's See How We Can Use ADF To Load Data from Any Source to Snowflake .</title>
      <dc:creator>Akash Deep Sharma</dc:creator>
      <pubDate>Thu, 24 Mar 2022 18:57:19 +0000</pubDate>
      <link>https://dev.to/akash3106/lets-see-how-we-can-use-adf-to-load-data-from-any-source-to-snowflake--1855</link>
      <guid>https://dev.to/akash3106/lets-see-how-we-can-use-adf-to-load-data-from-any-source-to-snowflake--1855</guid>
      <description>&lt;p&gt;&lt;strong&gt;Azure Data Factory Mainly Follow Two Approaches&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Direct Copy Approach . &lt;/li&gt;
&lt;li&gt;Stage Copy Approach .&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Direct Copy Approach&lt;/strong&gt; &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To load Data From Sources like Azure Blob Storage or other Azure Storage Services and Destination Other than Snowflake, In that case we can use Direct copy Approach , We do not need to Configure Staging area . By Just passing Source Data Set and Sink Data Set we can Implement our Pipeline .&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Stage Copy Approach&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If We Want to Load our Data to Snowflake from Sources other than Azure Storage Services , we will use a Stage Copy Approach. For that we need to Configure our staging , and pass the Source Data Set as well as Sink Data Set .&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7o4e497z9m81cl1hsiew.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7o4e497z9m81cl1hsiew.png" alt="Stage Copy Approach"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Now Let's See how we are Loading data from MYSQL to Snowflake using ADF&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Steps - *&lt;/em&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Setup Azure Account → SignIn into your account → Go to Azure Portal.&lt;/li&gt;
&lt;li&gt;In Azure select and create a new Project in Azure Data Factory.&lt;/li&gt;
&lt;li&gt;Setup Snowflake Account → Create a  Users → Create a Database and Schemas.&lt;/li&gt;
&lt;li&gt;In Studio Go to the Manage Section , And Create a new Linked Service To Connect MYSQL with ADF.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faftvhcutrjby4egs9b27.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faftvhcutrjby4egs9b27.png" alt="MYSQL Linking Services"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;After that , Create a New Linked Service for Snowflake .&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq5e6rq8l6sgty028vbmx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq5e6rq8l6sgty028vbmx.png" alt="Snowflake Linking Services"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Now go to the author section in ADF and click on the icon of the data set and create a new data set from the source table.&lt;/li&gt;
&lt;li&gt;After creating data sets from source, now create a data set for destination. For this we need a table in our snowflake account, so create a table in a snowflake account and then in  ADF create a new data set for the snowflake table.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgf1mm2vogrerjiszr1up.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgf1mm2vogrerjiszr1up.png" alt="Snowflake Dataset"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;After you are done with creating data sets now we are good to move to create a pipeline. For this you need to go in the pipeline section and create a new pipeline, In activity section you will Copy Data.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx83xlgsv3nh7poqxmciy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx83xlgsv3nh7poqxmciy.png" alt="Creating a PipeLine"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Now , click to the Copy Data command and configure the Source and Sink Tabs.&lt;/li&gt;
&lt;li&gt;Run your Azure Data Factory Pipeline to load the data into Snowflake. You should see in Snowflake a call to the ODBC in your history table. This is how you know ADF has successfully landed data in Snowflake .&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>snowflake</category>
      <category>azure</category>
      <category>adf</category>
    </item>
  </channel>
</rss>
