DEV Community

Cover image for Let's See How We Can Use ADF To Load Data from Any Source to Snowflake .
Akash Deep Sharma
Akash Deep Sharma

Posted on

3 2

Let's See How We Can Use ADF To Load Data from Any Source to Snowflake .

Azure Data Factory Mainly Follow Two Approaches

  1. Direct Copy Approach .
  2. Stage Copy Approach .
  • Direct Copy Approach

To load Data From Sources like Azure Blob Storage or other Azure Storage Services and Destination Other than Snowflake, In that case we can use Direct copy Approach , We do not need to Configure Staging area . By Just passing Source Data Set and Sink Data Set we can Implement our Pipeline .

  • Stage Copy Approach

If We Want to Load our Data to Snowflake from Sources other than Azure Storage Services , we will use a Stage Copy Approach. For that we need to Configure our staging , and pass the Source Data Set as well as Sink Data Set .

Stage Copy Approach

Now Let's See how we are Loading data from MYSQL to Snowflake using ADF

*Steps - *

  1. Setup Azure Account → SignIn into your account → Go to Azure Portal.
  2. In Azure select and create a new Project in Azure Data Factory.
  3. Setup Snowflake Account → Create a Users → Create a Database and Schemas.
  4. In Studio Go to the Manage Section , And Create a new Linked Service To Connect MYSQL with ADF.

MYSQL Linking Services

  1. After that , Create a New Linked Service for Snowflake .

Snowflake Linking Services

  1. Now go to the author section in ADF and click on the icon of the data set and create a new data set from the source table.
  2. After creating data sets from source, now create a data set for destination. For this we need a table in our snowflake account, so create a table in a snowflake account and then in ADF create a new data set for the snowflake table.

Snowflake Dataset

  1. After you are done with creating data sets now we are good to move to create a pipeline. For this you need to go in the pipeline section and create a new pipeline, In activity section you will Copy Data.

Creating a PipeLine

  1. Now , click to the Copy Data command and configure the Source and Sink Tabs.
  2. Run your Azure Data Factory Pipeline to load the data into Snowflake. You should see in Snowflake a call to the ODBC in your history table. This is how you know ADF has successfully landed data in Snowflake .

Image of Docusign

🛠️ Bring your solution into Docusign. Reach over 1.6M customers.

Docusign is now extensible. Overcome challenges with disconnected products and inaccessible data by bringing your solutions into Docusign and publishing to 1.6M customers in the App Center.

Learn more

Top comments (0)

Billboard image

The Next Generation Developer Platform

Coherence is the first Platform-as-a-Service you can control. Unlike "black-box" platforms that are opinionated about the infra you can deploy, Coherence is powered by CNC, the open-source IaC framework, which offers limitless customization.

Learn more

👋 Kindness is contagious

Immerse yourself in a wealth of knowledge with this piece, supported by the inclusive DEV Community—every developer, no matter where they are in their journey, is invited to contribute to our collective wisdom.

A simple “thank you” goes a long way—express your gratitude below in the comments!

Gathering insights enriches our journey on DEV and fortifies our community ties. Did you find this article valuable? Taking a moment to thank the author can have a significant impact.

Okay