DEV Community

Krishna Tangudu
Krishna Tangudu

Posted on

Part 1: Zero-Copy Sharing – Salesforce Snowflake for Analytics

The ETL Nightmare Ends

Salesforce holds your richest customer data—opportunities, accounts, contacts—but getting it into Snowflake for analytics usually means painful ETL pipelines, schema drift, and maintenance hell. Zero-copy data Federation sharing changes everything: Salesforce publishes secure shares that land in Snowflake as native objects, queryable instantly without duplication or latency.

We are currently in the process of implementing Salesforce Sales Cloud. In this first part of a 2-part series, I'll walk through the exact OAuth setup, common integration traps (network policies! URL underscores!), and best practices for analytics teams.

Architecture Overview

Zero-copy sharing flips traditional pipelines: Salesforce Data Cloud becomes the secure publisher, Snowflake the consumer.

Layer Owner Contains
Source Salesforce Sales Cloud Raw CRM objects (OPPORTUNITY, ACCOUNT, CONTACT)
Share Salesforce Data Cloud Secure data share with selected objects
Consumer DB Snowflake Read-only shared objects like SFDC.OPPORTUNITY
Analytics Layer Snowflake Views, models, dashboards built on shared data

Salesforce Datacloud handles object sharing; Snowflake analysts query familiar objects immediately. No Fivetran, no Airflow, no storage costs for duplicates.

Step-by-Step Setup

Before you begin:

Make sure that your Snowflake admin adds the Data 360 IP Addresses to the allowlist.

Supported Regions for Snowflake Integration

In Snowflake
Step 1: Snowflake Security Integration

CREATE OR REPLACE SECURITY INTEGRATION sf_integration
  TYPE = OAUTH
  OAUTH_CLIENT = CUSTOM
  OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
  OAUTH_REDIRECT_URI = 'https://test.salesforce.com/services/cdp/SnowflakeOAuthCallback'
  ENABLED = TRUE
  OAUTH_ISSUE_REFRESH_TOKENS = TRUE;
Enter fullscreen mode Exit fullscreen mode

Prod tip: Swap test.salesforce.com for login.salesforce.com in production.

Step 2: Snowflake Integration User (Not required if the user creating a data target on Salesforce already has a login using Azure or a native user in Snowflake.)

CREATE OR REPLACE USER sf_integration_user
  LOGIN_NAME = 'sf_integration_user'
  PASSWORD = '<secure-password>'
  DEFAULT_ROLE = 'sf_integration_role'
  EMAIL = 'your-email@company.com';
Grant minimal privileges: USAGE on warehouse, SELECT on consumer database.

Enter fullscreen mode Exit fullscreen mode

Step 3: Client Secrets & Salesforce Config

SELECT SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('sf_integration');
Enter fullscreen mode Exit fullscreen mode

Copy both client ID and secret to Salesforce Data Cloud.

Step 4: Salesforce Data Target
Salesforce team configures:

Account URL: https://my-account-us-east-1.snowflakecomputing.com
Critical: if your Snowflake URL has underscores (my_account_us-east-1), convert to hyphens (my-account-us-east-1) in the Account URL field.

Authentication: OAuth (using secrets from step 3)

Step 5: Publish to Share:
The Salesforce team should create a data stream and a data lake object, then add this object to the created Data Share Target.

Data Stream ( Address ):

Data Lake Object:

Salesforce publishes share → Snowflake receives:


CREATE DATABASE sf_shared FROM SHARE <ShareName>;

Enter fullscreen mode Exit fullscreen mode

Integration Issues & Fixes
80% of failures happen here. Lessons from this activity:

Network Policies Block Everything
Snowflake network policies can silently prevent Salesforce Data cloud IP ranges.
Check: SHOW NETWORK POLICIES;
Fix: Add Salesforce Data cloud IPs

Account URL Underscores → Hyphens
Salesforce rejects Snowflake URLs with _. Always transform:

Check Permissions
Check the permissions of the user who is trying to establish the connection

Note: This setup is only required one time.

Best Practices for Analytics Teams

Layered Architecture

SF_SHARED (raw, read-only)
↓
SF_ANALYTICS (views, materialized)
↓
BI tools (Power BI, Tableau)
Enter fullscreen mode Exit fullscreen mode

Governance

  • Enable only semantic view access : grant USAGE on raw share to business users
  • Create weekly Alert to monitor Schema drift breaks downstream views Salesforce owns object changes (new fields, deletions)
  • Materialize frequent aggregates like below
CREATE MATERIALIZED VIEW analytics.opp_pipeline_summary AS
SELECT stage, COUNT(*) as count, AVG(amount) as avg_amount
FROM sf_shared.sfdc.opportunity
GROUP BY stage;
Cost Control
Enter fullscreen mode Exit fullscreen mode

Use multi-cluster warehouses for concurrent BI queries
Monitor: QUERY_HISTORY() for expensive shared table scans

Take Away:
0 TB duplicated storage, 30% faster pipeline deployment, 100% schema fidelity.

Quick Wins Checklist

  • [ ] URL underscores → hyphens
  • [ ] Network policy allows Salesforce IPs
  • [ ] Test OAuth token refresh before prod
  • [ ] Raw share → semantic views (don't expose raw)

What's Next (Part 2 Teaser)
This gets Salesforce data into Snowflake seamlessly. In Part 2, we reverse direction: Snowflake Customer 360 data flowing back to sales reps in Salesforce via data federation

Quick Start Your Integration

✅ Fork: https://github.com/LALITHASWAROOPK/salesforce-snowflake-zero-copy

Part 2 → Customer 360 (Snowflake → Salesforce)

Try this yourself: Share your biggest Salesforce-Snowflake integration pain point in the comments!

Top comments (0)