DEV Community

Sagara
Sagara

Posted on

Snowflake DCM Projects : Building DEV/PROD Environments with Declarative IaC Using Template Variables

This is an English version of the original Japanese article:
https://dev.classmethod.jp/articles/snowflake-dcm-projects-preview/

Snowflake has released a new feature called DCM Projects. It allows you to declaratively define Snowflake objects using DEFINE statements and deploy them through a Plan → Deploy workflow — essentially an Infrastructure-as-Code (IaC) feature native to Snowflake.

https://docs.snowflake.com/en/release-notes/2026/other/2026-03-20-dcm-projects

https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-overview

Previously, IaC management of Snowflake objects required external tools such as Terraform or Schemachange. With DCM Projects, you can now manage objects declaratively within Snowflake natively.

In this article, I verified a workflow where a single DCM project folder (set of definition files) has two targets defined, and deploys to separate DCM project objects for DEV and PROD by switching template variables. I'll walk through the steps and share the results.

Note: DCM Projects is in Preview as of March 20, 2026. Specifications may change before GA. Also, the official docs recommend separating DEV/PROD environments by using different accounts. The single-account setup in this article is purely for verification purposes — if you operate within a single account, careful access management is required.

Feature Overview

DCM Projects (Database Change Management Projects) is a feature that lets you define the desired state of Snowflake objects in code and manage them declaratively.

Key characteristics include:

  • Declarative definitions: Define the desired state of objects using DEFINE statements. Dependencies are automatically resolved, so you don't need to worry about the order of declarations
  • Plan → Deploy workflow: Review diffs with PLAN before deploying, then apply changes with DEPLOY. It follows the same concept as Terraform's planapply
  • Jinja2 templates: Supports variable substitution, conditional branching, loops, and macros. You can expand different parameters per environment from a single set of definition files
  • Supported objects: Database, Schema, Table, Dynamic Table, (Secure) View, Internal Stage, Warehouse, Role / Database Role, Grant, Data Metric Function, Task, SQL Function, Tag, Authentication Policy, and more. However, only a subset of Snowflake objects is supported — for example, File Format is not included
  • Pipeline management: REFRESH ALL for bulk-refreshing Dynamic Tables, and TEST ALL for bulk-running data quality tests
  • Management interfaces: Operable from Snowsight, Snowflake CLI (v3.16+), SQL, and Cortex Code CLI

https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-supported-entities

Limitations

The following are the main limitations confirmed as of March 20, 2026, based on the official docs.

General

  • As a Preview feature, specifications may change
  • Only a subset of Snowflake objects is supported
  • Changes during deploy are subject to CREATE OR ALTER constraints. For some objects (e.g., Tables), partial changes may be applied on failure
  • Maximum of 1,000 source files / 10,000 rendered object definitions. Exceeding these limits may cause performance degradation or execution failures
  • During Preview, changesets may not fully capture all granular changes
  • _snow is a reserved identifier
  • Sensitive information should not be placed in template variables

Object-specific limitations

  • Database / Schema / Table / View / Dynamic Table: Rename is not supported
  • Table: Column rename, incompatible type changes, adding Search Optimization, and adding tags / masking policies / row access policies to column definitions are not supported. Column order changes are also not supported
  • Dynamic Table: INITIALIZE is immutable. Body changes or refresh mode changes may require re-initialization / full refresh. Column order changes and rename are not supported
  • View: Rename and column order changes are not supported
  • Internal Stage: Only internal stages are supported, and encryption type is immutable
  • Warehouse: INITIALLY_SUSPENDED is immutable
  • Role / Database Role: Application Roles are not supported
  • Grant: APPLICATION ROLE grants / CALLER grants are not supported
  • Tag: PROPAGATE is not supported

Jinja2 templates

  • import, extends, and include syntax are not supported

Prerequisites

  • Snowflake: AWS US West (Oregon) region, Enterprise Edition (The feature was not enabled in an AWS Tokyo region environment, so a trial account was created in AWS US West.)
  • Feature status: Preview as of March 20, 2026
  • Required privileges: CREATE DCM PROJECT ON SCHEMA privilege. Additionally, the project owner needs sufficient privileges to deploy all objects defined in the project
  • Project owner role: Since this article includes role creation via DEFINE ROLE, ACCOUNTADMIN (which has CREATE ROLE privilege) is used
  • DCM project object location: DCM project objects are schema-level objects, so a database / schema to host them is required (this is a separate concept from the deployment target database)
  • Operating environment: Snowsight Workspace

Preparation

Creating a Workspace

Create a Workspace in Snowsight.

Navigate to "Projects" → "Workspaces" from the left menu, click the Workspace name at the top left, select "Private Workspace", enter a name, and create it.

If the Workspace is created as shown below, you're good to go.

2026-03-20_18h19_10

Creating a Database / Schema for DCM Project Objects

DCM project objects are schema-level objects and require a database and schema to host them. In this article, we create a dedicated DCM_ADMIN.PROJECTS database / schema and place both the DEV and PROD DCM project objects there.

USE ROLE ACCOUNTADMIN;
USE SECONDARY ROLES NONE;

-- For hosting DCM project objects
CREATE DATABASE IF NOT EXISTS DCM_ADMIN;
CREATE SCHEMA IF NOT EXISTS DCM_ADMIN.PROJECTS;
Enter fullscreen mode Exit fullscreen mode

Note: The official DCM Projects docs recommend running USE SECONDARY ROLES NONE; because secondary roles are considered during Plan / Deploy execution, and you want to avoid depending on privileges beyond the project owner role. This article also uses USE SECONDARY ROLES NONE; during preparation and deploy-related SQL execution.

After running the queries, confirm that the DCM_ADMIN database and PROJECTS schema have been created.

2026-03-20_18h21_03

Hands-On Walkthrough

Let's build DEV/PROD environments using DCM Projects!

1. Creating a DCM Project Folder

Create a single DCM project folder in the Snowsight Workspace. Within this project folder, define two targets (DEV / PROD) in manifest.yml and deploy the same set of definition files to each environment.

In the created Workspace, click the "+" button and select "DCM Project".

2026-03-21_05h11_12

Check "Define default target environment" and configure as shown below. Select the database and schema created earlier, and set the Target name to "DEV".

2026-03-21_05h25_32

The folders and files are automatically generated as shown below. The manifest.yml also contains the role, database, and schema information configured earlier.

2026-03-21_05h26_07

Note: In DCM Projects, the DCM project folder (the set of definition files consisting of manifest.yml + sources/definitions/ in the Workspace) and the DCM project object (a Snowflake schema-level object that is the target of deployment) are separate concepts. A single project folder can have multiple targets, each deploying to a different DCM project object. If you deploy different configurations to a single DCM project object, objects no longer included in the rendered result may become drop targets. Therefore, when managing multiple environments like DEV/PROD, specify a separate DCM project object for each environment.

2. Deleting the Default .sql Files

Since we have a specific configuration for this verification, delete the default .sql files that were generated.

For reference, let's review what each file contains before deleting them.

sources/definitions/examples.sql

This file contains the most common syntax for defining objects using DCM Projects. The key point is defining each object's settings with define.

2026-03-21_05h38_57

sources/definitions/jinja_demo.sql

Since DCM Projects supports Jinja2 templates, you can programmatically define objects using for and if. This file demonstrates creating multiple team tables by looping through the teams values defined in manifest.yml.

2026-03-21_05h45_53

sources/macros/grants_macro.sql

Since Jinja2 templates are available, you can also create macros like in dbt. This macro creates DEVELOPER and USAGE roles for a given team and grants privileges to each.

2026-03-21_05h56_14

Folder Structure After Deleting Default .sql Files

It should look like the image below. The macros folder was also deleted since it's not needed for this exercise.

Note: tmp.sql is a file where I wrote the queries to create the database / schema earlier.

2026-03-21_05h57_40

3. Configuring manifest.yml (DEV/PROD Targets)

To separate DEV/PROD environments at the database level within the same account, configure manifest.yml as follows:

manifest_version: 2
type: DCM_PROJECT

default_target: DEV

targets:
  DEV:
    account_identifier: MY_ACCOUNT
    project_name: DCM_ADMIN.PROJECTS.MY_DCM_PROJECT_DEV
    project_owner: ACCOUNTADMIN
    templating_config: DEV
  PROD:
    account_identifier: MY_ACCOUNT
    project_name: DCM_ADMIN.PROJECTS.MY_DCM_PROJECT_PROD
    project_owner: ACCOUNTADMIN
    templating_config: PROD

templating:
  defaults:
    wh_auto_suspend: 60
  configurations:
    DEV:
      db_name: DEV_DB
      wh_name: DEV_WH
      wh_size: X-SMALL
      role_name: DEV_EXPLORER_ROLE
    PROD:
      db_name: PROD_DB
      wh_name: PROD_WH
      wh_size: SMALL
      role_name: PROD_EXPLORER_ROLE
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h01_13

Key points:

  • Under targets, specify account_identifier (same value since it's the same account), project_name (fully qualified name of the DCM project object), project_owner (owner role during deployment), and templating_config (configuration name to use)
  • Define default values in templating.defaults and environment-specific values in templating.configurations
  • Variable resolution order is three stages: "defaults → configuration → runtime override"

Note: Since this article includes role creation via DEFINE ROLE, both DEV and PROD use project_owner: ACCOUNTADMIN to satisfy the CREATE ROLE privilege requirement. This is to simplify the verification and is not a recommendation to routinely use ACCOUNTADMIN.

The official docs' CI/CD best practices advise against using general administrator roles (like ACCOUNTADMIN) as the project owner in production. In practice, consider creating dedicated deployment roles like DCM_DEV_DEPLOYER / DCM_PROD_DEPLOYER or service users for each environment.

Also, by default, the role that executes the deploy (project owner) holds OWNERSHIP of the deployed objects. In this article's setup, all objects after deploy are owned by ACCOUNTADMIN. In practice, consider a setup where a dedicated deployment role is the owner, and plan your GRANT OWNERSHIP design accordingly. Note that if the project owner transfers OWNERSHIP to another role it doesn't hold, it may become unable to continue managing that object in subsequent deploys.

4. Defining Databases, Schemas, and Warehouses

Now let's start defining the objects.

Create definition files for databases, schemas, and warehouses under the sources/definitions/ directory.

Since the Jinja2 template variable {{ db_name }} is used, it automatically expands to DEV_DB for the DEV configuration and PROD_DB for the PROD configuration.

sources/definitions/databases.sql

DEFINE DATABASE {{ db_name }};
Enter fullscreen mode Exit fullscreen mode

2026-03-21_05h59_12

sources/definitions/schemas.sql

DEFINE SCHEMA {{ db_name }}.RAW;

DEFINE SCHEMA {{ db_name }}.STG;

DEFINE SCHEMA {{ db_name }}.MART;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_05h59_54

The warehouse size is also parameterized with {{ wh_size }} from manifest.yml, so DEV gets X-SMALL and PROD gets SMALL. {{ wh_auto_suspend }} uses the shared value defined in templating.defaults.

sources/definitions/warehouses.sql

DEFINE WAREHOUSE {{ wh_name }}
  WITH
    WAREHOUSE_SIZE = '{{ wh_size }}'
    AUTO_SUSPEND = {{ wh_auto_suspend }}
    AUTO_RESUME = TRUE
    INITIALLY_SUSPENDED = TRUE;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h03_33

5. Defining Roles and GRANTs

Create a data exploration role for each environment and grant the necessary privileges. With the template variable {{ role_name }}, DEV_EXPLORER_ROLE is created for DEV and PROD_EXPLORER_ROLE for PROD.

In DCM Projects, you can write both DEFINE statements and GRANT statements in the same file.

sources/definitions/roles.sql

DEFINE ROLE {{ role_name }};

-- USAGE privilege on database
GRANT USAGE ON DATABASE {{ db_name }} TO ROLE {{ role_name }};

-- USAGE privilege on each schema
GRANT USAGE ON SCHEMA {{ db_name }}.RAW TO ROLE {{ role_name }};
GRANT USAGE ON SCHEMA {{ db_name }}.STG TO ROLE {{ role_name }};
GRANT USAGE ON SCHEMA {{ db_name }}.MART TO ROLE {{ role_name }};

-- RAW schema: SELECT privilege on tables (read-only)
GRANT SELECT ON ALL TABLES IN SCHEMA {{ db_name }}.RAW TO ROLE {{ role_name }};

-- STG/MART schemas: SELECT privilege on Dynamic Tables (read-only)
GRANT SELECT ON ALL DYNAMIC TABLES IN SCHEMA {{ db_name }}.STG TO ROLE {{ role_name }};
GRANT SELECT ON ALL DYNAMIC TABLES IN SCHEMA {{ db_name }}.MART TO ROLE {{ role_name }};

-- USAGE privilege on warehouse
GRANT USAGE ON WAREHOUSE {{ wh_name }} TO ROLE {{ role_name }};

-- Role hierarchy to SYSADMIN
GRANT ROLE {{ role_name }} TO ROLE SYSADMIN;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h04_48

Note: DCM Projects tracks only GRANTs deployed through DCM Projects. If you remove a GRANT statement from the definition files, that GRANT will become a revoke target on the next Deploy.

6. Defining Tables

Define tables in the RAW schema to store e-commerce order data.

sources/definitions/raw_tables.sql

DEFINE TABLE {{ db_name }}.RAW.CUSTOMERS (
  CUSTOMER_ID INT,
  CUSTOMER_NAME VARCHAR(100),
  EMAIL VARCHAR(200),
  REGION VARCHAR(50),
  CREATED_AT TIMESTAMP_NTZ DEFAULT CURRENT_TIMESTAMP()
);

DEFINE TABLE {{ db_name }}.RAW.ORDERS (
  ORDER_ID INT,
  CUSTOMER_ID INT,
  ORDER_DATE DATE,
  ORDER_STATUS VARCHAR(20),
  TOTAL_AMOUNT DECIMAL(10,2),
  CREATED_AT TIMESTAMP_NTZ DEFAULT CURRENT_TIMESTAMP()
);

DEFINE TABLE {{ db_name }}.RAW.ORDER_ITEMS (
  ORDER_ITEM_ID INT,
  ORDER_ID INT,
  PRODUCT_NAME VARCHAR(200),
  QUANTITY INT,
  UNIT_PRICE DECIMAL(10,2),
  CREATED_AT TIMESTAMP_NTZ DEFAULT CURRENT_TIMESTAMP()
);
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h07_05

7. Defining an Internal Stage

Define an Internal Stage for data ingestion. Directory table settings can also be configured.

sources/definitions/stages.sql

DEFINE STAGE {{ db_name }}.RAW.DATA_STAGE
  DIRECTORY = (ENABLE = TRUE);
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h09_30

8. Defining Dynamic Tables

Define Dynamic Tables in the STG and MART schemas to build a RAW → STG → MART pipeline.

In this three-layer architecture, the STG layer performs data cleansing (TRIM, LOWER, UPPER, etc.) and the MART layer performs aggregation and joins. STG layer Dynamic Tables are set to TARGET_LAG = DOWNSTREAM, and MART layer tables to TARGET_LAG = '1 hour'.

sources/definitions/stg_dynamic_tables.sql

DEFINE DYNAMIC TABLE {{ db_name }}.STG.STG_CUSTOMERS
  WAREHOUSE = {{ wh_name }}
  TARGET_LAG = DOWNSTREAM
  AS
    SELECT
      CUSTOMER_ID,
      TRIM(CUSTOMER_NAME) AS CUSTOMER_NAME,
      LOWER(EMAIL) AS EMAIL,
      UPPER(REGION) AS REGION,
      CREATED_AT
    FROM {{ db_name }}.RAW.CUSTOMERS;

DEFINE DYNAMIC TABLE {{ db_name }}.STG.STG_ORDERS
  WAREHOUSE = {{ wh_name }}
  TARGET_LAG = DOWNSTREAM
  AS
    SELECT
      O.ORDER_ID,
      O.CUSTOMER_ID,
      O.ORDER_DATE,
      UPPER(O.ORDER_STATUS) AS ORDER_STATUS,
      O.TOTAL_AMOUNT,
      O.CREATED_AT
    FROM {{ db_name }}.RAW.ORDERS O;

DEFINE DYNAMIC TABLE {{ db_name }}.STG.STG_ORDER_ITEMS
  WAREHOUSE = {{ wh_name }}
  TARGET_LAG = DOWNSTREAM
  AS
    SELECT
      OI.ORDER_ITEM_ID,
      OI.ORDER_ID,
      TRIM(OI.PRODUCT_NAME) AS PRODUCT_NAME,
      OI.QUANTITY,
      OI.UNIT_PRICE,
      OI.QUANTITY * OI.UNIT_PRICE AS LINE_TOTAL,
      OI.CREATED_AT
    FROM {{ db_name }}.RAW.ORDER_ITEMS OI;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h12_33

sources/definitions/mart_dynamic_tables.sql

DEFINE DYNAMIC TABLE {{ db_name }}.MART.MART_CUSTOMER_ORDERS
  WAREHOUSE = {{ wh_name }}
  TARGET_LAG = '1 hour'
  AS
    SELECT
      C.CUSTOMER_ID,
      C.CUSTOMER_NAME,
      C.EMAIL,
      C.REGION,
      COUNT(O.ORDER_ID) AS TOTAL_ORDERS,
      SUM(O.TOTAL_AMOUNT) AS TOTAL_SPENT,
      MIN(O.ORDER_DATE) AS FIRST_ORDER_DATE,
      MAX(O.ORDER_DATE) AS LAST_ORDER_DATE
    FROM {{ db_name }}.STG.STG_CUSTOMERS C
    LEFT JOIN {{ db_name }}.STG.STG_ORDERS O
      ON C.CUSTOMER_ID = O.CUSTOMER_ID
    GROUP BY C.CUSTOMER_ID, C.CUSTOMER_NAME, C.EMAIL, C.REGION;

DEFINE DYNAMIC TABLE {{ db_name }}.MART.MART_PRODUCT_SALES
  WAREHOUSE = {{ wh_name }}
  TARGET_LAG = '1 hour'
  AS
    SELECT
      OI.PRODUCT_NAME,
      COUNT(DISTINCT OI.ORDER_ID) AS ORDER_COUNT,
      SUM(OI.QUANTITY) AS TOTAL_QUANTITY,
      SUM(OI.LINE_TOTAL) AS TOTAL_REVENUE,
      AVG(OI.UNIT_PRICE) AS AVG_UNIT_PRICE
    FROM {{ db_name }}.STG.STG_ORDER_ITEMS OI
    GROUP BY OI.PRODUCT_NAME;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h12_51

9. Defining Tasks (Data Ingestion via COPY INTO)

Define tasks using DEFINE TASK to load CSV files uploaded to the Internal Stage into each table.

Each task is configured with a CRON schedule to run daily at 2:00 AM (JST). With PURGE = TRUE specified, files on the stage are automatically deleted after a successful COPY INTO.

Note that tasks are created in a suspended state when deployed.

Note: File Format is not included in DCM Projects' supported objects (as of March 20, 2026), so it cannot be defined with DEFINE FILE FORMAT. In this article, format options are specified inline in the FILE_FORMAT of the COPY INTO statement to avoid a separate File Format object creation step. For objects not supported by DEFINE, you can supplement them with separate SQL scripts (EXECUTE IMMEDIATE FROM or snow sql) after DCM PLAN/DEPLOY.

sources/definitions/tasks.sql

-- COPY INTO task for CUSTOMERS table
DEFINE TASK {{ db_name }}.RAW.LOAD_CUSTOMERS
  WAREHOUSE = {{ wh_name }}
  SCHEDULE = 'USING CRON 0 2 * * * Asia/Tokyo'
  AS
    COPY INTO {{ db_name }}.RAW.CUSTOMERS
    FROM @{{ db_name }}.RAW.DATA_STAGE/customers
    FILE_FORMAT = (TYPE = 'CSV' FIELD_OPTIONALLY_ENCLOSED_BY = '"' SKIP_HEADER = 1 NULL_IF = ('NULL', 'null', ''))
    ON_ERROR = 'CONTINUE'
    PURGE = TRUE;

-- COPY INTO task for ORDERS table
DEFINE TASK {{ db_name }}.RAW.LOAD_ORDERS
  WAREHOUSE = {{ wh_name }}
  SCHEDULE = 'USING CRON 0 2 * * * Asia/Tokyo'
  AS
    COPY INTO {{ db_name }}.RAW.ORDERS
    FROM @{{ db_name }}.RAW.DATA_STAGE/orders
    FILE_FORMAT = (TYPE = 'CSV' FIELD_OPTIONALLY_ENCLOSED_BY = '"' SKIP_HEADER = 1 NULL_IF = ('NULL', 'null', ''))
    ON_ERROR = 'CONTINUE'
    PURGE = TRUE;

-- COPY INTO task for ORDER_ITEMS table
DEFINE TASK {{ db_name }}.RAW.LOAD_ORDER_ITEMS
  WAREHOUSE = {{ wh_name }}
  SCHEDULE = 'USING CRON 0 2 * * * Asia/Tokyo'
  AS
    COPY INTO {{ db_name }}.RAW.ORDER_ITEMS
    FROM @{{ db_name }}.RAW.DATA_STAGE/order_items
    FILE_FORMAT = (TYPE = 'CSV' FIELD_OPTIONALLY_ENCLOSED_BY = '"' SKIP_HEADER = 1 NULL_IF = ('NULL', 'null', ''))
    ON_ERROR = 'CONTINUE'
    PURGE = TRUE;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h14_29

10. Plan & Deploy to the DEV Target

Now that the definition files are ready, let's run Plan against the DEV target first.

In the Snowsight Workspace DCM Project screen, switch the target to "DEV", ensure "Plan" is selected, and click the play button.

2026-03-21_06h15_36

A list of objects to be created is displayed. Clicking on each object shows the detailed configuration values.

2026-03-21_06h16_48

By the way, clicking Summarize in the upper right provides a summary of what changes will be made.

2026-03-21_06h18_53

2026-03-21_06h20_17

If everything looks good, click the "Deploy" button. According to the official docs, the "Alias" is described as Think of the deployment alias like a commit message for your code change., so I entered "first deploy".

2026-03-21_06h20_56

2026-03-21_06h23_49

When the deployment completes successfully, it displays as shown below.

2026-03-21_06h24_41

After deployment, verify that the objects have been created. (Below is an example using queries.)

USE DATABASE DEV_DB;
SHOW SCHEMAS;
SHOW TABLES IN SCHEMA RAW;
SHOW TASKS IN SCHEMA RAW;
SHOW DYNAMIC TABLES IN SCHEMA STG;
SHOW DYNAMIC TABLES IN SCHEMA MART;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h25_52

-- Verify roles and grants
SHOW ROLES LIKE 'DEV_EXPLORER_ROLE';
SHOW GRANTS TO ROLE DEV_EXPLORER_ROLE;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h26_28

11. Loading Sample Data and Verifying Operation

Preparing Sample CSVs

Load sample data into the DEV environment to verify that the COPY INTO tasks and Dynamic Table pipeline work correctly.

First, prepare the following three CSV files locally.

customers.csv

CUSTOMER_ID,CUSTOMER_NAME,EMAIL,REGION,CREATED_AT
1,Taro Tanaka,tanaka@example.com,tokyo,2025-12-01 10:00:00
2,Hanako Sato,sato@example.com,osaka,2025-12-05 11:30:00
3,Ichiro Suzuki,suzuki@example.com,tokyo,2025-12-10 09:15:00
4,Misaki Takahashi,takahashi@example.com,fukuoka,2025-12-15 14:00:00
5,Kenta Ito,ito@example.com,osaka,2025-12-20 16:45:00
6,Yumi Watanabe,watanabe@example.com,nagoya,2026-01-03 08:30:00
7,Takuya Yamamoto,yamamoto@example.com,tokyo,2026-01-08 13:00:00
8,Sakura Nakamura,nakamura@example.com,sapporo,2026-01-12 10:20:00
9,Daisuke Kobayashi,kobayashi@example.com,fukuoka,2026-01-18 15:10:00
10,Ai Kato,kato@example.com,tokyo,2026-01-25 11:00:00
Enter fullscreen mode Exit fullscreen mode

orders.csv

ORDER_ID,CUSTOMER_ID,ORDER_DATE,ORDER_STATUS,TOTAL_AMOUNT,CREATED_AT
1001,1,2026-01-15,completed,15000.00,2026-01-15 10:30:00
1002,2,2026-01-18,completed,8500.00,2026-01-18 14:00:00
1003,1,2026-01-22,completed,23000.00,2026-01-22 09:45:00
1004,3,2026-02-01,completed,4200.00,2026-02-01 11:20:00
1005,4,2026-02-05,shipped,31500.00,2026-02-05 16:00:00
1006,5,2026-02-10,completed,9800.00,2026-02-10 13:30:00
1007,2,2026-02-14,completed,12000.00,2026-02-14 10:15:00
1008,6,2026-02-20,shipped,7600.00,2026-02-20 15:45:00
1009,7,2026-03-01,completed,18500.00,2026-03-01 08:00:00
1010,8,2026-03-05,pending,5400.00,2026-03-05 12:30:00
1011,3,2026-03-08,completed,11200.00,2026-03-08 14:20:00
1012,9,2026-03-10,shipped,26800.00,2026-03-10 09:00:00
1013,10,2026-03-12,completed,3900.00,2026-03-12 11:45:00
1014,1,2026-03-15,pending,14700.00,2026-03-15 16:30:00
1015,5,2026-03-18,completed,8900.00,2026-03-18 10:00:00
Enter fullscreen mode Exit fullscreen mode

order_items.csv

ORDER_ITEM_ID,ORDER_ID,PRODUCT_NAME,QUANTITY,UNIT_PRICE,CREATED_AT
1,1001,Wireless Earbuds,1,8000.00,2026-01-15 10:30:00
2,1001,Smartphone Case,2,3500.00,2026-01-15 10:30:00
3,1002,USB Cable,5,1700.00,2026-01-18 14:00:00
4,1003,Monitor Arm,1,15000.00,2026-01-22 09:45:00
5,1003,Mouse Pad,2,4000.00,2026-01-22 09:45:00
6,1004,Keyboard Cover,3,1400.00,2026-02-01 11:20:00
7,1005,4K Monitor,1,29000.00,2026-02-05 16:00:00
8,1005,HDMI Cable,1,2500.00,2026-02-05 16:00:00
9,1006,Wireless Mouse,2,4900.00,2026-02-10 13:30:00
10,1007,Webcam,1,12000.00,2026-02-14 10:15:00
11,1008,USB Hub,2,3800.00,2026-02-20 15:45:00
12,1009,Mechanical Keyboard,1,18500.00,2026-03-01 08:00:00
13,1010,Mouse Pad,1,4000.00,2026-03-05 12:30:00
14,1010,Keyboard Cover,1,1400.00,2026-03-05 12:30:00
15,1011,Wireless Earbuds,1,8000.00,2026-03-08 14:20:00
16,1011,USB Cable,2,1600.00,2026-03-08 14:20:00
17,1012,4K Monitor,1,29000.00,2026-03-10 09:00:00
18,1013,Smartphone Case,1,3900.00,2026-03-12 11:45:00
19,1014,Webcam,1,12000.00,2026-03-15 16:30:00
20,1014,USB Cable,3,900.00,2026-03-15 16:30:00
21,1015,Wireless Mouse,1,4900.00,2026-03-18 10:00:00
22,1015,Mouse Pad,1,4000.00,2026-03-18 10:00:00
Enter fullscreen mode Exit fullscreen mode

Uploading CSV Files

Upload the prepared CSV files to the internal stage from Snowsight.

Since the task definitions specify sub-paths like FROM @{{ db_name }}.RAW.DATA_STAGE/customers, separate the upload path for each file:

  • customers.csv → Path: /customers/
  • orders.csv → Path: /orders/
  • order_items.csv → Path: /order_items/

2026-03-21_06h31_40

After uploading, verify that the files are placed on the stage. You should see three CSV files in their respective sub-paths.

LIST @DEV_DB.RAW.DATA_STAGE;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h32_58

Resuming and Manually Executing Tasks

Tasks are in a suspended state (state is suspended) when deployed. First, check the status with SHOW TASKS.

SHOW TASKS IN SCHEMA DEV_DB.RAW;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h33_33

You can confirm that the state column shows suspended.

To enable the tasks, run ALTER TASK ... RESUME. Since these are three independent tasks (no parent-child relationship), resume each one individually.

ALTER TASK DEV_DB.RAW.LOAD_CUSTOMERS RESUME;
ALTER TASK DEV_DB.RAW.LOAD_ORDERS RESUME;
ALTER TASK DEV_DB.RAW.LOAD_ORDER_ITEMS RESUME;
Enter fullscreen mode Exit fullscreen mode

Run SHOW TASKS again and confirm the state column shows started.

SHOW TASKS IN SCHEMA DEV_DB.RAW;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h34_23

The task schedule is daily at 2:00 AM (JST), but for verification purposes, manually execute them with EXECUTE TASK.

EXECUTE TASK DEV_DB.RAW.LOAD_CUSTOMERS;
EXECUTE TASK DEV_DB.RAW.LOAD_ORDERS;
EXECUTE TASK DEV_DB.RAW.LOAD_ORDER_ITEMS;
Enter fullscreen mode Exit fullscreen mode

Verify that data has been loaded into each table.

SELECT COUNT(*) FROM DEV_DB.RAW.CUSTOMERS;  -- 10 rows
SELECT COUNT(*) FROM DEV_DB.RAW.ORDERS;     -- 15 rows
SELECT COUNT(*) FROM DEV_DB.RAW.ORDER_ITEMS; -- 22 rows
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h35_09

Note: As verified in the bonus section below, the official docs state: When definition changes are deployed for a task that is already started, Snowflake automatically suspends that task (or its root task) temporarily, applies the change, and then resumes it again. This means you don't need to manually suspend tasks even when modifying task definitions through DCM. This is fantastic!

Bulk-Refreshing Dynamic Tables with REFRESH ALL

After executing COPY INTO, you could wait for Dynamic Tables to auto-refresh based on TARGET_LAG, but DCM Projects' REFRESH ALL command lets you immediately bulk-refresh all Dynamic Tables under the project.

https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-pipelines

When executed via SQL, it looks like this. REFRESH ALL triggers a refresh for all Dynamic Tables managed by the project and their upstream Dynamic Tables. The result is returned in JSON format, showing inserted_rows, deleted_rows, and data_timestamp (a timestamp indicating data freshness) for each Dynamic Table.

EXECUTE DCM PROJECT DCM_ADMIN.PROJECTS.MY_DCM_PROJECT_DEV
  REFRESH ALL;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h38_52

Verify that data has been reflected in the MART layer Dynamic Tables. Customer order summaries and product sales summaries should display correctly.

SELECT * FROM DEV_DB.MART.MART_CUSTOMER_ORDERS ORDER BY TOTAL_SPENT DESC;

SELECT * FROM DEV_DB.MART.MART_PRODUCT_SALES ORDER BY TOTAL_REVENUE DESC;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h39_46

12. Plan & Deploy to the PROD Target

Next, deploy to the PROD target in the same manner. Since the PROD target specifies MY_DCM_PROJECT_PROD as a separate DCM project object in manifest.yml, the DEV environment is not affected.

First, run the following query to create the PROD DCM project object.

USE ROLE ACCOUNTADMIN;

CREATE DCM PROJECT IF NOT EXISTS DCM_ADMIN.PROJECTS.MY_DCM_PROJECT_PROD
    COMMENT = 'DCM project for PROD';
Enter fullscreen mode Exit fullscreen mode

Next, in the Workspace DCM Project screen, switch the target to "PROD", ensure "Plan" is selected, and click the play button. (If the PROD DCM project object is not recognized and an error occurs, refreshing the browser page should resolve it.)

2026-03-21_06h45_29

Similar to DEV, the objects are displayed as new creation diffs against PROD_DB. Confirm that the warehouse size is SMALL.

2026-03-21_06h48_03

If everything looks good, click the "Deploy" button to execute the deployment.

2026-03-21_06h48_45

2026-03-21_06h49_09

After the deployment completes successfully, verify the PROD environment objects. Confirm that PROD_DB has the same schema structure and objects as DEV_DB, the warehouse PROD_WH size is SMALL, and PROD_EXPLORER_ROLE has been granted privileges on each object.

USE DATABASE PROD_DB;
SHOW SCHEMAS;
SHOW TABLES IN SCHEMA RAW;
SHOW TASKS IN SCHEMA RAW;
SHOW DYNAMIC TABLES IN SCHEMA STG;
SHOW DYNAMIC TABLES IN SCHEMA MART;
SHOW WAREHOUSES LIKE 'PROD_WH';
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h50_43

-- Verify roles and grants
SHOW ROLES LIKE 'PROD_EXPLORER_ROLE';
SHOW GRANTS TO ROLE PROD_EXPLORER_ROLE;
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h51_08

This confirms that using a single DCM project folder (set of definition files), we can deploy the same configuration to separate DCM project objects for DEV and PROD by switching template variables!

Note: Task enablement and Dynamic Table refresh for the PROD environment need to be done separately.

Bonus: What Happens When You Modify or Remove Objects from Definition Files

In DCM Projects, removing a DEFINE statement from definition files is expected to make that object a drop target on the next Deploy. Let's verify this behavior.

Change #1: Commenting Out the Warehouse Definition

Comment out the contents of sources/definitions/warehouses.sql.

2026-03-21_06h52_24

Change #2: Modifying a Task Definition

Change just one task to run at 3:00AM daily instead of 2:00 AM.

2026-03-21_06h55_31

Running Plan

Run Plan against the DEV target.

2026-03-21_06h56_16

The Plan result is shown below. DEV_WH is displayed as a DROP target, and the task shows as an ALTER target with clear visibility into which configuration values will change.

2026-03-21_06h57_53

Running Deploy

Proceed with the Deploy. (As a bonus note, the Alias also works fine with Japanese characters.)

2026-03-21_06h58_43

After Deploy, verify whether the warehouse was actually deleted. As shown below, nothing is returned, confirming it has been dropped.

SHOW WAREHOUSES LIKE 'DEV_WH';
Enter fullscreen mode Exit fullscreen mode

2026-03-21_06h59_32

For the task, we can confirm that the configuration value has been updated and the state remains started. This is fantastic!

2026-03-21_07h00_59

Conclusion

I tried building DEV/PROD environments within a single Snowflake account using Snowflake DCM Projects.

What stood out most from hands-on experience was how incredibly simple it is to separate DEV/PROD environments using template variables. Simply defining db_name and wh_size in manifest.yml's templating.configurations lets you deploy environments with different parameters from the same set of definition files. (If you're familiar with dbt, you'll find the onboarding very smooth.)

Another great aspect is that the Plan → Deploy safety workflow is natively built in. If you're accustomed to Terraform's planapply, you can use it with the same mindset. The integration of Dynamic Table REFRESH ALL and data quality test TEST ALL as DCM Project features is also very convenient for data pipeline management.

On the other hand, it's important to note that some objects like File Format are not yet supported by DCM management, and as a Preview feature, there are various limitations including rename restrictions (as of March 20, 2026).

That said, this is a feature I'm very excited to see evolve with future updates!!

Top comments (0)