DEV Community

HatmanStack
HatmanStack

Posted on

How to Connect Google Forms to Snowflake Using Cloud Run

Every major cloud provider has tools to collect survey data within their own ecosystem. But what if you need form responses to land in a data warehouse on a different platform? That takes a little
wiring.

In this tutorial, you'll build a pipeline that automatically sends Google Forms responses to a Snowflake table. The architecture uses four components:

  • Google Forms — collects the data
  • Apps Script — triggers on each form submission
  • Google Cloud Run — runs a Node.js service that connects to Snowflake
  • Snowflake Node Connector — inserts the data using parameterized queries

By the end, every form submission will automatically appear in your Snowflake warehouse within a couple of minutes.

Prerequisites

  • A Google Cloud account with billing enabled
  • A Snowflake account (a free trial works)
  • Basic familiarity with Node.js and Docker

Step 1: Set Up the Google Form and Apps Script

Create a Google Form and navigate to the Responses tab. Click the Google Sheets icon to create a linked spreadsheet — this is where form responses land before we forward them to Snowflake.

Open the linked spreadsheet, then go to Extensions → Apps Script. Create a function that runs on each form submission:

  function myFunction() {
    var ss = SpreadsheetApp.getActiveSpreadsheet();
    var sheet = ss.getSheets()[0];
    var range = sheet.getRange("A2:E");
    range.sort({column: 1, ascending: false});

    var url = ""; // We'll fill this in after deploying Cloud Run
    var headers = {
      "contentType": "application/json",
      "headers": {
        "X-PW-AccessToken": "<TOKEN>",
        "X-PW-Application": "developer_api",
        "X-PW-UserEmail": "<YOUR_EMAIL>"
      }
    };
    UrlFetchApp.fetch(url, headers);
  }
Enter fullscreen mode Exit fullscreen mode

This script sorts entries by timestamp (newest first), then makes an HTTP request to our Cloud Run service. Leave the url empty for now — we'll fill it in after deploying the connector.

Step 2: Create a Google Cloud Service Account

Service accounts let applications authenticate with Google APIs without using a personal login. Our Cloud Run service needs one to read from the Google Sheet.

  1. In Google Cloud Console, go to IAM & Admin → Service Accounts → Create Service Account
  2. Save the generated email address — you'll need it later to share the Google Sheet
  3. Go to the Keys tab → Add Key → JSON. This downloads a credential file to your machine
  4. Enable the Google Sheets API in your project (APIs & Services → Enable APIs)

Keep the downloaded JSON file — we'll include it in our Cloud Run deployment as creds.json.

Step 3: Prepare Your Snowflake Table

Before the connector can insert data, the destination table must exist. In Snowflake, run:

  CREATE DATABASE IF NOT EXISTS DEMO_DB;
  CREATE SCHEMA IF NOT EXISTS DEMO_DB.PUBLIC;
  CREATE TABLE IF NOT EXISTS DEMO_DB.PUBLIC.SHEETS (
    TS STRING,
    NAME STRING,
    DAYS STRING,
    DIET STRING,
    PAY STRING
  );
Enter fullscreen mode Exit fullscreen mode

Adjust the column names and types to match your form's fields.

Step 4: Build the Node.js Connector

The connector is a small Express server that reads the latest form entry from Google Sheets and inserts it into Snowflake. Create three files:

  index.js

  const path = require('path');
  const { google } = require('googleapis');
  const sheets = google.sheets('v4');
  const snow = require('snowflake-sdk');
  const express = require('express');
  const app = express();

  const getInvite = async () => {
    const auth = new google.auth.GoogleAuth({
      keyFile: path.join(__dirname, 'creds.json'),
      scopes: ['https://www.googleapis.com/auth/spreadsheets'],
    });
    google.options({ auth });

    const res = await sheets.spreadsheets.values.get({
      spreadsheetId: '<your-sheets-id>', // From the sheet URL
      range: 'A2:E2',
    });

    const row = res.data.values;
    if (!row || row.length === 0) {
      console.log('No data found.');
      return;
    }

    const connection = snow.createConnection({
      account: '<locator>.<cloud-provider>', // e.g., xh45729.us-east-2.aws
      username: '<your-username>',
      password: '<your-password>',
      warehouse: 'COMPUTE_WH',
      database: 'DEMO_DB',
      schema: 'PUBLIC',
      role: 'ACCOUNTADMIN'
    });

    const conn = connection.connect();

    // Parameterized query prevents SQL injection — never concatenate
    // user-supplied values directly into SQL strings
    conn.execute({
      sqlText: 'INSERT INTO DEMO_DB.PUBLIC.SHEETS (TS, NAME, DAYS, DIET, PAY) VALUES (?, ?, ?, ?, ?)',
      binds: row[0]
    });
  };

  const port = process.env.PORT || 8080;
  app.listen(port, () => {
    console.log(`Listening on port ${port}`);
  });

  app.get('/', (req, res) => {
    getInvite();
    res.send('Adding Data');
  });
Enter fullscreen mode Exit fullscreen mode

Your Google Sheet ID is the long string in the sheet's URL between /d/ and /edit:

https://docs.google.com/spreadsheets/d/<sheets-id>/edit#gid=0

The Snowflake account identifier (.) is in the bottom-left of your Snowflake console, or in the login URL. For example: xh45729.us-east-2.aws. Check the https://docs.snowflake.com/en/user-guide/admin-account-identifier if you're unsure — the format varies by deployment.

Notice the parameterized query with ? placeholders and the binds array. This is important: parameterized queries prevent SQL injection by letting the database driver handle escaping. Never build SQL strings by concatenating user input directly.

  Dockerfile

  FROM node:20-slim

  WORKDIR /usr/src/app

  # Copy dependency manifests first — Docker caches this layer
  # so npm install only re-runs when dependencies change
  COPY package*.json ./

  RUN npm install --production

  COPY . .

  EXPOSE 8080
  CMD ["node", "index.js"]
Enter fullscreen mode Exit fullscreen mode
  package.json

  {
    "name": "node-sheets-to-snow",
    "version": "2.0.0",
    "description": "Google Sheets to Snowflake connector",
    "main": "index.js",
    "scripts": {
      "start": "node index.js"
    },
    "engines": {
      "node": ">=20.0.0"
    },
    "dependencies": {
      "express": "^4.21.0",
      "googleapis": "^144.0.0",
      "snowflake-sdk": "^2.0.2"
    }
  }
Enter fullscreen mode Exit fullscreen mode

Step 5: Deploy to Google Cloud Run

Place all four files (index.js, Dockerfile, package.json, creds.json) in a directory. Open Cloud Shell from the Google Cloud Console (the terminal icon in the top right), upload the files, and deploy:

  cd your-project-directory
  gcloud run deploy --source .
Enter fullscreen mode Exit fullscreen mode

Cloud Run will ask for a service name and region. It builds the container, deploys it, and returns a URL. Copy that URL.

Step 6: Wire Everything Together

Two final connections:

  1. Share the Google Sheet with your service account. Click the Share button on the spreadsheet and add the service account email from Step 2. This gives the Cloud Run service permission to read form responses.
  2. Add the Cloud Run URL to your Apps Script. Go back to Extensions → Apps Script and paste the URL into the url variable in your function.

Testing It

Submit a response through your Google Form. Within one to two minutes, the data should appear in your Snowflake table. The slight delay comes from Cloud Run's cold start — the container spins down when idle and takes a moment to restart on the first request.

You can verify by running in Snowflake:

  SELECT * FROM DEMO_DB.PUBLIC.SHEETS ORDER BY TS DESC LIMIT 5;
Enter fullscreen mode Exit fullscreen mode

The complete source code is available on https://github.com/HatmanStack/snow-node-sheets-gpc.

Top comments (0)