Still manually exporting CSVs from GA4 and Search Console?
TL;DR: Two TypeScript files + GitHub Actions = fully automated weekly data export. Service Account setup takes 15 minutes, code is copy-paste ready.
Architecture
GitHub Actions (weekly cron)
├── search_console_tracker.ts → search-console-latest.csv
└── ga4_tracker.ts → ga4-pages-latest.csv
ga4-acquisition-latest.csv
Prerequisites (15 min one-time setup)
- Create a Google Cloud project
- Enable Search Console API + Google Analytics Data API
- Create a Service Account → download JSON key
- Add the Service Account to Search Console ("Restricted") and GA4 ("Viewer")
Search Console Tracker
import { google } from "googleapis";
import { writeFileSync, mkdirSync, existsSync } from "fs";
const SERVICE_ACCOUNT_JSON = process.env.GOOGLE_SERVICE_ACCOUNT_JSON ?? "";
const SITE_URL = process.env.SEARCH_CONSOLE_SITE_URL ?? "";
const OUTPUT_DIR = process.env.ANALYTICS_OUTPUT_DIR || "./analytics";
function getClient() {
const credentials = JSON.parse(SERVICE_ACCOUNT_JSON);
const auth = new google.auth.GoogleAuth({
credentials,
scopes: ["https://www.googleapis.com/auth/webmasters.readonly"],
});
return google.searchconsole({ version: "v1", auth });
}
async function main() {
const client = getClient();
const now = new Date();
const endDate = new Date(now.getTime() - 3 * 86400000); // 3 days ago (data lag)
const startDate = new Date(endDate.getTime() - 28 * 86400000);
const fmt = (d: Date) => d.toISOString().split("T")[0];
const res = await client.searchanalytics.query({
siteUrl: SITE_URL,
requestBody: {
startDate: fmt(startDate),
endDate: fmt(endDate),
dimensions: ["query"],
rowLimit: 500,
},
});
const rows = res.data.rows ?? [];
if (!existsSync(OUTPUT_DIR)) mkdirSync(OUTPUT_DIR, { recursive: true });
const csv = [
"Query,Clicks,Impressions,CTR,Position",
...rows.map((r) =>
`"${(r.keys?.[0] ?? "").replace(/"/g, '""')}",${r.clicks},${r.impressions},${((r.ctr ?? 0) * 100).toFixed(2)}%,${(r.position ?? 0).toFixed(1)}`
),
].join("\n");
writeFileSync(`${OUTPUT_DIR}/search-console-latest.csv`, csv, "utf-8");
console.log(`Saved ${rows.length} queries`);
}
main();
GA4 Tracker
import { google } from "googleapis";
import { writeFileSync, mkdirSync, existsSync } from "fs";
const SERVICE_ACCOUNT_JSON = process.env.GOOGLE_SERVICE_ACCOUNT_JSON ?? "";
const PROPERTY_ID = process.env.GA4_PROPERTY_ID ?? "";
const OUTPUT_DIR = process.env.ANALYTICS_OUTPUT_DIR || "./analytics";
function getClient() {
const credentials = JSON.parse(SERVICE_ACCOUNT_JSON);
const auth = new google.auth.GoogleAuth({
credentials,
scopes: ["https://www.googleapis.com/auth/analytics.readonly"],
});
return google.analyticsdata({ version: "v1beta", auth });
}
async function main() {
const client = getClient();
if (!existsSync(OUTPUT_DIR)) mkdirSync(OUTPUT_DIR, { recursive: true });
const res = await client.properties.runReport({
property: `properties/${PROPERTY_ID}`,
requestBody: {
dateRanges: [{ startDate: "28daysAgo", endDate: "yesterday" }],
dimensions: [{ name: "pagePath" }],
metrics: [
{ name: "screenPageViews" },
{ name: "averageSessionDuration" },
{ name: "bounceRate" },
],
orderBys: [{ metric: { metricName: "screenPageViews" }, desc: true }],
limit: 100,
},
});
const rows = res.data.rows ?? [];
const csv = [
"Page path,Views,Avg engagement time (sec),Bounce rate",
...rows.map((r) => {
const path = r.dimensionValues?.[0]?.value ?? "";
const views = r.metricValues?.[0]?.value ?? "0";
const time = parseFloat(r.metricValues?.[1]?.value ?? "0").toFixed(1);
const bounce = (parseFloat(r.metricValues?.[2]?.value ?? "0") * 100).toFixed(1);
return `"${path}",${views},${time},${bounce}%`;
}),
].join("\n");
writeFileSync(`${OUTPUT_DIR}/ga4-pages-latest.csv`, csv, "utf-8");
console.log(`Pages: ${rows.length} rows`);
}
main();
GitHub Actions
name: Analytics Tracker
on:
schedule:
- cron: '0 1 * * 1' # Every Monday
workflow_dispatch:
permissions:
contents: write
jobs:
track:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with: { node-version: '20' }
- run: npm ci || npm install
- run: mkdir -p analytics-output
- name: Search Console
env:
GOOGLE_SERVICE_ACCOUNT_JSON: ${{ secrets.GOOGLE_SERVICE_ACCOUNT_JSON }}
SEARCH_CONSOLE_SITE_URL: https://your-site.com
ANALYTICS_OUTPUT_DIR: ./analytics-output
run: npx tsx search_console_tracker.ts
- name: GA4
env:
GOOGLE_SERVICE_ACCOUNT_JSON: ${{ secrets.GOOGLE_SERVICE_ACCOUNT_JSON }}
GA4_PROPERTY_ID: ${{ secrets.GA4_PROPERTY_ID }}
ANALYTICS_OUTPUT_DIR: ./analytics-output
run: npx tsx ga4_tracker.ts
- run: |
git config user.name "bot"
git config user.email "bot@example.com"
git add analytics-output/*.csv
git diff --cached --quiet || git commit -m "data: update analytics"
git push
Dependencies
npm install googleapis tsx
Bonus: Feed It to AI
Once CSVs auto-update, you can ask Claude Code:
"Read search-console-latest.csv and list keywords with high clicks but ranking below position 10."
"Find pages with high PV but low engagement time."
Automated data + AI analysis = SEO on autopilot.
Summary
| Step | Time |
|---|---|
| Service Account setup | 15 min (one-time) |
| Copy-paste code | 2 min |
| GitHub Actions runs weekly | 0 min |
No more manual exports.
Related Links
- Indie Dev Monetization Roadmap — Not just tool reviews. "How do you actually make money?" included.
- LaunchKit — SaaS Starter Kit — Next.js + Supabase + Stripe, production-ready
Top comments (0)