Architecture Overview
YouTube doesn't expose subscriber email addresses through their API — but many creators list their contact email publicly on their channel page. Here's how I architected an n8n workflow to extract those emails and automate outreach:
1. Daily Cron Trigger
↓
2. YouTube Data API → Fetch recent subscribers
↓
3. Split array → Process each subscriber individually
↓
4. Google Sheets lookup → Check for duplicates
↓
5. IF subscriber exists → Skip to next
↓ (new subscriber)
6. Apify YouTube Scraper → Extract email from channel page
↓
7. IF email found → Send Gmail welcome + log to Sheets
↓ (no email)
8. Log subscriber without email → Prevent re-scraping
This workflow processes subscribers in batches of 1 to avoid rate limits, uses conditional logic to handle missing data gracefully, and maintains a Google Sheet as the source of truth for processed subscribers.
YouTube Data API Integration
The core data source is the YouTube Data API v3 subscriptions endpoint. You'll need OAuth2 credentials with the youtube.readonly scope.
Authentication setup:
- Create a project in Google Cloud Console
- Enable YouTube Data API v3
- Create OAuth2 credentials (Web application type)
- Add authorized redirect URI:
https://your-n8n-instance.com/rest/oauth2-credential/callback - Copy Client ID and Client Secret to n8n credential manager
API request structure:
GET https://www.googleapis.com/youtube/v3/subscriptions
?part=snippet
&myRecentSubscribers=true
&maxResults=50
Headers: {
"Authorization": "Bearer {oauth_token}"
}
Response format:
{
"items": [
{
"subscriberSnippet": {
"title": "Channel Name",
"channelId": "UCxxxxxxxxxxxxx",
"description": "Channel description"
}
}
]
}
Critical parameters:
-
myRecentSubscribers=truefilters to newest subscribers (important for daily processing) -
maxResults=50is the API limit per request - The response includes
channelIdwhich becomes your unique identifier
Rate limits: 10,000 quota units per day (each request costs 1 unit). With 50 subscribers per day, you'll use 1 unit daily.
Apify Web Scraping Integration
Apify's YouTube Email Scraper actor handles the complex part: navigating to a channel's About page and extracting contact information.
Authentication:
- Sign up at apify.com
- Navigate to Settings → Integrations → API Token
- Copy token to n8n Apify credential
Actor configuration:
{
"url": "https://www.youtube.com/channel/{channelId}",
"maxConcurrency": 1
}
Response structure:
{
"email": ["contact@example.com"],
"channelName": "Creator Name",
"subscriberCount": 1500
}
Cost considerations:
- Each scrape consumes ~0.001 compute units
- Free tier includes 5 compute units/month (~5,000 scrapes)
- Paid plans start at $49/month for 100 compute units
Error handling:
- Actor returns empty
emailarray when no email found - Timeout set to 60 seconds per channel
- Failed runs throw errors caught by n8n error workflow
Google Sheets Duplicate Detection
The workflow uses Google Sheets as a simple database to prevent processing the same subscriber twice.
Sheet structure:
| Subscriber name | Subscriber id | Email |
|----------------|------------------------|------------------------|
| John Creator | UCxxxxxxxxxxxxx | john@example.com |
| Jane Vlogger | UCyyyyyyyyyyyyyyy | |
Lookup query:
// n8n expression in Google Sheets node
{{ $('SplitInBatches').item.json.subscriberSnippet.channelId }}
This queries the "Subscriber id" column. If row_number exists in the response, the subscriber was already processed.
Why this approach:
- Avoids re-scraping channels daily (saves Apify credits)
- Prevents sending duplicate welcome emails
- Provides a queryable database of all subscribers
- No database setup required (Google Sheets is free and familiar)
Gmail API Configuration
Welcome emails send via Gmail's API using OAuth2 authentication.
OAuth2 setup:
- Same Google Cloud project as YouTube API
- Enable Gmail API
- Scopes required:
gmail.send - Use the same OAuth2 credential type in n8n
Email parameters:
{
"to": "{{ $('Apify').item.json.email[0] }}",
"subject": "Thanks for subscribing! 🎉",
"message": "<html>...</html>"
}
Rate limits:
- 500 emails/day for standard Gmail accounts
- 2,000 emails/day for Google Workspace accounts
- No per-minute limits, but bursts >100/min may trigger warnings
Implementation Gotchas
Missing subscriber data:
The YouTube API occasionally returns subscribers with incomplete subscriberSnippet data (deleted accounts, privacy settings). Always check for null:
{{ $json.subscriberSnippet?.channelId || 'unknown' }}
Apify timeout handling:
Some channels load slowly. Set actor timeout to 60-90 seconds and wrap in try/catch:
// In n8n error workflow
IF error.message includes "timeout" → Log to separate sheet
Email array indexing:
Apify returns emails as an array even for single results. Always use email[0] to grab the first:
{{ $json.email[0] }}
Google Sheets append performance:
Appending rows one at a time is slow. For high-volume channels (>100 subscribers/day), batch writes using the "Append/Update" operation with row ranges.
OAuth token expiration:
YouTube and Gmail OAuth tokens expire after 1 hour. n8n handles refresh automatically, but if the workflow fails after days of inactivity, manually re-authenticate credentials.
Duplicate detection edge case:
If someone unsubscribes then re-subscribes, they'll be logged twice. To prevent this, add a "Last checked" timestamp column and filter by date.
Prerequisites
Required accounts:
- n8n instance (self-hosted or n8n Cloud)
- Google Cloud Console account
- Apify account (free tier works)
- Gmail account
API credentials to configure:
- YouTube Data API v3 OAuth2
- Google Sheets OAuth2
- Gmail OAuth2
- Apify API key
Cost estimate:
- n8n Cloud: $20/month starter plan
- Apify free tier: 5,000 scrapes/month
- YouTube/Gmail APIs: Free (within quotas)
- Total: $0-20/month depending on n8n hosting
Documentation links:
- YouTube Data API reference
- Apify YouTube Scraper actor
- n8n Google OAuth setup
- Gmail API send reference
Get the Complete n8n Workflow Configuration
This tutorial covers the API integration architecture and core concepts. For the complete n8n workflow JSON, detailed node configurations with all parameters, and a video walkthrough of the setup process, check out the full implementation guide.
Top comments (0)