This is a submission for the GitHub Copilot CLI Challenge
What I Built
EmailToGSheets is an automated email processing system that reads emails from an IMAP server, extracts structured data using AI, and syncs it to Google Sheets in real-time. It's a production-grade solution designed for organizations that need to convert email communications into actionable spreadsheet data with automatic SLA (Service Level Agreement) tracking.
Key Features:
- Smart Email Processing: Automatically fetches and filters emails from specific senders using IMAP
- AI-Powered Analysis: Integrates Hugging Face AI to analyze and categorize email content into business categories (Bugs, Operation Support, Meeting, Maintenance Support)
- SLA Tracking: Automatically calculates response times between customer emails and helpdesk replies with 24-hour SLA monitoring
- Google Sheets Integration: Direct synchronization to Google Sheets with automatic OAuth2 token management and batch updates
- Smart Ticket System: Automatic ticket ID generation with date-based sequencing and thread detection
- Automated Scheduling: Built-in cron scheduler for regular processing (every 30 minutes)
- Fallback System: Manual CSV processing mode when IMAP is blocked by firewalls
- Docker Deployment: Containerized application with health checks and persistent volumes
- Token Management: Automatic OAuth2 token refresh with dedicated CLI tool for health checks
- VPS Deployment: CI/CD pipeline with GitHub Actions for automated deployment
- Smart Caching: Email cache system to avoid reprocessing and optimize performance
The Problem It Solves:
Many organizations receive structured data through email (support requests, forms, notifications) but need this data in spreadsheets for analysis, tracking, or integration with other systems. Manually copying email data to spreadsheets is time-consuming and error-prone. EmailToGSheets automates this entire workflow, ensuring data is always up-to-date, properly categorized, and includes SLA metrics for support team performance tracking.
Demo
Repository: wira-works/emailtogsheets
Architecture Overview:
Email Server (IMAP)
↓
EmailReader + AI Analyzer + SLA Calculator
↓
CSV Writer + Cache
↓
Google Sheets API (Batch Updates)
↓
Live Spreadsheet with SLA Tracking
Quick Start Commands:
# Local Operations
npm start # Run email processor
npm run schedule # Start scheduler (30-min interval)
npm run token:check # Check OAuth token status
npm run token:refresh # Force token refresh
# Health Monitoring
npm run health # Complete system dashboard
# VPS Deployment
npm run sync:vps # Bidirectional sync with VPS
npm run deploy:setup # Interactive deployment wizard
Key Components:
-
EmailReader (
lib/emailReader.js): IMAP connection, email filtering, and SLA calculation -
AIAnalyzer (
lib/aiAnalyzer.js): AI-powered content extraction with fallback pattern matching -
GoogleSheetsIntegration (
lib/googleSheets.js): OAuth2 authentication, batch updates, and SLA column management -
EmailCache (
lib/emailCache.js): Ticket metadata tracking and SLA calculation caching -
OAuth2TokenManager (
lib/oauth2TokenManager.js): Automatic token refresh system -
Scheduler (
scheduler.js): Cron-based automation with configurable intervals
Production Features:
- Automated token refresh every 30 minutes
- Email caching to prevent duplicate processing
- SLA tracking with 24-hour threshold monitoring
- AI email categorization with 3-tier fallback strategy
- Comprehensive logging system
- Health check dashboard
- Backup and disaster recovery scripts
- SSH-based VPS synchronization
- Docker deployment with GitHub Container Registry (GHCR)
My Experience with GitHub Copilot CLI
GitHub Copilot CLI transformed how I built this project. It wasn't just about writing code in the IDE—the gh copilot command became my go-to tool for generating complex shell commands, understanding Docker operations, and automating deployment workflows.
1. Docker Compose with Health Checks & Resource Limits
What I asked GitHub Copilot CLI:
gh copilot suggest "create docker compose for nodejs app with health checks, resource limits, and persistent volumes for logs output and data"
What Copilot CLI generated:
services:
email-scheduler:
image: email-to-gsheets:latest
container_name: email-scheduler-app
restart: unless-stopped
volumes:
- ./data:/app/data
- ./output:/app/output
- ./logs:/app/logs
- ./token.json:/app/token.json
- ./credentials.json:/app/credentials.json
deploy:
resources:
limits:
cpus: '1'
memory: 512M
reservations:
cpus: '0.5'
memory: 256M
healthcheck:
test: ["CMD", "node", "-e", "require('fs').accessSync('./token.json'); console.log('healthy')"]
interval: 60s
timeout: 10s
retries: 3
start_period: 30s
logging:
driver: "json-file"
options:
max-size: "10m"
max-file: "5"
Impact: Copilot CLI suggested production-grade patterns I hadn't considered - resource reservations, proper logging limits, and a clever health check that verifies the OAuth token file exists. This saved me hours of Docker documentation reading.
2. VPS File Synchronization with SSH
What I asked GitHub Copilot CLI:
gh copilot suggest "bidirectional file sync between local and remote server using scp with custom port 22"
What Copilot CLI generated:
# Sync local to VPS
scp -P 22 -r "$LOCAL_PATH/output/" "$VPS_USER@$VPS_HOST:$VPS_PATH/output/"
scp -P 22 -r "$LOCAL_PATH/data/" "$VPS_USER@$VPS_HOST:$VPS_PATH/data/"
scp -P 22 "$LOCAL_PATH/token.json" "$VPS_USER@$VPS_HOST:$VPS_PATH/token.json"
# Sync VPS to local
scp -P 22 -r "$VPS_USER@$VPS_HOST:$VPS_PATH/output/" "$LOCAL_PATH/output/"
scp -P 22 -r "$VPS_USER@$VPS_HOST:$VPS_PATH/data/" "$LOCAL_PATH/data/"
Impact: Perfect SSH file transfer syntax with custom ports on the first try. Copilot CLI understood I needed bidirectional sync and structured the script accordingly.
3. Docker Container Health Monitoring
What I asked GitHub Copilot CLI:
gh copilot suggest "check if docker container is running and show its status with custom format"
What Copilot CLI generated:
docker ps --filter 'name=email-scheduler' --format 'table {{.Names}}\t{{.Status}}\t{{.Mounts}}'
Then I needed to understand the cleanup command:
gh copilot explain "docker image prune -f --filter 'until=72h'"
Copilot CLI explained:
This command removes unused Docker images older than 72 hours. The
-fflag forces removal without confirmation, and--filter 'until=72h'specifies images that haven't been used in the last 72 hours.
Impact: Learned proper Docker filtering syntax and automated cleanup for production environments.
4. AI-Powered Email Categorization Logic
What I asked GitHub Copilot (IDE):
// Comment: Create a function to categorize emails using Hugging Face API with fallback to pattern matching
What Copilot generated:
async analyzeEmailContent(subject, textBody) {
try {
const emailContent = `Subject: ${subject}\n\nContent: ${textBody}`.substring(0, 1000);
let analysis;
try {
// Try AI analysis first
analysis = await this.generateAnalysis(emailContent, subject);
} catch (aiError) {
console.warn(`⚠️ AI Analysis failed, using enhanced pattern matching`);
// Fallback to pattern-based analysis
const detailedAnalysis = this.getDetailedAnalysis(subject, textBody);
analysis = detailedAnalysis.analysis;
}
return analysis;
} catch (error) {
// Final fallback to simple keyword-based analysis
return this.fallbackAnalysis(subject, textBody);
}
}
fallbackCategorization(content) {
const contentLower = content.toLowerCase();
// MAINTENANCE SUPPORT - Technical activities
if (
contentLower.includes('defect') ||
contentLower.includes('fixing') ||
(contentLower.includes('migrasi') && contentLower.includes('dokumen')) ||
contentLower.includes('migration') ||
(contentLower.includes('inject') && contentLower.includes('data'))
) {
return 'Maintenance Support';
}
// ... more categories
}
Impact: Copilot suggested a three-tier fallback strategy (AI → Pattern → Keywords) and even included Indonesian language patterns for my business context.
5. SLA Calculation with Email Threading
What I asked GitHub Copilot (IDE):
// Calculate SLA response time between customer email and helpdesk reply, considering email threading with In-Reply-To headers
What Copilot generated:
calculateSLA() {
console.log('\n⏱️ Calculating SLA (Response Time)...');
const customerEmails = this.emails.filter(email => !email.isHelpdeskReply);
const helpdeskReplies = this.emails.filter(email => email.isHelpdeskReply);
customerEmails.forEach(customerEmail => {
// Find helpdesk reply using RFC 5322 headers
const reply = helpdeskReplies.find(helpdeskEmail => {
const referencesCustomerEmail =
helpdeskEmail.inReplyTo === customerEmail.messageId ||
(helpdeskEmail.references && helpdeskEmail.references.includes(customerEmail.messageId));
const subjectMatch =
(helpdeskEmail.subject.toLowerCase().includes('re:') ||
helpdeskEmail.subject.toLowerCase().includes('bls:')) &&
helpdeskEmail.subject.toLowerCase().includes(
customerEmail.subject.toLowerCase().replace('re:', '').replace('bls:', '').trim()
);
return referencesCustomerEmail || subjectMatch;
});
if (reply) {
const diffHours = (new Date(reply.date) - new Date(customerEmail.date)) / (1000 * 60 * 60);
customerEmail.responseTimeHours = diffHours.toFixed(2);
customerEmail.slaStatus = diffHours <= 24 ? 'Met (< 24h)' : 'Exceeded';
}
});
}
Impact: Copilot taught me about RFC 5322 email headers (In-Reply-To, References) and suggested bilingual subject matching (English "Re:" + Indonesian "Bls:") which was critical for my use case.
6. Google Sheets Batch Updates
What I asked GitHub Copilot (IDE):
// Batch update specific columns in Google Sheets for SLA data without overwriting the entire row
What Copilot generated:
async updateSLAColumnsInSheet(slaUpdates, headers) {
const requests = [];
const slaIndices = this.getSLAColumnIndices(headers);
for (const update of slaUpdates) {
const rowIndex = update.rowIndex;
const colStartIndex = slaIndices.replyDate;
const colEndIndex = slaIndices.slaStatus;
// Convert column index to A1 notation
const startCol = String.fromCharCode(65 + colStartIndex);
const endCol = String.fromCharCode(65 + colEndIndex);
const range = `${this.sheetName}!${startCol}${rowIndex}:${endCol}${rowIndex}`;
const slaValues = [[
update.replyDate,
update.helpdeskRepliedBy,
update.responseTime,
update.responseTimeHours,
update.slaStatus
]];
requests.push({ range, values: slaValues });
}
// Batch update all at once
await this.sheets.spreadsheets.values.batchUpdate({
spreadsheetId: this.spreadsheetId,
resource: {
data: requests,
valueInputOption: 'RAW'
}
});
}
Impact: Copilot suggested batch updates instead of individual API calls (10x faster) and generated the A1 notation conversion formula that I would have struggled with.
7. Deployment Script with Automatic Rollback
What I asked GitHub Copilot CLI:
gh copilot suggest "bash script to pull docker image from ghcr, update docker-compose, start container, and rollback on failure"
What Copilot CLI generated:
# Backup current config
cp docker-compose.yml docker-compose.yml.backup
# Pull new image
docker pull "${IMAGE_NAME}:${TAG}"
# Update docker-compose.yml
sed -i "s|image: .*|image: ${IMAGE_NAME}:${TAG}|g" docker-compose.yml
# Start new container with error handling
if docker-compose up -d; then
log_success "Container started"
rm -f docker-compose.yml.backup
else
log_error "Failed - Rolling back"
mv docker-compose.yml.backup docker-compose.yml
docker-compose up -d
exit 1
fi
# Clean up old images (>72 hours)
docker image prune -f --filter "until=72h"
Impact: Production-grade deployment pattern with automatic rollback that prevented several outages during testing. The cleanup command keeps disk space under control.
8. Understanding Complex Shell Commands
Throughout the project, I frequently used gh copilot explain to understand commands:
Example 1:
gh copilot explain "ssh -q -p 22 user@host 'docker ps -a --filter name=app' 2>/dev/null"
Copilot CLI explained:
Connects to remote host via SSH on port 22 in quiet mode, runs docker ps to list all containers filtered by name, and suppresses error output.
Example 2:
gh copilot explain "sed -i 's|image: .*|image: new-image:tag|g' docker-compose.yml"
Copilot CLI explained:
In-place edit of docker-compose.yml, replacing the entire image line with new image tag using pipe delimiters.
Impact: Instead of googling syntax, I got instant, context-aware explanations right in the terminal.
Overall Development Impact
| Area | Without Copilot | With Copilot | Time Saved |
|---|---|---|---|
| AI Email Categorization | 6 hours | 2 hours | 4 hours |
| SLA Calculation Logic | 8 hours | 2 hours | 6 hours |
| Google Sheets Integration | 5 hours | 2 hours | 3 hours |
| Docker & Deployment | 6 hours | 1 hour | 5 hours |
| Bash Scripts & Automation | 4 hours | 1 hour | 3 hours |
| TOTAL | 29 hours | 8 hours | 🎯 21 hours (72% faster) |
Key Takeaways
GitHub Copilot CLI for Commands:
gh copilot suggestbecame my shell command generator, eliminating constant documentation lookups and Stack Overflow searchesGitHub Copilot IDE for Logic: Perfect for complex business logic like SLA calculations, AI integration, and API interactions
Learning Accelerator: Taught me RFC 5322 email headers, Docker best practices, A1 notation, and production deployment patterns
Context-Aware Suggestions: Understood my business context (Indonesian language, SLA tracking, email threading) and suggested appropriate solutions
Production Quality: Suggested error handling, fallback strategies, resource limits, and rollback mechanisms I wouldn't have thought of
Would I build this without Copilot? Probably not. The combination of email protocol handling, AI integration, SLA calculations, and VPS deployment would have felt overwhelming. Copilot made it approachable by breaking down complexity into manageable, well-structured code.
Most Valuable Copilot CLI Commands:
-
gh copilot suggest- Generated 90% of my shell scripts -
gh copilot explain- Explained complex commands instantly - Saved me from context-switching to browser documentation
Most Surprising Moment: When Copilot suggested using RFC 5322 email headers for thread detection - I had no idea these headers existed, but they solved my biggest problem perfectly.
Project Stats:
- Lines of Code: ~3000+ (JavaScript + Shell Scripts)
- Docker: Fully containerized
- Status: Working Prototype (Non-Production)
- Development Time: 8 hours (vs 29 hours estimated without Copilot)
- Key Dependencies: googleapis, node-cron, imap, axios, huggingface
Technologies: Node.js, JavaScript, Docker, Shell Scripting, Google APIs, IMAP, OAuth2, GitHub Actions, VPS Deployment, AI/ML (Hugging Face)

Top comments (0)