How we built DotShare Auth: a Next.js 16 OAuth portal that handles token exchange, automatic refresh, and edge-level rate limiting for a VS Code extension that publishes to LinkedIn, X, Facebook, and Reddit.
Table of Contents
- The Problem
- Architecture Overview
- The Auth Server
- Token Lifecycle in the Extension
- Edge Rate Limiting with Upstash Redis
- Lessons Learned
1. The Problem
DotShare is a VS Code extension that lets developers publish content directly to social platforms. To post on behalf of a user, it needs OAuth tokens from LinkedIn, X (Twitter), Facebook, and Reddit.
The naive approach? Ask the user to paste tokens manually. That's awful UX.
The real problems we had to solve:
- Each platform has a different OAuth flow (PKCE, Basic Auth, state params)
- App secrets (
CLIENT_SECRET,APP_SECRET) can't live in the extension — it's client-side code anyone can inspect - Tokens expire — X tokens last ~2 hours, Facebook tokens last ~60 days
- We needed zero manual re-authentication after the initial connect
Our solution: a dedicated Next.js 16 auth server that acts as a secure proxy between VS Code and the OAuth providers.
2. Architecture Overview
VS Code Extension
→ opens browser to https://dotshare-auth-server.vercel.app/auth/{platform}
→ user clicks "Authenticate" (one click, no credentials to enter)
→ redirected to platform OAuth page
→ platform redirects back to /auth/{platform}/callback
→ server exchanges code for token using .env secrets
→ browser redirects to vscode://freerave.dotshare/auth?platform=x&access_token=...&refresh_token=...&expires_in=7200
→ VS Code extension receives token automatically
→ TokenManager stores token + expiry in SecretStorage
→ before every API call: TokenManager checks expiry → refreshes if needed
Key design decisions:
- Auth server is stateless — no database, no user accounts
- Tokens are never stored server-side — passed directly via deep link
- All OAuth secrets live in server
.envonly - The extension handles all token lifecycle (expiry tracking, refresh)
3. The Auth Server
Single Source of Truth
Instead of scattering platform config across multiple files, everything lives in one place:
// lib/platforms.ts
export interface PlatformConfig {
name: string;
icon: string;
description?: string;
scopes: string[];
authUrl: string;
envKey: string; // NEXT_PUBLIC_* variable name
titleGradientTo?: string; // gradient color for the title
}
export type PlatformKey = 'linkedin' | 'x' | 'facebook' | 'reddit';
export const PLATFORMS: Record<string, PlatformConfig> = {
linkedin: {
name: 'LinkedIn',
icon: 'in',
description: 'Professional network',
scopes: ['openid', 'profile', 'email', 'w_member_social'],
authUrl: 'https://www.linkedin.com/oauth/v2/authorization',
envKey: 'NEXT_PUBLIC_LINKEDIN_CLIENT_ID',
titleGradientTo: '#4da3ff',
},
x: {
name: 'X (Twitter)',
icon: '𝕏',
description: 'Public discourse',
scopes: ['tweet.read', 'tweet.write', 'users.read', 'offline.access'],
authUrl: 'https://twitter.com/i/oauth2/authorize',
envKey: 'NEXT_PUBLIC_X_CLIENT_ID',
},
facebook: {
name: 'Facebook',
icon: 'f',
description: 'Social network',
scopes: ['pages_manage_posts', 'pages_read_engagement', 'publish_to_groups'],
authUrl: 'https://www.facebook.com/v18.0/dialog/oauth',
envKey: 'NEXT_PUBLIC_FACEBOOK_APP_ID',
titleGradientTo: '#42a5f5',
},
reddit: {
name: 'Reddit',
icon: 'r/',
description: 'Discussion forums',
scopes: ['submit', 'read', 'identity'],
authUrl: 'https://www.reddit.com/api/v1/authorize',
envKey: 'NEXT_PUBLIC_REDDIT_CLIENT_ID',
titleGradientTo: '#ff7043',
},
};
Adding a new platform? One file. The home page, auth pages, and callback pages all read from PLATFORMS automatically.
Eliminating Duplicate UI Code
The original codebase had 4 identical auth pages and 4 identical callback pages — ~600 lines of copy-pasted React. We collapsed them into shared components:
// Before: 150 lines per platform
// facebook/page.tsx — full component with all the logic
// After: 1 line per platform
// facebook/page.tsx
'use client';
import { AuthPage } from '@/components/AuthPage';
export default function FacebookAuth() {
return <AuthPage platform="facebook" />;
}
The useOAuthInit hook handles all the OAuth initiation logic — PKCE for X, state for X and Reddit, duration=permanent for Reddit:
// hooks/useOAuthInit.ts
const PKCE_PLATFORMS: PlatformKey[] = ['x'];
const STATEFUL_PLATFORMS: PlatformKey[] = ['x', 'reddit'];
export function useOAuthInit(platform: PlatformKey) {
const [loading, setLoading] = useState(false);
const [error, setError] = useState('');
const handleAuth = async () => {
const config = PLATFORMS[platform];
const clientId = process.env[config.envKey as keyof typeof process.env];
if (!clientId) {
setError(`Missing ${config.envKey} in .env`);
return;
}
setLoading(true);
const authUrl = new URL(config.authUrl);
authUrl.searchParams.set('client_id', clientId);
authUrl.searchParams.set('redirect_uri', `${window.location.origin}/auth/${platform}/callback`);
authUrl.searchParams.set('response_type', 'code');
authUrl.searchParams.set('scope', config.scopes.join(' '));
if (STATEFUL_PLATFORMS.includes(platform)) {
const state = generateState();
sessionStorage.setItem(`${platform}_state`, state);
authUrl.searchParams.set('state', state);
}
if (PKCE_PLATFORMS.includes(platform)) {
const codeVerifier = generateCodeVerifier();
const codeChallenge = await generateCodeChallenge(codeVerifier);
sessionStorage.setItem('x_code_verifier', codeVerifier);
authUrl.searchParams.set('code_challenge', codeChallenge);
authUrl.searchParams.set('code_challenge_method', 'S256');
}
if (platform === 'reddit') {
authUrl.searchParams.set('duration', 'permanent');
}
window.location.href = authUrl.toString();
};
return { handleAuth, loading, error };
}
Token Refresh Endpoints
Not all platforms support token refresh the same way:
| Platform | Support | Endpoint |
|---|---|---|
| X | ✅ Standard refresh | POST /api/auth/x/refresh |
| ✅ Standard refresh | POST /api/auth/reddit/refresh |
|
| ⚠️ Token extension (60 days) | POST /api/auth/facebook/extend |
|
| ❌ Enterprise only | — |
X refresh (public client — no secret needed, just client ID):
// app/api/auth/x/refresh/route.ts
export async function POST(req: NextRequest) {
const { refreshToken } = await req.json();
const clientId = process.env.X_CLIENT_ID;
const params = new URLSearchParams({
grant_type: 'refresh_token',
refresh_token: refreshToken,
client_id: clientId!,
});
const basicAuth = Buffer.from(`${clientId}:`).toString('base64');
const response = await fetch('https://api.twitter.com/2/oauth2/token', {
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Authorization': `Basic ${basicAuth}`,
},
body: params.toString(),
});
const data = await response.json();
return NextResponse.json(data);
}
Facebook token extension (short-lived → 60-day long-lived):
// app/api/auth/facebook/extend/route.ts
export async function POST(req: NextRequest) {
const { accessToken } = await req.json();
const params = new URLSearchParams({
grant_type: 'fb_exchange_token',
client_id: process.env.FACEBOOK_APP_ID!,
client_secret: process.env.FACEBOOK_APP_SECRET!,
fb_exchange_token: accessToken,
});
const response = await fetch(
`https://graph.facebook.com/v18.0/oauth/access_token?${params.toString()}`,
{ method: 'GET' }
);
const data = await response.json();
// data.expires_in ≈ 5184000 seconds = 60 days
return NextResponse.json(data);
}
Deep Links with Expiry Info
The callback pages now include expires_in and refresh_token in the deep link back to VS Code:
// hooks/useOAuthCallback.ts (inside the callback handler)
setTimeout(() => {
const params = new URLSearchParams({
platform,
access_token: data.access_token,
...(REFRESHABLE_PLATFORMS.includes(platform) && data.refresh_token
? { refresh_token: data.refresh_token }
: {}),
...(data.expires_in ? { expires_in: String(data.expires_in) } : {}),
});
window.location.href = `vscode://freerave.dotshare/auth?${params.toString()}`;
}, 1500);
The resulting deep link looks like:
vscode://freerave.dotshare/auth
?platform=x
&access_token=AAA...
&refresh_token=BBB...
&expires_in=7200
4. Token Lifecycle in the Extension

This is where the magic happens. The extension needs to ensure every API call uses a valid token — without ever asking the user to re-authenticate.
TokenManager
// src/services/TokenManager.ts
export const AUTH_SERVER_URL = 'https://dotshare-auth-server.vercel.app';
const REFRESH_BUFFER_MS = 5 * 60 * 1000; // Refresh 5 minutes before expiry
export class TokenManager {
private static _context: vscode.ExtensionContext;
static init(context: vscode.ExtensionContext): void {
this._context = context;
}
// Called from URI handler after OAuth callback
static async storeToken(
platform: RefreshablePlatform,
accessToken: string,
refreshToken?: string,
expiresIn?: number
): Promise<void> {
// store access + refresh token in SecretStorage
// store expires_at = Date.now() + expiresIn * 1000
}
// Called before every API call
static async getValidToken(platform: RefreshablePlatform): Promise<string> {
const expiring = await this.isExpiringSoon(platform);
if (expiring) {
try {
await this.refresh(platform); // auto-refresh silently
} catch (error) {
Logger.warn(`refresh failed, using existing token`, error);
}
}
return await this._context.secrets.get(tokenKey[platform]) || '';
}
// 429 handling via shared post() helper
private static async post<T>(url: string, data: unknown): Promise<{ data: T }> {
try {
return await axios.post<T>(url, data);
} catch (err) {
if (axios.isAxiosError(err) && err.response?.status === 429) {
const retryAfter = err.response.headers['retry-after'] ?? '60';
throw new Error(`Rate limited. Retry after ${retryAfter}s`);
}
throw err;
}
}
}
URI Handler
The VS Code URI handler receives the deep link and stores everything:
// src/extension.ts — URI handler
const uriHandler = vscode.window.registerUriHandler({
async handleUri(uri: vscode.Uri) {
if (uri.path !== '/auth') return;
const params = new URLSearchParams(uri.query);
const platform = params.get('platform');
const accessToken = params.get('access_token');
const refreshToken = params.get('refresh_token');
const expiresIn = params.get('expires_in');
switch (platform) {
case 'linkedin':
await context.secrets.store('linkedinToken', accessToken!);
break;
case 'x':
await TokenManager.storeToken(
'x', accessToken!,
refreshToken ?? undefined,
expiresIn ? Number(expiresIn) : undefined
);
break;
case 'facebook':
await TokenManager.storeToken(
'facebook', accessToken!,
undefined,
expiresIn ? Number(expiresIn) : undefined
);
break;
case 'reddit':
await TokenManager.storeToken(
'reddit', accessToken!,
refreshToken ?? undefined,
expiresIn ? Number(expiresIn) : undefined
);
break;
}
vscode.window.showInformationMessage(`✓ ${platform} connected!`);
}
});
Using Tokens in Platform Posters
Before, platform files used whatever token was passed in. Now they call getValidToken() themselves:
// src/platforms/x.ts
export async function shareToX(
_accessToken: string, // ignored — TokenManager handles this
_accessSecret: string,
tweetData: TweetData
): Promise<string> {
// getValidToken() checks expiry and refreshes if needed — all silently
const accessToken = await TokenManager.getValidToken('x');
if (!accessToken) throw new Error('X: not authenticated');
// ... rest of posting logic
}
Scheduler Integration
Even scheduled posts (posted hours later) get fresh tokens:
// src/extension.ts — credentialsGetter for Scheduler
const credentialsGetter = async () => ({
linkedinToken: await context.secrets.get('linkedinToken') || '',
xAccessToken: await TokenManager.getValidToken('x'), // auto-refreshes
facebookToken: await TokenManager.getValidToken('facebook'), // auto-extends
redditAccessToken: await TokenManager.getValidToken('reddit'), // auto-refreshes
// ...
});
5. Edge Rate Limiting with Upstash Redis
Why In-Memory Rate Limiting Fails on Serverless
On Vercel, every request can hit a different serverless instance. An in-memory Map resets on every cold start — meaning your rate limiter is completely ineffective under any real load.
Request 1 → Instance A (Map: {ip: 1})
Request 2 → Instance B (Map: {ip: 1}) ← starts fresh, doesn't know about Request 1
Request 3 → Instance C (Map: {ip: 1}) ← same problem
The solution: a shared Redis store that all instances read from.
proxy.ts (Next.js 16)
Note: In Next.js 16,
middleware.tsis deprecated and renamed toproxy.ts. The function export also changes frommiddlewaretoproxy.
// proxy.ts
import { NextRequest, NextResponse } from 'next/server';
import { Ratelimit } from '@upstash/ratelimit';
import { Redis } from '@upstash/redis';
import { ipAddress } from '@vercel/functions';
const redis = Redis.fromEnv();
const authInitLimiter = new Ratelimit({
redis,
limiter: Ratelimit.slidingWindow(5, '1 m'),
analytics: true,
prefix: 'rl:auth_init',
});
const tokenExchangeLimiter = new Ratelimit({
redis,
limiter: Ratelimit.slidingWindow(10, '1 m'),
analytics: true,
prefix: 'rl:token_exchange',
});
const tokenRefreshLimiter = new Ratelimit({
redis,
limiter: Ratelimit.slidingWindow(10, '1 m'),
analytics: true,
prefix: 'rl:token_refresh',
});
function getLimiter(pathname: string) {
if (/^\/api\/auth\/[^/]+(\/refresh|\/extend)$/.test(pathname)) {
return tokenRefreshLimiter;
}
if (/^\/api\/auth\/[^/]+$/.test(pathname)) {
return tokenExchangeLimiter;
}
if (/^\/auth\/[^/]+$/.test(pathname)) {
return authInitLimiter;
}
return null;
}
export async function proxy(request: NextRequest) {
const pathname = request.nextUrl.pathname;
const limiter = getLimiter(pathname);
if (!limiter) return NextResponse.next();
const ip =
ipAddress(request) ??
request.headers.get('x-forwarded-for')?.split(',')[0].trim() ??
'127.0.0.1';
const { success, limit, remaining, reset } = await limiter.limit(ip);
if (!success) {
// Browser hitting /auth/* → redirect gracefully instead of JSON error
if (pathname.startsWith('/auth/')) {
const url = request.nextUrl.clone();
url.pathname = '/';
url.searchParams.set('error', 'Too many requests. Please try again later.');
return NextResponse.redirect(url);
}
// API routes → JSON 429 with Retry-After
return NextResponse.json(
{ error: 'Too many requests. Please try again later.' },
{
status: 429,
headers: {
'X-RateLimit-Limit': String(limit),
'X-RateLimit-Remaining': String(remaining),
'X-RateLimit-Reset': String(reset),
'Retry-After': String(Math.ceil((reset - Date.now()) / 1000)),
},
}
);
}
const response = NextResponse.next();
response.headers.set('X-RateLimit-Limit', String(limit));
response.headers.set('X-RateLimit-Remaining', String(remaining));
response.headers.set('X-RateLimit-Reset', String(reset));
return response;
}
export const config = {
matcher: ['/auth/:path*', '/api/auth/:path*'],
};
Rate Limit Tiers
| Route | Limit | Window | Reason |
|---|---|---|---|
/auth/* |
5 req | 1 min | Browser redirects only |
/api/auth/* |
10 req | 1 min | Initial token exchange |
/api/auth/*/refresh |
10 req | 1 min | Extension refresh calls |
The Extension Handles 429s Gracefully
// src/services/TokenManager.ts
private static async post<T>(url: string, data: unknown): Promise<{ data: T }> {
try {
return await axios.post<T>(url, data);
} catch (err) {
if (axios.isAxiosError(err) && err.response?.status === 429) {
const retryAfter = err.response.headers['retry-after'] ?? '60';
throw new Error(`Rate limited. Retry after ${retryAfter}s`);
}
throw err;
}
}
If a refresh is rate limited, getValidToken() catches the error, logs a warning, and returns the existing token. The user never sees a crash.
6. Lessons Learned
1. Stateless auth servers are beautiful
No database, no user sessions, no GDPR headaches. The server does one thing: exchange OAuth codes for tokens securely. Everything else is the client's problem.
2. In-memory rate limiting on serverless = placebo
Don't waste time on it. If you're on a serverless platform, you need a shared store. Upstash Redis is free for small projects and takes 10 minutes to set up.
3. Architecture over prompts
The biggest win wasn't any single feature — it was establishing lib/platforms.ts as the single source of truth early. Adding a new platform went from "edit 7 files" to "edit 1 file."
4. expires_in belongs in the deep link
The extension can't know when a token expires unless the auth server tells it. One extra query param saved us from building a polling mechanism.
5. Next.js 16 renames middleware to proxy
If you're upgrading: middleware.ts → proxy.ts, export function middleware → export function proxy. The codemod handles it: npx @next/codemod@canary middleware-to-proxy .
The Full Picture
dotshare-auth-server/
proxy.ts ← Edge rate limiting (Upstash Redis)
src/
app/
auth/[platform]/
page.tsx ← <AuthPage platform="x" /> (1 line)
callback/page.tsx ← <CallbackPage platform="x" /> (1 line)
api/auth/
x/route.ts ← token exchange
x/refresh/route.ts ← token refresh
facebook/extend/route.ts ← 60-day token extension
reddit/refresh/route.ts ← token refresh
components/
AuthPage.tsx ← shared auth UI
CallbackPage.tsx ← shared callback UI
hooks/
useOAuthInit.ts ← PKCE, state, redirect logic
useOAuthCallback.ts ← exchange, deep link logic
lib/
platforms.ts ← single source of truth
DotShare (VS Code Extension)/
src/
extension.ts ← URI handler → TokenManager.storeToken()
services/
TokenManager.ts ← store, getValidToken, refresh, clear
platforms/
x.ts ← getValidToken('x') before every post
reddit.ts ← getValidToken('reddit') before every post
Resources
- DotShare Auth Server
- DotShare VS Code Extension
- Upstash Ratelimit
- Next.js 16 Proxy (formerly Middleware)
- VS Code SecretStorage API
Built with Next.js 16, TypeScript, Upstash Redis, and VS Code Extension API.
Part of the DotShare project — a VS Code extension for publishing developer content across social platforms.



Top comments (0)