DEV Community

Bhupendra Lute
Bhupendra Lute

Posted on

Static Docs are Dead: How I Built a Global Documentation Engine with Gemini AI and Lingo.dev

Documentation is the silent killer of productivity. I’ve been there: you push a groundbreaking feature, but the README still points to the v1.0 architecture. Or worse, you have a global user base, but your documentation is a monolingual island.

That’s why I built PolyDocs during Lingo.dev hackathon #2.

PolyDocs is an automated, AI-driven documentation synchronization platform. It doesn't just store files; it listens to your repository, analyzes your code using LLMs, localizes it into multiple languages, and delivers it via Pull Requests—all without you touching a single Markdown file.

In this deep dive, I’ll take you through the architecture I designed, the technical pitfalls I hit, and the specific logic that makes PolyDocs a reality.


🏗️ The Architecture: A Symphony of Services

PolyDocs isn't just a script; it’s an ecosystem. To handle the complexity of Git operations, AI generation, and real-time dashboarding, I designed a multi-layered architecture from the ground up.

The Macro View

The Macro View

I built the system as a TypeScript Monorepo, ensuring type safety from the API routes down to the shared dashboard types.


🧠 Deep Dive: How the Features Work

1. The Intelligent Webhook Ingestor

I didn't want a simple endpoint. I needed a defensive one. I implemented a custom parser to keep the rawBody for cryptographic signature verification, ensuring requests actually come from GitHub.

// backend/src/index.ts - Custom parser for security
app.use(
  express.json({
    verify: (req: any, res, buf) => {
      req.rawBody = buf; // Preserving signature for Octokit validation
    },
  })
);
Enter fullscreen mode Exit fullscreen mode

2. The Gemini + Lingo.dev Pipeline

This is the heart of PolyDocs. I didn't want to just "translate." I wanted to re-contextualize.

The Workflow:

  1. Context Aggregation: I pull the top 10 most relevant code files to feed the LLM.
  2. AI Synthesis: Gemini 2.0 Flash generates a technical README based on real code intent.
  3. Native Localization: Lingo.dev transforms that README into Spanish, French, and Japanese.

Workflow

I wrote the compiler logic to ensure it stays within LLM token limits while maintaining high accuracy:

// backend/src/services/compiler.ts - Selective Context
const codeFiles = treeData.tree
  .filter((file: any) => {
    return file.path.endsWith('.ts') || file.path.endsWith('.js');
  })
  .slice(0, 10); // Intelligent context windowing
Enter fullscreen mode Exit fullscreen mode

🛠️ Technical Challenges & Solutions

Building this project solo meant I had to be my own architect, dev, and ops team. Here are the biggest hurdles I overcame:

Challenge 1: The Monorepo Docker Trap 📦

In a monorepo, the backend and frontend share a root package-lock.json. Initially, my Docker builds failed because they couldn't see the root dependencies.

My Solution: I designed root-context builds using multi-stage Dockerfiles. I selectively installed workspace dependencies, which kept the final images lean but the build context complete.

# backend/Dockerfile - Workspace Aware
COPY package*.json ./
COPY backend/package*.json ./backend/
RUN npm install -w @polydocs/backend
Enter fullscreen mode Exit fullscreen mode

Challenge 2: The Cross-Platform Binary War ⚔️

During development on Windows, I hit a massive roadblock: Error: Cannot find module @rollup/rollup-linux-x64-musl.
Because my lockfile was generated on Windows, the Docker container (running Alpine Linux) couldn't find its required native binaries.

The Creative Fix:
I modified the Dockerfile to explicitly force-install the Linux-specific binary, effectively overriding the lockfile's platform restriction for the production environment.

# Forcing the platform-specific dependency during build
RUN npm install -w @polydocs/frontend && \
    npm install @rollup/rollup-linux-x64-musl
Enter fullscreen mode Exit fullscreen mode

🎨 Design Philosophy: Premium

I wanted PolyDocs to feel like a high-end SaaS product. I used GSAP (GreenSock) for micro-interactions to make the dashboard feel alive.

I implemented a "staggered" load for build logs and repository cards to create a premium flow:

// frontend/src/pages/Dashboard.tsx
useEffect(() => {
  if (!loading) {
    gsap.from('.stagger-item', {
      y: 20,
      opacity: 0,
      stagger: 0.05,
      ease: 'power2.out',
    });
  }
}, [loading]);
Enter fullscreen mode Exit fullscreen mode

🚀 Conclusion

Documentation should never be a chore. It should be a byproduct of great engineering. By combining Gemini AI with the Lingo.dev localization engine, I've created a system that removes the manual overhead of the documentation lifecycle.


Github Repository: PlyDocs

Top comments (0)