DEV Community

Cover image for RepoReady — Instant AI-powered onboarding for any GitHub, GitLab, or Bitbucket repo
Maneesh Thakur
Maneesh Thakur

Posted on

RepoReady — Instant AI-powered onboarding for any GitHub, GitLab, or Bitbucket repo

DEV Weekend Challenge: Community

This is a submission for the DEV Weekend Challenge: Community

The Community

Every developer has felt it: you click a link to a new repository, the file tree opens, and you just... stare. Where do you start? What even is this thing?

The open source contributor community has a deeply broken onboarding experience. READMEs are outdated within months of being written. Architecture docs don't exist. Setup instructions assume you already know the stack. First-time contributors give up before they ever submit a PR.

I've been there. You've been there. Every developer you know has been there.

This project is for that community — for every dev who has ever opened a codebase and immediately felt lost.

What I Built

RepoReady — paste any public (or private) repo URL → get a structured AI-generated onboarding guide in seconds.

No installation. No backend. No data stored. Just open the page and go.

The guide covers five sections automatically extracted from the repo:

  • 💡 What the project does — plain English, no jargon
  • 🧩 Tech stack — every language, framework, and dependency explained
  • 🗂️ Folder structure — what each directory is actually for
  • 🚀 How to get started — setup steps pulled straight from the code
  • 🏁 Where to contribute — suggested entry points for first-timers

After analysis, you can:

  • Export as Markdown — download a clean .md file, commit it alongside your codebase
  • 🔗 Share — copy a URL pre-filled with the repo to send teammates

Demo

https://maneesh-relanto.github.io/DevTo-CommunityChallenge-RepoReady/

(https://maneesh-relanto.github.io/DevTo-CommunityChallenge-RepoReady/)

Code

[https://github.com/Maneesh-Relanto/DevTo-CommunityChallenge-RepoReady

How I Built It

Architecture: zero-backend, client-side only

The entire app is a static HTML/CSS/JS project. No server, no API keys stored server-side, no database. Everything runs in the browser.

User Input (Repo URL + AI API Key)
        │
        ▼
┌──────────────────────────────┐
│  Detect platform from URL    │
│  github.com / gitlab.com /   │
│  bitbucket.org               │
└──────────────────────────────┘
        │
        ▼
┌──────────────────────────────┐
│  Platform API call           │
│  • Repo metadata             │
│  • README (first 4000 chars) │
│  • File tree (up to 120)     │
│  • Package manifest          │
│    (package.json / Cargo.toml│
│     / pyproject.toml / etc.) │
└──────────────────────────────┘
        │
        ▼
┌──────────────────────────────┐
│  Structured prompt → AI      │
│  Returns JSON with 5 keys    │
└──────────────────────────────┘
        │
        ▼
┌──────────────────────────────┐
│  Render onboarding guide     │
│  + Markdown export           │
│  + Shareable link            │
└──────────────────────────────┘
Enter fullscreen mode Exit fullscreen mode

File structure

repoready/
├── index.html      # HTML shell + layout
├── package.json    # ESM declaration + npm test script
├── css/
│   └── styles.css  # Design tokens, animations
├── js/
│   ├── app.js      # Main entry — wires everything
│   ├── ai.js       # All 7 AI provider adapters
│   ├── git.js      # GitHub, GitLab, Bitbucket fetchers
│   ├── ui.js       # DOM rendering, export, history
│   └── config.js   # Centralised constants
├── tests/
│   ├── git.test.js     # 17 unit tests
│   ├── ai.test.js      # 16 unit tests
│   └── history.test.js # 12 unit tests
└── README.md
Enter fullscreen mode Exit fullscreen mode

Multi-provider AI

Seven providers, switchable at runtime — user brings their own key (Ollama is fully local, no key needed):

Provider Models
Claude (Anthropic) claude-sonnet-4-6, claude-opus-4-6, claude-haiku-4-5
OpenAI gpt-4o, gpt-4o-mini, gpt-4-turbo
Google Gemini gemini-2.0-flash, gemini-2.0-flash-lite, gemini-1.5-pro
Mistral AI mistral-large-latest, mistral-small-latest, codestral-latest
xAI Grok grok-3, grok-3-fast, grok-2-1212
Groq 🆓 llama-3.3-70b-versatile, llama-3.1-8b-instant, mixtral-8x7b
Ollama 🏠 llama3.2, mistral, codellama, phi4 (runs locally)

Multi-platform Git support

GitHub uses the REST API (/repos/{owner}/{repo}/git/trees/HEAD?recursive=1) to get the full recursive file tree in a single call.

GitLab uses the projects API + a separate /languages endpoint to correctly identify the primary language (GitLab doesn't expose this in the main project object).

Bitbucket required extra work — its /src endpoint is not recursive, so I fetch the root directory and then fan out parallel requests for each first-level subdirectory to build a meaningful tree.

Prompt engineering

The prompt is structured to return strict JSON — no markdown fences, no free text — which makes parsing reliable across all three AI providers. I specifically extract and inject the package manifest (e.g. package.json, Cargo.toml, pyproject.toml) into the prompt so the AI has concrete dependency information rather than guessing from filenames.

Challenges

ES modules over file://
The project uses native ES modules, which browsers block when opened directly from the filesystem (CORS restriction). This bit me during early testing — index.html opened from a double-click was completely silent. The fix is trivial (python -m http.server), but catching it and documenting it clearly matters for contributors.

Bitbucket's flat file tree
Unlike GitHub and GitLab, Bitbucket's /src endpoint returns only the immediate directory listing rather than a recursive tree. I worked around this by fanning out parallel requests to each root-level subdirectory — capped at 20 concurrent requests to stay polite — which gives a meaningful two-level tree in most cases.

AI response consistency
Three different AI providers all produce slightly different formatting, even when instructed to return raw JSON. The parser uses a greedy /{[\s\S]*}/ regex to extract the first complete JSON object from the response, which handles models that wrap JSON in markdown code fences despite explicit instructions not to.

Keeping it zero-dependency
The temptation to reach for a markdown renderer, a UI library, or a bundler is real. I deliberately resisted — the whole project loads from a single <script type="module"> tag. No npm install, no build step, no node_modules. This constraint forced cleaner separation of concerns and makes the project trivially deployable to any static host.

Try It Yourself

  1. Open https://maneesh-relanto.github.io/DevTo-CommunityChallenge-RepoReady/
  2. Pick an AI provider, paste your API key
  3. Enter any repo URL — try one of these to start:
    • https://github.com/expressjs/express
    • https://github.com/django/django
    • https://github.com/rust-lang/rust
  4. Click Analyze →

Source: https://github.com/Maneesh-Relanto/repoready

Top comments (0)