DEV Community

Cover image for **How to Build an Effective Monorepo: Complete Setup Guide for Modern Development Teams**
Nithin Bharadwaj
Nithin Bharadwaj

Posted on

**How to Build an Effective Monorepo: Complete Setup Guide for Modern Development Teams**

As a best-selling author, I invite you to explore my books on Amazon. Don't forget to follow me on Medium and show your support. Thank you! Your support means the world!

A single repository holds all our code. This idea, a monorepo, is surprisingly powerful. It means our website's frontend, our backend API, our shared design components, and our utility libraries all live under one roof. No more juggling a dozen separate repositories, wondering which version of a shared library another project uses. Everything is here, together.

The initial challenge is the tooling. Running npm install at the root often leads to a massive node_modules and confusion about how packages find each other. This is where workspace configuration comes in.

We define our structure in the root package.json. This file tells our package manager, "These folders are individual packages." When we run an install, it links them together internally.

// At the very top of our monorepo
{
  "name": "our-mono-repo",
  "private": true,
  "workspaces": ["packages/*", "apps/*"],
  "scripts": {
    "build": "turbo run build",
    "dev": "turbo run dev --parallel"
  }
}
Enter fullscreen mode Exit fullscreen mode

Inside packages/ui-button, the package.json can now directly reference its sibling.

{
  "name": "@our-company/ui-button",
  "version": "1.0.0",
  "dependencies": {
    "@our-company/design-tokens": "workspace:*",
    "react": "^18.0.0"
  }
}
Enter fullscreen mode Exit fullscreen mode

That "workspace:*" is key. It says, "Use the local version from this monorepo, not something from an external registry." The package manager creates a symlink, making it feel like a published package even though it's just in the next folder. This immediate feedback is vital for development.

Dependencies can become messy quickly. We might find ten different minor versions of the same library scattered across packages. To handle this, we can hoist common dependencies to the root. Many modern tools do this automatically. We can also enforce consistency.

In the root package.json, we can force a specific version for a sub-dependency everywhere.

{
  "resolutions": {
    "lodash": "4.17.21"
  }
}
Enter fullscreen mode Exit fullscreen mode

This is a blunt but sometimes necessary tool. A better approach is using a tool that can analyze and unify versions across the entire project graph, suggesting fixes before they become problems.

Once the structure is set, building everything becomes the next hurdle. We don't want to rebuild our entire UI library just because we fixed a typo in a documentation app. This is where build optimization and caching change the game.

Tools like Turbo use a concept called a pipeline. We define the relationships between tasks. They then build a graph of our monorepo and execute tasks in the correct order, in parallel, and crucially, they skip anything already built.

A configuration file defines these relationships.

// turbo.json
{
  "pipeline": {
    "build": {
      "dependsOn": ["^build"],
      "outputs": [".next/**", "!.next/cache/**"]
    },
    "test": {
      "dependsOn": ["build"],
      "outputs": []
    },
    "lint": {
      "outputs": []
    },
    "dev": {
      "cache": false
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Here, "dependsOn": ["^build"] means "before you build this package, build all its dependencies." If ui-button depends on design-tokens, Turbo builds design-tokens first. The caching is content-aware. If nothing in design-tokens has changed since the last build, Turbo restores the entire dist folder from cache in milliseconds.

We can implement a simpler version of this idea ourselves to understand the mechanism. It involves tracking file changes and package dependencies.

// A conceptual script to find what needs building
const fs = require('fs').promises;
const path = require('path');
const crypto = require('crypto');

async function getPackageGraph(rootDir) {
  const graph = {};
  const workspaces = ['packages/*', 'apps/*'];

  for (const pattern of workspaces) {
    const globPaths = await glob(pattern, { cwd: rootDir });
    for (const pkgPath of globPaths) {
      const fullPath = path.join(rootDir, pkgPath);
      const pkgJsonPath = path.join(fullPath, 'package.json');

      if (await fileExists(pkgJsonPath)) {
        const pkg = JSON.parse(await fs.readFile(pkgJsonPath, 'utf8'));
        graph[pkg.name] = {
          dir: fullPath,
          deps: Object.keys(pkg.dependencies || {})
            .concat(Object.keys(pkg.devDependencies || {}))
            .filter(d => d.startsWith('@our-company/')),
          hash: await computeHash(fullPath) // Hash of source files
        };
      }
    }
  }
  return graph;
}

async function computeHash(pkgDir) {
  const files = await getAllSourceFiles(pkgDir);
  const hasher = crypto.createHash('md5');
  for (const file of files.sort()) {
    const content = await fs.readFile(file, 'utf8');
    hasher.update(content);
  }
  return hasher.digest('hex');
}
Enter fullscreen mode Exit fullscreen mode

By comparing the current hash of a package's source files to a stored one, we know if it changed. If the hash is the same, we skip the build. This is the core of incremental builds.

Coordinating versions across dozens of packages is complex. We can't just bump everything manually. We need a system. This is where changesets help. A developer makes a change and runs a command to describe it.

npx changeset
Enter fullscreen mode Exit fullscreen mode

This prompts: "What kind of change is this? (patch, minor, major)". It then asks for a description. This generates a small markdown file in a .changeset folder.

---
"@our-company/ui-button": patch
"@our-company/docs-app": minor
---

Updated the Button component's hover state. The docs app now includes the new example.
Enter fullscreen mode Exit fullscreen mode

Later, when we're ready to release, we run changeset version. This command reads all these little files, bumps the versions in the relevant package.json files according to the semantic versioning rules, updates inter-dependent package.json files (so docs-app now depends on ui-button@1.0.1), and generates a consolidated changelog. Finally, changeset publish publishes all updated packages to npm.

Sharing code effectively is the monorepo's main promise. Beyond workspace:* links, TypeScript helps immensely. We can use project references to ensure our whole codebase is type-safe together.

In the root tsconfig.json, we set up references to all our packages.

{
  "references": [
    { "path": "./packages/design-tokens" },
    { "path": "./packages/ui-button" },
    { "path": "./apps/web-app" }
  ]
}
Enter fullscreen mode Exit fullscreen mode

Inside each package's tsconfig.json, we set composite: true and declaration: true.

{
  "compilerOptions": {
    "composite": true,
    "declaration": true,
    "outDir": "./dist",
    "rootDir": "./src"
  },
  "references": [{ "path": "../design-tokens" }]
}
Enter fullscreen mode Exit fullscreen mode

Now, when we run tsc --build from the root, TypeScript understands the dependency graph. It builds design-tokens first, then ui-button (which relies on design-tokens's types), then web-app. It's fast and guarantees type consistency across package boundaries.

Testing needs a strategy too. We want to run tests for packages affected by a change. We can use the same dependency graph from our build tool. We also create shared test utilities in a dedicated package, like @our-company/test-utils, to avoid copying the same mocking logic everywhere.

A simple test runner script can illustrate the principle of running tests based on changes.

// scripts/test-changed.js
const { execSync } = require('child_process');
const changeDetector = new ChangeDetector();

async function runTests() {
  // Get packages changed since main branch
  const changedPackages = await changeDetector.getChangedPackages('origin/main');

  if (changedPackages.length === 0) {
    console.log('No package changes detected. Running all tests might be overkill.');
    return;
  }

  const testTargets = new Set(changedPackages);

  // Also get all packages that depend on the changed ones
  const graph = await getPackageGraph();
  changedPackages.forEach(pkgName => {
    const dependents = findDependentsInGraph(graph, pkgName);
    dependents.forEach(dep => testTargets.add(dep));
  });

  console.log(`Running tests for: ${Array.from(testTargets).join(', ')}`);

  // Use a tool like Turbo or run npm scripts directly
  for (const pkg of testTargets) {
    try {
      execSync(`npm run test --workspace=${pkg}`, { stdio: 'inherit' });
    } catch (error) {
      console.error(`Tests failed in ${pkg}`);
      process.exit(1);
    }
  }
}

runTests();
Enter fullscreen mode Exit fullscreen mode

Documentation often becomes fragmented. In a monorepo, we can generate it from the source. For TypeScript projects, a tool like TypeDoc can be configured to parse all packages and output a single, inter-linked API website. For React components, a single Storybook instance can import and display stories from every UI package in the monorepo, creating a unified design system catalog.

The configuration for this is often at the root. A root .storybook/main.js might look like this:

const path = require('path');

module.exports = {
  stories: [
    // Load stories from any package
    '../packages/**/*.stories.@(js|jsx|ts|tsx|mdx)',
    '../apps/**/*.stories.@(js|jsx|ts|tsx|mdx)',
  ],
  addons: ['@storybook/addon-essentials'],
  framework: '@storybook/react-webpack5',
  // Important: Tell webpack to resolve our internal packages
  webpackFinal: async (config) => {
    config.resolve = {
      ...config.resolve,
      // This ensures imports like '@our-company/ui-button' resolve correctly
      modules: [...(config.resolve.modules || []), path.resolve(__dirname, '../node_modules')],
    };
    return config;
  },
};
Enter fullscreen mode Exit fullscreen mode

Finally, the development experience itself can be streamlined. Running npm run dev at the root should ideally start all the necessary development servers. With the right tooling, this command can launch the API server, the frontend app, and the Storybook docs, all watching for changes in their own and their dependent packages. This live, interconnected environment is where the monorepo truly shines, turning a collection of packages into a single, cohesive development workspace.

The journey from a scattered collection of repos to a streamlined monorepo is about choosing the right techniques for your team's size and needs. It starts with workspaces and ends with a fully automated system for building, testing, versioning, and documenting. The initial setup requires thought, but the payoff is a codebase that is simpler to manage, faster to develop in, and more consistent across all its parts.

📘 Checkout my latest ebook for free on my channel!

Be sure to like, share, comment, and subscribe to the channel!


101 Books

101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.

Check out our book Golang Clean Code available on Amazon.

Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!

Our Creations

Be sure to check out our creations:

Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | Java Elite Dev | Golang Elite Dev | Python Elite Dev | JS Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

Top comments (0)