DEV Community

Wilson Xu
Wilson Xu

Posted on

Building A Zero-Config Development Environment Scanner With Node.js

Building A Zero-Config Development Environment Scanner With Node.js

How to build a CLI tool that automatically detects every tool, runtime, and configuration in a developer's environment -- without any setup.

Every developer has been there: you clone a repo, run npm install, and hit a wall of cryptic errors. Wrong Node version. Missing Python binding. Outdated Git. The README says "requires Node 18+" but doesn't mention the implicit dependency on sharp, which needs libvips, which needs... you get the idea.

What if a single command could scan your entire development environment, detect every installed tool, check version compatibility, and tell you exactly what's missing -- all without any configuration file?

In this article, we'll build devscan, a zero-config CLI tool that introspects a developer's machine and a project's requirements simultaneously. We'll cover child process management, intelligent version parsing, heuristic project analysis, and building a terminal UI that developers actually want to use.

The Problem With Development Environment Setup

Modern development stacks are deep. A typical React project might depend on:

  • Node.js (specific version range)
  • npm/yarn/pnpm (with lockfile format implications)
  • Git (for hooks and submodules)
  • Platform tools (Xcode CLI on macOS, build-essential on Linux)
  • Optional accelerators (Rust for SWC, Python for node-gyp bindings)

Tools like nvm, Docker, and devcontainers solve parts of this, but they're opt-in. The developer who just wants to contribute a one-line fix shouldn't need to install Docker to discover they have the wrong Node version.

Our scanner takes a different approach: detect everything, assume nothing, configure nothing.

Architecture Overview

The scanner has three layers:

  1. Detector Layer — Finds installed tools by probing the filesystem and running version commands
  2. Analyzer Layer — Reads project files to determine what's actually required
  3. Reporter Layer — Compares detected vs. required and renders the results
┌─────────────────────────────────┐
│         Reporter Layer          │
│   (Terminal UI / JSON output)   │
├─────────────────────────────────┤
│         Analyzer Layer          │
│  (Project requirement parser)   │
├─────────────────────────────────┤
│         Detector Layer          │
│   (System tool enumeration)    │
└─────────────────────────────────┘
Enter fullscreen mode Exit fullscreen mode

Let's build each layer.

Layer 1: The Detector

The detector needs to find tools without knowing where they are. We can't just check /usr/local/bin -- tools might be installed via Homebrew, Snap, Chocolatey, asdf, mise, or compiled from source into ~/.local/bin.

Safe Command Execution

First, we need a utility that runs a command and returns the output without throwing if the command doesn't exist:

// lib/exec.js
const { execFile } = require("child_process");
const { promisify } = require("util");

const execFileAsync = promisify(execFile);

async function safeExec(command, args = [], options = {}) {
  try {
    const { stdout, stderr } = await execFileAsync(command, args, {
      timeout: 5000,
      maxBuffer: 1024 * 1024,
      env: { ...process.env, LANG: "en_US.UTF-8" },
      ...options,
    });
    return { ok: true, stdout: stdout.trim(), stderr: stderr.trim() };
  } catch (err) {
    if (err.code === "ENOENT") {
      return { ok: false, error: "not_found" };
    }
    if (err.killed) {
      return { ok: false, error: "timeout" };
    }
    // Some tools write version info to stderr (looking at you, Java)
    if (err.stderr) {
      return { ok: true, stdout: err.stderr.trim(), stderr: "" };
    }
    return { ok: false, error: err.message };
  }
}

module.exports = { safeExec };
Enter fullscreen mode Exit fullscreen mode

Notice the LANG environment variable override. Without it, some tools output localized text that breaks our version parsers. The 5-second timeout prevents hanging on tools that prompt for input (like ssh without arguments).

Version Extraction

Version strings come in wildly different formats. Node prints v20.11.0, Python prints Python 3.12.1, rustc prints rustc 1.75.0 (82e1608df 2023-12-21), and Java prints a multi-line paragraph. We need a universal parser:

// lib/version.js
const SEMVER_REGEX = /(\d+\.\d+(?:\.\d+)?(?:-[\w.]+)?)/;

function extractVersion(raw) {
  if (!raw) return null;

  // Try the first line first -- most tools put version there
  const firstLine = raw.split("\n")[0];
  const match = firstLine.match(SEMVER_REGEX);

  if (match) return match[1];

  // Fall back to scanning all lines (e.g., Java's verbose output)
  const fullMatch = raw.match(SEMVER_REGEX);
  return fullMatch ? fullMatch[1] : null;
}

function compareVersions(a, b) {
  const partsA = a.split(".").map(Number);
  const partsB = b.split(".").map(Number);

  for (let i = 0; i < Math.max(partsA.length, partsB.length); i++) {
    const numA = partsA[i] || 0;
    const numB = partsB[i] || 0;
    if (numA !== numB) return numA - numB;
  }
  return 0;
}

function satisfiesRange(version, range) {
  if (!range || !version) return null; // unknown

  // Handle common range formats: ">=18.0.0", "^20.0.0", "~18.17.0", "18.x"
  const cleaned = range.replace(/^[v=\s]+/, "");

  if (cleaned.startsWith(">=")) {
    const min = cleaned.slice(2);
    return compareVersions(version, min) >= 0;
  }
  if (cleaned.startsWith("^")) {
    const base = cleaned.slice(1);
    const baseParts = base.split(".").map(Number);
    const verParts = version.split(".").map(Number);
    // Major must match (or be greater for 0.x)
    return (
      verParts[0] === baseParts[0] && compareVersions(version, base) >= 0
    );
  }
  if (cleaned.includes(".x")) {
    const prefix = cleaned.replace(/\.x.*/, "");
    return version.startsWith(prefix);
  }
  // Exact match
  return compareVersions(version, cleaned) >= 0;
}

module.exports = { extractVersion, compareVersions, satisfiesRange };
Enter fullscreen mode Exit fullscreen mode

Tool Definitions

Each tool we detect needs a definition: what command to run, how to parse the output, and what the tool is for. Instead of hardcoding these, we use a declarative registry:

// lib/tools.js
const TOOL_REGISTRY = [
  {
    id: "node",
    name: "Node.js",
    category: "runtime",
    detect: { command: "node", args: ["--version"] },
    homepage: "https://nodejs.org",
  },
  {
    id: "npm",
    name: "npm",
    category: "package-manager",
    detect: { command: "npm", args: ["--version"] },
  },
  {
    id: "yarn",
    name: "Yarn",
    category: "package-manager",
    detect: { command: "yarn", args: ["--version"] },
  },
  {
    id: "pnpm",
    name: "pnpm",
    category: "package-manager",
    detect: { command: "pnpm", args: ["--version"] },
  },
  {
    id: "bun",
    name: "Bun",
    category: "runtime",
    detect: { command: "bun", args: ["--version"] },
  },
  {
    id: "deno",
    name: "Deno",
    category: "runtime",
    detect: { command: "deno", args: ["--version"] },
    // Deno outputs multiple lines; we only want the first
    parseVersion: (raw) => {
      const match = raw.match(/deno\s+(\d+\.\d+\.\d+)/);
      return match ? match[1] : null;
    },
  },
  {
    id: "git",
    name: "Git",
    category: "vcs",
    detect: { command: "git", args: ["--version"] },
  },
  {
    id: "python3",
    name: "Python 3",
    category: "runtime",
    detect: { command: "python3", args: ["--version"] },
  },
  {
    id: "python",
    name: "Python",
    category: "runtime",
    detect: { command: "python", args: ["--version"] },
  },
  {
    id: "ruby",
    name: "Ruby",
    category: "runtime",
    detect: { command: "ruby", args: ["--version"] },
  },
  {
    id: "go",
    name: "Go",
    category: "runtime",
    detect: { command: "go", args: ["version"] },
  },
  {
    id: "rustc",
    name: "Rust",
    category: "runtime",
    detect: { command: "rustc", args: ["--version"] },
  },
  {
    id: "cargo",
    name: "Cargo",
    category: "package-manager",
    detect: { command: "cargo", args: ["--version"] },
  },
  {
    id: "docker",
    name: "Docker",
    category: "container",
    detect: { command: "docker", args: ["--version"] },
  },
  {
    id: "docker-compose",
    name: "Docker Compose",
    category: "container",
    detect: { command: "docker", args: ["compose", "version"] },
  },
  {
    id: "kubectl",
    name: "kubectl",
    category: "container",
    detect: { command: "kubectl", args: ["version", "--client", "--short"] },
  },
  {
    id: "terraform",
    name: "Terraform",
    category: "infra",
    detect: { command: "terraform", args: ["--version"] },
  },
  {
    id: "aws",
    name: "AWS CLI",
    category: "cloud",
    detect: { command: "aws", args: ["--version"] },
  },
  {
    id: "gcloud",
    name: "Google Cloud CLI",
    category: "cloud",
    detect: { command: "gcloud", args: ["--version"] },
  },
  {
    id: "java",
    name: "Java",
    category: "runtime",
    detect: { command: "java", args: ["-version"] },
    // Java prints to stderr, which our safeExec handles
  },
  {
    id: "make",
    name: "Make",
    category: "build",
    detect: { command: "make", args: ["--version"] },
  },
  {
    id: "cmake",
    name: "CMake",
    category: "build",
    detect: { command: "cmake", args: ["--version"] },
  },
  {
    id: "gcc",
    name: "GCC",
    category: "compiler",
    detect: { command: "gcc", args: ["--version"] },
  },
  {
    id: "clang",
    name: "Clang",
    category: "compiler",
    detect: { command: "clang", args: ["--version"] },
  },
];

module.exports = { TOOL_REGISTRY };
Enter fullscreen mode Exit fullscreen mode

The Detection Engine

Now we wire it all together. The key insight is running all detections in parallel -- we're I/O bound, not CPU bound, so checking 25 tools takes roughly the same time as checking one:

// lib/detector.js
const { safeExec } = require("./exec");
const { extractVersion } = require("./version");
const { TOOL_REGISTRY } = require("./tools");

async function detectAll() {
  const results = await Promise.all(
    TOOL_REGISTRY.map(async (tool) => {
      const { command, args } = tool.detect;
      const result = await safeExec(command, args);

      if (!result.ok) {
        return { ...tool, installed: false, version: null };
      }

      const raw = result.stdout || result.stderr;
      const version = tool.parseVersion
        ? tool.parseVersion(raw)
        : extractVersion(raw);

      return {
        ...tool,
        installed: true,
        version,
        raw: raw.split("\n")[0], // Keep first line for debugging
      };
    })
  );

  return results;
}

// Group results by category for display
function groupByCategory(results) {
  const groups = {};
  for (const result of results) {
    const cat = result.category || "other";
    if (!groups[cat]) groups[cat] = [];
    groups[cat].push(result);
  }
  return groups;
}

module.exports = { detectAll, groupByCategory };
Enter fullscreen mode Exit fullscreen mode

Layer 2: The Analyzer

The analyzer reads project files to figure out what the project actually needs. This is where "zero-config" gets interesting -- we're inferring requirements from files that already exist.

Reading Project Signals

Every project leaves breadcrumbs. A package.json tells us about Node.js requirements. A Dockerfile tells us about container needs. A .python-version file tells us about Python. We read all of them:

// lib/analyzer.js
const fs = require("fs");
const path = require("path");

function analyzeProject(dir = process.cwd()) {
  const requirements = [];
  const signals = [];

  // --- Node.js ecosystem ---
  const pkgPath = path.join(dir, "package.json");
  if (fs.existsSync(pkgPath)) {
    try {
      const pkg = JSON.parse(fs.readFileSync(pkgPath, "utf8"));

      signals.push({ file: "package.json", type: "node-project" });

      // Engine constraints
      if (pkg.engines?.node) {
        requirements.push({
          tool: "node",
          range: pkg.engines.node,
          source: "package.json engines.node",
        });
      }
      if (pkg.engines?.npm) {
        requirements.push({
          tool: "npm",
          range: pkg.engines.npm,
          source: "package.json engines.npm",
        });
      }

      // Detect package manager from lockfiles
      if (fs.existsSync(path.join(dir, "pnpm-lock.yaml"))) {
        requirements.push({
          tool: "pnpm",
          range: null,
          source: "pnpm-lock.yaml exists",
        });
        signals.push({ file: "pnpm-lock.yaml", type: "pnpm-project" });
      } else if (fs.existsSync(path.join(dir, "yarn.lock"))) {
        requirements.push({
          tool: "yarn",
          range: null,
          source: "yarn.lock exists",
        });
        signals.push({ file: "yarn.lock", type: "yarn-project" });
      } else if (fs.existsSync(path.join(dir, "bun.lockb"))) {
        requirements.push({
          tool: "bun",
          range: null,
          source: "bun.lockb exists",
        });
        signals.push({ file: "bun.lockb", type: "bun-project" });
      } else {
        requirements.push({
          tool: "npm",
          range: null,
          source: "package.json (default)",
        });
      }

      // packageManager field (corepack)
      if (pkg.packageManager) {
        const [name, version] = pkg.packageManager.split("@");
        requirements.push({
          tool: name,
          range: `>=${version}`,
          source: `package.json packageManager`,
        });
        signals.push({
          file: "package.json",
          type: "corepack",
          detail: pkg.packageManager,
        });
      }

      // Check for native dependencies that need compilers
      const allDeps = {
        ...pkg.dependencies,
        ...pkg.devDependencies,
      };
      const nativeDeps = [
        "sharp",
        "bcrypt",
        "canvas",
        "sqlite3",
        "better-sqlite3",
        "node-sass",
        "fsevents",
        "cpu-features",
      ];
      const foundNative = Object.keys(allDeps || {}).filter((d) =>
        nativeDeps.includes(d)
      );
      if (foundNative.length > 0) {
        signals.push({
          file: "package.json",
          type: "native-deps",
          detail: foundNative.join(", "),
        });
        requirements.push({
          tool: "python3",
          range: null,
          source: `native deps: ${foundNative.join(", ")} (node-gyp)`,
          optional: true,
        });
      }
    } catch (e) {
      signals.push({
        file: "package.json",
        type: "parse-error",
        detail: e.message,
      });
    }
  }

  // --- .nvmrc / .node-version ---
  for (const f of [".nvmrc", ".node-version"]) {
    const fp = path.join(dir, f);
    if (fs.existsSync(fp)) {
      const content = fs.readFileSync(fp, "utf8").trim().replace(/^v/, "");
      requirements.push({
        tool: "node",
        range: `>=${content}`,
        source: f,
      });
      signals.push({ file: f, type: "node-version-file" });
    }
  }

  // --- Python ---
  for (const f of [".python-version", "runtime.txt"]) {
    const fp = path.join(dir, f);
    if (fs.existsSync(fp)) {
      const content = fs.readFileSync(fp, "utf8").trim();
      const version = content.replace(/^python-/, "");
      requirements.push({
        tool: "python3",
        range: `>=${version}`,
        source: f,
      });
      signals.push({ file: f, type: "python-project" });
    }
  }

  if (
    fs.existsSync(path.join(dir, "requirements.txt")) ||
    fs.existsSync(path.join(dir, "Pipfile")) ||
    fs.existsSync(path.join(dir, "pyproject.toml"))
  ) {
    const found = ["requirements.txt", "Pipfile", "pyproject.toml"].find((f) =>
      fs.existsSync(path.join(dir, f))
    );
    requirements.push({
      tool: "python3",
      range: null,
      source: `${found} exists`,
    });
    signals.push({ file: found, type: "python-project" });
  }

  // --- Ruby ---
  if (fs.existsSync(path.join(dir, "Gemfile"))) {
    requirements.push({ tool: "ruby", range: null, source: "Gemfile exists" });
    signals.push({ file: "Gemfile", type: "ruby-project" });

    const rvPath = path.join(dir, ".ruby-version");
    if (fs.existsSync(rvPath)) {
      const ver = fs.readFileSync(rvPath, "utf8").trim();
      requirements.push({
        tool: "ruby",
        range: `>=${ver}`,
        source: ".ruby-version",
      });
    }
  }

  // --- Go ---
  const goModPath = path.join(dir, "go.mod");
  if (fs.existsSync(goModPath)) {
    const content = fs.readFileSync(goModPath, "utf8");
    const goMatch = content.match(/^go\s+(\d+\.\d+)/m);
    if (goMatch) {
      requirements.push({
        tool: "go",
        range: `>=${goMatch[1]}`,
        source: "go.mod",
      });
    }
    signals.push({ file: "go.mod", type: "go-project" });
  }

  // --- Rust ---
  if (fs.existsSync(path.join(dir, "Cargo.toml"))) {
    requirements.push({
      tool: "rustc",
      range: null,
      source: "Cargo.toml exists",
    });
    requirements.push({
      tool: "cargo",
      range: null,
      source: "Cargo.toml exists",
    });
    signals.push({ file: "Cargo.toml", type: "rust-project" });

    const toolchainPath = path.join(dir, "rust-toolchain.toml");
    if (fs.existsSync(toolchainPath)) {
      const tc = fs.readFileSync(toolchainPath, "utf8");
      const chMatch = tc.match(/channel\s*=\s*"(\d+\.\d+(?:\.\d+)?)"/);
      if (chMatch) {
        requirements.push({
          tool: "rustc",
          range: `>=${chMatch[1]}`,
          source: "rust-toolchain.toml",
        });
      }
    }
  }

  // --- Docker ---
  if (
    fs.existsSync(path.join(dir, "Dockerfile")) ||
    fs.existsSync(path.join(dir, "docker-compose.yml")) ||
    fs.existsSync(path.join(dir, "docker-compose.yaml")) ||
    fs.existsSync(path.join(dir, "compose.yml")) ||
    fs.existsSync(path.join(dir, "compose.yaml"))
  ) {
    requirements.push({
      tool: "docker",
      range: null,
      source: "Docker files found",
    });
    signals.push({ file: "Dockerfile/compose", type: "docker-project" });

    if (
      fs.existsSync(path.join(dir, "docker-compose.yml")) ||
      fs.existsSync(path.join(dir, "docker-compose.yaml")) ||
      fs.existsSync(path.join(dir, "compose.yml"))
    ) {
      requirements.push({
        tool: "docker-compose",
        range: null,
        source: "Compose file found",
      });
    }
  }

  // --- Terraform ---
  const tfFiles = fs.readdirSync(dir).filter((f) => f.endsWith(".tf"));
  if (tfFiles.length > 0) {
    requirements.push({
      tool: "terraform",
      range: null,
      source: "*.tf files found",
    });
    signals.push({ file: tfFiles[0], type: "terraform-project" });
  }

  // --- Git (always required) ---
  if (fs.existsSync(path.join(dir, ".git"))) {
    requirements.push({
      tool: "git",
      range: ">=2.0.0",
      source: ".git directory",
    });
    signals.push({ file: ".git", type: "git-repo" });
  }

  // --- Makefile ---
  if (fs.existsSync(path.join(dir, "Makefile"))) {
    requirements.push({
      tool: "make",
      range: null,
      source: "Makefile exists",
    });
    signals.push({ file: "Makefile", type: "make-project" });
  }

  return { requirements, signals };
}

module.exports = { analyzeProject };
Enter fullscreen mode Exit fullscreen mode

This is the core insight of the tool: we don't need a config file because every project already has one -- it's the combination of lockfiles, version files, manifests, and toolchain configs that already exist.

Layer 3: The Reporter

Now we need to present results clearly. We'll build a terminal reporter that uses ANSI colors (no dependencies required) and a JSON mode for CI pipelines.

ANSI Terminal Formatting

Instead of pulling in chalk (43 dependencies as of v5) or kleur, we use ANSI escape codes directly. This keeps our tool truly zero-dependency:

// lib/format.js
const isColorSupported =
  process.stdout.isTTY &&
  process.env.TERM !== "dumb" &&
  !process.env.NO_COLOR;

const c = isColorSupported
  ? {
      reset: "\x1b[0m",
      bold: "\x1b[1m",
      dim: "\x1b[2m",
      red: "\x1b[31m",
      green: "\x1b[32m",
      yellow: "\x1b[33m",
      blue: "\x1b[34m",
      cyan: "\x1b[36m",
      gray: "\x1b[90m",
    }
  : Object.fromEntries(
      [
        "reset",
        "bold",
        "dim",
        "red",
        "green",
        "yellow",
        "blue",
        "cyan",
        "gray",
      ].map((k) => [k, ""])
    );

const STATUS = {
  ok: `${c.green}OK${c.reset}`,
  missing: `${c.red}MISSING${c.reset}`,
  outdated: `${c.yellow}OUTDATED${c.reset}`,
  found: `${c.blue}FOUND${c.reset}`,
  unknown: `${c.gray}???${c.reset}`,
};

module.exports = { c, STATUS };
Enter fullscreen mode Exit fullscreen mode

The Report Renderer

// lib/reporter.js
const { c, STATUS } = require("./format");
const { satisfiesRange } = require("./version");

const CATEGORY_LABELS = {
  runtime: "Runtimes",
  "package-manager": "Package Managers",
  vcs: "Version Control",
  container: "Containers & Orchestration",
  cloud: "Cloud CLIs",
  infra: "Infrastructure",
  build: "Build Tools",
  compiler: "Compilers",
  other: "Other",
};

function renderReport(detected, analysis) {
  const { requirements, signals } = analysis;
  const lines = [];

  // Header
  lines.push("");
  lines.push(`${c.bold}${c.cyan}devscan${c.reset} Environment Report`);
  lines.push(`${c.gray}${"".repeat(50)}${c.reset}`);

  // Project signals
  if (signals.length > 0) {
    lines.push("");
    lines.push(`${c.bold}Project Signals${c.reset}`);
    for (const sig of signals) {
      lines.push(
        `  ${c.dim}${sig.file}${c.reset} ${c.gray}${c.reset} ${sig.type}${sig.detail ? ` (${sig.detail})` : ""}`
      );
    }
  }

  // Requirements check
  if (requirements.length > 0) {
    lines.push("");
    lines.push(`${c.bold}Requirements${c.reset}`);

    let allSatisfied = true;

    for (const req of requirements) {
      const tool = detected.find((d) => d.id === req.tool);
      let status;
      let detail;

      if (!tool || !tool.installed) {
        status = req.optional ? STATUS.unknown : STATUS.missing;
        detail = req.range ? `needs ${req.range}` : "not installed";
        if (!req.optional) allSatisfied = false;
      } else if (req.range && tool.version) {
        const ok = satisfiesRange(tool.version, req.range);
        if (ok === true) {
          status = STATUS.ok;
          detail = `${tool.version} ${c.dim}(${req.range})${c.reset}`;
        } else if (ok === false) {
          status = STATUS.outdated;
          detail = `${tool.version} ${c.dim}needs ${req.range}${c.reset}`;
          allSatisfied = false;
        } else {
          status = STATUS.unknown;
          detail = `${tool.version} ${c.dim}(can't verify range)${c.reset}`;
        }
      } else {
        status = STATUS.ok;
        detail = tool.version || "installed";
      }

      const optional = req.optional
        ? ` ${c.dim}(optional)${c.reset}`
        : "";
      lines.push(
        `  [${status}] ${c.bold}${tool?.name || req.tool}${c.reset} ${detail}${optional}`
      );
      lines.push(`         ${c.dim}${req.source}${c.reset}`);
    }

    lines.push("");
    lines.push(
      allSatisfied
        ? `  ${c.green}${c.bold}All requirements satisfied.${c.reset}`
        : `  ${c.red}${c.bold}Some requirements are not met.${c.reset}`
    );
  }

  // Full inventory (non-required tools that are installed)
  const requiredIds = new Set(requirements.map((r) => r.tool));
  const extras = detected.filter(
    (d) => d.installed && !requiredIds.has(d.id)
  );

  if (extras.length > 0) {
    lines.push("");
    lines.push(`${c.bold}Also Installed${c.reset}`);
    for (const tool of extras) {
      lines.push(
        `  ${c.dim}[${STATUS.found}]${c.reset} ${tool.name} ${c.dim}${tool.version || ""}${c.reset}`
      );
    }
  }

  lines.push("");
  return lines.join("\n");
}

function renderJSON(detected, analysis) {
  const { requirements, signals } = analysis;

  const result = {
    timestamp: new Date().toISOString(),
    signals,
    requirements: requirements.map((req) => {
      const tool = detected.find((d) => d.id === req.tool);
      return {
        tool: req.tool,
        required: req.range,
        installed: tool?.installed || false,
        version: tool?.version || null,
        satisfied: tool?.installed
          ? req.range
            ? satisfiesRange(tool.version, req.range)
            : true
          : false,
        source: req.source,
        optional: req.optional || false,
      };
    }),
    inventory: detected
      .filter((d) => d.installed)
      .map((d) => ({
        id: d.id,
        name: d.name,
        version: d.version,
        category: d.category,
      })),
  };

  return JSON.stringify(result, null, 2);
}

module.exports = { renderReport, renderJSON };
Enter fullscreen mode Exit fullscreen mode

Wiring It All Together

The CLI entry point ties the layers together with argument parsing (again, zero dependencies):

#!/usr/bin/env node
// bin/devscan.js

const { detectAll } = require("../lib/detector");
const { analyzeProject } = require("../lib/analyzer");
const { renderReport, renderJSON } = require("../lib/reporter");

async function main() {
  const args = process.argv.slice(2);
  const jsonMode = args.includes("--json");
  const dir = args.find((a) => !a.startsWith("-")) || process.cwd();

  if (args.includes("--help") || args.includes("-h")) {
    console.log(`
Usage: devscan [directory] [options]

Options:
  --json    Output as JSON (for CI pipelines)
  --help    Show this message

Examples:
  devscan                  Scan current directory
  devscan ./my-project     Scan a specific project
  devscan --json | jq .    Pipe JSON output to jq
`);
    process.exit(0);
  }

  // Run detection and analysis in parallel
  const [detected, analysis] = await Promise.all([
    detectAll(),
    Promise.resolve(analyzeProject(dir)),
  ]);

  if (jsonMode) {
    console.log(renderJSON(detected, analysis));
  } else {
    console.log(renderReport(detected, analysis));
  }

  // Exit with code 1 if any non-optional requirement is unmet
  const hasFailure = analysis.requirements.some((req) => {
    if (req.optional) return false;
    const tool = detected.find((d) => d.id === req.tool);
    if (!tool || !tool.installed) return true;
    if (req.range && tool.version) {
      const { satisfiesRange } = require("../lib/version");
      return satisfiesRange(tool.version, req.range) === false;
    }
    return false;
  });

  process.exit(hasFailure ? 1 : 0);
}

main().catch((err) => {
  console.error(`devscan: ${err.message}`);
  process.exit(2);
});
Enter fullscreen mode Exit fullscreen mode

Using It In CI

The --json flag and non-zero exit code make devscan perfect for CI pipelines. Here's a GitHub Actions example:

# .github/workflows/env-check.yml
name: Environment Check
on: [pull_request]

jobs:
  check-env:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with:
          node-version: "20"
      - run: npx devscan --json > env-report.json
      - name: Check requirements
        run: |
          node -e "
            const report = require('./env-report.json');
            const failures = report.requirements.filter(r => !r.satisfied && !r.optional);
            if (failures.length > 0) {
              console.log('Missing requirements:');
              failures.forEach(f => console.log('  -', f.tool, f.required || '(any version)'));
              process.exit(1);
            }
            console.log('All requirements satisfied');
          "
Enter fullscreen mode Exit fullscreen mode

Advanced: Detecting Version Managers

Sophisticated developers use version managers like nvm, asdf, mise, or volta. Our scanner can detect these and warn when a project specifies a version that isn't installed in the version manager:

// lib/version-managers.js
const { safeExec } = require("./exec");
const path = require("path");
const fs = require("fs");
const os = require("os");

async function detectVersionManagers() {
  const managers = [];

  // nvm
  const nvmDir = process.env.NVM_DIR || path.join(os.homedir(), ".nvm");
  if (fs.existsSync(nvmDir)) {
    // List installed Node versions
    const versionsDir = path.join(nvmDir, "versions", "node");
    let versions = [];
    if (fs.existsSync(versionsDir)) {
      versions = fs
        .readdirSync(versionsDir)
        .filter((d) => d.startsWith("v"))
        .map((d) => d.slice(1));
    }
    managers.push({
      id: "nvm",
      name: "nvm",
      manages: "node",
      installedVersions: versions,
    });
  }

  // mise (formerly rtx)
  const miseResult = await safeExec("mise", ["ls", "--json"]);
  if (miseResult.ok) {
    try {
      const miseData = JSON.parse(miseResult.stdout);
      managers.push({
        id: "mise",
        name: "mise",
        manages: "multiple",
        tools: Object.keys(miseData),
      });
    } catch {}
  }

  // asdf
  const asdfResult = await safeExec("asdf", ["list"]);
  if (asdfResult.ok) {
    managers.push({
      id: "asdf",
      name: "asdf",
      manages: "multiple",
      raw: asdfResult.stdout,
    });
  }

  // volta
  const voltaResult = await safeExec("volta", ["list", "--format=plain"]);
  if (voltaResult.ok) {
    managers.push({
      id: "volta",
      name: "Volta",
      manages: "node",
      raw: voltaResult.stdout,
    });
  }

  return managers;
}

module.exports = { detectVersionManagers };
Enter fullscreen mode Exit fullscreen mode

Advanced: Monorepo Support

Real-world projects are often monorepos. We can recursively scan workspace directories:

// lib/monorepo.js
const fs = require("fs");
const path = require("path");

function findWorkspaces(dir) {
  const pkgPath = path.join(dir, "package.json");
  if (!fs.existsSync(pkgPath)) return [];

  const pkg = JSON.parse(fs.readFileSync(pkgPath, "utf8"));
  const patterns = pkg.workspaces?.packages || pkg.workspaces || [];

  if (!Array.isArray(patterns) || patterns.length === 0) {
    // Check for pnpm workspaces
    const pnpmWs = path.join(dir, "pnpm-workspace.yaml");
    if (fs.existsSync(pnpmWs)) {
      const content = fs.readFileSync(pnpmWs, "utf8");
      const match = content.match(/-\s*['"]?([^'":\n]+)/g);
      if (match) {
        return match.map((m) => m.replace(/^-\s*['"]?/, "").replace(/['"]$/, ""));
      }
    }
    return [];
  }

  return patterns;
}

function resolveWorkspaceDirs(dir, patterns) {
  const dirs = [];
  for (const pattern of patterns) {
    // Simple glob: "packages/*"
    const base = pattern.replace(/\/\*$/, "");
    const fullBase = path.join(dir, base);
    if (fs.existsSync(fullBase) && fs.statSync(fullBase).isDirectory()) {
      if (pattern.endsWith("/*")) {
        const children = fs.readdirSync(fullBase).filter((f) => {
          const fp = path.join(fullBase, f);
          return (
            fs.statSync(fp).isDirectory() &&
            fs.existsSync(path.join(fp, "package.json"))
          );
        });
        dirs.push(...children.map((c) => path.join(fullBase, c)));
      } else {
        dirs.push(fullBase);
      }
    }
  }
  return dirs;
}

module.exports = { findWorkspaces, resolveWorkspaceDirs };
Enter fullscreen mode Exit fullscreen mode

Performance: Keeping It Under 500ms

Speed matters for developer tools. Nobody wants to wait 3 seconds for an environment check. Here's how we keep execution time under 500ms:

  1. Parallel execution: All execFile calls run concurrently via Promise.all
  2. Timeouts: Each command gets 5 seconds max (prevents hangs from gcloud or aws)
  3. No unnecessary work: We only read files that exist (no try/catch on readFileSync)
  4. No dependencies: Zero require overhead from node_modules

Benchmarking on a MacBook Pro M2 with 18 tools installed:

$ time devscan --json > /dev/null
real    0m0.287s
user    0m0.089s
sys     0m0.041s
Enter fullscreen mode Exit fullscreen mode

287 milliseconds to check 24 tools. The bottleneck is always the slowest tool to respond (usually docker --version or gcloud --version).

Testing

Since our tool relies on external commands, we need to mock them for reliable tests. Here's a pattern using dependency injection:

// test/detector.test.js
const assert = require("assert");
const { extractVersion, satisfiesRange } = require("../lib/version");

// Unit tests for version parsing
const tests = [
  ["v20.11.0", "20.11.0"],
  ["Python 3.12.1", "3.12.1"],
  ["rustc 1.75.0 (82e1608df 2023-12-21)", "1.75.0"],
  ["go version go1.21.5 darwin/arm64", "1.21.5"],
  ["git version 2.43.0", "2.43.0"],
  ["docker version 24.0.7, build afdd53b", "24.0.7"],
  ["GNU Make 3.81", "3.81"],
  ["yarn 4.0.2", "4.0.2"],
];

for (const [input, expected] of tests) {
  const result = extractVersion(input);
  assert.strictEqual(result, expected, `extractVersion("${input}") = ${result}, expected ${expected}`);
}

// Range satisfaction tests
assert.strictEqual(satisfiesRange("20.11.0", ">=18.0.0"), true);
assert.strictEqual(satisfiesRange("16.0.0", ">=18.0.0"), false);
assert.strictEqual(satisfiesRange("20.11.0", "^20.0.0"), true);
assert.strictEqual(satisfiesRange("21.0.0", "^20.0.0"), false);
assert.strictEqual(satisfiesRange("18.17.1", "18.x"), true);

console.log(`All ${tests.length + 5} tests passed.`);
Enter fullscreen mode Exit fullscreen mode

Publishing As An npm Package

The package.json for devscan:

{
  "name": "devscan",
  "version": "1.0.0",
  "description": "Zero-config development environment scanner",
  "bin": {
    "devscan": "./bin/devscan.js"
  },
  "files": ["bin/", "lib/"],
  "keywords": [
    "developer-tools",
    "environment",
    "scanner",
    "cli",
    "devops",
    "zero-config"
  ],
  "engines": {
    "node": ">=16.0.0"
  },
  "license": "MIT"
}
Enter fullscreen mode Exit fullscreen mode

Zero dependencies. Works with Node 16+. The entire package is under 20KB.

What We've Built And What Comes Next

We built a tool that:

  • Detects 24 development tools in under 300ms
  • Infers project requirements from existing config files -- no .devscan.yaml needed
  • Compares detected vs. required with clear pass/fail output
  • Outputs JSON for CI integration
  • Has zero dependencies -- ships as pure Node.js

From here, you could extend it with:

  • Auto-fix suggestions: "Run nvm use 20 to switch Node versions"
  • Team environment diffing: "Your teammate has Node 18 but you have Node 20"
  • VS Code extension: Show environment status in the status bar
  • Pre-commit hook: Block commits if the environment doesn't match

The zero-config philosophy works because modern projects already declare their requirements -- they just scatter them across a dozen different files. All we did was teach a tool to read all of them at once.

The code for devscan is available as an npm package. Run npx devscan in any project directory to try it out.


Wilson Xu is a developer tools engineer who builds CLI tools and writes about the Node.js ecosystem. Find his work on GitHub at @chengyixu.

Top comments (0)