DEV Community

Wilson Xu
Wilson Xu

Posted on

Building a GraphQL Explorer CLI — Query Any API from Your Terminal

Building a GraphQL Explorer CLI — Query Any API from Your Terminal

Testing GraphQL APIs shouldn't require firing up a browser. GraphQL Playground, Insomnia, and Altair are excellent tools, but they pull you out of your terminal workflow. If you're already SSH'd into a server, running a CI pipeline, or simply prefer the command line, you need something faster.

In this tutorial, we'll build gql-explorer, a Node.js CLI that introspects any GraphQL endpoint, lets you explore schemas interactively, run queries with autocomplete, manage authentication headers, and bookmark your favorite queries — all without leaving the terminal.

The Problem with Browser-Based GraphQL Tools

Every GraphQL developer knows the dance: write code in your editor, switch to Playground to test a query, copy the result back, tweak the query, switch again. Browser-based tools are powerful but introduce context switching that breaks flow.

There are deeper issues too. Browser tools don't integrate with shell pipelines. You can't pipe a GraphQL response into jq, feed it to another script, or run queries in a loop. They don't work over SSH. They can't be scripted for CI/CD. And they eat RAM — a Chromium tab running Playground consumes 200-400MB for what is essentially a fancy curl wrapper.

A CLI solves all of these problems. Let's build one.

Project Setup

Initialize the project and install dependencies:

mkdir gql-explorer && cd gql-explorer
npm init -y
Enter fullscreen mode Exit fullscreen mode
npm install graphql graphql-tag node-fetch@2 chalk inquirer ora cli-table3 \
  conf commander readline-sync strip-ansi
Enter fullscreen mode Exit fullscreen mode

Here's what each dependency does:

  • graphql + graphql-tag: Parse and validate GraphQL queries
  • node-fetch: Make HTTP requests to GraphQL endpoints
  • chalk: Syntax highlighting for JSON output
  • inquirer: Interactive prompts and autocomplete
  • ora: Spinner for loading states
  • cli-table3: Pretty-print schema tables
  • conf: Persist saved queries and settings
  • commander: CLI argument parsing

Create the entry point:

touch index.js
chmod +x index.js
Enter fullscreen mode Exit fullscreen mode

Core Architecture

The CLI has four main modules: the HTTP client, the schema introspector, the query runner, and the interactive explorer. Let's start with the foundation.

#!/usr/bin/env node
// index.js

const { Command } = require('commander');
const chalk = require('chalk');
const Conf = require('conf');

const config = new Conf({ projectName: 'gql-explorer' });
const program = new Command();

program
  .name('gql-explorer')
  .description('Query any GraphQL API from your terminal')
  .version('1.0.0');

program
  .command('introspect <endpoint>')
  .description('Introspect a GraphQL endpoint and display its schema')
  .option('-H, --header <headers...>', 'HTTP headers (key:value)')
  .action(introspectCommand);

program
  .command('query <endpoint>')
  .description('Run a GraphQL query')
  .option('-q, --query <query>', 'GraphQL query string')
  .option('-f, --file <file>', 'Read query from file')
  .option('-v, --variables <json>', 'Query variables as JSON')
  .option('-H, --header <headers...>', 'HTTP headers')
  .option('--save <name>', 'Save this query as a bookmark')
  .option('--timing', 'Show response timing', true)
  .action(queryCommand);

program
  .command('explore <endpoint>')
  .description('Interactive schema explorer with autocomplete')
  .option('-H, --header <headers...>', 'HTTP headers')
  .action(exploreCommand);

program
  .command('saved')
  .description('List and run saved queries')
  .action(savedCommand);

program
  .command('history')
  .description('Show query history')
  .option('-n, --count <number>', 'Number of entries', '20')
  .action(historyCommand);

program.parse();
Enter fullscreen mode Exit fullscreen mode

The GraphQL HTTP Client

The client handles all communication with GraphQL endpoints. It supports custom headers, measures response times, and tracks response sizes.

// lib/client.js

const fetch = require('node-fetch');
const chalk = require('chalk');

class GraphQLClient {
  constructor(endpoint, headers = {}) {
    this.endpoint = endpoint;
    this.headers = {
      'Content-Type': 'application/json',
      ...headers,
    };
  }

  parseHeaders(headerArgs) {
    if (!headerArgs) return {};
    const headers = {};
    for (const h of headerArgs) {
      const colonIndex = h.indexOf(':');
      if (colonIndex === -1) {
        console.error(chalk.red(`Invalid header format: ${h}`));
        console.error(chalk.gray('Use key:value format, e.g., Authorization:Bearer token123'));
        continue;
      }
      const key = h.substring(0, colonIndex).trim();
      const value = h.substring(colonIndex + 1).trim();
      headers[key] = value;
    }
    return headers;
  }

  async execute(query, variables = {}) {
    const startTime = process.hrtime.bigint();

    const response = await fetch(this.endpoint, {
      method: 'POST',
      headers: this.headers,
      body: JSON.stringify({ query, variables }),
    });

    const endTime = process.hrtime.bigint();
    const durationMs = Number(endTime - startTime) / 1_000_000;

    const text = await response.text();
    const sizeBytes = Buffer.byteLength(text, 'utf8');

    let data;
    try {
      data = JSON.parse(text);
    } catch {
      throw new Error(`Invalid JSON response: ${text.substring(0, 200)}`);
    }

    return {
      data,
      status: response.status,
      timing: {
        durationMs: Math.round(durationMs * 100) / 100,
        sizeBytes,
        sizeFormatted: formatBytes(sizeBytes),
      },
    };
  }
}

function formatBytes(bytes) {
  if (bytes < 1024) return `${bytes} B`;
  if (bytes < 1024 * 1024) return `${(bytes / 1024).toFixed(1)} KB`;
  return `${(bytes / (1024 * 1024)).toFixed(1)} MB`;
}

module.exports = { GraphQLClient, formatBytes };
Enter fullscreen mode Exit fullscreen mode

The process.hrtime.bigint() approach gives nanosecond precision for timing. This is far more accurate than Date.now(), which only resolves to milliseconds and can be affected by system clock adjustments.

Schema Introspection

GraphQL's self-documenting nature is its superpower. Every compliant endpoint responds to the introspection query, giving us the full schema. Let's build the introspector.

// lib/introspect.js

const { GraphQLClient } = require('./client');
const Table = require('cli-table3');
const chalk = require('chalk');
const ora = require('ora');

const INTROSPECTION_QUERY = `
  query IntrospectionQuery {
    __schema {
      queryType { name }
      mutationType { name }
      subscriptionType { name }
      types {
        kind
        name
        description
        fields(includeDeprecated: true) {
          name
          description
          isDeprecated
          deprecationReason
          args {
            name
            type {
              kind
              name
              ofType {
                kind
                name
                ofType {
                  kind
                  name
                }
              }
            }
            defaultValue
          }
          type {
            kind
            name
            ofType {
              kind
              name
              ofType {
                kind
                name
              }
            }
          }
        }
        inputFields {
          name
          type {
            kind
            name
            ofType { kind name }
          }
          defaultValue
        }
        enumValues(includeDeprecated: true) {
          name
          description
          isDeprecated
        }
      }
    }
  }
`;

async function introspect(endpoint, headers = {}) {
  const spinner = ora('Introspecting schema...').start();
  const client = new GraphQLClient(endpoint, headers);

  try {
    const { data, timing } = await client.execute(INTROSPECTION_QUERY);

    if (data.errors) {
      spinner.fail('Introspection failed');
      console.error(chalk.red(JSON.stringify(data.errors, null, 2)));
      return null;
    }

    spinner.succeed(
      `Schema loaded in ${chalk.cyan(timing.durationMs + 'ms')} (${timing.sizeFormatted})`
    );

    return data.data.__schema;
  } catch (error) {
    spinner.fail(`Connection failed: ${error.message}`);
    return null;
  }
}

function resolveType(typeObj) {
  if (!typeObj) return 'Unknown';
  if (typeObj.kind === 'NON_NULL') {
    return `${resolveType(typeObj.ofType)}!`;
  }
  if (typeObj.kind === 'LIST') {
    return `[${resolveType(typeObj.ofType)}]`;
  }
  return typeObj.name || 'Unknown';
}

function displaySchema(schema) {
  const userTypes = schema.types.filter(
    (t) => !t.name.startsWith('__') && !['String', 'Int', 'Float', 'Boolean', 'ID'].includes(t.name)
  );

  // Summary
  const queries = schema.types.find((t) => t.name === schema.queryType?.name);
  const mutations = schema.types.find((t) => t.name === schema.mutationType?.name);

  console.log('\n' + chalk.bold.underline('Schema Summary'));
  console.log(`  Queries:       ${chalk.green(queries?.fields?.length || 0)}`);
  console.log(`  Mutations:     ${chalk.yellow(mutations?.fields?.length || 0)}`);
  console.log(`  Custom Types:  ${chalk.cyan(userTypes.length)}`);

  // Queries table
  if (queries?.fields?.length) {
    console.log('\n' + chalk.bold.green('Queries'));
    const table = new Table({
      head: [chalk.white('Name'), chalk.white('Return Type'), chalk.white('Args')],
      colWidths: [30, 25, 40],
      wordWrap: true,
    });

    for (const field of queries.fields) {
      const args = field.args
        .map((a) => `${a.name}: ${resolveType(a.type)}`)
        .join(', ');
      table.push([
        field.isDeprecated ? chalk.strikethrough.gray(field.name) : field.name,
        resolveType(field.type),
        args || chalk.gray('none'),
      ]);
    }
    console.log(table.toString());
  }

  // Mutations table
  if (mutations?.fields?.length) {
    console.log('\n' + chalk.bold.yellow('Mutations'));
    const table = new Table({
      head: [chalk.white('Name'), chalk.white('Return Type'), chalk.white('Args')],
      colWidths: [30, 25, 40],
      wordWrap: true,
    });

    for (const field of mutations.fields) {
      const args = field.args
        .map((a) => `${a.name}: ${resolveType(a.type)}`)
        .join(', ');
      table.push([field.name, resolveType(field.type), args || chalk.gray('none')]);
    }
    console.log(table.toString());
  }

  // Types
  console.log('\n' + chalk.bold.cyan('Types'));
  const typeTable = new Table({
    head: [chalk.white('Name'), chalk.white('Kind'), chalk.white('Fields')],
    colWidths: [30, 15, 50],
    wordWrap: true,
  });

  for (const type of userTypes.slice(0, 30)) {
    const fieldList = type.fields
      ? type.fields.map((f) => f.name).join(', ')
      : type.enumValues
      ? type.enumValues.map((e) => e.name).join(' | ')
      : chalk.gray('');
    typeTable.push([type.name, type.kind, fieldList]);
  }
  console.log(typeTable.toString());
}

module.exports = { introspect, displaySchema, resolveType, INTROSPECTION_QUERY };
Enter fullscreen mode Exit fullscreen mode

Running gql-explorer introspect https://countries.trevorblades.com/graphql would output a neatly formatted table showing all available queries, mutations, and types with their fields and arguments.

Interactive Query Builder with Autocomplete

This is where the CLI becomes genuinely powerful. The interactive explorer uses schema introspection to provide field-level autocomplete as you build queries.

// lib/explorer.js

const inquirer = require('inquirer');
const chalk = require('chalk');
const { introspect, resolveType } = require('./introspect');
const { GraphQLClient } = require('./client');
const { highlightJson } = require('./highlight');

async function explore(endpoint, headers = {}) {
  const schema = await introspect(endpoint, headers);
  if (!schema) return;

  const client = new GraphQLClient(endpoint, headers);
  const typeMap = buildTypeMap(schema);

  console.log(chalk.bold('\nInteractive GraphQL Explorer'));
  console.log(chalk.gray('Navigate the schema and build queries interactively.\n'));

  let running = true;
  while (running) {
    const { action } = await inquirer.prompt([
      {
        type: 'list',
        name: 'action',
        message: 'What would you like to do?',
        choices: [
          { name: 'Browse queries', value: 'queries' },
          { name: 'Browse mutations', value: 'mutations' },
          { name: 'Explore a type', value: 'type' },
          { name: 'Run a raw query', value: 'raw' },
          { name: 'Build a query interactively', value: 'build' },
          { name: 'Exit', value: 'exit' },
        ],
      },
    ]);

    switch (action) {
      case 'queries':
        await browseOperations(schema, typeMap, 'query', client);
        break;
      case 'mutations':
        await browseOperations(schema, typeMap, 'mutation', client);
        break;
      case 'type':
        await exploreType(typeMap);
        break;
      case 'raw':
        await runRawQuery(client);
        break;
      case 'build':
        await buildQuery(schema, typeMap, client);
        break;
      case 'exit':
        running = false;
        break;
    }
  }
}

function buildTypeMap(schema) {
  const map = {};
  for (const type of schema.types) {
    map[type.name] = type;
  }
  return map;
}

async function buildQuery(schema, typeMap, client) {
  const queryType = typeMap[schema.queryType?.name];
  if (!queryType?.fields) {
    console.log(chalk.yellow('No queries available.'));
    return;
  }

  const { selectedField } = await inquirer.prompt([
    {
      type: 'list',
      name: 'selectedField',
      message: 'Select a query:',
      choices: queryType.fields.map((f) => ({
        name: `${f.name}${resolveType(f.type)}`,
        value: f,
      })),
    },
  ]);

  // Build arguments
  let argsString = '';
  if (selectedField.args.length > 0) {
    const argValues = {};
    for (const arg of selectedField.args) {
      const typeName = resolveType(arg.type);
      const isRequired = typeName.endsWith('!');

      const { value } = await inquirer.prompt([
        {
          type: 'input',
          name: 'value',
          message: `${arg.name} (${typeName}):`,
          when: () => true,
        },
      ]);

      if (value) {
        argValues[arg.name] = inferArgValue(value, typeName);
      } else if (isRequired) {
        console.log(chalk.red(`${arg.name} is required!`));
        return;
      }
    }

    if (Object.keys(argValues).length > 0) {
      const argParts = Object.entries(argValues).map(([k, v]) => `${k}: ${v}`);
      argsString = `(${argParts.join(', ')})`;
    }
  }

  // Select fields to return
  const returnType = unwrapType(selectedField.type);
  const returnTypeDef = typeMap[returnType];

  let fieldsString = '';
  if (returnTypeDef?.fields) {
    const { selectedFields } = await inquirer.prompt([
      {
        type: 'checkbox',
        name: 'selectedFields',
        message: 'Select fields to return:',
        choices: returnTypeDef.fields.map((f) => ({
          name: `${f.name}: ${resolveType(f.type)}`,
          value: f.name,
          checked: ['id', 'name', 'title'].includes(f.name),
        })),
      },
    ]);
    fieldsString = selectedFields.length > 0
      ? `{ ${selectedFields.join(' ')} }`
      : '{ id }';
  }

  const query = `query {\n  ${selectedField.name}${argsString} ${fieldsString}\n}`;
  console.log('\n' + chalk.bold('Generated Query:'));
  console.log(chalk.cyan(query) + '\n');

  const { shouldRun } = await inquirer.prompt([
    { type: 'confirm', name: 'shouldRun', message: 'Execute this query?', default: true },
  ]);

  if (shouldRun) {
    const { data, timing } = await client.execute(query);
    console.log(highlightJson(data));
    printTiming(timing);
  }
}

function inferArgValue(value, typeName) {
  const baseType = typeName.replace('!', '');
  if (['Int', 'Float'].includes(baseType)) return value;
  if (baseType === 'Boolean') return value;
  if (baseType === 'ID') return `"${value}"`;
  return `"${value}"`;
}

function unwrapType(typeObj) {
  if (!typeObj) return null;
  if (typeObj.name) return typeObj.name;
  return unwrapType(typeObj.ofType);
}

module.exports = { explore };
Enter fullscreen mode Exit fullscreen mode

The interactive builder walks you through each step: pick a query, fill in arguments (with type hints), select which fields you want back, and execute. No memorizing schema structures or guessing field names.

Pretty-Printed JSON with Syntax Highlighting

Raw JSON output is hard to read. Let's add syntax highlighting that makes responses scannable at a glance.

// lib/highlight.js

const chalk = require('chalk');

function highlightJson(obj, indent = 2) {
  const raw = JSON.stringify(obj, null, indent);
  return colorizeJson(raw);
}

function colorizeJson(json) {
  return json
    // String values (but not keys)
    .replace(/: "(.*?)"/g, `: ${chalk.green('"$1"')}`)
    // Keys
    .replace(/"(\w+)":/g, `${chalk.cyan('"$1"')}:`)
    // Numbers
    .replace(/: (\d+\.?\d*)/g, `: ${chalk.yellow('$1')}`)
    // Booleans
    .replace(/: (true|false)/g, (_, val) => `: ${chalk.magenta(val)}`)
    // Null
    .replace(/: (null)/g, `: ${chalk.gray('null')}`);
}

function printTiming(timing) {
  console.log(
    chalk.gray(`\n⏱  ${timing.durationMs}ms | ${timing.sizeFormatted}`)
  );
}

module.exports = { highlightJson, colorizeJson, printTiming };
Enter fullscreen mode Exit fullscreen mode

This approach colors keys in cyan, strings in green, numbers in yellow, booleans in magenta, and null in gray. The result is immediately scannable — your eyes can jump to the data you need without parsing walls of monochrome text.

Variable Interpolation and Saved Queries

Real-world queries use variables. Our CLI supports both inline JSON variables and variable files, plus the ability to save and recall frequently-used queries.

// lib/queries.js

const Conf = require('conf');
const chalk = require('chalk');
const Table = require('cli-table3');
const fs = require('fs');
const path = require('path');

const config = new Conf({ projectName: 'gql-explorer' });

function saveQuery(name, query, variables = {}, endpoint = '') {
  const saved = config.get('savedQueries', {});
  saved[name] = {
    query,
    variables,
    endpoint,
    savedAt: new Date().toISOString(),
  };
  config.set('savedQueries', saved);
  console.log(chalk.green(`Query saved as "${name}"`));
}

function getSavedQuery(name) {
  const saved = config.get('savedQueries', {});
  return saved[name] || null;
}

function listSavedQueries() {
  const saved = config.get('savedQueries', {});
  const entries = Object.entries(saved);

  if (entries.length === 0) {
    console.log(chalk.gray('No saved queries. Use --save <name> when running a query.'));
    return;
  }

  const table = new Table({
    head: [chalk.white('Name'), chalk.white('Endpoint'), chalk.white('Saved At')],
    colWidths: [25, 40, 25],
  });

  for (const [name, data] of entries) {
    table.push([name, data.endpoint || chalk.gray(''), data.savedAt.split('T')[0]]);
  }

  console.log(table.toString());
}

function addToHistory(query, endpoint, variables, timing) {
  const history = config.get('queryHistory', []);
  history.unshift({
    query: query.substring(0, 200),
    endpoint,
    variables,
    timing,
    timestamp: new Date().toISOString(),
  });
  // Keep last 100 entries
  config.set('queryHistory', history.slice(0, 100));
}

function getHistory(count = 20) {
  const history = config.get('queryHistory', []);
  return history.slice(0, count);
}

function displayHistory(count) {
  const entries = getHistory(count);

  if (entries.length === 0) {
    console.log(chalk.gray('No query history yet.'));
    return;
  }

  const table = new Table({
    head: [
      chalk.white('#'),
      chalk.white('Query'),
      chalk.white('Endpoint'),
      chalk.white('Time'),
      chalk.white('When'),
    ],
    colWidths: [5, 40, 30, 10, 20],
    wordWrap: true,
  });

  entries.forEach((entry, i) => {
    table.push([
      i + 1,
      entry.query.substring(0, 35) + (entry.query.length > 35 ? '...' : ''),
      new URL(entry.endpoint).hostname,
      `${entry.timing?.durationMs || '?'}ms`,
      timeAgo(entry.timestamp),
    ]);
  });

  console.log(table.toString());
}

function timeAgo(timestamp) {
  const seconds = Math.floor((Date.now() - new Date(timestamp)) / 1000);
  if (seconds < 60) return 'just now';
  if (seconds < 3600) return `${Math.floor(seconds / 60)}m ago`;
  if (seconds < 86400) return `${Math.floor(seconds / 3600)}h ago`;
  return `${Math.floor(seconds / 86400)}d ago`;
}

function interpolateVariables(query, variables) {
  // Replace $varName patterns with actual values for inline usage
  let result = query;
  for (const [key, value] of Object.entries(variables)) {
    const pattern = new RegExp(`\\$${key}\\b`, 'g');
    result = result.replace(pattern, JSON.stringify(value));
  }
  return result;
}

module.exports = {
  saveQuery,
  getSavedQuery,
  listSavedQueries,
  addToHistory,
  displayHistory,
  interpolateVariables,
};
Enter fullscreen mode Exit fullscreen mode

Usage examples:

# Run a query with variables
gql-explorer query https://api.example.com/graphql \
  -q 'query($id: ID!) { user(id: $id) { name email } }' \
  -v '{"id": "123"}'

# Save a query for later
gql-explorer query https://api.example.com/graphql \
  -q '{ users { id name } }' \
  --save list-users

# View saved queries
gql-explorer saved
Enter fullscreen mode Exit fullscreen mode

Header Management for Authentication

Most production GraphQL APIs require authentication. The CLI supports multiple header formats and persistent header profiles.

// lib/headers.js

const Conf = require('conf');
const chalk = require('chalk');

const config = new Conf({ projectName: 'gql-explorer' });

function parseHeaders(headerArgs) {
  if (!headerArgs) return {};
  const headers = {};

  for (const h of headerArgs) {
    const colonIdx = h.indexOf(':');
    if (colonIdx === -1) {
      console.error(chalk.red(`Invalid header: "${h}". Use key:value format.`));
      continue;
    }
    headers[h.substring(0, colonIdx).trim()] = h.substring(colonIdx + 1).trim();
  }

  return headers;
}

function saveHeaderProfile(name, headers) {
  const profiles = config.get('headerProfiles', {});
  profiles[name] = headers;
  config.set('headerProfiles', profiles);
  console.log(chalk.green(`Header profile "${name}" saved.`));
}

function getHeaderProfile(name) {
  const profiles = config.get('headerProfiles', {});
  return profiles[name] || null;
}

function mergeHeaders(...headerSources) {
  return Object.assign({}, ...headerSources);
}

module.exports = { parseHeaders, saveHeaderProfile, getHeaderProfile, mergeHeaders };
Enter fullscreen mode Exit fullscreen mode

Common authentication patterns:

# Bearer token
gql-explorer query https://api.example.com/graphql \
  -H "Authorization:Bearer eyJhbGciOiJIUzI1NiIs..." \
  -q '{ me { name email } }'

# API key in custom header
gql-explorer query https://api.example.com/graphql \
  -H "X-API-Key:sk_live_abc123" \
  -q '{ products { id name price } }'

# Multiple headers
gql-explorer query https://api.example.com/graphql \
  -H "Authorization:Bearer token123" \
  -H "X-Request-ID:req_456" \
  -q '{ orders { id status } }'
Enter fullscreen mode Exit fullscreen mode

Wiring Up the Commands

Now let's connect everything in the main entry point:

// index.js (complete command handlers)

const { GraphQLClient } = require('./lib/client');
const { introspect, displaySchema } = require('./lib/introspect');
const { explore } = require('./lib/explorer');
const { highlightJson, printTiming } = require('./lib/highlight');
const { parseHeaders } = require('./lib/headers');
const {
  saveQuery,
  addToHistory,
  displayHistory,
  listSavedQueries,
  getSavedQuery,
  interpolateVariables,
} = require('./lib/queries');

async function introspectCommand(endpoint, options) {
  const headers = parseHeaders(options.header);
  const schema = await introspect(endpoint, headers);
  if (schema) {
    displaySchema(schema);
  }
}

async function queryCommand(endpoint, options) {
  const headers = parseHeaders(options.header);
  const client = new GraphQLClient(endpoint, headers);

  let query = options.query;
  if (options.file) {
    const fs = require('fs');
    query = fs.readFileSync(options.file, 'utf-8');
  }

  if (!query) {
    console.error(chalk.red('Provide a query with -q or -f'));
    process.exit(1);
  }

  let variables = {};
  if (options.variables) {
    try {
      variables = JSON.parse(options.variables);
    } catch {
      console.error(chalk.red('Invalid JSON in --variables'));
      process.exit(1);
    }
  }

  const ora = require('ora');
  const spinner = ora('Executing query...').start();

  try {
    const { data, timing } = await client.execute(query, variables);
    spinner.stop();

    if (data.errors) {
      console.error(chalk.red.bold('GraphQL Errors:'));
      for (const err of data.errors) {
        console.error(chalk.red(`  • ${err.message}`));
        if (err.locations) {
          for (const loc of err.locations) {
            console.error(chalk.gray(`    at line ${loc.line}, column ${loc.column}`));
          }
        }
      }
    }

    if (data.data) {
      console.log(highlightJson(data.data));
    }

    if (options.timing) {
      printTiming(timing);
    }

    // Save query if requested
    if (options.save) {
      saveQuery(options.save, query, variables, endpoint);
    }

    // Add to history
    addToHistory(query, endpoint, variables, timing);
  } catch (error) {
    spinner.fail(`Query failed: ${error.message}`);
  }
}

async function exploreCommand(endpoint, options) {
  const headers = parseHeaders(options.header);
  await explore(endpoint, headers);
}

async function savedCommand() {
  listSavedQueries();
}

async function historyCommand(options) {
  displayHistory(parseInt(options.count, 10));
}
Enter fullscreen mode Exit fullscreen mode

Shell Pipeline Integration

One of the CLI's biggest advantages over browser tools is pipeline integration. The output is valid JSON, so it works seamlessly with jq, grep, and other Unix tools.

# Extract just user names
gql-explorer query https://countries.trevorblades.com/graphql \
  -q '{ countries { name code } }' 2>/dev/null | jq '.countries[].name'

# Count results
gql-explorer query https://api.example.com/graphql \
  -q '{ products { id } }' 2>/dev/null | jq '.products | length'

# Feed into another script
gql-explorer query https://api.example.com/graphql \
  -q '{ users { email } }' 2>/dev/null | jq -r '.users[].email' | while read email; do
  echo "Processing $email"
done

# Save response to file
gql-explorer query https://api.example.com/graphql \
  -q '{ schema { types } }' > schema-backup.json
Enter fullscreen mode Exit fullscreen mode

To make this work properly, we ensure all non-data output (spinners, timing, errors) goes to stderr, while clean JSON goes to stdout:

// Detect if stdout is piped
const isPiped = !process.stdout.isTTY;

if (isPiped) {
  // Raw JSON only — no colors, no spinners
  process.stdout.write(JSON.stringify(data.data, null, 2));
} else {
  // Full pretty output with colors
  console.log(highlightJson(data.data));
  printTiming(timing);
}
Enter fullscreen mode Exit fullscreen mode

Performance: Query Timing and Response Stats

Every query automatically captures performance metrics. This is invaluable for optimizing API calls.

// lib/perf.js

const chalk = require('chalk');
const Table = require('cli-table3');
const Conf = require('conf');

const config = new Conf({ projectName: 'gql-explorer' });

function recordMetric(endpoint, query, timing) {
  const metrics = config.get('metrics', []);
  metrics.push({
    endpoint,
    queryHash: simpleHash(query),
    durationMs: timing.durationMs,
    sizeBytes: timing.sizeBytes,
    timestamp: Date.now(),
  });
  config.set('metrics', metrics.slice(-500));
}

function showStats(endpoint) {
  const metrics = config.get('metrics', []);
  const filtered = endpoint
    ? metrics.filter((m) => m.endpoint === endpoint)
    : metrics;

  if (filtered.length === 0) {
    console.log(chalk.gray('No metrics recorded yet.'));
    return;
  }

  const durations = filtered.map((m) => m.durationMs);
  const sizes = filtered.map((m) => m.sizeBytes);

  const table = new Table();
  table.push(
    { 'Total Queries': chalk.cyan(filtered.length) },
    { 'Avg Response Time': chalk.yellow(`${avg(durations).toFixed(1)}ms`) },
    { 'P95 Response Time': chalk.yellow(`${percentile(durations, 95).toFixed(1)}ms`) },
    { 'Max Response Time': chalk.red(`${Math.max(...durations).toFixed(1)}ms`) },
    { 'Avg Response Size': chalk.green(formatBytes(avg(sizes))) },
    { 'Total Data Transfer': chalk.green(formatBytes(sum(sizes))) }
  );

  console.log(chalk.bold('\nPerformance Stats'));
  console.log(table.toString());
}

function avg(arr) {
  return arr.reduce((a, b) => a + b, 0) / arr.length;
}

function sum(arr) {
  return arr.reduce((a, b) => a + b, 0);
}

function percentile(arr, p) {
  const sorted = [...arr].sort((a, b) => a - b);
  const index = Math.ceil((p / 100) * sorted.length) - 1;
  return sorted[index];
}

function simpleHash(str) {
  let hash = 0;
  for (const char of str) {
    hash = (hash << 5) - hash + char.charCodeAt(0);
    hash |= 0;
  }
  return hash.toString(36);
}

function formatBytes(bytes) {
  if (bytes < 1024) return `${Math.round(bytes)} B`;
  if (bytes < 1048576) return `${(bytes / 1024).toFixed(1)} KB`;
  return `${(bytes / 1048576).toFixed(1)} MB`;
}

module.exports = { recordMetric, showStats };
Enter fullscreen mode Exit fullscreen mode

The P95 metric is particularly useful — it tells you that 95% of your queries complete faster than that threshold. If your P95 is 500ms but your average is 50ms, you have outlier queries that need investigation.

Error Handling Done Right

GraphQL errors are structured differently from REST errors. A response can have both data and errors. Our CLI handles this gracefully:

function handleResponse(response) {
  const { data, status, timing } = response;

  // HTTP-level errors
  if (status >= 400) {
    console.error(chalk.red(`HTTP ${status}: ${getStatusText(status)}`));
    if (status === 401) {
      console.error(chalk.yellow('Hint: Use -H "Authorization:Bearer <token>" to authenticate'));
    }
    return;
  }

  // GraphQL-level errors (can coexist with data)
  if (data.errors) {
    console.error(chalk.red.bold('\nGraphQL Errors:'));
    for (const error of data.errors) {
      console.error(chalk.red(`  • ${error.message}`));

      // Show path for partial errors
      if (error.path) {
        console.error(chalk.gray(`    Path: ${error.path.join('')}`));
      }

      // Show validation details
      if (error.extensions?.code) {
        console.error(chalk.gray(`    Code: ${error.extensions.code}`));
      }
    }
  }

  // Partial data (errors + data)
  if (data.data) {
    if (data.errors) {
      console.log(chalk.yellow('\nPartial data returned:'));
    }
    console.log(highlightJson(data.data));
  }

  printTiming(timing);
}
Enter fullscreen mode Exit fullscreen mode

This handles the three scenarios: clean data, complete failure (errors only), and partial success (data + errors). The path display for partial errors is especially helpful — it tells you exactly which field in your query caused the problem.

Publishing as an npm Package

Make the CLI globally installable:

{
  "name": "gql-explorer",
  "version": "1.0.0",
  "description": "Query any GraphQL API from your terminal",
  "bin": {
    "gql-explorer": "./index.js"
  },
  "keywords": ["graphql", "cli", "terminal", "api", "explorer"],
  "files": ["index.js", "lib/"],
  "engines": {
    "node": ">=16.0.0"
  }
}
Enter fullscreen mode Exit fullscreen mode
npm link    # For local development
npm publish # When ready to share
Enter fullscreen mode Exit fullscreen mode

After publishing, anyone can install it with npm install -g gql-explorer and start querying GraphQL APIs immediately.

Real-World Usage Examples

Exploring the GitHub GraphQL API:

# Introspect GitHub's schema (massive — 400+ types)
gql-explorer introspect https://api.github.com/graphql \
  -H "Authorization:Bearer ghp_xxxxxxxxxxxx"

# Find your recent repos
gql-explorer query https://api.github.com/graphql \
  -H "Authorization:Bearer ghp_xxxxxxxxxxxx" \
  -q '{ viewer { repositories(first: 5, orderBy: {field: UPDATED_AT, direction: DESC}) { nodes { name stargazerCount updatedAt } } } }'

# Save it for repeated use
gql-explorer query https://api.github.com/graphql \
  -H "Authorization:Bearer ghp_xxxxxxxxxxxx" \
  -q '{ viewer { login name bio } }' \
  --save github-me
Enter fullscreen mode Exit fullscreen mode

Querying a public countries API:

# No auth needed
gql-explorer introspect https://countries.trevorblades.com/graphql

# Get all countries in Europe
gql-explorer query https://countries.trevorblades.com/graphql \
  -q '{ continent(code: "EU") { countries { name capital currency } } }'
Enter fullscreen mode Exit fullscreen mode

CI/CD integration:

#!/bin/bash
# health-check.sh — verify GraphQL API is responsive

RESPONSE=$(gql-explorer query "$GRAPHQL_ENDPOINT" \
  -H "Authorization:Bearer $API_TOKEN" \
  -q '{ __typename }' 2>/dev/null)

if echo "$RESPONSE" | jq -e '.data.__typename' > /dev/null 2>&1; then
  echo "API is healthy"
  exit 0
else
  echo "API health check failed"
  exit 1
fi
Enter fullscreen mode Exit fullscreen mode

What's Next

This CLI covers the core workflow, but there's room to grow. Consider adding WebSocket support for GraphQL subscriptions — libraries like graphql-ws make this straightforward. Query caching with TTLs could speed up repeated introspection calls. A .gqlrc config file per project would let teams share endpoint configurations and header profiles.

You could also add query validation against the introspected schema before sending the request, saving a round-trip when you have a typo in a field name. The graphql package's validate function handles this with three lines of code.

The point isn't to replace Playground entirely. It's to give you a tool that lives where you already work — in the terminal — and integrates with the Unix philosophy of composable, pipeable commands. When you need to quickly test a query, check an API's schema, or automate GraphQL calls in a script, reaching for a CLI is simply faster.

The complete source code is available for reference and extension. Clone it, add the features your workflow needs, and stop context-switching to the browser for every query.

Top comments (0)