DEV Community

sonpiaz
sonpiaz

Posted on

hidrix-tools: 18 MCP tools that give AI agents web search, social media scraping, and content analysis

I open-sourced hidrix-tools yesterday — an MCP server that gives AI coding agents the ability to search the web, scrape social media, and analyze content.

The problem

If you use Claude Code, Cursor, pi, or any CLI agent — they can read files and write code. But ask them to search X/Twitter, pull Reddit threads, or scrape a Facebook group? They can't.

I kept copy-pasting research into conversations. It was slow and broke the flow.

What I built

An MCP tool server with 18 tools. Plug it into any agent that supports MCP — no vendor lock-in.

Your AI Agent                    hidrix-tools                     Internet
┌──────────┐     MCP protocol    ┌──────────────┐                ┌─────────┐
│ Claude   │ ◄─────────────────► │ web_search   │ ◄────────────► │ Brave   │
│ Cursor   │                     │ x_search     │                │ X/Twitter│
│ pi       │                     │ reddit_search│                │ Reddit  │
│ Codex    │                     │ facebook_    │                │ Facebook│
│ ...      │                     │   scraper    │                │ YouTube │
└──────────┘                     └──────────────┘                └─────────┘
Enter fullscreen mode Exit fullscreen mode

Tools included

Search: web_search (Brave), x_search, x_thread_reader, x_user_posts, reddit_search, reddit_thread_reader, reddit_subreddit_top, youtube_search, tiktok_search

Scrape: web_fetch (URL → markdown), facebook_scraper (groups, pages, keyword search, Meta Ad Library)

LinkedIn: linkedin_search, linkedin_profile (no login needed)

Analyze: content_scorer (engagement ranking + time-decay), content_analyzer (topic clusters, patterns, trends)

Storage: SQLite — dedup, run history, query saved posts

What I shipped in the last 24 hours

11 commits since the initial release:

  • LinkedIn tools — search posts and profiles without login
  • Facebook scraper pipeline — groups, pages, search, Meta Ads
  • YouTube transcripts + channel tools
  • SQLite storage with dedup
  • Follow system — track people/keywords across platforms
  • Knowledge compiler — auto-builds an Obsidian wiki from collected data
  • Business intelligence — competitor tracking, market signals

The feature I use most: Knowledge compiler

My agent follows 10+ people on X. Every day it:

  1. Collects their new posts
  2. Stores them in SQLite (dedup)
  3. Extracts key insights
  4. Compiles wiki articles in my Obsidian vault

I wake up to an updated knowledge base every morning. Zero manual work.

Not a demo — just a cron job + MCP tools.

Install (3 commands)

git clone https://github.com/sonpiaz/hidrix-tools.git ~/.hidrix-tools
cd ~/.hidrix-tools && bun install && cp .env.example .env
# Add your API keys to .env
Enter fullscreen mode Exit fullscreen mode

Connect to Claude Code

{
  "mcpServers": {
    "hidrix-tools": {
      "command": "bun",
      "args": ["run", "~/.hidrix-tools/server.ts"]
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Works with Claude Code, Cursor, OpenClaw, pi, Codex, Hermes — any MCP client.

Adding your own tool

Tools are auto-discovered. Copy the template, edit, restart:

cp -r tools/_template tools/my-tool
# Edit tools/my-tool/index.ts
bun run server.ts
# [hidrix-tools] ✓ my_tool
Enter fullscreen mode Exit fullscreen mode

Links

If you're building with AI agents and tired of copy-pasting research, give it a try. Feedback welcome.

Top comments (0)