I got tired of Googling prospects before sales calls.
Every time I'd prep for an outreach email, I'd spend 20-30 minutes on the same routine. Open their website. Scroll through the About page. Check their LinkedIn. Try to figure out what they sell, how they sell it, and where AI could save them time. Then I'd write a cold email that sounded like every other cold email.
So I built a thing. A prospect research agent that takes a company URL and spits out a 1-page briefing in about 90 seconds. I use it for my consulting business, but you could use it for any B2B outreach.
The stack is embarrassingly simple. Firecrawl for web scraping. Claude for analysis. That's it.
Setting It Up
First, install the Firecrawl MCP server. If you're running Claude Code, it's one config entry:
{
"mcpServers": {
"firecrawl": {
"command": "npx",
"args": ["-y", "firecrawl-mcp"]
}
}
}
Set your FIRECRAWL_API_KEY as an environment variable. Free tier gives you 500 credits/month. I've never gone over 200.
The Prompt
I wrote a system prompt that tells Claude what to extract. Nothing fancy:
You're a prospect research analyst. Given a company website, produce a 1-page briefing with:
1. COMPANY SNAPSHOT: What they do, who they sell to, approximate size
2. MANUAL PROCESSES: What looks like it's still done by hand (look for: contact forms, manual quoting, PDF catalogs, no API integrations)
3. AUTOMATION OPPORTUNITIES: 3-5 specific things AI could handle
4. ESTIMATED SAVINGS: Hours/week and rough dollar value per opportunity
5. CONVERSATION STARTERS: 2-3 specific observations to open a sales call
Use the company's own language. Reference specific pages you found.
Running It
I point it at a URL and let Firecrawl crawl the site. It grabs the homepage, about page, services/products pages, and any documentation it can find. Usually 5-15 pages depending on the site.
# Inside Claude Code, I run:
firecrawl_scrape({ url: "https://example.com", formats: ["markdown"] })
Then Claude reads through everything and builds the briefing.
A Real Example
I ran this on a staffing company in New Jersey last week. 45 employees, $8M revenue (per their Inc 5000 badge on the homepage, love when they make it easy).
The agent found:
- Job applications still go through a PDF form you email back. No ATS visible.
- Client intake is a contact form that goes to a generic inbox.
- They post to 4 job boards manually (I could see the "also posted on" tags).
- No chatbot. Their FAQ page had 47 questions, all static HTML.
Estimated savings: 15-20 hours/week on intake and posting alone. At $25/hr admin cost, that's $1,500-2,000/month. A voice AI agent handling the intake calls would save another 10 hours.
I sent them a cold email with 3 of those findings in the first paragraph. Got a reply in 4 hours. That never happens with generic outreach.
Iterating on the Output
The first version had problems. The briefing would hallucinate company size when it couldn't find revenue data. "Estimated $15M revenue" based on nothing. I added a hard rule: if you can't verify a number from the site itself, write "not found" instead of guessing. Fixed 90% of the accuracy issues.
I also added tech stack detection. Firecrawl picks up meta tags, script sources, and headers. If I can see they're running Shopify or HubSpot or a custom PHP site from 2014, that tells me a lot about their technical appetite. A company on Shopify is different from a company running a custom-built CMS. The first one will say yes to integrations. The second one will need convincing.
The other thing I changed: I made the briefing reference specific URLs. Instead of "their careers page could use improvement," it says "careers page at /join-us has a downloadable PDF application form with no online submission." Makes the cold email feel like you've done your homework. Because you have. Well, your agent has.
Scaling It
I ran this on 30 companies over the past two weeks. Each briefing costs about $0.03 in API calls (Firecrawl scrape + Claude analysis). Thirty briefings for under a dollar. Compare that to paying a VA $5-10 per prospect research report.
The time savings are stupid. 30 companies at 25 minutes each would be 12.5 hours of research. The agent did it in about 45 minutes total (90 seconds per company, but I run them in batches).
I'm now plugging this into my outreach pipeline. Scrape a lead list, run each URL through the agent, generate personalized first paragraphs, send. The whole thing from "I have 50 leads" to "50 personalized emails ready to review" takes about 20 minutes.
The Whole Thing Took 3 Hours to Build
Three hours of prompting and testing. That's it. Now every prospect email I send has specific, verified observations about their business. Response rates went from around 5% to closer to 20%.
The code is minimal. The value is in the prompt and knowing what to look for. Most B2B sites leak information about their operations if you know where to read. FAQ pages show you pain points. Career pages show you what roles they can't fill. Contact forms show you how manual their intake is.
If you sell B2B anything, build one of these. It's the highest-ROI afternoon I've spent this year.
I'm building AI systems like this for companies. If you want to see the other agents I run across 5 businesses: negodiuk.ai
Top comments (0)