DEV Community

Cover image for Enterprise Search Just Got a Protocol Upgrade: Inside Pureinsights Discovery 2.8*
Om Shree
Om Shree

Posted on

Enterprise Search Just Got a Protocol Upgrade: Inside Pureinsights Discovery 2.8*

The Shreesozo Dispatch | MCP & Agentic AI | April 2026

The Problem Nobody Was Fixing Fast Enough

Enterprise search and AI agents have been living in parallel universes.

On one side: search platforms indexing PubMed, SharePoint, internal wikis, Oracle databases, and file shares. On the other: AI agents capable of reasoning, planning, and executing tasks. The problem was that agents couldn't reach the search layer. Every connection required a custom integration — its own connector code, its own auth logic, its own maintenance burden.

This wasn't a minor inconvenience. It was the core bottleneck blocking AI agents from being genuinely useful in enterprise settings. An agent that can't query your knowledge base is an agent operating blind.

Pureinsights Discovery 2.8, released this week, takes a direct run at that problem.


What Discovery 2.8 Actually Ships

The headline feature is native MCP support inside QueryFlow — Pureinsights' API builder and query orchestration layer.

Model Context Protocol, introduced by Anthropic in November 2024 and since adopted by OpenAI, Google DeepMind, Microsoft, and AWS, is the open standard for connecting AI agents to external tools and data without building custom integrations for each pairing. Before MCP, every time an AI system needed to talk to a new tool, someone had to write a connector. Now there's one protocol. If both sides speak it, they talk.

What Discovery 2.8 does is let developers expose their search entrypoints as custom MCP servers. Any MCP-compatible agent can then call those entrypoints directly — no glue code, no brittle API wrappers sitting in the middle. The MCP support isn't layered on top of QueryFlow as an afterthought; it runs through the same pipeline infrastructure that Discovery already uses for query orchestration.

Kamran Khan, CEO of Pureinsights, put it plainly in the release: "With MCP support, our customers can now connect Discovery directly into the agentic AI workflows and tools they're already building."

Beyond MCP, the release ships several connectors that close real gaps in enterprise data access:

SharePoint Online — Full crawling of sites, subsites, lists, list items, files, and attachments. Microsoft's document ecosystem, directly inside Discovery ingestion pipelines. SharePoint sits at the center of knowledge management for thousands of enterprises and has historically been one of the harder silos to crack for AI retrieval.

OracleDB — Native Oracle Database support via JDBC. Connect to Oracle, execute SQL, retrieve table data for ingestion.

SMB — Crawl network file shares via a new Filesystem component.

LDAP — Query enterprise directory services, retrieve users and groups.

The pattern across all four is consistent: each connector removes another category of "unreachable" data from the equation.

The Schedules API rounds out the release. It lets teams trigger data ingestion seeds using cron expressions, so pipelines run on a defined schedule instead of requiring someone to manually kick them off. For teams running real-time knowledge pipelines, automated ingestion isn't a nice-to-have — it's the baseline.


Why This Release Lands at a Meaningful Moment

MCP's growth trajectory over the past 16 months is hard to argue with.

Anthropic launched the protocol in November 2024 with roughly 2 million monthly SDK downloads. By March 2026, that number had grown to 97 million. The milestones in between tell the story: OpenAI adopted it in April 2025, Microsoft Copilot Studio in July 2025, AWS Bedrock in November 2025. The ecosystem now includes over 5,800 community-built servers. The Linux Foundation took governance of the protocol in December 2025, which is the kind of institutional move that turns "interesting standard" into "durable infrastructure."

Enterprise search specifically has been one of the slower categories to adopt agentic patterns. Most search platforms were built to serve human users — not to be called programmatically by agents operating inside larger pipelines. MCP changes that by giving search tools a standardized way to expose themselves to the agent layer.

The efficiency argument is straightforward. Before MCP, connecting an AI agent to 10 internal tools meant building and maintaining 10 separate integrations. With MCP, each tool exposes one server that works across compliant agents — the math moves from multiplicative to additive. That's why CIOs are now paying attention to a protocol specification, which is not something that happens often.

Pureinsights is one of the first enterprise search vendors to ship native MCP support. In a market where timing relative to protocol adoption tends to compound, that positioning matters.


What This Looks Like in Practice

Consider a common enterprise scenario: a research team needs an AI assistant that can pull from PubMed, an internal SharePoint repository, and a proprietary Oracle database — and synthesize answers across all three.

Before Discovery 2.8, that meant custom integration work for each source, different authentication schemes per system, and ongoing maintenance as each platform updates independently.

With MCP support in QueryFlow, the developer exposes each search entrypoint as an MCP server. The agent calls those servers directly using the standard protocol. SharePoint data is crawled and indexed through the new connector. Oracle tables are queried via JDBC. Ingestion runs automatically on cron via the Schedules API. The pipeline doesn't need a human operator watching it. The agent doesn't need bespoke code to reach any of it.

That's what "low-code agentic pipeline" actually means when it ships in a real product — not a marketing slide, but a working architecture where the protocol handles the connection layer and the developer focuses on the logic.

The Pureinsights Discovery Platform is already used across financial services, government, retail, and media. Those aren't sectors known for tolerating fragile integrations. Shipping MCP as a first-class capability rather than an add-on signals that this is meant to hold up in production, not just in demos.


The Broader Signal — and the Open Question

Discovery 2.8 is a product release, but it reflects something larger happening across the enterprise software landscape. MCP is moving from developer tooling into production infrastructure. When a cloud-native search platform ships native MCP support as a first-class capability, it signals that the protocol has crossed a meaningful threshold.

The remaining challenge is the one that follows every fast-moving protocol: security. Researchers have flagged prompt injection risks, tool poisoning vectors, and access control gaps as areas that need serious attention before MCP is ready for the most sensitive enterprise data. Pureinsights operates in financial services and government — sectors where those concerns aren't theoretical. How they address those security questions in future releases will determine how deep into regulated environments Discovery can go.

Anthropic's own roadmap includes OAuth 2.1 with enterprise identity provider integration targeting Q2 2026. That should help. But for teams deploying MCP-connected systems today, governance and access control need to be explicitly designed in, not assumed.

For now, Discovery 2.8 puts a concrete product behind an idea that has mostly lived in architecture diagrams: enterprise search as an active participant in agentic AI workflows, not a static database sitting behind a wall.

If you're building agentic pipelines on top of enterprise data, this release is worth a close look. Full details are available at pureinsights.com.


Published by Om Shree | Shreesozo — The Shreesozo Dispatch covers MCP, agentic AI, and developer tools for builders who don't have time for hype.


Top comments (0)