DEV Community

NeuroLink AI
NeuroLink AI

Posted on

Running AI in the Browser: NeuroLink's Client-Side SDK for Web Apps

Running AI in the Browser: NeuroLink's Client-Side SDK for Web Apps

The landscape of AI development is rapidly expanding, moving beyond traditional server-side inference to embrace the power of the client. Running AI directly in the browser offers exciting possibilities for enhanced user experiences, improved privacy, and reduced infrastructure costs. However, it also introduces unique challenges, particularly around bundle size, efficient execution, and secure API key management.

Enter NeuroLink, Juspay's universal AI SDK for TypeScript. While NeuroLink is a comprehensive platform designed for both client and server environments, its client-side SDK is specifically engineered to bring robust AI capabilities directly to your web applications. This article explores how NeuroLink addresses the intricacies of client-side AI, allowing developers to integrate powerful language models and tools into their browser-based projects seamlessly.

Why Client-Side AI for Web Applications?

Before diving into NeuroLink's specifics, let's consider the compelling reasons to run AI in the browser:

  1. Reduced Latency and Real-time Feedback: Processing AI tasks directly on the user's device eliminates network roundtrips, leading to instantaneous responses. This is critical for interactive applications like real-time chat, content generation, and intelligent UIs.
  2. Enhanced Privacy and Data Security: Sensitive user data can remain on the client, never leaving the user's browser. This local processing significantly improves privacy posture and simplifies compliance with data protection regulations.
  3. Lower Infrastructure Costs: Offloading AI inference to the client reduces the computational burden on your backend servers, potentially leading to substantial cost savings on cloud resources.
  4. Offline Functionality: For certain models or tasks, client-side execution can enable AI features even when the user is offline, providing a more resilient and consistent experience.
  5. Personalization at Scale: Each user's browser becomes a personalized AI engine, capable of tailoring experiences based on local data and preferences without constant server communication.

NeuroLink's Client-Side SDK: A Powerful Foundation

NeuroLink's client-side SDK is built from the ground up for web environments, offering a suite of tools and integrations that make browser-based AI development a breeze.

At its core, the SDK provides a type-safe HTTP client (createClient) for interacting with NeuroLink APIs, whether hosted on your own infrastructure or through a managed service. This client handles everything from request/response serialization to automatic retries and middleware management.

For modern web frameworks, NeuroLink offers first-class integrations:

  • React Hooks: A rich set of React hooks like useChat, useAgent, useWorkflow, useVoice, useStream, and useTools simplify the integration of AI functionalities into React applications. These hooks manage state, handle streaming, and provide intuitive interfaces for building AI-powered UIs.
  • Vercel AI SDK Compatibility: For those already using the popular Vercel AI SDK, NeuroLink provides a LanguageModelV1 adapter (createNeuroLinkProvider). This allows NeuroLink models to be used interchangeably with generateText, streamText, and other AI SDK functions, providing flexibility and leveraging an existing ecosystem.
import { createClient } from "@juspay/neurolink/client";

const client = createClient({
  baseUrl: "https://api.neurolink.example.com",
  apiKey: process.env.NEUROLINK_API_KEY,
});

// Generate text
const result = await client.generate({
  input: { text: "Explain TCP in two sentences" },
  provider: "openai",
  model: "gpt-4o",
});

console.log(result.data.content);
Enter fullscreen mode Exit fullscreen mode

For React developers:

import { NeuroLinkProvider, useChat } from "@juspay/neurolink/client";

function App() {
  return (
    <NeuroLinkProvider
      config={{
        baseUrl: "https://api.neurolink.example.com",
        apiKey: process.env.NEUROLINK_API_KEY,
      }}
    >
      <ChatComponent />
    </NeuroLinkProvider>
  );
}

function ChatComponent() {
  const { messages, input, handleInputChange, handleSubmit, isLoading } =
    useChat({
      agentId: "my-agent",
    });

  return (
    <div>
      {messages.map((m) => (
        <div key={m.id}>
          <strong>{m.role}:</strong> {m.content}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
        <button disabled={isLoading}>Send</button>
      </form>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

Bundle Size and Tree Shaking: Optimizing for the Web

One of the primary concerns with integrating complex SDKs into web applications is the impact on bundle size, which directly affects load times and user experience. NeuroLink is designed with this in mind.

The SDK leverages modern bundling techniques to ensure that only the necessary code is included in your client-side applications. The scripts/build-browser.mjs script, for instance, uses esbuild to create optimized browser bundles. A crucial aspect of this is stubbing out Node.js-specific modules and dependencies. Many internal NeuroLink components rely on Node.js APIs (like fs, path, crypto) or server-only npm packages. During the browser build process, these are replaced with light-weight, browser-compatible stubs or polyfills, or simply removed if not required by the client-side functionality.

// Excerpt from scripts/build-browser.mjs
const nodeBuiltins = [
  'fs','fs/promises','path','crypto','os','events','http','https','net','tls',
  // ... and many more Node.js specific modules
];

const npmStubs = [
  'sharp','canvas','ffmpeg-static','pdf-parse','exceljs','adm-zip',
  // ... and many server-only npm packages
];

// In the esbuild configuration, these are marked as external or resolved to noop stubs.
// This ensures they don't get bundled into the client-side code.
Enter fullscreen mode Exit fullscreen mode

This aggressive tree-shaking and stubbing strategy ensures that the client-side bundle remains as small as possible, minimizing overhead and maximizing performance for web users.

Proxy Patterns for API Keys: Keeping Secrets Safe

Directly embedding API keys in client-side code is a significant security risk. NeuroLink facilitates secure interaction with AI services by encouraging and supporting proxy patterns for API key management. Instead of making direct calls to AI provider APIs from the browser with exposed keys, the NeuroLink client SDK is designed to communicate with your own backend (which then securely proxies requests to the AI providers using its own, secret API keys).

This can be achieved by setting baseUrl to your own API endpoint:

const client = createClient({
  // Your backend acts as a secure proxy
  baseUrl: "https://your-backend.com/neurolink-proxy",
  // API key for your backend, which then uses its own keys for AI providers
  apiKey: "your-backend-api-key-if-any",
});
Enter fullscreen mode Exit fullscreen mode

This approach not only protects your sensitive credentials but also allows you to implement custom logic, rate limiting, logging, and caching on your backend, providing a robust and secure AI integration layer. OAuth2 client credentials and JWT token management are also supported, enabling more sophisticated authentication flows for enterprise applications.

Streaming in React/Vue/Svelte: Real-time AI Experiences

Modern AI applications thrive on real-time interaction, and streaming is a cornerstone of this experience. NeuroLink's client SDK offers comprehensive streaming capabilities, crucial for applications built with frameworks like React, Vue, or Svelte.

The SDK supports three primary streaming mechanisms:

  1. Callback-Based Streaming (HTTP Client): The client.stream() method allows you to define callbacks (onText, onToolCall, onDone, onError, etc.) that are triggered as chunks of AI responses arrive. This is often the simplest way to integrate streaming into any JavaScript framework.

    await client.stream(
      { input: { text: "Explain quantum computing" }, provider: "openai" },
      {
        onText: (text) => process.stdout.write(text), // Update UI with incoming text
        onDone: (result) => console.log("\nUsage:", result.usage),
      },
    );
    
  2. Server-Sent Events (SSE): For long-lived, unidirectional streaming from the server to the client, the createSSEClient provides a dedicated, auto-reconnecting SSE client. This is ideal for scenarios where the server pushes updates (e.g., agent progress, ongoing generation).

  3. WebSockets: For bidirectional, real-time communication, the createWebSocketClient enables full-duplex interactions, perfect for interactive AI agents that require constant back-and-forth messaging.

The React hooks, such as useChat and useStream, abstract away much of this complexity, providing ready-to-use solutions for building streaming UIs that automatically update as AI generates content.

Client-Side AI and Edge Computing

Client-side AI is a natural complement to edge computing strategies. By performing inference directly in the browser, you effectively push computation to the "edge" of the network – the user's device. This distributed approach reduces reliance on centralized cloud resources, minimizes data transfer, and can lead to more resilient and scalable applications.

NeuroLink's design philosophy aligns with this trend, providing the tools necessary to build hybrid AI architectures where some tasks run on powerful cloud GPUs, while others, particularly those requiring low latency or high privacy, execute efficiently in the browser or on nearby edge devices.

Conclusion

The ability to run powerful AI models and tools directly within the browser opens up a new frontier for web application development. NeuroLink's Client-Side SDK for TypeScript provides a robust, type-safe, and highly optimized solution to navigate this landscape. By carefully managing bundle size, facilitating secure API key handling, and offering flexible streaming and framework integrations, NeuroLink empowers developers to create intelligent, responsive, and private AI-powered web experiences.

Whether you're building a real-time AI chat application, an intelligent content editor, or a personalized recommendation engine, NeuroLink's client SDK offers the foundation you need to bring your AI vision to the web.


NeuroLink — The Universal AI SDK for TypeScript

Top comments (0)