DEV Community

Cover image for The New Vercel AI SDK: Your Own Chatbot in a Flash
Liam Stone
Liam Stone

Posted on • Updated on

The New Vercel AI SDK: Your Own Chatbot in a Flash

Introduction

Empowering your applications with Artificial Intelligence (AI) and Machine Learning (ML) capabilities no longer requires a Ph.D. or years of specialized experience. With the advent of robust and user-friendly tools like the Vercel AI SDK, it's now possible to build smart, conversational UIs effortlessly, regardless of your AI proficiency level.

Vercel AI SDK abstracts away the complexities associated with integrating AI models, allowing you to focus on creating engaging user experiences. It offers a seamless pathway to AI functionalities, from setting up a fresh project using popular frameworks like Next.js and Tailwind CSS to crafting immersive chat interfaces with just a few lines of code. There are also options to easily integrate LLM frameworks like Langchain to make your bot more powerful.

Ready to supercharge your application with AI? Let's embark on this exciting journey and explore the potential of Vercel AI SDK in unlocking a whole new world of possibilities for your application.

Create Next App

Before we dive into the Vercel AI SDK specifics, let's create a new project using Next.js and Tailwind CSS. Next.js is a popular framework for building React applications, and Tailwind CSS is a utility-first CSS framework that is highly flexible and customizable.

You can create a new Next.js application using the create-next-app utility. To start, enter the following command in your terminal:

npx create-next-app@latest
Enter fullscreen mode Exit fullscreen mode

After running this command, you'll be prompted with a series of questions to configure your new application.

First, you'll need to name your application. Enter a suitable name when prompted.

The next steps involve configuring your application. You will be asked if you would like to use the latest version of Next.js ("Would you like to use the latest version of Next.js?") and if you want to include Tailwind CSS ("Would you like to include Tailwind CSS?"). Make sure to answer 'yes' to both these questions.

Finally, you will be asked if you want to specify a source directory ("Would you like to specify a source directory?"). For the purposes of this guide, you can answer 'no' to this question.

Once you've completed the setup process, you'll have a new Next.js application that includes Tailwind CSS. This gives you a solid starting point to begin working with the Vercel AI SDK.

The Vercel AI SDK

The Vercel AI SDK is a newly-released open-source library aimed at assisting developers in creating engaging, conversational user interfaces that integrate seamlessly with streaming and chat functions. This library is built with JavaScript and TypeScript in mind and is compatible with popular frontend frameworks such as React/Next.js and Svelte/SvelteKit. Future support for Nuxt/Vue is also anticipated.

To install the Vercel AI SDK, all you need to do is to run the following command in your terminal:

npm install ai
Enter fullscreen mode Exit fullscreen mode

The entire source code and various applications are available at the Github here (also, how did they get that namespace!!??)

Environment Variables

In many projects, especially those that involve external services like APIs or databases, it's often necessary to use sensitive data such as API keys, database URIs, or secret keys. These should not be hard-coded directly into your codebase, particularly if it's a public repository, as this can pose significant security risks.

That's where environment variables come in. They allow developers to inject these values into the application, keeping them secure and out of the version-controlled codebase.

In Next.js, and many other Node.js-based projects, environment variables are typically stored in a .env file at the root of your project. This file is used to define your environment variables.

To get the API secret key for Open AI you need to head over to their website, login/signup and then navigate to your account and create a key.

Here's an example of what your .env file might look like for a project using the Vercel AI SDK:

OPENAI_API_KEY=your_openai_api_key
Enter fullscreen mode Exit fullscreen mode

In the above example, OPENAI_API_KEY is the environment variable that is set to the value your_openai_api_key. In your actual project, you would replace your_openai_api_key with your real OpenAI API key.

It's crucial to remember that the .env file contains sensitive information and should not be committed to your public version control system. To ensure this file is ignored, add .env to your .gitignore file:

# .gitignore
.env
Enter fullscreen mode Exit fullscreen mode

The Chat Streaming API

To setup the chat API dive into the app/api folder/chat folder (or create it) and create your route.ts. The app router enables directory based routing. If you want a new page you put page.tsx in the folder and if you need an API route it's route.ts.

Enter the code below:

import { OpenAIStream, StreamingTextResponse } from 'ai'
import { Configuration, OpenAIApi } from 'openai-edge'

const config = new Configuration({
  apiKey: process.env.OPENAI_API_KEY
})
const openai = new OpenAIApi(config)

export const runtime = 'edge'

export async function POST(req: Request) {
  const { messages } = await req.json()

  const response = await openai.createChatCompletion({
    model: 'gpt-3.5-turbo',
    stream: true,
    messages
  })

  const stream = OpenAIStream(response)

  return new StreamingTextResponse(stream)
}
Enter fullscreen mode Exit fullscreen mode

So what is going on here? First, we import the necessary modules: OpenAIStream and StreamingTextResponse from the Vercel AI SDK, and Configuration and OpenAIApi from the 'openai-edge' package. These modules set the foundation for utilizing OpenAI's capabilities in our application.

Next, we create an OpenAI API client using our OpenAI API key, stored securely as an environment variable. This client serves as our gateway to the powerful language models offered by OpenAI.

After that, we specify the runtime as 'edge', indicating that we aim to execute our functions at the edge, ensuring reduced latency and improved performance.

The magic happens within the POST function, which is an asynchronous function handling POST requests.

Upon receiving a request, it extracts the 'messages' from the request's body. These 'messages' are then fed into the OpenAI API client's createChatCompletion method, which generates a dynamic conversation with the language model specified (in this case, 'gpt-3.5-turbo').

The response from the OpenAI API client is then converted into a stream using OpenAIStream. This stream, which essentially represents a dynamic conversation with the AI, is then returned to the client as a StreamingTextResponse.

In essence, this code showcases how to easily set up and manage an AI-powered conversation within your application using Vercel AI SDK and OpenAI. It's an elegant blend of AI and application development, bringing conversational AI at your fingertips.

The Simple UI

Tailwind CSS means that setting up the UI is very simple! in the app directory jump into the page.tsx, remove the boilerplate code and enter the following:

'use client'

import { useChat } from 'ai/react'

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat()

  return (
    <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
      {messages.length > 0
        ? messages.map(m => (
            <div key={m.id} className="whitespace-pre-wrap">
              {m.role === 'user' ? 'User: ' : 'AI: '}
              {m.content}
            </div>
          ))
        : null}

      <form onSubmit={handleSubmit}>
        <input
          className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={handleInputChange}
        />
      </form>
    </div>
  )
}
Enter fullscreen mode Exit fullscreen mode

This code showcases a simple chat interface built with the Vercel AI SDK in a React application. The useChat hook from the Vercel AI SDK is used, which provides chat state variables (messages and input) and functions (handleInputChange and handleSubmit) that manage the chat flow.

The UI displays a chat history, where each message has a 'role' (either 'user' or 'AI') and 'content'. Users input their messages in a form field that updates the input state as they type, and sends it to the AI model upon submission, thereby adding to the conversation.

Give it a crack!

Now all your hard work is done it's time to chat! Simply pop in:

npm run dev
Enter fullscreen mode Exit fullscreen mode

This should open a local instance of the application for you to test out. You'll see that you're able to chat with the bot pretty easily.

Note the streamed response which is an absolute dream considering when all this AI stuff kicked off this would take multiple lines of code/polling/websockets/readable streams for something like this.

Image description

Now you can go ahead an deploy with a few clicks to Vercel and show all your friends your very own chatbot!

Empowering Conversations with AI

And just like that, you've leveled up your application by integrating the power of artificial intelligence! Using the Vercel AI SDK, you've breathed life into your user interfaces, enabling them to converse intelligently and in real-time with your users.

But this is just the tip of the AI iceberg. The Vercel AI SDK supports an array of AI models from OpenAI, Hugging Face, and LangChain. Each of these AI powerhouses offers unique capabilities and strengths, waiting for you to harness them.

Why stop at GPT-3.5? How about testing out Hugging Face's BERT or LangChain's advanced language models? The choice is yours, and the possibilities are endless.

So, dive in, start exploring these different AI models, and watch as your application evolves from simply interactive to intelligently conversational. Go ahead, surprise yourself with what you and AI can achieve together. AI has never been this accessible and fun to play with!

Remember, the future of AI is now. And with the Vercel AI SDK, you are not just ready for it; you're driving it. Happy coding!

If you'd like to know more about AI and making it work for your business checkout my startup Erudii. I'm always looking for people to collaborate with!

References:
If you got stuck anywhere feel free to reach out. The Vercel documentation is excellent and the Github repo is also easily clonable if you don't have time to code from the start and just want to get stuck in customising.

The repo also includes a chat-with-functions route, a fantastic new feature covered in another blog.

Top comments (6)

Collapse
 
pavelee profile image
Paweł Ciosek

So cool! Thanks! Huge time saving

Collapse
 
devben profile image
DevBen

Please how do I let the AI bot answer base on my data? since I don't have money now am using hugging face

Collapse
 
rahulj9a profile image
Rahul Solanki

Wow! Vercel has really done a great job, will save many hours. And thanks to you to share that

Collapse
 
martydevs profile image
Andre Marti

Such an amazing way to integrate AI in shortless time and just with one hook. Even If i not code in Nextjs, this way leaves out great amount of effort in add AI functions!

Great article!

Collapse
 
stonediggity profile image
Liam Stone

Thanks for reading Andre. Definitely worth learning the Next JS Framework for :-)

Collapse
 
leandro_nnz profile image
Leandro Nuñez

Thanks!