Implementing Vercel's AI SDK as a component
Requirements:
- Knowledge of Next.js v13+ (App directory)
- React/TypeScript (preferred)
- Intermediate level JavaScript knowledge
Documentation can be found here
In this video, I showcase how to implement Vercel's AI SDK into your web app as a component. In order to achieve this with Vercel's SDK, you will need to use Next.js.
I bootstrapped a project,
but you can also add the dependencies manually if you have an existing project that is not already using Next.js.
Getting Started
Once you have your project layout, install the dependency necessary to implement the SDK
Then, create your server side API @ root/api/chat/route.ts (or route.js if you are not leveraging TypeScript)
Inside this component, you will find all the code necessary for your server side API call
Excellent! Now we just need to create the component to showcase the new LLM on our site.
At the root level of your layout, or in a components directory or wherever, you will create a component, Chat.tsx, which will handle your client side information.
Implement your client sided functionality, and there you have it. You have created a component that holds jsx for a form. Inside the form, users can type out an input, and click send. Submitting this input to the API is done through the useChat hook that is imported and provided to us.
Essentially, we are just adding an interface to which we can string responses from OpenAI's Chat-GPT via an API call.
Keep in mind, this does cost money, which is why I leverage the 3.5 turbo model. User input is transferred to tokens and charged to your account via your apiKey that is added in the server side API.
Another FYI
Add your .env.local file to your .gitignore file if you are leveraging a GitHub repository for your project. This will keep you safe from potential malicious attacks on your apiKey, resulting in charges to your credit card.
On Production, you will need to add your environment variables to whatever service you are deploying with. Vercel makes it exceptionally easy to deploy, which you can check out here.
Top comments (0)