By now you've probably seen one of those demos of AI apps that are so fast that you don't even need to stream the responses. Developing a full-stack AI app can be very complicated and has taken many startups millions of dollars and hundreds of talented engineers. Fortunately, because we have access to so many amazing tools not only can you make one by yourself, you can build one in a weekend.
Here's what you'll use:
- Supabase (start for free)
- A Javascript framework for frontend (just pick any you like, there are billions)
- LLAMA 3 via Groq Cloud (they have insanely fast chips for running AI models)
Now you know what you'll use it's time to get started
First step to building an AI app
Drink some water and touch some grass. Seriously, you've been in front of your laptop for too long and it's not healthy. Please go outside.
Setting up a Supabase project
Now you're back from outside (or more likely, just ignored that part), you're ready to set up a Supabase project. A managed version is a few clicks on supabase.com. Self-hosting is a bit more involved but if you want to go down that route please read this guide.
Getting Groq
Sign up on Groq Cloud, you might have to wait for API access but that might have changed by the time you read this.
Calling Supabase
Assuming you have some kind of frontend project already set up with npm, you'll want to install Supabase with this command:
npm i @supabase/supabase-js
You'll then be able to create a Supabase client with this code:
import { createClient } from '@supabase/supabase-js'
const supabaseUrl = 'YOUR_URL'
const supabaseKey = 'YOUR_KEY'
const supabase = createClient(supabaseUrl, supabaseKey)
Now, you have access to Supabase in your project.
Calling Llama 3 From the Edge
We'll use an edge function in Supabase to call Llama 3 on the Groq API. They're a super easy way to write custom code for a Supabase project.
You'll need to install the Supabase CLI in different ways depending on your OS, check out this guide for details and come right back here.
Run supabase init
in your project folder to initialize Supabase and then run supabase functions new groq-api
which will create a function called 'groq-api.'.
Open up the index.ts file and add this code:
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts'
export const corsHeaders = {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
}
serve(async (req) => {
if (req.method === 'OPTIONS') {
return new Response('ok', { headers: corsHeaders })
}
})
This is very important because we want to call our function from a browser. So when a browser sends an OPTIONS request, the function will respond with an ok.
Now add this under the if statement:
const GROQ_API_KEY = 'GROQ_API_KEY'
const data = fetch('https://api.groq.com/openai/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${GROQ_API_KEY}`
},
body: JSON.stringify({
"messages": [
{
"role": "user",
"content": "Hello Llama!"
}
],
"model": "llama3-8b-8192",
"temperature": 1,
"max_tokens": 1024,
"top_p": 1,
"stream": false,
"stop": null
})
})
This will call the Groq API without streaming the response and it will pass a message from the user that just says, "Hello Llama!"
Now let's allow the user to send their own message by adding this to the start of the serve function:
const { message } = await req.json()
That will destructure message from the request JSON. Then return the response from the fetch operation as follows:
const completion = response.json()['data']['content']
return new Response(JSON.stringify({ "completion": completion}), {
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
status: 200,
})
Your code should now look like this:
import { serve } from 'https://deno.land/std@0.177.0/http/server.ts'
export const corsHeaders = {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Headers': 'authorization, x-client-info, apikey, content-type',
}
serve(async (req) => {
const { message } = await req.json()
// This is needed if you're planning to invoke your function from a browser.
if (req.method === 'OPTIONS') {
return new Response('ok', { headers: corsHeaders })
}
const GROQ_API_KEY = 'GROQ_API_KEY'
const response = await fetch('https://api.groq.com/openai/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${GROQ_API_KEY}`
},
body: JSON.stringify({
"messages": [
{
"role": "user",
"content": "Hello Llama!"
}
],
"model": "llama3-8b-8192",
"temperature": 1,
"max_tokens": 1024,
"top_p": 1,
"stream": true,
"stop": null
})
})
const completion = response.json()['data']['content']
return new Response(JSON.stringify({ "completion": completion}), {
headers: { ...corsHeaders, 'Content-Type': 'application/json' },
status: 200,
})
})
Deploy your function by running supabase functions deploy groq-api
from your project folder.
Now, all you need to do is use your function in the front project. Import Supabase and invoke:
import { supabase } from "/path/to/supabase"
const { data, error } = await supabase.functions.invoke('groq-api', {
body: { message: 'The message value from the user' }
})
All done! Depending on your framework you'll handle variables and states differently but that part above is framework agnostic.
A few more things to note:
- We didn't implement chat history, you could that in your Supabase DB and pull dynamically depending on who the user is
- Use environment variables instead of hard coding API keys when possible
For now, you've got a great start. Time to scale it to millions.
Top comments (2)
WOAH, I didn't know how insanely fast groq is.