Originally published at hafiz.dev
Laravel just shipped something big. On February 5, 2026, Taylor Otwell dropped the Laravel AI SDK, a first party package for building AI-powered features directly in your Laravel apps. And honestly? I've been waiting for this one.
I've built multiple AI-powered products over the past couple of years: StudyLab (AI quiz generation from PDFs), ReplyGenius (Chrome extension for AI-powered review replies), and PromptOptimizer (AI prompt enhancement tool). For each one, I had to piece together my own AI integration layer. Different packages, different approaches, lots of boilerplate.
So when I saw $ composer require laravel/ai trending on X with 23K views on Taylor's tweet alone, I immediately dug into the docs. Here's what I found, what actually changed, and whether you should jump in right now.
What Is the Laravel AI SDK?
It's a unified API for working with AI providers like OpenAI, Anthropic, Gemini, xAI, and others. One package, one interface, multiple providers. Think of it like how Laravel's filesystem abstraction lets you swap between S3 and local storage. Same idea, but for AI.
The SDK covers a lot more than just text generation:
| Feature | Supported Providers |
|---|---|
| Text / Agents | OpenAI, Anthropic, Gemini, Groq, xAI |
| Image Generation | OpenAI, Gemini, xAI |
| Text-to-Speech | OpenAI, ElevenLabs |
| Speech-to-Text | OpenAI, ElevenLabs |
| Embeddings | OpenAI, Gemini, Cohere, Jina |
| Reranking | Cohere, Jina |
| File Storage | OpenAI, Anthropic, Gemini |
That's not just a wrapper around ChatGPT. This covers agents, images, audio, transcription, embeddings, vector stores, and RAG. Basically everything you need to build AI-native applications.
The Evolution: How We Got Here
Here's what makes this interesting for me. I've literally used every approach Laravel developers have tried for AI integration. Let me walk you through the journey.
Phase 1: The openai-php/client Days
When I first built StudyLab's quiz generation, I started with the openai-php/client package. It worked, but it was pretty raw. You were basically writing API calls with a nicer syntax:
use OpenAI;
$client = OpenAI::client(config('services.openai.key'));
$response = $client->chat()->create([
'model' => 'gpt-4',
'messages' => [
['role' => 'system', 'content' => 'You are a quiz generator...'],
['role' => 'user', 'content' => $pdfContent],
],
'response_format' => ['type' => 'json_object'],
]);
$quizData = json_decode($response->choices[0]->message->content, true);
// Hope the JSON is valid...
// Hope the structure matches what you expected...
// Handle errors manually...
It got the job done. But you were locked into OpenAI. Switching providers meant rewriting everything. And if you wanted structured output? You'd parse JSON from a string and pray it was valid.
Phase 2: The openai-php/laravel Community Package
The openai-php/laravel package added Laravel-specific niceties like a facade and config file. Made things a bit cleaner:
use OpenAI\Laravel\Facades\OpenAI;
$response = OpenAI::chat()->create([
'model' => 'gpt-4',
'messages' => [
['role' => 'system', 'content' => $systemPrompt],
['role' => 'user', 'content' => $userMessage],
],
]);
Better developer experience. Still OpenAI-only though. When I wanted to try Anthropic's Claude for ReplyGenius (better at nuanced writing), I had to install a completely different package. Two AI packages, two APIs, two sets of error handling.
Phase 3: Prism PHP
Then I discovered Prism PHP. This was the turning point. Prism gave you a unified interface across providers:
use Prism\Prism\Prism;
$response = Prism::text()
->using('anthropic', 'claude-sonnet-4-5-20250514')
->withSystemPrompt('You are a quiz generator...')
->withPrompt($userMessage)
->asJson()
->generate();
One API, swap providers with a single line. I started using Prism for every new project. It felt like how AI integration in Laravel should work.
And here's the fun part: the Laravel AI SDK actually uses Prism under the hood. If you check the Packagist listing, you'll see prism-php/prism: ^0.99.0 as a dependency. So the Laravel team didn't reinvent the wheel. They built on top of what was already working and added the Laravel magic on top.
What the SDK Actually Changes
So if Prism was already good, why does the SDK matter? Because it goes further. Way further.
Agents as First-Class Citizens
This is the biggest shift. Instead of writing AI logic inside controllers or services, you create dedicated Agent classes:
php artisan make:agent QuizGenerator --structured
<?php
namespace App\Ai\Agents;
use Illuminate\Contracts\JsonSchema\JsonSchema;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\HasStructuredOutput;
use Laravel\Ai\Contracts\HasTools;
use Laravel\Ai\Promptable;
class QuizGenerator implements Agent, HasStructuredOutput, HasTools
{
use Promptable;
public function instructions(): string
{
return 'You are an educational quiz generator. Create multiple-choice questions from the provided content. Each question should test understanding, not just memorization.';
}
public function schema(JsonSchema $schema): array
{
return [
'questions' => $schema->array(items: [
'question' => $schema->string()->required(),
'options' => $schema->array(items: $schema->string())->required(),
'correct_answer' => $schema->integer()->required(),
'explanation' => $schema->string()->required(),
])->required(),
];
}
public function tools(): iterable
{
return [];
}
}
Then you just use it:
$response = (new QuizGenerator)->prompt(
'Generate 5 questions from this content: ' . $pdfContent
);
// $response['questions'] is already structured. No JSON parsing.
// No "hope the format is right" situation.
return $response['questions'];
Compare that to the old approach where I had a 200 line service class in StudyLab handling prompt building, API calls, JSON parsing, validation, and error recovery. The Agent pattern encapsulates all of that cleanly.
Built-in Conversation Memory
If you're building a chatbot or any conversational AI feature, this is huge. The SDK handles conversation persistence out of the box:
use Laravel\Ai\Concerns\RemembersConversations;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\Conversational;
class SupportBot implements Agent, Conversational
{
use Promptable, RemembersConversations;
public function instructions(): string
{
return 'You are a helpful support assistant...';
}
}
// Start a conversation
$response = (new SupportBot)->forUser($user)->prompt('Hi, I need help');
$conversationId = $response->conversationId;
// Continue later
$response = (new SupportBot)
->continue($conversationId, as: $user)
->prompt('Can you explain that more?');
No more manually storing messages in a custom table. No more loading conversation history and passing it back. The SDK creates agent_conversations and agent_conversation_messages tables and handles everything.
I built something similar manually for StudyLab's AI tutor feature. It took me about two days to get conversation tracking, message limits, and context management working properly. With the SDK, it's a trait.
Streaming and Queuing
Two features I'm really excited about. Streaming lets you return AI responses as Server-Sent Events directly from a route:
Route::get('/coach', function () {
return (new SalesCoach)->stream('Analyze this transcript...');
});
And queuing pushes AI work to the background, which is critical for anything production-grade:
Route::post('/generate', function (Request $request) {
return (new QuizGenerator)
->queue($request->input('content'))
->then(function (AgentResponse $response) {
// Process when done
})
->catch(function (Throwable $e) {
// Handle failure
});
});
This plays perfectly with Laravel's queue system. If you're already running Horizon or Supervisor for background jobs, your AI tasks slot right into the same infrastructure.
Smart Failover
AI providers go down. Rate limits happen. The SDK handles this gracefully:
$response = (new QuizGenerator)->prompt(
'Generate questions...',
provider: ['openai', 'anthropic'],
);
Pass an array of providers and the SDK automatically fails over to the next one if the primary is unavailable. I've had OpenAI rate limit me during peak hours on StudyLab. My workaround was a try/catch with a manual fallback. This is much cleaner.
Testing Without Burning API Credits
This might be my favorite feature for day-to-day development. You can fake agent responses in tests:
use App\Ai\Agents\QuizGenerator;
QuizGenerator::fake([
'questions' => [
['question' => 'Test?', 'options' => ['A', 'B'], 'correct_answer' => 0, 'explanation' => 'Because...'],
],
]);
$response = (new QuizGenerator)->prompt('Generate quiz...');
// Works with structured output, no API call made
Before this, I was either mocking HTTP responses (messy) or burning real API credits in tests (expensive). Having first party testing support is a big quality of life improvement.
What You Should Know Before Jumping In
Let me be real about a few things.
It requires PHP 8.4 and Laravel 12. This isn't backwards compatible. If you're on Laravel 10 or 11, you'll need to upgrade first. For new projects, that's fine. For existing ones, check the upgrade guide before planning your migration.
It's v0.1.2 right now. The SDK was released this week. It works, it's documented, but it's early. I wouldn't rip out a working Prism or openai-php integration from a production app just to switch. For new projects or new AI features? Absolutely use it.
Vector stores need PostgreSQL with pgvector. If you're using MySQL (like I do for most projects), the embeddings and RAG features won't work out of the box. You'll need PostgreSQL for vector similarity search. Something to consider if you're planning to build search or document retrieval features.
The Artisan commands are nice. php artisan make:agent, php artisan make:tool scaffold everything you need. Similar to how make:controller and make:model work. It feels native.
My Honest Take: Should You Use It?
Here's how I'd break it down.
Use it now if you're starting a new Laravel 12 project with AI features. There's no reason to reach for a third-party package anymore. The SDK covers text, images, audio, embeddings, and agents. It's well-documented and backed by the Laravel team.
Wait a bit if you have a working production app using Prism or openai-php. Your current setup is fine. When you add your next AI feature, consider building it with the SDK alongside your existing code. Migrate gradually.
Get excited if you've been building AI features with manual HTTP calls and custom service classes. The jump from raw API calls to the SDK's Agent pattern is massive. It's the difference between writing raw SQL and using Eloquent.
For me personally, I'm going to use it for my next project. I've been through the full evolution (openai-php → community package → Prism → now this), and the SDK feels like the natural endpoint. It does what Laravel does best: takes something complex and makes it feel obvious.
Common Mistakes to Avoid
Don't create monolithic agents. Keep each agent focused on one task. A QuizGenerator agent, a ContentSummarizer agent, a SupportBot agent. Not one giant AIHelper class that does everything.
Don't skip structured output. If you expect JSON back from an AI, use HasStructuredOutput with a proper schema. Don't parse raw text responses. The schema validation alone will save you hours of debugging.
Don't forget about timeouts. AI calls can be slow, especially with large prompts. Pass a timeout parameter when needed:
$response = (new QuizGenerator)->prompt(
$longContent,
timeout: 120,
);
Don't ignore the queue option. For anything user facing where the AI response takes more than a couple seconds, use ->queue() instead of ->prompt(). Your users will thank you.
FAQ
Can I use the Laravel AI SDK with Laravel 11?
No. The SDK requires PHP 8.4 and Laravel 12. You'll need to upgrade first. Check the official upgrade guide for details.
Does the Laravel AI SDK replace Prism PHP?
Not exactly. The SDK actually uses Prism under the hood as a dependency. It builds on top of Prism and adds Laravel-specific features like Artisan commands, the Agent pattern, conversation persistence, streaming responses, and testing utilities.
Can I use the Laravel AI SDK with MySQL?
Yes, for most features (agents, text generation, images, audio). But vector stores and embedding queries require PostgreSQL with the pgvector extension. Standard text-based AI features work with any database.
Is the Laravel AI SDK production-ready?
It's v0.1.2 and was released on February 5, 2026. The docs are comprehensive and the API is clean, but it's early. For new projects, it's a solid choice. For existing production apps, I'd introduce it gradually alongside your current setup rather than doing a full migration.
How does the Laravel AI SDK compare to building with raw HTTP calls?
Night and day. You go from manually constructing API payloads, parsing JSON responses, handling errors, and managing conversation state to using structured Agent classes with built-in tools, testing support, and failover. Similar to the jump from raw SQL to Eloquent.
What's Next
Taylor Otwell and Josh Cirre are doing a live walkthrough on February 9, building a real app with the SDK. Worth watching if you want to see it in action before writing any code.
The AI section in the Laravel docs now has three pages: AI SDK, Boost, and MCP. All three work together to make Laravel arguably the best PHP framework for AI-powered applications.
If you're building a SaaS product with AI features, or adding AI capabilities to an existing Laravel API, this SDK is going to be the standard way to do it. I'm excited to see what the community builds with it.
Need help adding AI features to your Laravel project? Get in touch.
Top comments (0)