DEV Community

Cover image for Building Your First AI Agent in PHP with Symfony’s New AI Component and Ollama
Roberto B.
Roberto B.

Posted on

Building Your First AI Agent in PHP with Symfony’s New AI Component and Ollama

Learn how to build an AI agent in PHP using Symfony’s new AI Component and Ollama. This hands-on guide walks you through the installation, configuration, and running of a simple real-world example that rewrites and improves Markdown content locally with Gemma 3. Perfect for developers exploring how AI fits into the Symfony ecosystem.

In my previous article, I showed how a PHP developer can build a simple AI agent using Neuron AI and Ollama.

This time, let’s explore something new and exciting: the Symfony AI Component, a new experimental package that brings AI capabilities directly into the Symfony ecosystem.

What is the Symfony AI component?

The Symfony AI component provides a clean, framework-agnostic interface to interact with AI models from different providers (like Ollama, OpenAI, and others).

It’s designed to feel natural for Symfony developers, utilizing familiar concepts such as factories, messages, and bridges for various AI platforms.

Installation

At the moment, the component is still under development, so you’ll need to set your Composer configuration to allow development versions, adding minimum-stability as dev in your composer.json file:

{
    "minimum-stability": "dev",
}
Enter fullscreen mode Exit fullscreen mode

Then install The Symfony AI component using Composer:

composer require symfony/ai-agent
Enter fullscreen mode Exit fullscreen mode

Example: using the Symfony AI component with Ollama

Let’s look at a real example that reviews and rewrites an article in Markdown format using Gemma 3 through Ollama.

Here’s the complete code:

<?php

use Symfony\AI\Platform\Bridge\Ollama\PlatformFactory;
use Symfony\AI\Platform\Message\Message;
use Symfony\AI\Platform\Message\MessageBag;
use Symfony\Component\HttpClient\HttpClient;

require "./vendor/autoload.php";

$platform = PlatformFactory::create(
    'http://localhost:11434',
    HttpClient::create()
);

$articleMarkdownFile = './article-1.md';

$messages = new MessageBag(
    Message::forSystem('Review and rewrite the following article in clear, grammatically correct, and professional English. Improve readability, correct any errors, and ensure the tone remains neutral and coherent. Do not change the meaning of the text. Output only the corrected and polished version of the article.'),
    Message::ofUser("This is the article: " . file_get_contents($articleMarkdownFile)),
);

try {
    $result = $platform->invoke("gemma3:4b", $messages);
    echo $result->getResult()->getContent() . PHP_EOL;
} catch (InvalidArgumentException $e) {
    echo $e->getMessage();
}
Enter fullscreen mode Exit fullscreen mode

How it works

Let’s break this down step by step:

Step 1) Import and autoload dependencies

require "./vendor/autoload.php";
Enter fullscreen mode Exit fullscreen mode

This ensures all Symfony AI and HTTP client classes are available.

Step 2) Create the AI platform object

$platform = PlatformFactory::create(
    'http://localhost:11434',
    HttpClient::create()
);
Enter fullscreen mode Exit fullscreen mode

The PlatformFactory creates a bridge to Ollama, which runs locally.
You can replace the URL with another endpoint if you’re using a remote Ollama instance.

Step 3) Prepare your input data

$articleMarkdownFile = './article-1.md';
$articleMarkdown = file_get_contents($articleMarkdownFile);
Enter fullscreen mode Exit fullscreen mode

The script reads a local Markdown article to be processed.

Step 4) Define messages for the AI model

$messages = new MessageBag(
    Message::forSystem('Review and rewrite...'),
    Message::ofUser(
        "This is the article: " . $articleMarkdown),
);
Enter fullscreen mode Exit fullscreen mode

The system message (forSystem()) tells the model what task to perform, while the user message (ofUser()) provides the actual article text.

Step 5) Invoke the model

$result = $platform->invoke("gemma3:4b", $messages);
Enter fullscreen mode Exit fullscreen mode

Here we call the Gemma model (gemma3:4b using Ollama).
The AI processes the messages and returns a polished version of the article.

Step 6) Retrieve and display the result

echo $result->getResult()->getContent();
Enter fullscreen mode Exit fullscreen mode

The reviewed article is printed directly to your console.

Running it

Make sure you have the Ollama service running locally with the Gemma 3 model available:

ollama pull gemma3:4b
ollama serve
Enter fullscreen mode Exit fullscreen mode

Then execute your PHP script:

php ai-reviewer.php
Enter fullscreen mode Exit fullscreen mode

You should see your improved article printed in the terminal.

Conclusion

The Symfony AI Component is still in its early stages, but it already offers a clean and extensible way to interact with AI models directly from PHP.

For developers familiar with Symfony’s structure and philosophy, it feels natural and powerful, especially when combined with local tools like Ollama.

This opens the door for Symfony apps to integrate AI-driven workflows such as:

  • Content rewriting or summarization
  • Code review and documentation generation
  • Natural language queries for data
  • Chatbots or assistants built into Symfony projects

Resources

Top comments (0)