DEV Community

Cover image for Building Your First AI Agent in PHP with Symfony’s New AI Components and Ollama
Roberto B.
Roberto B.

Posted on • Edited on

Building Your First AI Agent in PHP with Symfony’s New AI Components and Ollama

Learn how to build an AI agent in PHP using Symfony’s new AI Component and Ollama. This hands-on guide walks you through the installation, configuration, and running of a simple real-world example that rewrites and improves Markdown content locally using Gemma 3. Perfect for developers exploring how AI fits into the Symfony ecosystem.

In my previous article, I showed how a PHP developer can build a simple AI agent using Neuron AI and Ollama.

This time, let’s explore something new and exciting: the Symfony AI Components, a set of new PHP/Symfony packages that bring AI capabilities directly into the Symfony ecosystem.

This article is updated to use the last version of Symfony AI components, tagged as 0.1

What are the Symfony AI components?

The Symfony AI components provide a clean, framework-agnostic interface to interact with AI models from different providers (like Ollama, OpenAI, and others).

It’s designed to feel natural for Symfony developers, utilizing familiar concepts such as factories, messages, and bridges for various AI platforms.

Installation

Symfony AI is now available as a tagged 0.1 release, introducing a set of dedicated packages focused on specific responsibilities.
In this article, we utilize the agent component, which provides a unified abstraction over multiple AI providers, including OpenAI, Ollama, Anthropic, Gemini, and others.

Install the agent component

The Symfony AI Agent component provides a unified interface to multiple AI providers, including OpenAI, Anthropic, Gemini, and others.

Install it using Composer:

composer require symfony/ai-agent
Enter fullscreen mode Exit fullscreen mode

Install a platform implementation

Next, install at least one concrete platform implementation. For example, to use Ollama:

composer require symfony/ai-ollama-platform
Enter fullscreen mode Exit fullscreen mode

You can install additional platform packages depending on the providers you want to use (OpenAI, Anthropic, etc.). For example, if you want to use Google Gemini instead of Ollama, you can install the proper platform package:

composer require symfony/ai-gemini-platform
Enter fullscreen mode Exit fullscreen mode

This allows your code to be very flexible and, in the future, to interchange providers and models easily.

Example: using the Symfony AI Agent component with Ollama

Let’s look at a real example that reviews and rewrites an article in Markdown format using Gemma 3 through Ollama.

Here’s the complete code:

<?php

require "./vendor/autoload.php";

use Symfony\AI\Agent\Agent;
use Symfony\AI\Platform\Bridge\Ollama\PlatformFactory;
use Symfony\AI\Platform\Message\Message;
use Symfony\AI\Platform\Message\MessageBag;
use Symfony\Component\HttpClient\HttpClient;

class MyAgent
{
    private Agent $agent;

    public function __construct(
        string $model,
        string $ollamaBaseUrl = "http://localhost:11434",
    ) {
        $platform = PlatformFactory::create(
            $ollamaBaseUrl,
            HttpClient::create(),
        );

        $this->agent = new Agent($platform, $model);
    }

    public function reviewArticle(string $articleFilePath): string
    {
        if (!file_exists($articleFilePath)) {
            throw new \InvalidArgumentException(
                sprintf('Article file "%s" does not exist.', $articleFilePath),
            );
        }

        $messages = new MessageBag(
            Message::forSystem(
                "Review and rewrite the following article in clear, grammatically correct, and professional English. " .
                    "Improve readability, correct any errors, and ensure the tone remains neutral and coherent. " .
                    "Do not change the meaning of the text. Output only the corrected and polished version of the article. As input, I will provide the whole article, I will not write any introduction question, or request.",
            ),
            Message::ofUser("" . file_get_contents($articleFilePath)),
        );

        $result = $this->agent->call($messages);

        return $result->getContent();
    }
}

$agent = new MyAgent(model: "gemma3:270m");

echo $agent->reviewArticle("./article-1.md");

Enter fullscreen mode Exit fullscreen mode

Understanding the PHP AI Agent source code

This script shows how to build a simple AI Agent using the Symfony AI Agent component to review and rewrite an article. The goal is to encapsulate all AI-related logic in a reusable class while keeping the usage straightforward.

Let’s break the code down into its key parts.

Step 1) Bootstrapping and dependencies

require "./vendor/autoload.php";
Enter fullscreen mode Exit fullscreen mode

The Composer autoloader is required to make all Symfony AI classes available. This is standard for any Composer-based PHP project.

Next, we import the classes used throughout the script:

use Symfony\AI\Agent\Agent;
use Symfony\AI\Platform\Bridge\Ollama\PlatformFactory;
use Symfony\AI\Platform\Message\Message;
use Symfony\AI\Platform\Message\MessageBag;
use Symfony\Component\HttpClient\HttpClient;
Enter fullscreen mode Exit fullscreen mode

These classes cover:

  • the Agent abstraction
  • the Ollama platform bridge
  • message handling (system and user messages)
  • the HTTP client used to communicate with Ollama

Step 2) Encapsulating logic in the MyAgent class

The MyAgent class acts as a reusable AI worker. It hides all the setup details and exposes a simple API for reviewing articles.

class MyAgent
{
    private Agent $agent;
Enter fullscreen mode Exit fullscreen mode

The $agent property stores an instance of Symfony\AI\Agent\Agent, which is configured once and reused.

Step 3) Initializing the AI platform

public function __construct(
    string $model,
    string $ollamaBaseUrl = "http://localhost:11434",
) {
    $platform = PlatformFactory::create(
        $ollamaBaseUrl,
        HttpClient::create(),
    );

    $this->agent = new Agent($platform, $model);
}
Enter fullscreen mode Exit fullscreen mode

This constructor is responsible for all initialization:

  • Platform creation PlatformFactory::create() sets up the Ollama platform and points it to the local Ollama server.
  • HTTP client Symfony’s HTTP client is used to send requests to the AI model.
  • Agent setup The Agent is created by combining the platform with a specific model (for example, gemma3:270m).

Once constructed, the agent is ready to handle requests.

Step 4) Preparing the messages for the AI model

$messages = new MessageBag(
    Message::forSystem(
        "Review and rewrite the following article in clear, grammatically correct, and professional English. ..."
    ),
    Message::ofUser("" . file_get_contents($articleFilePath)),
);
Enter fullscreen mode Exit fullscreen mode

The interaction with the model is defined using a MessageBag:

  • System message This message defines the role and behavior of the AI. In this case, it instructs the model to review and rewrite the article without changing its meaning or adding extra commentary.
  • User message The user message contains the full contents of the article, loaded directly from a Markdown file.

This clear separation mirrors how modern chat-based LLMs expect input.

Step 5) Calling the Agent

$result = $this->agent->call($messages);
Enter fullscreen mode Exit fullscreen mode

The call() method sends the prepared messages to the configured model via the platform. The Agent handles the request lifecycle and returns a result object.

Step 6) Retrieving the generated content

return $result->getContent();
Enter fullscreen mode Exit fullscreen mode

The AI-generated text is extracted from the result and returned as a plain string, ready to be displayed, stored, or further processed.

Step 7) Using the Agent

$agent = new MyAgent(model: "gemma3:270m");
echo $agent->reviewArticle("./article-1.md");
Enter fullscreen mode Exit fullscreen mode

Finally, the agent is instantiated with a specific model and used to review an article file. Switching models or articles only requires changing the input parameters, not the internal logic.

Why this approach works well

  • Separation of concerns: all AI logic lives in one class
  • Reusability: the same agent can process multiple articles
  • Flexibility: switching models or platforms is trivial
  • Future-proof: aligns with the Agent-based design of Symfony AI

This pattern provides a solid foundation for building more advanced AI-powered workflows in Symfony applications.

Running it

Make sure you have the Ollama service running locally with the Gemma 3 model available:

ollama pull gemma3:4b
ollama serve
Enter fullscreen mode Exit fullscreen mode

Then execute your PHP script:

php ai-reviewer.php
Enter fullscreen mode Exit fullscreen mode

You should see your improved article printed in the terminal.

Conclusion

The Symfony AI Component is still in its early stages, but it already offers a clean and extensible way to interact with AI models directly from PHP.

For developers familiar with Symfony’s structure and philosophy, it feels natural and powerful, especially when combined with local tools like Ollama.

This opens the door for Symfony apps to integrate AI-driven workflows, such as:

  • Content rewriting or summarization
  • Code review and documentation generation
  • Natural language queries for data
  • Chatbots or assistants built into Symfony projects

Resources

Top comments (0)