DEV Community

Cover image for Calling OpenAI from a PHP framework the same way you query a database
Amaury
Amaury

Posted on

Calling OpenAI from a PHP framework the same way you query a database

Temma is a PHP MVC framework designed to be easy to pick up and use. It sits between micro-frameworks (too bare) and full-stack ones (too heavy), and tries to get out of your way as much as possible.

One of its core concepts is the datasource: a unified way to declare and access any external connection, whether it's a database, a cache, a message queue, or an API. In Temma 2.16.0, OpenAI joins that list.

Most PHP tutorials on OpenAI integration involve installing a SDK, writing a service class, injecting it manually, and wiring everything together. It works, but it's a lot of plumbing for what is ultimately a remote call.

In Temma, OpenAI is a datasource. The same way you declare a MySQL connection, you declare an OpenAI connection. One line in your config, and you're done.

Configuration

In etc/temma.php:

<?php

return [
    'application' => [
        'dataSources' => [
            'db'     => 'mysql://user:passwd@localhost/mybase',
            'openai' => 'openai://chat/gpt-4o/sk-proj-xxxxxxxxxxxxx',
        ],
    ],
];
Enter fullscreen mode Exit fullscreen mode

That's it. OpenAI sits alongside your database, your Redis cache, your S3 bucket. Same pattern, same config file.

Basic usage

In any controller, the connection is available immediately:

$openai = $this->openai;

// Simple prompt
$response = $openai['What is the capital of France?'];

// Or using the read() method
$response = $openai->read('Translate to French: Hello world');

// With a fallback value in case of error
$response = $openai->read(
    'Translate to French: Hello world',
    'Bonjour le monde'
);
Enter fullscreen mode Exit fullscreen mode

System prompt and options

You can pass a system prompt and control temperature directly in the call:

$response = $openai->read(
    'Summarize this article in 3 bullet points: ' . $articleBody,
    null,
    [
        'system'      => 'You are a concise technical writer.',
        'temperature' => 0.3,
    ]
);
Enter fullscreen mode Exit fullscreen mode

Multi-turn conversations

Temma's OpenAI datasource handles conversation history natively:

$response = $openai->read('And the capital of Italy?', null, [
    'system'   => 'You are a geography assistant.',
    'messages' => [
        ['role' => 'user', 'content' => 'What is the capital of France?'],
        ['role' => 'assistant', 'content' => 'The capital of France is Paris.'],
    ],
]);
// $response contains "The capital of Italy is Rome."
Enter fullscreen mode Exit fullscreen mode

A practical example: auto-tagging articles

Here is a realistic use case: a controller action that generates tags for an article automatically.

<?php

use \Temma\Attributes\View   as TμView;
use \Temma\Attributes\Method as TμMethod;

#[TμView('~Json')] // use JSON view on all actions
class Article extends \Temma\Web\Controller {
    // create a DAO on the 'article' table
    protected $_temmaAutoDao = true;

    // POST /article/tag/1
    #[TμMethod('POST')] // accept POST requests only
    public function tag(int $id) {
        // get the article
        $article = $this->_dao->get($id);
        if (!$article)
            return $this->_httpError(404);
        // fetch the tags from OpenAI
        $tags = $this->openai->read(
            'Generate 5 comma-separated tags for this article: ' . $article['body'],
            null,
            [
                'system'      => 'You are a tagging assistant. Reply with exactly 3 tags, comma-separated, lowercase, no punctuation.',
                'temperature' => 0.2,
            ]
        );
        // update the article
        $this->_dao->update($id, ['tags' => $tags]);
        // return a JSON stream `{"tags": "tag1,tag2,tag3"}`
        $this['tags'] = $tags;
    }
}
Enter fullscreen mode Exit fullscreen mode

No service class. No manual injection. The OpenAI connection is available in the controller exactly like the database connection, because from Temma's perspective, they are the same kind of thing: a datasource.

Why this matters

The datasource abstraction is one of Temma's core ideas. Whether you're talking to MySQL, Redis, S3, Slack, or OpenAI, the connection is declared in one place and accessed the same way throughout your code. Adding AI capabilities to an existing project becomes a config change and a few lines of code, not an architectural decision.

Temma is open source (MIT), has been in production since 2007, and the OpenAI datasource landed in the just-released 2.16.0. Full docs at temma.net.

Top comments (0)