
Caching database queries is one of the most effective ways to improve application performance. While tools like Redis and Memcached are the gold standard for real-time caching, sometimes you want a simple, file-based approach — and maybe even store those cached files in the cloud for durability and scalability.
In this article, I’ll show you how to:
- Implement a file-based cache for MySQL queries in PHP
- Upload those cached JSON files to DigitalOcean Spaces
- Serve cached data efficiently to clients
Step 1: Local File-Based Cache in PHP
We start with a simple caching mechanism that saves query results into a JSON file:
<?php
$cacheFile = "cache/users.json";
$cacheTime = 300; // 5 minutes
if (file_exists($cacheFile) && (time() - filemtime($cacheFile)) < $cacheTime) {
$data = json_decode(file_get_contents($cacheFile), true);
} else {
$mysqli = new mysqli("localhost", "user", "pass", "dbname");
$res = $mysqli->query("SELECT * FROM users");
$data = $res->fetch_all(MYSQLI_ASSOC);
file_put_contents($cacheFile, json_encode($data));
}
header("Content-Type: application/json");
echo json_encode($data);
?>
This ensures that repeated requests within 5 minutes don’t hit the database — they just read from the cached file.
Step 2: Add DigitalOcean Spaces
DigitalOcean Spaces is an object storage service (similar to AWS S3). We’ll use the AWS SDK for PHP to upload our cached JSON file to Spaces.
Install AWS SDK
composer require aws/aws-sdk-php
Configure Spaces
You’ll need:
- Space name (bucket name)
- Region (e.g.,
nyc3) - Endpoint (e.g.,
https://nyc3.digitaloceanspaces.com) - Access key & secret key
Updated Server Code
<?php
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
// Spaces config
$spaceName = "my-space";
$region = "nyc3";
$endpoint = "https://$region.digitaloceanspaces.com";
$accessKey = "YOUR_ACCESS_KEY";
$secretKey = "YOUR_SECRET_KEY";
// Create S3 client
$s3 = new S3Client([
'version' => 'latest',
'region' => $region,
'endpoint' => $endpoint,
'credentials' => [
'key' => $accessKey,
'secret' => $secretKey,
],
]);
// Cache logic
$cacheFile = "cache/users.json";
$cacheTime = 300;
if (file_exists($cacheFile) && (time() - filemtime($cacheFile)) < $cacheTime) {
$data = json_decode(file_get_contents($cacheFile), true);
} else {
$mysqli = new mysqli("localhost", "user", "pass", "dbname");
$res = $mysqli->query("SELECT * FROM users");
$data = $res->fetch_all(MYSQLI_ASSOC);
// Save locally
file_put_contents($cacheFile, json_encode($data));
// Upload to Spaces
try {
$s3->putObject([
'Bucket' => $spaceName,
'Key' => "users.json",
'Body' => json_encode($data),
'ACL' => 'public-read',
'ContentType' => 'application/json'
]);
} catch (AwsException $e) {
error_log("Spaces upload failed: " . $e->getMessage());
}
}
header("Content-Type: application/json");
echo json_encode($data);
?>
Step 3: Client Side Consumption
Clients can now fetch cached data either from:
- Your PHP endpoint (
server.php) - Directly from Spaces CDN (
https://my-space.nyc3.digitaloceanspaces.com/users.json)
Example (JavaScript Client)
<script>
fetch("https://my-space.nyc3.digitaloceanspaces.com/users.json")
.then(res => res.json())
.then(data => {
data.forEach(user => {
console.log(`ID: ${user.id}, Name: ${user.name}`);
});
});
</script>
Benefits of This Approach
- Performance: Reduce database load with cached query results.
- Durability: Cached files are safely stored in Spaces.
- Scalability: Multiple clients can fetch directly from Spaces CDN.
- Flexibility: Use local cache for speed, Spaces for persistence.
Conclusion
While Redis or Memcached are better for real-time caching, combining file-based caching with DigitalOcean Spaces gives you a simple, scalable solution for storing and serving cached query results. It’s perfect for scenarios where durability and CDN delivery matter more than microsecond response times.
Top comments (0)