For some reason in my current work, I need to crunch data from an non-optimal sql query and send the result into a csv to an email. All I have though is a php laravel system.
Therefore there are some limitations:
- Memory: We cannot use the laravel's DB layer because of:
- SQL complexity, I need to execute the query once pagination offers no use
- Result Set size: I cannot fetch the results alltogether
- Email Limitations: CSV may be huge. I cannot send it as is even if it is compressesed.
For 1.1 and 1.2 we can use a console command that sends the email. Jobs decause data crunching takes time is a no go.
Then we have the following options though:
Option1: Trigger the command into background using &
:
shell_exec("php artisan data:crunch &");
As you can see we avoid using Artisan
class because we need to run the command into background. In order to do it the command at shell_exec
must be terminated with &
.
Option2: Use cron:
$schedule->command('data:crunch')->dailyAt('...')->runInBackground();
In this case as you can see at ./app/Console/Kernel.php
we utilize the runInBackground
method.
The command itself should use the PDO directly instead of laravel's db layer. For this use the following code:
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\DB;
class MyCommand extends Command
{
// COmmand Definitions are ommited
public function handle()
{
// COmplex SQL
$sql = "...";
$pdo = DB::getPdo();
$stmt = $pdo->prepare();
while($item = $stmt->fetch(\PDO::FETCH_NUM)){
// GEN CSV here
}
}
}
For multiple DB connections use (replace the my_connection
with the appropriate one defined upon ./config/database.php
):
DB::connection('my_connection')->getPdo();
For the CSV you'll need to have enough disk space and this algorithm needs to be followed:
- Save upon
storage_path();
using php'sfile()
andfputcsv
:
$fp = fopen(storage_path('mycsv.csv','w');
while($item = $stmt->fetch(\PDO::FETCH_NUM)){
fputcsv($fp,$item);
fflush($fp);
}
fclose($fp);
Due to memory limitations we need to build the file line by line instead of one-go that laravel's API provide. fflush()
ensures that line is written in case of unexpected termination.
- Then upload it into:
- An s3 bucket hosted as static website or served via a CDN
- Azure blob storage
- Into a seperate (s)ftp server.
- Any web accessible storage 4.send the email towards the receipient. In this email you'll need to place the url of the file inside.
The idea is that we need a URL where the end user needs to be able top download the file.Therefore, we need to upload the file into a storage that is able to provide a public-available url. And send this email. The reason why is because mailboxes cannot handle huge files.
Also, if CDN is used, the files can be served via signed URLS if possible.
Top comments (0)