Locking the command is basically a "double run prevention".
Assume that you have a hourly email command and you're sending e-mails to users one by one. If the command doesn't finish until next hour, you may send multiple e-mails to some users. Or your colleague can try to start same command after you run it.
// RUN A:
$users = User::where('hourly_email_sent', false)->get();
// On above query, 10.000 users have been found.
// The command started and already processed 8.000 users in one hour. There are 2.000 users remaining.
// If you start command again, you'll send 2 emails for remaining 2.000 users. Because a command is already running and you started another one.
// RUN B:
$users = User::where('hourly_email_sent', false)->get();
// On above query 2000+ users have been found.
// RUN A and RUN B can contain same users!
I generally prefer to put a value to cache and check it before actually running the command. Here's an example:
<?php
$cacheLockKey = 'command_name';
if (Cache::has($cacheLockKey))
{
$this->output->writeln('Another process is already running. Exiting...');
return 1;
}
Cache::put($cacheLockKey, date('Y-m-d H:i:s'));
// do some logic
Cache::forget($cacheLockKey);
return 0;
This simple control will prevent you to run a command twice with mistake. It's easy, isn't it?
Downsides with putting lock into cache
You can use database or file storage too. I didn't get trouble just because I'm using cache, but of course you can get problems somehow.
If cache gets flushed for all values, your cache key will also be removed. You have to protect your "lock" until command is really finished.
If you have multiple servers, you have to make sure all servers are using same cache storage.
Failure Case
If the command fails for a reason, you have to be sure cache key was removed. Otherwise command won't start again.
You can handle this situation manually, but if you're running command on cronjob, your next run may not be able to succeed.
This failure case of course depends whatever you want from a job.
Top comments (0)