DEV Community

loading...
Cover image for How to process a heavy request in WordPress

How to process a heavy request in WordPress

Amirition
WordPress and python developer, thinking too much about life.
・2 min read

Requests in WordPress and PHP have a limited time to run. I recently had a problem with processing a huge file. What I wanted to do, didn't fit in a single request so I'd done a little research.
At first, I thought about the cronjobs and it was the right solution but you have to just tweak it a little.

The Problem

I had a CPT and a CSV file, containing the information about every post. Every file had about 1000 rows and 40 columns so it was very big for a single request.
Noticing that one column was an array of remote images that had to be uploaded to my website. When I tested it, only 3 posts were imported and this was not the result I expected.

Solution 1: Background Processing

GitHub logo deliciousbrains / wp-background-processing

WordPress background processing class


This plugin can run your tasks in the background. The full tutorial for using its classes is on their GitHub page.

What this plugin does is simply saving all the data you need for your requests in the database. Every row contains the information for a single request and then runs a cronjob every 5 minutes that grabs the latest row from the database and runs the request. Sound very simple but yet powerful and with a lot of features.

Solution 2: Batch Processing

GitHub logo gdarko / wp-batch-processing

🚈 Easily process large batches of data in WordPress. Provide the data, setup the processing procedure, run the batch processor from the admin dashboard. Profit.


This plugin helps you to do your large requests too. It saves the data in the database and has a user interface! You can go to its page and run the batch and see how many of the requests are done.

image

But the actual difference is that this plugin does not use cronjob. It just runs an ajax call which in the last row of the database runs. Then at that ajax call, another ajax call runs, and the information about the success or the failure of the previous call shows in the interface. One other thing to mention, you can't leave the page or the process will be paused.

My Code

So enough theoretical and let's see some code. I went with the batch processing plugin.

As the tutorial on the GitHub page says, you should create an instance of the class but my point was to show you how I handled my file.

Further Problem: One of the major problems I've faced was that my file was so large that even the batch processor couldn't handle the first step of the work; which would be saving the items in the database. So if you're facing the same problem, what did you do to overcome this?

Discussion (0)