Thanks for this post, its helpful. I'm in the position where I'm building a plugin and it requires the ability to process potentially thousands of posts.
I've actually used the same batch processing plugin but I also encountered the same problem as you did, where the initial setup method caused a timeout of my environment. I was pulling in 50K ids, and ordering them by date, but it was just too much for my test site to handle.
One way around this I can see is by reducing the initial set up to query only a chunk of the items you want to process, by checking for the lack of existence of some post meta. The post meta would only be added to processed items, so when the query ran a second time, it would ignore the already processed items.
It has its downsides, such as the need to keep re-running the batch, but it might serve as a workaround.
I also found this option actionscheduler.org/ but I've not really figured out how to implement it yet.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Thanks for this post, its helpful. I'm in the position where I'm building a plugin and it requires the ability to process potentially thousands of posts.
I've actually used the same batch processing plugin but I also encountered the same problem as you did, where the initial setup method caused a timeout of my environment. I was pulling in 50K ids, and ordering them by date, but it was just too much for my test site to handle.
One way around this I can see is by reducing the initial set up to query only a chunk of the items you want to process, by checking for the lack of existence of some post meta. The post meta would only be added to processed items, so when the query ran a second time, it would ignore the already processed items.
It has its downsides, such as the need to keep re-running the batch, but it might serve as a workaround.
I also found this option actionscheduler.org/ but I've not really figured out how to implement it yet.