loading...
Cover image for How to run a long lasting query without interruption?

How to run a long lasting query without interruption?

sovietspy2 profile image Barney ・1 min read

I have a query like this:

insert into table1 (field1, field2) select (field1, field2) from table2

It's running on a huge table with millions of rows. It might take several hours or days but it wouldn't be a problem. The problem is right now that my client (dbeaver) constantly dropping the connection and my query fails. How would you handle this problem?
Thanks

Discussion

pic
Editor guide
Collapse
brandinchiu profile image
Brandin Chiu

So, a couple of things:

If an insert is taking that long, then you probably have something to clean up on your indices.

I'd take a look at those first.

As for queries where processing time is an issue, you can solve some of these problems by running them locally on the database server, as opposed to over a network connection.

So your application would log the job as needing to be completed and put it in a processing queue.

A service on the database server would read jobs out of the queue and process them locally.

Then, you can either have a separate job constantly check if the job is done, or wait for the original job to notify something or someone once it's completed.

Collapse
sovietspy2 profile image
Barney Author

I went with this solution and worked out well.

Collapse
brandinchiu profile image
Brandin Chiu

Glad it worked out :)

Collapse
paintedsky_ca profile image
PaintedSky

Fairly simplistic solution, but would it be possible to break the query up into X amount of rows at a time? Say, run a query for the first 100,000 rows (or however many you can manage before the connection drops), then run it again offset by the first amount, and so on?

Collapse
ndrone profile image
Nicholas Drone

Write a stored procedure to do the work for you, and store that in a temporary table.