Fairly simplistic solution, but would it be possible to break the query up into X amount of rows at a time? Say, run a query for the first 100,000 rows (or however many you can manage before the connection drops), then run it again offset by the first amount, and so on?
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Fairly simplistic solution, but would it be possible to break the query up into X amount of rows at a time? Say, run a query for the first 100,000 rows (or however many you can manage before the connection drops), then run it again offset by the first amount, and so on?