DEV Community

darwinpasco25
darwinpasco25

Posted on

How to Insert Thousands of Rows from an Excel / CSV File into a database with a single database call

Most of the procedures I see programmers do when uploading data from a excel or csv file is

  1. Loop through the file per row
  2. Submit the row to the database.

This method does not only takes so many hours to finish but also takes a lot of resources from the server. So these tasks are usually done during off-peak hours.

But there is a way to do this without taking so much time and resources. You can actually insert tens of thousands of records from a file in a single database call for your application.

This is done by using the ff procedures.
From the app:

  1. Read the excel / csv file as a DataTable
  2. Serialize the DataTable to JSON string
  3. Submit the JSON String as parameter (JSON / NVARCHAR datatype) to a stored procedure

In the MySQL / SQL Server / PostgreSQL stored procedure

  1. Read the JSON string as a table using JSON_TABLE (MySQL) OPEN_JSON (SQL Server) or JSON_EACH(PostgreSQL) table function
  2. Insert the converted JSON table to a table in the database.

Depending on the data transformations you will do prior to actually inserting the data, the total operation can take as little as 1 second to insert 10,000 rows or more. I regularly insert / update 50,000 rows from an Excel file to an SQL Server database in RDS and I am able to do so in 1-2 seconds, network latency including. Most I did is 200,000 rows without any problem during peak hours and the users didn't even notice it.

Sample Code
https://github.com/darwinpasco25/JsonFileUploadApp

Sample Excel file with 10,000 rows of records
https://github.com/darwinpasco25/JsonFileUploadApp/blob/main/JsonFileUploadApp/JsonFileUploadApp/sales.xlsx

Sample MySQL database
https://github.com/darwinpasco25/JsonFileUploadDB

Top comments (0)