Jose Manuel Jurado Diaz shares some customer notes:
Today, I’ve been working on a service request that our customer wants to improve the performance of a bulk insert process. Following, I would like to share my experience working on that.
Our customer mentioned that inserting data (100.000 rows) is taking 14 seconds in a database inĀ Business Critical. I was able to reproduce this time using a single thread using a table with 20 columns.
A lot of this advice also applies to on-premises SQL Server and relates to using bulk inserts and picking good batch sizes. Similar advice to what we’d be doing with SQL Server Integration Services or any other ETL/ELT process, tailored to Python.