BulkCopy sounds great on paper, but have you actually tried it?
I did. I took a 104,000 record table from SQL2000 on one server and used BulkCopy to import those records into a pre-existing table in SQL2005 on another server.
But first I tried the same operation with DTS as a benchmark and that took 21 seconds. (note that I’m doing this on an older computer that is 700 mhz and I think 512MB Ram. The remote server is an old dog, too.)
Then I wanted to see what it was like in code without bulk copy. I thought I would at least leverage some of the new features in ADO.NET 2.0 for this. I grabbed the same 104,000 records into a DataReader and used DataTable.Load (new feature in ado.net 2.0) to pull it into a DataTable. Then rather than let a DataAdapter.Update insert this one at a time (puh-lease!), I leveraged the new batch capabilities. SQLServer did not like 1000 records at a time (too many parameters in one execute command) so I set UpdateBatchSize to 100. I also am using the beautifully simple little System.Diagnostics.StopWatch class in this test. It’s been running a while. I will come back and insert the time here when it finishes > 41 minutes <. If the lights start dimming in Burlington, you’ll know why!
Before this test though I did the same operation using BulkCopy. Again, I pulled the 104,000 records from the remote server (on the same hardwired network) into a datareader and then passed that datareader into a SqlBulkCopy object, then called it’s WritetoServer method.
So remember the DTS took 21 seconds. How long did the operation take in ADO.NET 2.0?
Let’s have a few guesses and then I’ll tell you. Heh heh heh.
OKAY – it was 23 seconds! Basically the same as the DTS! Rather than 180 times as long as with the batch updating which would have been impossible with ADO.NET 1.1 where there is no batch updating. So that may have been 100 times longer than the batch update way.
Posted from BLInk!
Sign up for my newsletter so you don't miss my conference & Pluralsight course announcements!