Bulk Insert from Generic List into SQL Server with minimum lines of code

Entity Framework and even classic ADO.NET is very slow when you need to perform a large number of inserts into the database.

Calling 1000 times INSERT INTO is not efficient because for every INSERT operation, SQL Server will also need to write to the transaction log.

A nice workaround for this is to use Bulk insert or SQLBulkCopy class in C#.

However SQLBulkCopy method WriteToServer does not accept List<T> or IEnumerable, IList as parameter.

Implement extension method AsDataTable for IEnumerable<T>

A nice workaround is to implement extension method AsDataTable to IEnumerable because WriteToServer accepts DataTable as parameter.

Here is a simple implementation to do this:

 public static class IEnumerableExtensions
    {
        public static DataTable AsDataTable<T>(this IEnumerable<T> data)
        {
            PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
            var table = new DataTable();
            foreach (PropertyDescriptor prop in properties)
                table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
            foreach (T item in data)
            {
                DataRow row = table.NewRow();
                foreach (PropertyDescriptor prop in properties)
                    row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
                table.Rows.Add(row);
            }
            return table;
        }
    }

After this we can use this extension method to Bulk insert data from List<T> into SQL Server.

 var listPerson = new List<Person>
     {
          new Person() {Id = 1}, 
          new Person() {Id = 2}
     };

 using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["SomeConnectionString"].ConnectionString))
     {
          connection.Open();
          SqlTransaction transaction = connection.BeginTransaction();

          using (var bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
          {
             bulkCopy.BatchSize = 100;
             bulkCopy.DestinationTableName = "dbo.Person";
             try
             {
                 bulkCopy.WriteToServer(listPerson.AsDataTable());
             }
             catch (Exception)
             {
                 transaction.Rollback();
                 connection.Close();
             }
           }

           transaction.Commit();
     }

Optimal BatchSize depends on the number of items you need to insert, row size, network bandwidth and latency.

After doing some testing I can recommend BatchSize of 100 when you need to insert few thousand items into the database.

The best way to find optimal BatchSize value is to experiment yourself.

If you like this article don’t forget to subscribe to this blog and make sure you don’t miss new upcoming blog posts.

 

9 Comments on Bulk Insert from Generic List into SQL Server with minimum lines of code

  1. Stephen Yeadon
    October 17, 2014 at 12:56 pm (4 years ago)

    Nice extension method. I have used the sql bulk copy class many times in the past but my implementations have always been more clunky and required some understanding of the types in question. I like this generic approach.

    Reply
  2. swapnil dane
    March 27, 2015 at 1:29 pm (3 years ago)

    Excellent article. I am SQL Server developer and new to
    .NET programming. I was looking for something to load List into SQL Server
    tables. This worked perfectly for me. Thank you..!!

    Reply
  3. swapnil dane
    March 27, 2015 at 1:33 pm (3 years ago)

    Can I extend this to load JSON feed into sql server table? I really hope that Microsoft will provide some native support for JSON.

    Reply
    • Radenko Zec
      March 27, 2015 at 2:45 pm (3 years ago)

      You can store JSON in varchar(max) column as plain text but you cannot query it. If you need to store JSON files in Db and query it you can look at Azure DocumentDb.

      Reply
  4. Danilo R
    November 24, 2015 at 4:00 pm (3 years ago)

    Hi, I know post is a little bit old, but I wanted to comment on this if someone finds this useful. (As I did!)

    The code snippets are pretty handy. However, calling “transaction.Commit();” after calling the lines in the “catch (Exception)” block, will throw another exception since the connection has already been closed.

    That being said. I would remove “connection.Close();” from the “catch (Exception)” block in order to avoid this. PS: Since we have a “using” block, it will close the connection in any case.

    Reply
  5. Aaron Merritt
    March 11, 2016 at 4:03 pm (2 years ago)

    I hate to revive an old topic but I can’t find the answer to the problem I have from using this.. This method results in literally 34x the speed of which it was saving before on my end which is really awesome.

    My problem though is that when it saves the list of string values into SQL, they’re stored in an integer format and when I read the values back into the string, the values don’t convert back to a string..

    I’m certain that I’m just an idiot but can anyone shed light on the matter for me?

    Reply
  6. Carlos Neal
    September 5, 2017 at 11:43 pm (12 months ago)

    awesome post! Thanks helped me a lot 🙂

    Reply

Leave a Reply