Bulk Insert from Generic List into SQL Server with minimum lines of code

Entity Framework and even classic ADO.NET is very slow when you need to perform a large number of inserts into the database.

Calling 1000 times INSERT INTO is not efficient because for every INSERT operation, SQL Server will also need to write to the transaction log.

A nice workaround for this is to use Bulk insert or SQLBulkCopy class in C#.

However SQLBulkCopy method WriteToServer does not accept List<T> or IEnumerable, IList as parameter.

Implement extension method AsDataTable for IEnumerable<T>

A nice workaround is to implement extension method AsDataTable to IEnumerable because WriteToServer accepts DataTable as parameter.

Here is a simple implementation to do this:

 public static class IEnumerableExtensions
    {
        public static DataTable AsDataTable<T>(this IEnumerable<T> data)
        {
            PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
            var table = new DataTable();
            foreach (PropertyDescriptor prop in properties)
                table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
            foreach (T item in data)
            {
                DataRow row = table.NewRow();
                foreach (PropertyDescriptor prop in properties)
                    row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
                table.Rows.Add(row);
            }
            return table;
        }
    }

After this we can use this extension method to Bulk insert data from List<T> into SQL Server.

 var listPerson = new List<Person>
     {
          new Person() {Id = 1}, 
          new Person() {Id = 2}
     };

 using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["SomeConnectionString"].ConnectionString))
     {
          connection.Open();
          SqlTransaction transaction = connection.BeginTransaction();

          using (var bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
          {
             bulkCopy.BatchSize = 100;
             bulkCopy.DestinationTableName = "dbo.Person";
             try
             {
                 bulkCopy.WriteToServer(listPerson.AsDataTable());
             }
             catch (Exception)
             {
                 transaction.Rollback();
                 connection.Close();
             }
           }

           transaction.Commit();
     }

Optimal BatchSize depends on the number of items you need to insert, row size, network bandwidth and latency.

After doing some testing I can recommend BatchSize of 100 when you need to insert few thousand items into the database.

The best way to find optimal BatchSize value is to experiment yourself.

If you like this article don’t forget to subscribe to this blog and make sure you don’t miss new upcoming blog posts.

 

14 Comments on Bulk Insert from Generic List into SQL Server with minimum lines of code

  1. Stephen Yeadon
    October 17, 2014 at 12:56 pm (10 years ago)

    Nice extension method. I have used the sql bulk copy class many times in the past but my implementations have always been more clunky and required some understanding of the types in question. I like this generic approach.

    Reply
  2. swapnil dane
    March 27, 2015 at 1:29 pm (10 years ago)

    Excellent article. I am SQL Server developer and new to
    .NET programming. I was looking for something to load List into SQL Server
    tables. This worked perfectly for me. Thank you..!!

    Reply
  3. swapnil dane
    March 27, 2015 at 1:33 pm (10 years ago)

    Can I extend this to load JSON feed into sql server table? I really hope that Microsoft will provide some native support for JSON.

    Reply
    • Radenko Zec
      March 27, 2015 at 2:45 pm (10 years ago)

      You can store JSON in varchar(max) column as plain text but you cannot query it. If you need to store JSON files in Db and query it you can look at Azure DocumentDb.

      Reply
  4. Danilo R
    November 24, 2015 at 4:00 pm (9 years ago)

    Hi, I know post is a little bit old, but I wanted to comment on this if someone finds this useful. (As I did!)

    The code snippets are pretty handy. However, calling “transaction.Commit();” after calling the lines in the “catch (Exception)” block, will throw another exception since the connection has already been closed.

    That being said. I would remove “connection.Close();” from the “catch (Exception)” block in order to avoid this. PS: Since we have a “using” block, it will close the connection in any case.

    Reply
  5. Aaron Merritt
    March 11, 2016 at 4:03 pm (9 years ago)

    I hate to revive an old topic but I can’t find the answer to the problem I have from using this.. This method results in literally 34x the speed of which it was saving before on my end which is really awesome.

    My problem though is that when it saves the list of string values into SQL, they’re stored in an integer format and when I read the values back into the string, the values don’t convert back to a string..

    I’m certain that I’m just an idiot but can anyone shed light on the matter for me?

    Reply
  6. Carlos Neal
    September 5, 2017 at 11:43 pm (7 years ago)

    awesome post! Thanks helped me a lot 🙂

    Reply
  7. James
    September 21, 2018 at 9:47 am (6 years ago)

    This article is so interesting. It helps me a lot. But how can I implement it in oracle?

    Reply
  8. James
    September 24, 2018 at 8:25 am (6 years ago)

    Hi Radenko, how can we implement it in oracle?

    Reply
  9. Jaime
    December 7, 2018 at 3:04 pm (6 years ago)

    Some users will need to add column mappings:

    bulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping(“Column1”, “Column1”));

    Reply
  10. Ben
    August 20, 2019 at 9:39 pm (5 years ago)

    Hi and many thanks for your magnificent extension method . it will be really very helpful to explain step by step what and why have you done in extension method ?

    Reply
  11. M@
    July 17, 2020 at 6:27 pm (4 years ago)

    Great idea and exactly what I needed. However, I have a suggested improvement.

    I almost gave up on making it work because I kept getting the error “Cannot access destination table [SchemaName.TableName]”. I finally realized where I was going wrong and that this will only work for tables located in the default “dbo” schema.

    The problem is that in the DestinationTableName, you are wrapping the table name in brackets (which is a good step to deal with special characters in names), but it doesn’t work if I also include a schema name (MySchema.MyTable) because it ends up making the table name [MyShema.MyTable] when it should be [MyShema].[MyTable].

    I fixed this by adding a Schema parameter (defaulting to “dbo”) and changing the line to:

    bulkCopy.DestinationTableName = $”[{Schema}].[{TableName}]”;

    This ensures that the fully qualified name is always used.

    I hope this helps someone who is having the same problem.

    Reply

Leave a Reply