Efficiently Loading Large Datasets: C# and SqlBulkCopy for Bulk Inserts in SQL Server

2024-07-27

Inserting large amounts of data into SQL Server row by row using standard INSERT statements can be slow and inefficient. This is where bulk insert techniques come into play.

The Solution: SqlBulkCopy Class

The .NET Framework provides the SqlBulkCopy class in the System.Data.SqlClient namespace. This class allows you to perform bulk inserts into SQL Server tables significantly faster than traditional row-by-row insertion.

Steps Involved:

  1. Establish a Connection:

    • Create a connection string containing the details of your SQL Server instance.
    • Use the SqlConnection class to establish a connection to the database.
  2. Create a SqlBulkCopy Object:

  3. Define Table Mapping (Optional):

  4. Set Batch Size (Optional):

  5. Write Data to SQL Server:

  6. Close the Connection:

Code Example:

using System;
using System.Data;
using System.Data.SqlClient;

public class BulkInsertExample
{
    public static void Main(string[] args)
    {
        // Connection string (replace with your details)
        string connectionString = "Data Source=localhost;Initial Catalog=MyDatabase;Integrated Security=True";

        // Create a DataTable with columns matching the destination table
        DataTable data = new DataTable();
        data.Columns.Add("ID", typeof(int));
        data.Columns.Add("Name", typeof(string));

        // Add some sample data rows
        data.Rows.Add(1, "Alice");
        data.Rows.Add(2, "Bob");
        data.Rows.Add(3, "Charlie");

        // Create a SqlBulkCopy object
        using (SqlConnection connection = new SqlConnection(connectionString))
        {
            connection.Open();

            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
            {
                bulkCopy.DestinationTableName = "MyTable"; // Replace with your table name

                // Optional: Map source columns if names don't match
                bulkCopy.ColumnMappings.Add("ID", "ID");
                bulkCopy.ColumnMappings.Add("Name", "Name");

                // Optional: Set batch size (experiment for best performance)
                bulkCopy.BatchSize = 1000;

                // Write the data to SQL Server
                bulkCopy.WriteToServer(data);
            }
        }

        Console.WriteLine("Bulk insert completed successfully!");
    }
}

Additional Considerations:

  • Error Handling: Implement error handling mechanisms to catch potential exceptions during the bulk insert process.
  • Transactions: Consider using database transactions to ensure data consistency if your bulk insert operation involves multiple inserts that need to succeed or fail as a whole.
  • Performance Optimization: Experiment with different batch sizes and other SqlBulkCopy options to find the most efficient configuration for your specific dataset and hardware.



using System;
using System.Data;
using System.Data.SqlClient;

public class BulkInsertExample
{
    public static void Main(string[] args)
    {
        // Connection string (replace with your details)
        string connectionString = "Data Source=localhost;Initial Catalog=MyDatabase;Integrated Security=True";

        // Create a DataTable with columns matching the destination table
        DataTable data = new DataTable();
        data.Columns.Add("ID", typeof(int));
        data.Columns.Add("Name", typeof(string));

        // Add some sample data rows
        data.Rows.Add(1, "Alice");
        data.Rows.Add(2, "Bob");
        data.Rows.Add(3, "Charlie");

        // Create a SqlBulkCopy object
        using (SqlConnection connection = new SqlConnection(connectionString))
        {
            connection.Open();

            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
            {
                bulkCopy.DestinationTableName = "MyTable"; // Replace with your table name

                // Write the data to SQL Server
                bulkCopy.WriteToServer(data);
            }
        }

        Console.WriteLine("Bulk insert completed successfully!");
    }
}

Explanation:

  • Creates a DataTable with columns matching the destination table structure.
  • Adds sample data rows.
  • Establishes a connection to the SQL Server database using a connection string.
  • Sets the DestinationTableName property to specify the target table.
  • Uses WriteToServer method to insert data from the DataTable into the target table.
  • Closes the connection using using blocks for proper resource management.

Bulk Insert from IDataReader (Custom Implementation):

using System;
using System.Data;
using System.Data.SqlClient;

public class BulkInsertExample
{
    public static void Main(string[] args)
    {
        // Connection string (replace with your details)
        string connectionString = "Data Source=localhost;Initial Catalog=MyDatabase;Integrated Security=True";

        // Sample data (replace with your data source)
        List<Product> products = new List<Product>()
        {
            new Product(1, "Product 1", 10.99),
            new Product(2, "Product 2", 15.50),
            new Product(3, "Product 3", 22.75)
        };

        // Create a custom IDataReader implementation
        MyCustomDataReader reader = new MyCustomDataReader(products);

        // Create a SqlBulkCopy object
        using (SqlConnection connection = new SqlConnection(connectionString))
        {
            connection.Open();

            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
            {
                bulkCopy.DestinationTableName = "Products"; // Replace with your table name

                // Optional: Map source columns if names don't match
                bulkCopy.ColumnMappings.Add("ProductID", "ID");
                bulkCopy.ColumnMappings.Add("ProductName", "Name");
                bulkCopy.ColumnMappings.Add("Price", "Price");

                // Write the data to SQL Server
                bulkCopy.WriteToServer(reader);
            }
        }

        Console.WriteLine("Bulk insert completed successfully!");
    }

    public class Product
    {
        public int ProductID { get; set; }
        public string ProductName { get; set; }
        public decimal Price { get; set; }

        public Product(int productID, string productName, decimal price)
        {
            ProductID = productID;
            ProductName = productName;
            Price = price;
        }
    }

    public class MyCustomDataReader : IDataReader
    {
        private readonly List<Product> products;
        private int currentIndex = -1;

        public MyCustomDataReader(List<Product> products)
        {
            this.products = products;
        }

        public object this[int ordinal] => GetValue(ordinal);

        public int Depth => 0;

        public bool HasRows => products != null && products.Count > 0;

        public bool IsClosed => currentIndex >= products.Count - 1;

        public int RecordsAffected => NotSupported;

        public void Close() { }

        public void Dispose() {



  • Pros:
    • Powerful ETL (Extract, Transform, Load) tool for complex data integration scenarios.
    • Graphical user interface allows for visual design of data flows.
    • Supports various data sources and destinations, including flat files, databases, and more.
    • Can handle data transformations and cleansing before inserting into the target table.
  • Cons:
    • Steeper learning curve compared to SqlBulkCopy.
    • Requires additional configuration and setup in SQL Server Management Studio (SSMS).
    • Might be overkill for simple bulk insert operations.

BCP Utility:

  • Pros:
    • Command-line tool included with SQL Server for high-performance bulk data import and export.
    • Offers granular control over the import process through command-line options.
    • Can be scripted for automation.
  • Cons:
    • Command-line interface requires some technical knowledge.
    • Less user-friendly than SqlBulkCopy or SSIS.
    • Limited data transformation capabilities.

Third-Party Libraries:

  • Pros:
    • Some libraries offer additional features beyond SqlBulkCopy, such as parallel processing or advanced error handling.
    • Can simplify integration with specific data sources or frameworks.
  • Cons:
    • Introduce external dependencies.
    • Might require additional licensing costs.
    • May not be as well-maintained or supported compared to Microsoft's built-in tools.

Choosing the Right Method:

The best method for you depends on the specific requirements of your bulk insert operation:

  • If you need a simple, high-performance solution for basic bulk inserts, SqlBulkCopy is an excellent choice.
  • If your data needs complex transformations or you require a visual workflow for designing the data flow, SSIS is a powerful tool to consider.
  • If you need the raw power and control of a command-line tool, or have existing scripts using BCP, it might be a good fit.
  • For specific data sources or frameworks, a third-party library might offer additional benefits.

c# sql sql-server



Bridging the Gap: Transferring Data Between SQL Server and MySQL

SSIS is a powerful tool for Extract, Transform, and Load (ETL) operations. It allows you to create a workflow to extract data from one source...


Taming the Tide of Change: Version Control Strategies for Your SQL Server Database

Version control systems (VCS) like Subversion (SVN) are essential for managing changes to code. They track modifications...


Can't Upgrade SQL Server 6.5 Directly? Here's How to Migrate Your Data

Outdated Technology: SQL Server 6.5 was released in 1998. Since then, there have been significant advancements in database technology and security...


Replacing Records in SQL Server 2005: Alternative Approaches to MySQL REPLACE INTO

SQL Server 2005 doesn't have a direct equivalent to REPLACE INTO. You need to achieve similar behavior using a two-step process:...


Keeping Your Database Schema in Sync: Version Control for Database Changes

While these methods don't directly version control the database itself, they effectively manage schema changes and provide similar benefits to traditional version control systems...



c# sql server

Keeping Watch: Effective Methods for Tracking Updates in SQL Server Tables

This built-in feature tracks changes to specific tables. It records information about each modified row, including the type of change (insert


Keeping Watch: Effective Methods for Tracking Updates in SQL Server Tables

This built-in feature tracks changes to specific tables. It records information about each modified row, including the type of change (insert


Beyond Flat Files: Exploring Alternative Data Storage Methods for PHP Applications

Simple data storage method using plain text files.Each line (record) typically represents an entry, with fields (columns) separated by delimiters like commas


Ensuring Data Integrity: Safe Decoding of T-SQL CAST in Your C#/VB.NET Applications

In T-SQL (Transact-SQL), the CAST function is used to convert data from one data type to another within a SQL statement


Ensuring Data Integrity: Safe Decoding of T-SQL CAST in Your C#/VB.NET Applications

In T-SQL (Transact-SQL), the CAST function is used to convert data from one data type to another within a SQL statement