Using BulkInsert to Import Data in Kentico

   —   

From product data to blog posts, populating applications with data is part of many Kentico projects, especially when a website is being redesigned. Kentico provides several options for getting this data into the site, but sometimes companies have to develop a custom solution to achieve their specific needs. In this blog, I’ll show you how can leverage the BulkInsert API to simplify your imports and speed up your application.

Developers are often tasked with bringing data into their Kentico applications. This data can be historical information, product details, or any other type of content that will be used within the site. While Kentico provides a great API for developers to use, importing a large amount of data can impact the performance of the site if there are too many API calls at once.

To help with this issue, the BulkInsert API provides a way to import a large amount of records in a single SQL execution. This functionality reduces the amount of code that developers must write while providing a streamlined process for importing the data. Let me show you just how easy it can be.

A Completely Realistic Scenario

For my scenario, say I need to import a lot of exoplanet data into a custom table. If you don’t know, exoplanets are celestial bodies found outside our solar system, which, of course, is extremely important to my site where I sell coffee. I downloaded the data from NASA and imported it into a new database for my demo. I now need to create the functionality to import it on a regular basis.

Creating a Scheduled Task

The first step is to create a scheduled task for my import process. The task itself is pretty standard, with the exception of two parameters that are passed in the TaskData field for use inside the Execute method.

[assembly: RegisterCustomClass("Custom.BulkImportDemoTask", typeof(Custom.BulkImportDemoTask))] namespace Custom { public class BulkImportDemoTask : ITask { /// <summary> /// Executes the task. /// </summary> /// <param name="ti">Info object representing the scheduled task</param> public string Execute(TaskInfo ti) { try { string[] taskparams = ti.TaskData.Split(','); bool blnCleanupData = ValidationHelper.GetBoolean(taskparams[0], false); bool blnOptimized = ValidationHelper.GetBoolean(taskparams[1], false); return ""; } catch (Exception ex) { EventLogProvider.LogException("BulkImportDemoTask", "EXECUTE", ex); return ex.Message; } } } }

Getting the Import Data

Once the task is created, the next step is to get the exoplanet data I was going to import. In my demo, this data is in another database in a single table. I use the GeneralConnection API to connect to the database (using a different connection string) and pull the data into a DataSet.

// Get import data // In this demo, I'm getting it from another database by specifiying a different connection string GeneralConnection cnExoplanets = ConnectionHelper.GetConnectionByName("ExoplanetConnectionString"); QueryParameters qp = new QueryParameters("SELECT * FROM Exoplanets", null, QueryTypeEnum.SQLQuery, false); DataSet dsImport = cnExoplanets.ExecuteQuery(qp);

Your data may be in a file, an RSS feed, or any other repository. Regardless of where it is, using the Kentico APIs to get the data has a lot of benefits, including debugging capabilities and error reporting.

Importing Single Records

With the data in a usable object, I am ready to implement my import functionality. To start, I use the API to import each item individually. While this method works fine, it could potentially cause performance issues, especially if I were importing 100,000+ records.

// This section uses the API to loop through each object and perform an INSERT // Prepares the code name (class name) of the custom table to which the data record will be added string customTableClassName = "custom.exoplanets"; // Gets the custom table DataClassInfo customTable = DataClassInfoProvider.GetDataClassInfo(customTableClassName); if (customTable != null) { // Loop through each record and perfrom an insert foreach (DataRow dr in dsImport.Tables[0].Rows) { // Creates a new custom table item CustomTableItem newCustomTableItem = CustomTableItem.New(customTableClassName); // Sets the values for the fields of the custom table (ItemText in this case) newCustomTableItem.SetValue("PlanetName", ValidationHelper.GetString(dr["name"], "")); newCustomTableItem.SetValue("StarName", ValidationHelper.GetString(dr["star_name"], "")); newCustomTableItem.SetValue("YearDiscovered", ValidationHelper.GetInteger(dr["discovered"], 1900)); // Save the new custom table record into the database newCustomTableItem.Insert(); } };

You can see in the code that for each item in my DataSet, I’m using the CustomTableItem API to import the new row. This means there would be a separate SQL call for each row and a lot of overhead as the task executes.

Import Using BulkInsert

To streamline the above functionality, I convert the code to use the BulkInsert API. First, I create a BulkInsertSettings object to hold the mappings for my source and destination data sources.

// First, create a BulkInsertSettings object to hold the filed mappings for the source and destination tables BulkInsertSettings settings = new BulkInsertSettings { Mappings = new Dictionary<string, string> { {"name", "PlanetName"}, {"star_name", "StarName"}, {"discovered", "YearDiscovered"} } };

The last piece is to call the BulkInsert method of the ConnectionHelper. This creates a single BULK INSERT SQL transaction, passing in the import data and the BulkInsertSettings mappings.

// Use the BulkInsert API to insert all of the items at once. ConnectionHelper.BulkInsert(dsImport.Tables[0], "custom_exoplanets", settings);

Moving Forward

As you can see above, the BulkInsert API can greatly reduce the complexity of your code and simplify your import process. Because there is a single API call and a single SQL transaction, this functionality can greatly improve performance and enable you to import large amounts of data efficiently. Always be thinking about how your custom code can affect the overall site and the performance of other systems. Good luck!

Get the code

This blog is intended for informational purposes only and provides an example of one of the many ways to accomplish the described task. Always consult Kentico Documentation for the best practices and additional examples that may be more effective in your specific situation.

Share this article on   LinkedIn

Bryan Soltis

Hello. I am a Technical Evangelist here at Kentico and will be helping the technical community by providing guidance and best practices for all areas of the product. I might also do some karaoke. We'll see how the night goes...