Blasty
-
7/2/2010 3:01:58 PM
SQLImport comments
Ok, so I tried to use the SQLImport utility provided with kentico, and while I appreciate the effort to provide a tool of this nature, for me it is not usuable. I'd like to say what issues I had with it, in a hope you can improve your product in future releases.
The version I used came with the current Kentico 5.5 install.
Issues:
1. The import utility wrapped everything up into one huge transaction, depending on the amount of data being imported, this can be a Hard drive nightmare for a database. After importing about 11000 items, we had a rather large log file going and had to shut down our initial import of 48000 items.
2. Transaction handling. While transaction safety is a good thing, the SqlImport utility fails to stop an import when something fails. It doesn't appear to decide to unwind the transaction until the very end of the import. even if something failed at the beginning that would guarantee the transaction to not be committed. This can and has wasted an amazing amount of time.
3. Transaction failure. I got a notification of the reason for failure on a small 100 item import, but the large import popped a standard .Net exception pop-up. Could definately use some better error handling considering that your dealing with potentially bad data from a customers DB.
4. I'm taking a wild guess that a web programmer created this form application, considering some of the issues. Anyways, the import process doesn't use a seperate thread apparently, since the UI locks up completely and I don't get any progress bars or notifications of how the import is going. To figure out how it was going I had to get what I call "Ghetto Updates" by running nolock count statements against the custom.[doc] table the item was importing to.
5. At first I was going to stagger the import via the select statement I put in and only do 1000 items at a time. While this was working, It would of been really nice to save my document mappings when choosing the new import option. Everytime I used the utility I had to remap the same document to the same fields.
6. No merge functionality at all, I had a database that was filled with test data when developing the site, and the import utility was just going to insert the data on top of what was already there. Potentially an option to delete all existing documents would of been nice. Deleting 6000 test documents via the CMS desk consumed quite a bit of time.
I discovered the reason that the utility was failing on import and unwinding everything was because the field I chose for the document name had a blank space on some rows. Unfortunately even though something like document name was incorrect, the application continued to spend another 5 hours attempting to import data even though It was guaranteed to unwind it.
So, in summary, glad you made a utility like this for people, it just needs some clean-up. In the mean-time I created a console application that uses your treeengine to do node.Insert statements. Would love something similar to LINQ datacontext that lets you InsertAllOnSubmit...that way you could interact with the database in a more efficent manner.
I'll save my comments on the sql statements being generated for another thread.
|