OutOfMemoryException while updating documents in a loop.

Breck Caine asked on September 17, 2021 13:25

I have a file to update a field on COM_SKU by SKUNAME. For each record I get the document with GetDocuments

            DocumentQuery docs = DocumentHelper.GetDocuments("CMSProduct.Transmission")
                .Path("/Products/", PathTypeEnum.Children)
                .Where("skuname = '" + parts[0] + "'")

Then I loop through (should only be one doc) the TreeNode

foreach (TreeNode doc in docs)

And I set the value and do an udpate.

doc["oenumbers"] = parts[1].Trim();
doc.Update();

I have tried to Dispose() each object at the end of the loops, but memory runs away. I can do the updates directly in the database, but I was trying to use the preferred way. Without the Update(), the process runs fine.

Correct Answer

Sean Wright answered on September 20, 2021 20:13

I am using a console application

What console application?

I will process these in smaller batches for now

I always recommend doing bulk updates in batches.

I would also recommend using the CMSActionContext (as detailed here) to limit what side-effect operations occur when you are programmatically updating many documents.

Often we my team has to do bulk updates like this, we create an .aspx page in the CMS\CMSPages folder (checking for a specific role on PageLoad) and then build out a simple UI that allows us to specify the number of items to update per update batch, and the minimum DocumentId / NodeId for starting the next batch.

When we query for documents that are being updated, we ensure only documents with an Id >= the Id submitted with the form are processed, limiting the result set to the batch size.

When the batch completes, we display the results on the page, including the Max Id of all the pages processed, so we know where to pick up for the next batch.

0 votesVote for this answer Unmark Correct answer

Recent Answers


Sean Wright answered on September 18, 2021 07:21

It's good that you are trying to use the Kentico API to update these docs instead of doing it directly in the database - there's a lot of internal processing that happens when using the API and you could end up with data in an inconsistent state if you try updating in the database directly.

  1. Where is this code executing that is updating these Pages?

  2. Do you have any global events defined in your application?

  3. If you are using global events, you might need to handle recursion with recursion control (desribed later in that article).

  4. How many documents are you bringing into memory in your query?

  5. Try forcing the collection to materialize with the .TypedResult property (or .ToList()) at the end of your query - this is always a good practice.

  6. Do you update docs successfully through the API in any other parts of your app? (I'm assuming updating Pages in the Pages module of the administration app works without issue)

0 votesVote for this answer Mark as a Correct answer

Breck Caine answered on September 20, 2021 19:56

I am using a console application to do a one time mass update. There are about 3000 documents to update. I did try the ToList approach. That seemed to make it go longer before memory issues, but it still eventually crashed. I will process these in smaller batches for now and look for some general ASPX memory solutions.

0 votesVote for this answer Mark as a Correct answer

Breck Caine answered on September 22, 2021 10:24

The console application is just a ASP.NET c# console app. I think this is an environment issue on my side. I do not get the same issue when I run the code in a different environment. The machine where I do get the issue is a Hyper-V virtual machine, so maybe that is somehow part of the problem.

0 votesVote for this answer Mark as a Correct answer

   Please, sign in to be able to submit a new answer.