MVC site too slow? Caching to the rescue!

   —   

When was the last time you desperately needed a site to load and it was taking forever? How long have you waited? Five seconds, 30, or even a whole minute? According to Google’s aggregation, 53% of people will leave if the load time is more than three seconds. Is your MVC site fast enough?

Using the MVC development model is already a performance improvement compared to older WebForms. However, we all know how it goes with larger and feature-rich sites. As you add more features, you start querying Kentico's API more often and over time the database starts sweating. Sure, you can throw more hardware at any performance issue that surfaces (at least until you run out of budget), but today I'll show you smarter ways to decrease server processing time. I'll explain the caching strategies in MVC you can leverage as well as give you an insight into .NET caching and Kentico specifics that need to be taken into account.

Client vs Server Caching

The fastest site is the one that does not have to download anything at all. Static resources such as images, JavaScript files, Stylesheets, and others are downloaded during the first page request a visitor makes against your server. These resources do not typically change frequently so browsers store them locally. The client caching is easy to configure in Kentico's administration interface, so if you are interested in this part, see our documentation.

The server-side caching is much more interesting. Let's say there is a page in your website with lots of text data loaded from a database. Once a request arrives for this page, the server has to process everything. First, it requests the data from the database, waits for response, processes it according to your implementation (that includes other controllers, actions, partial views and, of course, additional database calls), and finally outputs HTML. The most expensive action here is the round trip to the database. Basically the more requests for data there are, the longer the request processing takes.

Caching on Server

So, how can we omit the database trips and have the data available?  Once the database responds with the dataset for the first request, we can store all the data within application memory of the MVC live site. .NET framework helps you with the rest - storage, naming, and memory management will be handled for you automatically. This is what we call Data Caching. It's not just for database calls, it can encapsulate any piece of code that has a prolonged process time, such as third-party service calls or complicated calculations.

With cached datasets in memory, requests are already processed much faster. However, in certain cases, the output of a controller action, such as Index action of a Home controller, can be identical for many different requests. For example, when multiple anonymous visitors request the same homepage they will get identical responses. But, without further optimization, your controller will process each request all the way to respond with identical outputs. ASP.NET comes with Output Caching support to handle these cases. It enables you to store the generated output of pages, controls, and HTTP responses in memory.

Data Caching

If you ever used caching in Kentico, you would be pleased to know that the API did not really change. If it's your first time looking at caching, don't worry. Kentico's API gives you CacheHelper that is used as a wrapper around expensive code. The helper takes care of executing the code the first time and later on just passes the data.

private AddressInfo GetLastUsedAddress() { return CacheHelper.Cache(() => AddressInfoProvider.GetAddresses(ShoppingCart.ShoppingCartCustomerID) .OrderByDescending("AddressLastModified") .TopN(1) .FirstOrDefault(), new CacheSettings(10, "GetLastUsedAddressForCustomer", ShoppingCart.ShoppingCartCustomerID) ); }

In the example above, you see that the logic loading the customer's address is moved into a CacheHelper.Cache method. Result of the logic is stored within memory according to configuration in CacheSettings object. In this case, it's cached for 10 minutes under key "GetLastUsedAddressForCustomer|{id}". The {id} placeholder will be filled with a real customer ID. When the same code gets executed multiple times for the same customer, it will use the data from memory instead of asking the database through Kentico's API.

Output Caching

Everything that happens inside of a controller action can be stored in Output cache using a simple filter attribute. The output of any action can vary by a certain number of attributes mostly based on requests. For example, if a visitor to a medical clinic website asks for details of doctor "Jim Perry" or doctor "Serena Bing", both of these requests will go to the same controller, DoctorController, and action, Detail. However, the attribute of the Detail action (Guid nodeGuid) determines which doctor's detail to display. Using the OutputCache filter attribute it's possible to add the value of such parameter to a cache key. That instructs the caching mechanism to store different versions of this controller action output if the parameter value changes.

[OutputCache(Duration = 3600, VaryByParam = "nodeGuid", Location = OutputCacheLocation.Server)] public ActionResult Detail(Guid nodeGuid, string nodeAlias) { var doctor = DoctorRepository.GetDoctor(nodeGuid); if (doctor == null) { return HttpNotFound(); } return View(doctor); }

Output cache should be used whenever possible. It allows the highest performance boost in general as it caches a fully rendered page (including views) and saves not only data-gathering but also processing time to render and output data.

Your Guide to Transitioning to MVC with Kentico 12

Cache Keys and Dependencies

As you can see, both of these approaches use cache keys. They are unique identifiers of cached data in server memory so that when you want to use the data again, you have a way of finding it. Think of it as a pointer to a data address. In both cases, they are also composed of runtime values such as IDs and GUIDs.

OK, this is great, but what if some of the data becomes outdated? The underlying data changes before the cached data set expires. How do I ensure the website is up-to-date?

Dependencies and Data Caching

Let's investigate cache invalidation and dependencies. Most of the time the reason for early cached data removal is because it was changed or deleted. These actions do not happen on the live site, but in the administration interface. For data caching, the mechanism for actively removing items from cache is mostly automated and synchronized via a web farm mechanism between the live site and CMS back end. Look at how a dependency is added to the CacheSection helper.

private AddressInfo GetLastUsedAddress() { return CacheHelper.Cache(() => AddressInfoProvider.GetAddresses(ShoppingCart.ShoppingCartCustomerID) .OrderByDescending("AddressLastModified") .TopN(1) .FirstOrDefault(), new CacheSettings(10, "GetLastUsedAddressForCustomer", ShoppingCart.ShoppingCartCustomerID) // // added cache dependency // { CacheDependency = CacheHelper.GetCacheDependency(new[] { "cms.address|all" }) } ); }

Cache dependencies use a system of dummy keys. Whenever an item is manipulated in the back end, the system looks for dependencies with specific dummy keys. If we changed an address with ID=15, Kentico would remove all cache items with the following dummy keys:

cms.address|all cms.address|byid|15 cms.address|byname|address_15

By adjusting the controller action, we made the cached data set dependent on the `cms.address|all` dummy key. Therefore, whenever any address in the system changes, the cached data will be automatically removed.

Dependencies and Output Caching

With output caching you are storing results of a whole controller action and not each particular data entry. The cache dependencies and removal process works very similarly to data caching though. The dummy cache key is added to HttpContext.Response object as the system finds the cache settings and creates the cache item when the response travels back to the client.

[OutputCache(Duration = 3600, VaryByParam = "nodeGuid", Location = OutputCacheLocation.Server)] public ActionResult Detail(Guid nodeGuid, string nodeAlias) { var doctor = DoctorRepository.GetDoctor(nodeGuid); if (doctor == null) { return HttpNotFound(); } // // added cache dependency // string dependencyCacheKey = "nodes|mvcsite|mvcsite.doctor|all"; CacheHelper.EnsureDummyKey(dependencyCacheKey); HttpContext.Response.AddCacheItemDependency(dependencyCacheKey); return View(doctor); }

Manually Removing Cache Items

Although the process around storing and removing cached data is very robust and handles almost everything automatically, in certain cases you may need to delete cache items manually. If that's what you're after, you can use two approaches:

  • Delete by cache dependency key
    The API allows you to trigger the cache dependency dummy key and remove all cache items that have a dependency on it.
  • Delete by cache item name
    Selectively removes cache items by their key.

Kentico's API provides methods for both of these use cases - CacheHelper.TouchKey and CacheHelper.Remove respectively.

When to Use Caching

The caching strategies for your project need to be well thought out. A simple general rule of thumb is to use both and combine them to achieve the best performance.

Data cache is a good fit for almost all data gathering implementations. It is a must for frequently requested sets of data. By storing data sets in memory, it not only saves time when processing requests, but spares your data sources from executing the same repetitive operations all the time, thus saving their computation power and your money.

The output cache, on the other hand, should be used for controllers that display relatively static results, such as articles, lists of information, tiles with products, and others. If you stop for a moment and think about your project, I'm sure you'll find many similar examples.

By leveraging cache dependencies on both output and data caching, specific cache items will be wiped out in case content changes. That ensures your website will be up-to-date for all visitors even though the data is provided from cache most of the time.

In the end, there is no site that could justify not using cache at all. From my experience, I've seen critical performance issues, such as requests, taking more than eight seconds to solve on an MVC site lacking any caching implementation. Remember the time a client is happy to wait for a page to load? After eight seconds your visitor and your conversion goals are gone.

In this particular case, it was caused by a client insisting on always fresh data and even though it was an internal company website, the loading times were annoying a great number of people.

Use Cache and Dependencies Whenever Possible

Try not to fall into this trap and aim to always cache as much as you can. If not the whole output, then at least data items. If not for a whole day, then at least for five minutes. Cache dependencies will ensure the rest and keep your site current.

You can find more information about caching in our documentation. Check your site now using Google's Page Speed before you do any changes. Then implement caching like I proposed in this article and measure again. I guarantee your loading times will drop more than 50% and if not, I'll buy you a beer!

Additional Resources

Of course, this is not everything there is to caching. If you're interested, feel free to check these interesting links:

Share this article on   LinkedIn

Michal Samuhel

I am solution architect at Kentico. I help with all possible and impossible issues, development, deployment and others. Kinda wild card in here. I like reading new information and sometimes I even publish a line or two.