|
Caching in ASP.NET Core: Improving Application Performance
Read on: my website / Read time: 6 minutes
|
|
|
|
BROUGHT TO YOU BY
Ultimate Solution for Developing Data-Related .NET Applications
dotConnect is a feature-rich ADO.NET provider for developing data-related .NET applications. It's fast, reliable, and works for all major databases and cloud services. dotConnect is compatible with all Entity Framework and EF Core versions.
Collaborate in Postman
Use Postman V11 collaboration features to collaborate with your team and API Consumers. Features include changing your workspace visibility, assigning roles to collaborators, sharing workspaces, and using Postman's API version control.
|
|
|
|
Caching is one of the simplest techniques to significantly improve your application's performance. It's the process of temporarily storing data in a faster access location. You will typically cache the results of expensive operations or frequently accessed data.
Caching allows subsequent requests for the same data to be served from the cache instead of fetching the data from its source.
ASP.NET Core offers several types of caches, such as IMemoryCache , IDistributedCache , and the upcoming HybridCache (.NET 9).
In this newsletter, we will explore how to implement caching in ASP.NET Core applications.
|
|
|
How Caching Improves Application Performance
Caching improves your application's performance by reducing latency and server load while enhancing scalability and user experience.
- Faster data retrieval: Cached data can be accessed much faster than retrieving it from the source (like a database or an API). Caches are typically stored in memory (RAM).
- Fewer database queries: Caching frequently accessed data reduces the number of database queries. This reduces the load on the database server.
- Lower CPU usage: Rendering web pages or processing API responses can consume significant CPU resources. Caching the results reduces the need for repetitive CPU-intensive tasks.
- Handling increased traffic: By reducing the load on backend systems, caching allows your application to handle more concurrent users and requests.
- Distributed caching: Distributed cache solutions like Redis enable scaling the cache across multiple servers, further improving performance and resilience.
In a recent project I worked on, we used Redis to scale to more than 1,000,000 users. We only had one SQL Server instance with a read-replica for reporting. The power of caching, eh?
|
|
|
Caching Abstractions in ASP.NET Core
ASP.NET Core provides two primary abstractions for working with caches:
IMemoryCache : Stores data in the memory of the web server. Simple to use but not suitable for distributed scenarios.
IDistributedCache : Offers a more robust solution for distributed applications. It allows you to store cached data in a distributed cache like Redis.
We have to register these services with DI to use them. AddDistributedMemoryCache will configure the in-memory implementation of IDistributedCache , which isn't distributed.
services.AddMemoryCache();
services.AddDistributedMemoryCache();
Here's how you can use the IMemoryCache . We will first check if the cached value is present and return it directly if it's there. Otherwise, we must fetch the value from the database and cache it for subsequent requests.
app.MapGet(
"products/{id}",
(int id, IMemoryCache cache, AppDbContext context) =>
{
if (!cache.TryGetValue(id, out Product product))
{
product = context.Products.Find(id);
var cacheEntryOptions = new MemoryCacheEntryOptions()
.SetAbsoluteExpiration(TimeSpan.FromMinutes(10))
.SetSlidingExpiration(TimeSpan.FromMinutes(2));
cache.Set(id, product, cacheEntryOptions);
}
return Results.Ok(product);
});
Cache expiration is another important topic to discuss. We want to remove cache entries that aren't used and become stale. You can pass in the MemoryCacheEntryOptions , allowing you to configure cache expiration. For example, we can set the AbsoluteExpiration and SlidingExpiration values to control when the cache entry will expire.
|
|
|
Cache-Aside Pattern
The cache-aside pattern is the most comming caching strategy. Here's how it works:
- Check the cache: Look for the requested data in the cache.
- Fetch from source (if cache miss): If the data isn't in the cache, fetch it from the source.
- Update the cache: Store the fetched data in the cache for subsequent requests.
Here's how you can implement the cache-aside pattern as an extension method for IDistributedCache :
public static class DistributedCacheExtensions
{
public static DistributedCacheEntryOptions DefaultExpiration => new()
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(2)
};
public static async Task<T> GetOrCreateAsync<T>(
this IDistributedCache cache,
string key,
Func<Task<T>> factory,
DistributedCacheEntryOptions? cacheOptions = null)
{
var cachedData = await cache.GetStringAsync(key);
if (cachedData is not null)
{
return JsonSerializer.Deserialize<T>(cachedData);
}
var data = await factory();
await cache.SetStringAsync(
key,
JsonSerializer.Serialize(data),
cacheOptions ?? DefaultExpiration);
return data;
}
}
We're using JsonSerializer to manage serialization to and from a JSON string. The SetStringAsync method also accepts a DistributedCacheEntryOptions argument to control cache expiration.
Here's how we would use this extension method:
app.MapGet(
"products/{id}",
(int id, IDistributedCache cache, AppDbContext context) =>
{
var product = cache.GetOrCreateAsync($"products-{id}", async () =>
{
var productFromDb = await context.Products.FindAsync(id);
return productFromDb;
});
return Results.Ok(product);
});
|
|
|
Pros and Cons of In-Memory Caching
Pros:
- Extremely fast
- Simple to implement
- No external dependencies
Cons:
- Cache data is lost if the server restarts
- Limited to the memory (RAM) of a single server
- Cache data is not shared across multiple instances of your application
|
|
|
Distributed Caching With Redis
Redis is a popular in-memory data store often used as a high-performance distributed cache. To use Redis in your ASP.NET Core application, you can use the StackExchange.Redis library.
However, there's also the Microsoft.Extensions.Caching.StackExchangeRedis library, allowing you to integrate Redis with IDistributedCache .
Install-Package Microsoft.Extensions.Caching.StackExchangeRedis
Here's how you can configure it with DI by providing a connection string to Redis:
string connectionString = builder.Configuration.GetConnectionString("Redis");
buidler.Services.AddStackExchangeRedisCache(options =>
{
options.Configuration = connectionString;
});
An alternative approach is to register an IConnectionMultiplexer as a service. Then, we will use it to provide a function for the ConnectionMultiplexerFactory .
string connectionString = builder.Configuration.GetConnectionString("Redis");
IConnectionMultiplexer connectionMultiplexer =
ConnectionMultiplexer.Connect(connectionString);
buidler.Services.AddSingleton(connectionMultiplexer);
buidler.Services.AddStackExchangeRedisCache(options =>
{
options.ConnectionMultiplexerFactory =
() => Task.FromResult(connectionMultiplexer);
});
Now, when you inject IDistributedCache , it will use Redis under the hood.
|
|
|
Cache Stampede and HybridCache
The in-memory cache implementations in ASP.NET Core are susceptible to race conditions, which can cause a cache stampede. A cache stampede happens when concurrent requests encounter a cache miss and try to fetch the data from the source. This can overload your application and negate the benefits of caching.
Locking is one solution for the cache stampede problem. .NET offers many options for locking and concurrency control. The most commonly used locking primitives are the lock statement and the Semaphore (or SemaphoreSlim ) class.
Here's how we could use SemaphoreSlim to introduce locking before fetching data:
public static class DistributedCacheExtensions
{
private static readonly SemaphoreSlim Semaphore = new SemaphoreSlim(1, 1);
public static async Task<T> GetOrCreateAsync<T>(...)
{
try
{
await Semaphore.WaitAsync();
var data = await factory();
await cache.SetStringAsync(
key,
JsonSerializer.Serialize(data),
cacheOptions ?? DefaultExpiration);
}
finally
{
Semaphore.Release();
}
return data;
}
}
The previous implementation has a lock contention issue since all requests have to wait for the semaphore. A much better solution would be locking based on the key value.
.NET 9 introduces a new caching abstraction called HybridCache , which aims to solve the shortcomings of IDistributedCache . Learn more about this in the Hybrid cache documentation.
|
|
|
Summary
Caching is a powerful technique for improving web application performance. ASP.NET Core's caching abstractions make it easy to implement various caching strategies.
We can choose from IMemoryCache for in-memory cache and IDistributedCache for distributed caching.
Here are a few guidelines to wrap up this week's issue:
- Use
IMemoryCache for simple, in-memory caching
- Implement the cache aside pattern to minimize database hits
- Consider Redis as a high-performance distributed cache implementation
- Use
IDistributedCache for sharing cached data across multiple applications
That's all for today.
See you next week.
|
|
|
P.S. Whenever you're ready, there are 2 ways I can help you:
1. Modular Monolith Architecture: This in-depth course will transform the way you build modern systems. You will learn the best practices for applying the Modular Monolith architecture in a real-world scenario. Join 600+ engineers
2. Pragmatic Clean Architecture: This comprehensive course will teach you the system I use to ship production-ready applications using Clean Architecture. Learn how to apply the best practices of modern software architecture. Join 2,750+ engineers
|
|
|
Check out Tech World with Milan for insights into a beautiful world of Software Engineering. Join readers from Microsoft, Google, Meta, Amazon, and more. |
|
|
You received this email because you subscribed to our list. You can unsubscribe at any time.
Update your profile | Dragiše Cvetkovića 2, Niš, - 18000
|
|
|
|
|