This blog is presented as a part of C# Advent 2024. Follow the link to check out the rest of the excellent C# and .NET content coming out in 2 blogs per day between December 1 - 25.
Caching plays a crucial role in performance in web applications by reducing strain on backend systems. The new HybridCache library, available in preview in .NET 9, simplifies caching and addresses some limitations of previous approaches, making it easier for developers to implement efficient, reliable caching solutions.
Why a New Cache Library?
Existing caching options, like IMemoryCache and IDistributedCache, serve different purposes but come with challenges. IMemoryCache is fast but limited to in-process memory, which can lead to issues like cache inconsistency or data loss on application restarts. IDistributedCache solves these issues with external storage solutions like Redis, but it adds latency, serialization complexity, and dependency on external systems.
HybridCache combines the strengths of both approaches, providing:
- Seamless in-process and distributed caching.
- Automatic serialization and deserialization.
- Stampede protection to handle multiple simultaneous requests for the same data.
Getting Started with HybridCache
Here’s are the steps to integrate HybridCache into your .NET 9 ASP.NET Core projects:
1. Add the Package
Include the Microsoft.Extensions.Caching.Hybrid NuGet package. NOTE: Once released, you will be able to add the package without the –prerelease flag.
dotnet add package Microsoft.Extensions.Caching.Hybrid --prerelease
2. Configure the Cache
Register HybridCache in Program.cs with the AddHybridCache method, where you can also set options like default expiration.
builder.Services.AddHybridCache();
3. Use the Cache
Replace existing caching logic with a HybridCache call:
var data = await hybridCache.GetOrCreateAsync(key, async () =>
{
return await FetchExpensiveDataAsync();
});
Instead of managing separate Get and Set calls, you use the GetOrCreateAsync method, which checks for cached data and executes a provided callback to populate the cache if needed.
Features of Hybrid Cache
Unified API
HybridCache simplifies caching by providing a single API that works seamlessly for both in-process and out-of-process scenarios. It’s designed to replace IDistributedCache and IMemoryCache without requiring major changes to your code, making it easy to integrate or upgrade your caching strategy.
When an IDistributedCache implementation is configured, HybridCache leverages it as a secondary cache layer, giving you the best of both worlds: the speed of in-memory caching combined with the reliability and persistence of distributed caching.
Stampede Protection
One of the challenges with traditional caching is the risk of a “cache stampede,” where multiple requests hit the system simultaneously when a cached item expires or is missing. This can overwhelm your backend, as each request triggers the same expensive operation to regenerate the data.
HybridCache addresses this by coordinating requests. Only the first caller executes the factory method to repopulate the cache, while others simply wait for the result. This approach reduces unnecessary load and keeps your system running smoothly, even under heavy traffic.
Configurable Serialization
When you register the HybridCache service during startup, you can customize serialization, including type-specific or generalized serializers, such as protobuf or XML. By default, the library handles string and byte[] internally, and uses System.Text.Json for everything else.
builder.Services.AddHybridCache(options =>
{
options.DefaultEntryOptions = new HybridCacheEntryOptions
{
Expiration = TimeSpan.FromSeconds(10),
LocalCacheExpiration = TimeSpan.FromSeconds(5)
};
}).AddSerializer<SomeProtobufMessage,
GoogleProtobufSerializer<SomeProtobufMessage>>();
Looking Ahead
HybridCache is in preview as part of the .NET 9 release, with general availability expected in future minor release soon.
Contact Trailhead to get started implementing HybridCache or to improve the caching in your .NET applications. Our experience in implementing efficient, scalable caching solutions ensures your applications will perform optimally, saving you time and effort while avoiding the most common pitfalls.


