Using Redis Cache
Redis is a popular choice for distributed, in-memory caching.
This topic contains the following sections:
To prepare your Redis server for use with PostSharp caching:
Set up the eviction policy to volatile-lru or volatile-random. See https://redis.io/topics/lru-cache#eviction-policies for details.
Other eviction policies than volatile-lru or volatile-random are not supported.
Set up the key-space notification to include the AKE events. See https://redis.io/topics/notifications#configuration for details.
To set up PostSharp to use Redis for caching:
Add a reference to the PostSharp.Patterns.Caching.Redis package.
Create an instance of StackExchange.Redis.ConnectionMultiplexer .
Create an instance of the RedisCachingBackend class using the RedisCachingBackend.Create(IConnectionMultiplexer, RedisCachingBackendConfiguration) factory method and assign the instance to CachingServices.DefaultBackend.
The caching backend has to be set before any cached method is called for the first time.
string connectionConfiguration = "localhost"; ConnectionMultiplexer connection = ConnectionMultiplexer.Connect( connectionConfiguration ); RedisCachingBackendConfiguration redisCachingConfiguration = new RedisCachingBackendConfiguration(); CachingServices.DefaultBackend = RedisCachingBackend.Create( connection, redisCachingConfiguration );
For higher performance, you can add an additional, in-process layer of caching between your application and the remote Redis server. To enable the local cache, set the RedisCachingBackendConfiguration.IsLocallyCached property to true.
The benefit of using local caching is to decrease latency between the application and the Redis server, and to decrease CPU load due to the deserialization of objects. The inconvenience is that there is that distributed local caches are synchronized asynchronously, therefore different application instances may see different values of cache items during a few milliseconds. However, the application instance initiating the change will have a consistent view of the cache.
Support for dependencies is disabled by default with the Redis caching backend because it has an important performance and deployment impact. From a performance point of view, the cache dependencies need to be stored in Redis (therefore consuming memory) and handled in a transactional way (therefore consuming processing power). As for deployment, the problem is that the cache GC process, which cleans up dependencies when cache items are expired from the cache, needs to run continuously, even when the application is not running.
If you choose to enable dependencies with Redis, you need to make sure that there is at least one instance of the cache GC process is running. It is legal to have several instances of this process running, but since all instances will compete to process the same messages, it is better to ensure that only a small number of instances (ideally one) is running.
To use dependencies with the Redis caching backend:
Make sure that at least one instance of the RedisCacheDependencyGarbageCollector class is alive at any moment (whenever the application is running or not). If several instances of your application use the same Redis server, a single instance of the RedisCacheDependencyGarbageCollector class is required. You may package the RedisCacheDependencyGarbageCollector into a separate application of cloud service.
In case of an outage of the service running the GC process, execute the PerformFullCollectionAsync(RedisCachingBackend, CancellationToken) method.
Set the RedisCachingBackendConfiguration.SupportsDependencies property to true.