Why .NET's memory cache is kinda flawed

2022 ж. 29 Мау.
54 216 Рет қаралды

The first 100 of you to use coupon code SUMMER2022 get 20% off my courses at dometrain.com
Become a Patreon and get source code access: / nickchapsas
Hello everybody I'm Nick and in this video I will take a look at the options we have in .NET for in-memory caching and explain why the default memory cache might be causing problems for you application. We will also take a look at a great alternative that solves that problem.
Give LazyCache a star on GitHub: github.com/alastairtree/LazyC...
Don't forget to comment, like and subscribe :)
Social Media:
Follow me on GitHub: bit.ly/ChapsasGitHub
Follow me on Twitter: bit.ly/ChapsasTwitter
Connect on LinkedIn: bit.ly/ChapsasLinkedIn
Keep coding merch: keepcoding.shop
#csharp #dotnet #caching

Пікірлер
  • Thank you for the info and the reference to LazyCache. If you think about it, the 'standard' MemoryCache way of working does work in the same way as a distributed cache so if you intend to scale up and move from in-memory to distributed cache you might get less 'surprises'.

    Жыл бұрын
  • This is really useful! I discovered LazyCache by watching this video and I wish I found it years ago. Thanks Nick!

    @keesdewit1982@keesdewit1982 Жыл бұрын
  • "Cache rules everything around me". 😄 Cool Easter Egg.

    @TaureanKing83@TaureanKing83 Жыл бұрын
  • I'm using LazyCache in my project as well. It helps me dealing with concurrent issues in asynchronous situations with ease.

    @yoanashih761@yoanashih761 Жыл бұрын
  • Great content as usual! I didn't know about this concurrency issue.

    @rodrigoflorex@rodrigoflorex Жыл бұрын
  • The pitfalls of optimistic concurrency. You can solve this issue with MemoryCache with extensions. You just have to evict the entry if the result produces an exception. Great to see you bringing this up.

    @urbanelemental3308@urbanelemental3308 Жыл бұрын
    • Catching exceptions is a costly operation, faster to just use lazycache or your own locking mechanism for the factory method.

      @jfpinero@jfpinero Жыл бұрын
  • Useful video as always!

    @muhamedkarajic@muhamedkarajic Жыл бұрын
  • I currently do something like this by using ConcurrentDictionary. I don't use this pattern a lot, and I'm hesitant to adding another dependency but I'll have a look.

    @astralpowers@astralpowers Жыл бұрын
    • Yeah I am planning to make a video on the Lazy solution when I talk about the ConcurrentDictionary problem explicitly. I think Microsoft used to use it in their own code too but they might have removed it.

      @nickchapsas@nickchapsas Жыл бұрын
    • @@nickchapsas Was about to mention this as well. I am using ```ConcurrentDictionary```` and then ``` await MyDict.GetOrAdd("key", (key) => new(Task.Run(MyFunction, cancellationToken))).Value``` to make sure a task is only ever kicked off by the first thread. Works very well for me.

      @Arkensor@Arkensor Жыл бұрын
  • MemoryCache actually uses a ConcurrentDictionary, so you're right to say it's the same thing.

    @BillyBraga@BillyBraga Жыл бұрын
  • Very sad that this LazyCache does not implement existing IMemoryCacheinterface and it can simply override implementation. :(

    @duszekmestre@duszekmestre Жыл бұрын
  • 6:37 WOW! I noticed that Wu Tang Clan reference 👏

    @fbsouza@fbsouza Жыл бұрын
  • thanks you for share it

    @DungBui-yo3tz@DungBui-yo3tz Жыл бұрын
  • Hey Nick, any plans on signalR or any real time programming videos in the future?

    @rade6063@rade6063 Жыл бұрын
  • Thanks a lot for the video 🤓👌

    @wilmararias2083@wilmararias2083 Жыл бұрын
  • The reason behind is that locking the factory call is dangerous as it can cause the deadlock, if factory call somehow calls method that locke the same semaphore. Callbacks are always dangerous to lock.

    @dmitrykim3096@dmitrykim30967 ай бұрын
  • Awesome vid Nick, but I think I spotted a typo: * "Caches ruins everything around me"

    @flyingmadpakke@flyingmadpakke Жыл бұрын
  • Thanks!

    @brettedwards8513@brettedwards8513 Жыл бұрын
  • Wow. I never knew this was an issue with concurrentDictionary and memoryCache (that uses the concurrentDictionary). I had so many issues with a caching on a project about 2 years ago with memoryCache. I was going crazy trying to understand the problem and this never occurred to me. Also had issues on a component that used concurrentDictionary. This is all good stuff to play with. I no longer work for that company, but I do some consultant work for them. If they ever ask to return to that project I now know what I'll be testing to fix those problems. Thanks for the knowledge.

    @pqsk@pqsk Жыл бұрын
  • Thank you. Great video, as always. It's weird that Lazy cache developed their own interface instead of using Microsoft.Extensions.Cache.Abstractions. Thats a deal breaker for me. As I try to keep away from anything that cannot be replaced using DI. I would like to know if the alternate solution exists

    @makp0@makp0 Жыл бұрын
    • Using Lazy as the cache value will also solve the problem

      @nickchapsas@nickchapsas Жыл бұрын
    • @@nickchapsas that's still not deterministic, you can still have different Lazy instances returned if multiple threads call GetOrAdd at the same time.

      @alexbagnolini6225@alexbagnolini6225 Жыл бұрын
    • @@alexbagnolini6225 the factory method will be excecuted only once

      @nickchapsas@nickchapsas Жыл бұрын
  • I'd like to see the DI video that Nick told in the video, anyone knows which video is?

    @rafaspimenta@rafaspimenta Жыл бұрын
  • Hi Nick, can you do a video about autofac?

    @robertotumini395@robertotumini395 Жыл бұрын
  • Hey Nick, any opinion on Akavache?

    @Tof__@Tof__ Жыл бұрын
  • I trust AsyncLazy from the MS package more tbh. Create it in the factory instead of the final value

    @mbalaganskiy@mbalaganskiy Жыл бұрын
  • For async operations that I need to be atomic, I use an "async lock" that I made, it's basically a IDisposable mutex wrapper, which I can use like using(await MyUtilsFacade.Lock("some_non_interned_text_key")) { } I found this particularly useful because I don't have to create lambdas, I don't get closuring problems and I can move values to and from the locked region without worrying about crazy behavior. It automatically creates/deletes the mutexes by key and frees the key strings passed when they're no longer in use, so I don't get "Interned guid strings" bloating application's RAM over long periods of time. (Because that's also a problem I faced, the normal "lock" keyword wasn't working well because not only I cant await within lock, the strings passed to it were sometimes different references even when they're equal, so I used string.intern and that caused my application to bloat overtime because interned strings are apparently never freed)

    @figloalds@figloalds Жыл бұрын
    • Does it use a SemaphoreSlim inside?

      @alexbagnolini6225@alexbagnolini6225 Жыл бұрын
    • @@alexbagnolini6225 yes it does. The mechanism uses an atomic dictionary to track LockEmitters that await for the semaphor and returns an IDisposable that releases the lock when disposed

      @figloalds@figloalds Жыл бұрын
  • what camera do you use to record video?

    @juanmarquezr@juanmarquezr Жыл бұрын
  • What's the equivalent of "GetOrCreateAsync" (IMemoryCache) for IDistributedCache?

    @AdisonCavani@AdisonCavani Жыл бұрын
  • When you were changed the parameter of GetCurrentWeatherAsync from city to entry.Key.ToString()! you said it was to avoid a closure. Do you have a video going into detail about how closures impact the app and when/how to avoid them?

    @Ree1BigChris@Ree1BigChris Жыл бұрын
    • I do actually: kzhead.info/sun/m5eGo9J6iaOwlpE/bejne.html

      @nickchapsas@nickchapsas Жыл бұрын
  • For the HTTP scenario would be better do use the Response Cache or implement the lazy cache package?

    @matheossg@matheossg Жыл бұрын
    • you can use both, depends on what you are trying to do, minimize data fetching or minimize api layer code (which could be calling other apis through http)

      @jfpinero@jfpinero Жыл бұрын
  • Dropping a like for the Wu Tang reference. Also for the useful lesson. Dope.

    @Marcometalizao@Marcometalizao Жыл бұрын
  • Hey, Nick. When you'll do CV review?

    @user-zk5ym9ut1j@user-zk5ym9ut1j Жыл бұрын
    • I’m planning the next livestream. Hopefully soon

      @nickchapsas@nickchapsas Жыл бұрын
  • I wish I were better in this. i'm a flawed man. I want to learn caching, but simpler. I wonder if there are videos that can do that. All i want to learn is to cache some data from a db for some time for each user so when the refresh the app don't go to the db to get the same data again. response caching or memory caching, I don't know when to use it correctly.

    @bilbobaggins8953@bilbobaggins8953 Жыл бұрын
  • I wonder how Proto Actor would be for this. Granted, it is a much more complicated solution for this "simple" problem.

    @dovh49@dovh49 Жыл бұрын
  • In my app i need to be able to invalidate items in cache manually from code not only based on duration. Is it possible with lazy cache ?

    @Martin-kj1od@Martin-kj1od Жыл бұрын
    • Yes, you can.

      @jfpinero@jfpinero Жыл бұрын
  • I don't think that verbosity in the DI is really necessary? You can just use AddSingleton(); AddSingleton(); EDIT: No - I had missed that the CachedWeatherService takes an IWeatherService, not a WeatherService.

    @0shii@0shii Жыл бұрын
    • Nop, you have to. If you don't you will create a circular dependency and your app won't even start

      @nickchapsas@nickchapsas Жыл бұрын
    • Works in my my test .NET Core 3.1 app. I believe the second call to AddSingleton should overwrite the registration of the WeatherService as a provider for IWeatherService, so it only remains in the dictionary as a provider for classes which request a concrete WeatherService.

      @0shii@0shii Жыл бұрын
    • @@0shii There is no overwrite happening. Add methods add on top of the previous one and they create an enumerable or the latest registered one. Doesn't work in .NET 6 and there is no way this behavior changed since .NET Core 3.1. Make sure you are injecting IWeatherService in the cached one, not WeatherService

      @nickchapsas@nickchapsas Жыл бұрын
    • Yep, I see it now, you're right - I had thought you were just injecting WeatherService into the CachedWeatherService, not IWeatherService. Apologies!

      @0shii@0shii Жыл бұрын
    • @@ShiyalaKohny There is. Testability

      @nickchapsas@nickchapsas Жыл бұрын
  • Every time I see caching code implementing these cross cutting patterns I long for Postsharp.Caching! Damn licence fees

    @nedgrady9301@nedgrady9301 Жыл бұрын
  • Also, memory cache doesnt have preeviction callback (callback before cache is expired) so that cache is automatically renewed lets say every 1hour. This feature had "old" net framework memorycache!

    @impeRAtoR28161621@impeRAtoR28161621 Жыл бұрын
    • You should use RegisterPostEvicionCallback, can re-add the entry like that if needed/refresh the cache entry. Yes, you have a ms or something with an empty cache entry, doesn't really matter for most use-cases. Don't forget to add a CancellationTokenSource so it causes the eviction to happen, else it will only trigger when a user hits the entry.

      @Masterrunescapeer@Masterrunescapeer Жыл бұрын
    • @@Masterrunescapeer that is posteviction callback. So imagine it needs 30sec to populate cache. Users will have to wait. With "old" memory everything was done "behind the scenes" since there was "pre" eviction callback. I dont know exact name but it is part of some cache policy class.

      @impeRAtoR28161621@impeRAtoR28161621 Жыл бұрын
    • @@impeRAtoR28161621 yes, I got around it since post eviction tells you what the value was by setting the value back to it, then get the new value and overwrite. Hacky solution, but works fine. For a majority of cases, it doesn't really matter, those super long running stuff you can do way longer absolute timers on it or shouldn't expire.

      @Masterrunescapeer@Masterrunescapeer Жыл бұрын
    • @@Masterrunescapeer I didnt new there you get old value which you can set again. I used another cache key which expires little bit earlier than real one and inside its posteviction callback I populate real long running cache

      @impeRAtoR28161621@impeRAtoR28161621 Жыл бұрын
  • I didnt really understand what was the behaviour we expected to see with the built in memory cache example or how it was different from what actually happened. It feels like you went over that part a little too fast

    @tinypanther27@tinypanther27 Жыл бұрын
    • The factory method can be executed multiple times by multiple threads

      @nickchapsas@nickchapsas Жыл бұрын
    • The expected behaviour is that the first thread sets the cached value and then all subsequent threads print the same value from the cache. What actually happened is that the threads are running in parallel, meaning that the for loop is not sequential, so each thread tries to write the value in cache but the value each thread has is kinda random. For example thread 1 increments the value to 1 but immediately thread 2 increments value to 2, so thread 1 prints the value 2, but meanwhile thread 3 has already read the value as 1 so it prints 1 ... So the thing that is printed is kinda random. That is how i understand it . Maybe im wrong.

      @unskeptable@unskeptable Жыл бұрын
  • I don't exactly get when this becomes a problem in the context of cache. Can anyone provide an example? obviously having an extra package is unwanted thing, so I need to understand when exactly should I care about this

    @youseff1015@youseff1015 Жыл бұрын
    • It becomes a problem when you've coded your system in a way that assumes that in multithreaded scenarios, the factory method is executed only once per evaluation.

      @nickchapsas@nickchapsas Жыл бұрын
    • The problem is known as Cache Stampede, see here en.wikipedia.org/wiki/Cache_stampede

      @jodydonetti@jodydonetti Жыл бұрын
    • I can give un example. Lets say that you cache a very expensive call to some api maybe its aws gigafreez s3. Now each call cost you 1$, and you cache it for 10 min. You will expect all the api costs to be 1$ per 10 min, but if you have 9001 concurrent users and each requst race the cache for value, the 'next' request to your api will request the cache for value before the first one completed the request to the source, so you may call the underlying resource multiple times, maybe hundreds of times before the cache will be set. In this case you waste lots of money even with the cache

      @IceQub3@IceQub3 Жыл бұрын
    • @@IceQub3 ohhh that make sense, nice example. Thank you

      @youseff1015@youseff1015 Жыл бұрын
    • @@nickchapsas if the cachedweatherservice was a singleton, would this still be an issue ?

      @nenzax2701@nenzax2701 Жыл бұрын
  • "... get the money! Dollar dollar bill yaaaalll!" 😅

    @keenanduplessis3023@keenanduplessis3023 Жыл бұрын
  • Hi nick

    @Anequit@Anequit Жыл бұрын
  • Is it just that the LazyCache locks the factory method internally ?

    @cavalfou@cavalfou Жыл бұрын
    • It uses Lazy which solves this problem

      @nickchapsas@nickchapsas Жыл бұрын
  • did you accidentally leak your api key at the beginning of the video?

    @Firebreak_2@Firebreak_2 Жыл бұрын
    • Nah these keys are invalidated after the video so it doesn't matter

      @nickchapsas@nickchapsas Жыл бұрын
    • You probably should just use the secrets api by default. Hard coding tokens in strings is a bad habit and it's also a hard habit to break. Also it's good to always show off the best habits for people learning from you (especially since sometimes people will watch old videos/read old articles while working on new features).

      @Zshazz@Zshazz Жыл бұрын
  • Which IDE is this?

    @GauravKumar-sr6bt@GauravKumar-sr6bt Жыл бұрын
    • JetBrains Rider

      @nickchapsas@nickchapsas Жыл бұрын
  • Whaaaaat?!!! The concurrent dictionary also suffers from this?? Oh god..

    @fdhsdrdark@fdhsdrdark Жыл бұрын
  • Even though MemoryCache is using ConcurrentDictionary internally, the exact method GetOrCreateAsync behaves like it's not thread safe because it's actually isn't - sadly it's an extension method which is implemented as two operations TryGetValue and then CreateEntry instead of one atomic one. And this is pretty annoying, because every other methods are thread safe, while this the most important one isn't. The ConcurrentDictionary, however, has a method GetOrAdd, which is actually thread safe. Pretty sure ConcurrentDictionary would work just fine with its GetOrAdd method (no async unfortunately), returning the same results as LazyCache. The only thing, that delegate "_ => Interlocked.Increment(..)" can indeed perform few times in concurrent environment before getting into the cache, which may in theory lead to all results being 2 or 3 instead of 1. But still it would be consistent.

    @michaelkarpenko3978@michaelkarpenko3978 Жыл бұрын
  • Lol Wutang

    @LittleRainGames@LittleRainGames8 ай бұрын
  • "Weather doesn't change really fast unless you are in London.." 😂😂 You should go back to Greece, awesome culture, food and weather😂✌️

    @Dustyy01@Dustyy01 Жыл бұрын
  • Dollar dollar bill y'all

    @ovidiufelixb@ovidiufelixb Жыл бұрын
  • What's the history behind always using "69" as a value in all your videos?

    @DerekWelton@DerekWelton Жыл бұрын
    • What 69?

      @nickchapsas@nickchapsas Жыл бұрын
    • @@nickchapsas nice

      @hannesvanniekerk2256@hannesvanniekerk2256 Жыл бұрын
  • its all fun and games, until your company decides to build their own caching solution.......

    @Esgarpen@Esgarpen14 күн бұрын
  • Not sure why you are providing a solution to a poorly implemented method.

    @T___Brown@T___Brown Жыл бұрын
KZhead