The KL Divergence : Data Science Basics
2024 ж. 25 Мам.
38 907 Рет қаралды
understanding how to measure the difference between two distributions
Proof that KL Divergence is non-negative : • Jensen's Inequality : ...
My Patreon : www.patreon.com/user?u=49277905
0:00 How to Learn Math
1:57 Motivation for P(x) / Q(x)
7:21 Motivation for Log
11:43 Motivation for Leading P(x)
15:59 Application to Data Science
Wow... 😳 I've never seen more genius, easy and intuitive explanation of KL-div 😳👏👏👏👏👏 Big thanks good man ! ❤️
Glad you liked it!
I agree,
Your bottom-up (instead of top-down) approach that you mentioned in the beginning of the video would be really great to see for all kinds of differrent concepts!
Great idea!
I don't think I'm ever going to forget this. Thanks so much.
I am a research scientist. You provide a clear and concise treatment of KL-Divergence. The best I have seen to date. Thanks.
That was the best description of why we use log that I have ever seen. Good work, man.
I'm in the middle of a $2,500 course, BUT → KZhead → your video... 👏🏻👏🏻👏🏻👏🏻👏🏻 Thank you for starting with the "why", and appealing to my brains desire to understand, not just do.
This is mind blowing.... I love the way you go from the problem to the solution, it's clever way to understand this KL divergence
thanks!
That was great. I have struggled to understand certain aspects of KL Divergence, and this is a great way to think about it without getting bogged down in symbology.
Glad it was helpful!
Amazing. The pace you have explained, the approach...everything is just top-notch.
Blew my mind, I wanted to understand what kl divergence is to understand the recent Gen AI papers and couldn't. This video helped me a lot.
This video is absolutely mind-blowing! The way it breaks down such a complex concept into an intuitive understanding is truly remarkable. Thank you!
Thank you for this, the best explanation of KL divergence that I have seen. Love how you approach it building gradually, really inspiring for how to learn math.
Wow. This is the best explanation of KL-divergence I've ever heard. So many over-complicated stuff out there but yours is absolutely genius.
Glad it was helpful!
Let's celebrate a new video on this amazing chanel!!! Love your work!
🎉
wahh.. i am studying computer science master degree. Your video really helps me a lot! please keep on doing such great work for us!
The best explanation I've ever seen about KL divergence ❤
I think you're channel and teaching style is brilliant. I wish I knew about this channel when I was doing my undergrad.
Great explanation, this is the first time I'm learning about KL divergence and it was very easy to grasp because of the way you taught it
Best Math Teacher ever. So clearly explained the design and thinking process of how the algo comes out. Many video just explain the formula which confused me why we should do this way... Thank you!
You've really made my day with ur explanation. Thank you so much :D
In the 'Motivation for log,' you said that taking a simple average is not the right way to go, and then you try to find a function that makes f(4) and f(1/4) have opposite signs. That means you are trying to make two very different distributions have the smallest distance possible (canceling each other out), which is contradictory to what we expected. We expected them to be large.
That was great! Not just dumping the formula on you but walking you through its logic with simple steps. Loved it! ❤
Great work! I've been a fan of your ,material for some time and in this video you have truly mastered your craft.
Wow, thank you!
the thing you said in the first minute, is something ive been saying for a while now. As students we arent told what problem drove scientists or engineers into constructing new formulas or ways of thinking.
I love how you approach to the KL divergence!
mad respect for Ritvik from Ritwik for acing the subtle art of intuitive explanation:)) If only professors could master the same art.
Outstanding. Really helping me through this info retrieval course!
I found out this professor is very good at explaining every tough concept! respect and many appreciations!
superb...I believe this is the best explanation I have ever come across for K L Divergence. Thanks a tonne.
me not know some of the fundamentals after listening to your explanation made a lot of sense, and I felt I understood the concept well. I am willing to watch your videos more often.
Everytime i have a math question your hannel is my first choice! Amazing ✅ thanks a million 🎉
Thank you as always for sharing your brilliant teachings, Ritvik. Could you please do a video on the Gram-Schmidt process and how orthonormal basis matrices are relevant to data science?
Fantastically clearly explained, congrats.
Your videos are great just keep going, I watched you for few years already
Excellent way to explain it. Makes maths sounds logical and approachable 🎉
I recently got interested in learning machine learning and stumbled upon the stable diffusion, the current state of art open source image generation ai. That's where I encountered the KL divergence. The more I try to understand it, more complicated concepts and formulas are thrown at me. I managed to find some videos that explains how to derive it, but none of them explained why the hell logarithm is present in it for gods sake! And here you are, explaining every missing details from other videos and blog posts in a way that the person who knows very little about the subject can understand in a very satisfying and easy to follow way. Hats off to you, sir. I wish every teachers are like you.
Thanks and godspeed for your journey through machine learning !
One of the BEST tutorials for sure
Another amazing video! Please keep them coming!
Amazing video, love the format!
This is the perfect video in Math. Love it. Shared with all my readers
That was one of the best explanations I have ever heard! Great job and many thanks!
Thanks!!
The comments didn't lie you actually explained this so well. I watched the ads all the way through btw.
Excellent intuitive explanation!
It was the easiest explanation I’ve ever seen.
Amazing teaching. It helps a lot in my process of data shift covariate detection project. Thanks
Glad it was helpful!
Thanks, exactly the explanation I have been looking for!
Thank you for the great explanation! I totally agree that math is not given from above, but invented by people. And showing how the invention can be done is the best way to teach the new concepts. Thanks a lot!
Thank you so much for this explanation and also got a new insight about the log :)
Happy to help!
Bro is a legend
This was awesome. Really helpful to think through it backwards and “redevelop” our own function
dude, the explanation is so good, you rock!
Glad it helped!
amazing explanation. not many can do this. well done.
I have never seen complex math explained this good Thank you very much!
very nice explanation. Thanks for the work.
Taking the MITx Stats class, but I find that you explain the concepts so much better!
Glad to hear!
Awesome explaination. Thanks for this video
Glad it was helpful!
This was incredibly illustrative!
Thank you! This is the best explanation of KL divergence wich i've seen
Glad it was helpful!
Thank you. As usual, great and very intuitive explanation.
No problem !
This is an amzing explanation, thanks!
Glad it was helpful!
That was amazing. Thank you so much!
Thank you SO much! God bless you Sir, keep up the great work 😊
You are very welcome
Great stuff... Learning a way to teach maths to my kid... A constructivist method... While learning about stats... I really appreciate your work.
Glad it was helpful!
Amazing explanation!
Great video. Two pieces missing for it to be perfect are in my opinion. If you could just calculate for sum of log (p(x)/q(x)) and show us what's wrong with that number. Exactly as you did with simple p(x)/q(x) why it isn't good solution. Finally in last slide if you could give numbers. You tell about quantification of something that is visually clear, but missing the numbers is kinda missed opportunity to explain how it works :) Again - thank you a lot for explanation. Great work.
thank you for the clear explanation.
thank you for the best expanation on this topic
Thank you very much for your valuable videos!!
Glad you like them!
Thank you so much for the explanation. it was really helpful👍👍
Thanks!
It would be interesting to have a video on how you study to understand a topic, what resources you use and the materials you look for
Math should all be taught this way, and to go one step further we should teach people how to make sense of math themselve in the long run Thanks for the explanation for kl divergence though ;)
Awesome! Very intuitive
Very good explanation
You just got a subscriber. Thank You! 😊
You are great in explaining this! Thanks!
Glad it was helpful!
Wonderful man. Thank you so much.
Thanks a lot for sharing the underlying motivation behind the K-L divergence! I really needed such deep insights! JAJAKALLAH...
You're so welcome!
Best video I've seen in a while!
Thanks!
Thanks for the lecture, your work is always so intuitive.
You are very welcome
God level explanation thank you!!!
Great explanation
very very very very well explained, Thanks
thank you. this really helped !!
you are way more better than my school's professor. thank you
Wow. Just wow! This is brilliant🤩
Thanks!
Thank you very much! Besides the "norminal" category in your example, I am wondering if this can also be used in "ordinal" category. For example, if I make a questionare from "dislike" to "like very much" and get poll from 2 groups, can I use the KL-divergence to calculate the difference between these 2 groups, and whether there is a even better way to discribe this difference, for example, group2 shows "significant" higher interest than group1 ?
Thank you :) for valuable content
Great explanation 👏
Glad you think so!
great explanation!
Glad you think so!
Super clear !
讲的真好,谢谢
Amazing explanation
Glad you liked it
I am a masters student in data science and machine learning and I have to tell you that this is the best explanation one can get for concepts like this...Hope you make more videos on these types in concepts.
Wow, thanks!
Great job
I think this would be a good prerequisite for Variational Inference and SVI, if you planning to do that vid
aaaaaaaaaaaaaa!!!!!, This video is absolutely mind-blowing! thank you for your great contribution.
Universities should fire their math professors and get you to teach their classes. Well done!
That was really awesome
How can this guy only have 8,000 views on such a good video... Very nice way of explaining!
Wow, thank you!
Great video!
Glad you enjoyed it!