Regularization Part 1: Ridge (L2) Regression

2024 ж. 4 Мам.
1 027 618 Рет қаралды

Ridge Regression is a neat little way to ensure you don't overfit your training data - essentially, you are desensitizing your model to the training data. It can also help you solve unsolvable equations, and if that isn't bad to the bone, I don't know what is.
This StatQuest follows up on the StatQuests on:
Bias and Variance
• Machine Learning Funda...
Linear Models Part 1: Linear Regression
• Linear Regression, Cle...
Linear Models Part 1.5: Multiple Regression
• Multiple Regression, C...
Linear Models Part 2: t-Tests and ANOVA
• Using Linear Models fo...
Linear Models Part 3: Design Matrices
• StatQuest: Linear Mode...
Cross Validation:
• Machine Learning Funda...
For a complete index of all the StatQuest videos, check out:
statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Buying The StatQuest Illustrated Guide to Machine Learning!!!
PDF - statquest.gumroad.com/l/wvtmc
Paperback - www.amazon.com/dp/B09ZCKR4H6
Kindle eBook - www.amazon.com/dp/B09ZG79HXC
Patreon: / statquest
...or...
KZhead Membership: / @statquest
...a cool StatQuest t-shirt or sweatshirt:
shop.spreadshirt.com/statques...
...buying one or two of my songs (or go large and get a whole album!)
joshuastarmer.bandcamp.com/
...or just donating to StatQuest!
www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
/ joshuastarmer
0:00 Awesome song and introduction
1:25 Ridge Regression main ideas
4:15 Ridge Regression details
10:21 Ridge Regression for discrete variables
13:24 Ridge Regression for Logistic Regression
14:12 Ridge Regression for fancy models
15:34 Ridge Regression when you don't have much data
19:15 Summary of concepts
Correction:
13:39 I meant to say "Negative Log-Likelihood" instead of "Likelihood".
#statquest #regularization

Пікірлер
  • Correction: 13:39 I meant to put "Negative Log-Likelihood" instead of "Likelihood". A lot of people ask about 15:34 and how we are supposed to do Cross Validation with only one data point. At this point I was just trying to keep the example simple and if, in practice, you don't have enough data for cross validation then you can't fit a line with ridge regression. However, much more common is that you might have 500 variables but only 400 observations - in this case you have enough data for cross validation and can fit a line with Ridge Regression, but since there are more variables than observations, you can't do ordinary least squares. ALSO, a lot of people ask why can't lambda by negative. Remember, the goal of lambda is not to give us the optimal fit, but to prevent overfitting. If a positive value for lambda does not improve the situation, then the optimal value for lambda (discovered via cross validation) will be 0, and the line will fit no worse than the Ordinary Least Squares Line. Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    @statquest@statquest4 жыл бұрын
    • @VINAY MALLU To repeat what I wrote in the comment you replied to: A lot of people ask why can't lambda by negative. Remember, the goal of lambda is not to give us the optimal fit, but to prevent overfitting. If a positive value for lambda does not improve the situation, then the optimal value for lambda (discovered via cross validation) will be 0, and the line will fit no worse than the Ordinary Least Squares Line.

      @statquest@statquest3 жыл бұрын
    • @VINAY MALLU The larger the dataset, the less likely you are to overfit the data. So in some sense, regularization becomes less important. However, Lasso (L1) regularization is still helpful for removing extra variables regardless of the size of the dataset. And even with very large datasets, ML algorithms that depend on weak learners benefit from regularization.

      @statquest@statquest3 жыл бұрын
    • @@statquest Coming back to Vinay's question: In the counterexample he gives a negative lambda would not achieve a better fit to the training data, but prevent overfitting (in that case overfitting to a more shallow slope). I really liked the video and found most of it very intuitive, but the fact that ridge regression favours a more shallow slope is not. With a large set of predictors, it's easy to see that enforcing sparsity may provide better out-of-sample predictions in practice. But with a single predictor the prior assumption of 'the obsvered data tend to overestimate the influence of the predictor' seems no more justified than its opposite would be. In other words: under OLS assumptions the distribution of OLS fitted slopes will be symmetrically centered on the 'true' slope. But the example was really helpful to understand that ridge regression doesn't work that way and instead biases the fit towards the intercept.

      @sfz82@sfz823 жыл бұрын
    • Is there an intuitive explanation, why the intercept beta 0 is not included in the regularization process?

      @cosworthpower5147@cosworthpower51472 жыл бұрын
    • @@cosworthpower5147 The goal is to reduce sensitivity to the parameters. The y-axis intercept does not depend on any of the parameters, so there's no reason to shrink it. Instead, as the other parameters go to 0, the intercept goes to the mean y-axis value.

      @statquest@statquest2 жыл бұрын
  • I am a machine learning engineer at a large, global tech company with a decade of experience in industry and a computer science graduate student. Your channel has helped me immensely in learning new concepts for work and job interviews, and your videos are so enjoyable to watch. They make learning feel effortless! Thank you so much!!

    @scubashar@scubashar3 жыл бұрын
    • Wow! Thank you very much! :)

      @statquest@statquest3 жыл бұрын
    • can you give me a job plz?

      @VainCape@VainCape3 жыл бұрын
    • @Son Of Rabat , some people (like me) might have skipped the "simple stuff" to jump right into the complex stuff because it gives better results. For example, I was introduced to ML by working with image classification and object detection right away, where deep learning is king. I studied backpropagation, gradient descent, etc, but never heard of Ridge Regression, for example, until recently. Now I'm trying to collect the pieces I left behind. (I also always sucked with the theoretical parts. As long as the evaluation metrics were good, it was fine... And it kind of worked for me, for some time. I'm now trying to change that, and deepen my theoretical knowledge.)

      @LucasPossatti@LucasPossatti2 жыл бұрын
    • Today, I also work for a global tech company (as a Data Scientist). Not for a decade though. 😅

      @LucasPossatti@LucasPossatti2 жыл бұрын
    • @@LucasPossatti same for me. I work as DS at large tech company, but still learn a lot from SQ

      @joshsherfey@joshsherfey2 жыл бұрын
  • After watching dozens of StatQuest videos, I finally know when to say 'BAM!'

    @ryzary@ryzary3 жыл бұрын
    • Bam! :)

      @statquest@statquest3 жыл бұрын
    • 🤣🤣🤣🤣🤣🤣🤣🤣

      @akshaydeshmukh4916@akshaydeshmukh49163 жыл бұрын
    • plz tell me when to say BAM M still unable to understand 😭

      @nailashah6918@nailashah69182 жыл бұрын
    • I had to build a ML model to help me predict the proper times to say ‘BAM!’

      @loayzag91@loayzag912 жыл бұрын
    • BAM!

      @ashishshrivastava8864@ashishshrivastava88642 жыл бұрын
  • This channel is by far the best at explaining mathematical concepts related to machine learning. I'm in a machine learning class at my university and go to every class lecture. I leave not having understood an hour and fifteen minutes of lecture. I immediately pull up this channel and watch a video on the same concept and "BAM". It makes sense.

    @bradleyrivers9489@bradleyrivers94892 ай бұрын
    • BAM! :)

      @statquest@statquest2 ай бұрын
  • Only Statquest can make someone emotional while learning statistics. The ease with which the concepts are flowing flawlessly into my brain makesme teary. Thank you so much 🥺❣

    @pritamfeb13@pritamfeb136 күн бұрын
    • Thanks!

      @statquest@statquest6 күн бұрын
  • Explaining things at this complexity at this level of simplicity is a real skill! Awesome channel!

    @ardakosar3826@ardakosar38263 жыл бұрын
    • Thank you! :)

      @statquest@statquest3 жыл бұрын
  • I have no words to express how good this lecture is.

    @EvaPev@EvaPev5 ай бұрын
    • Thanks!

      @statquest@statquest5 ай бұрын
  • YOU ARE THOUSANDS OF TIMES BETTER THAN MY PROF...CLEAR & SIMPLE. THANKSSSSS

    @SpL-mu5zu@SpL-mu5zu4 жыл бұрын
    • Thanks! :)

      @statquest@statquest4 жыл бұрын
  • Professors in general teach Ridge Regression with many complicated equations and notations. You made this topic very clear and easy to understand. Thank u very much again.

    @lucaspenna6009@lucaspenna60094 жыл бұрын
    • Thanks! :)

      @statquest@statquest4 жыл бұрын
  • I don't know how my stat teacher can make something this easy to understand that complicated. Everytime I can't understand what he's talking about in the class I know that I have to turn to StatQuest. Thank you for what you're doing.

    @NaggieNag@NaggieNag3 жыл бұрын
    • BAM! :)

      @statquest@statquest3 жыл бұрын
  • This is my first video and I am so impressed by how you explain things!!! It is like my buddy from college will explain it to me in plain words. You rock StatQuest, I am a follower from now on!! Thank you

    @andersonarroyo7238@andersonarroyo72383 жыл бұрын
    • Awesome! Thank you!

      @statquest@statquest3 жыл бұрын
  • The way you go through the logic step by step makes you a good teacher. In many of my research occasions they just say "adjust your alpha higher or lower until you don't overfit / underfit" but I don't even know what am I looking at. Bless you.

    @elliotyip9844@elliotyip9844 Жыл бұрын
    • Happy to help! :)

      @statquest@statquest Жыл бұрын
  • You just spoon feed my brain with your clear explanation, thanks man!

    @petax004@petax0045 жыл бұрын
    • BAM!

      @benagin7970@benagin79703 жыл бұрын
  • You have that ability to explain difficult topics in a very simple way, this is amazing! Thank you so much

    @republic2033@republic20333 жыл бұрын
    • Thank you! :)

      @statquest@statquest3 жыл бұрын
  • Thank you, Josh, for another fun StatQuest! I really enjoyed learning the use and benefits of Ridge Regression!

    @tymothylim6550@tymothylim65503 жыл бұрын
    • BAM!

      @statquest@statquest3 жыл бұрын
  • Man, love the sarcasm in your voice and the concise / crisp explanation of your concepts! DOUBLE BAMMMM!

    @tejbirsinghbhatia3090@tejbirsinghbhatia30903 жыл бұрын
    • Glad you liked it!

      @statquest@statquest3 жыл бұрын
  • I love you Stat quest. Your videos are better than any other stats resource I have come across, and I am actually understanding things now, which will help me do my job better. Please never stop making these excellent videos...

    @taltastic2@taltastic24 жыл бұрын
    • Thank you so much! And thank you for your support! I hope to make videos for the rest of my days (which I hope are many!). :)

      @statquest@statquest4 жыл бұрын
  • your explanations are insane... they're so easy to understand and literally capture the essence of the topic without being overly complicated! i've bingewatched so many of your videos ever since chancing upon your channel last night - i specially love the little jingles you add in at the start of your videos, they really add such a fun and personal touch~ thank you so so soo much, your channel has really helped me immensely!!!

    @charissapoh1159@charissapoh11592 жыл бұрын
    • Wow, thank you!

      @statquest@statquest2 жыл бұрын
    • Yes same for me!

      @johnbainbridge1931@johnbainbridge19319 ай бұрын
  • Incredibly clear explanation. I'm using your Machine Learning videos to study for my midterm for sure. It's so nice to know that these concepts aren't above my head after all.

    @DragomirJtac@DragomirJtac5 жыл бұрын
    • Nice!! Good luck on your mid terms!

      @statquest@statquest5 жыл бұрын
  • I've spent so much time trying to read and understand what EXACTLY is ridge regression. This video made it much easier to understand. Thank you so much for simplifying this complex concept!

    @Nicole-se7zj@Nicole-se7zj2 жыл бұрын
    • Bam! :)

      @statquest@statquest2 жыл бұрын
  • Amazing video, I have read many articles and watched many videos to understand the idea behind Ridge & Lasso Regression and finally you explained in the most simplest way, many thanks for your effort.

    @daliakamal5621@daliakamal56212 жыл бұрын
    • Glad it was helpful!

      @statquest@statquest2 жыл бұрын
  • I love your videos. They are so easy to follow and understand complicated concepts and procedures! Thanks for sharing all of the brilliant ideas!

    @meichendong3434@meichendong34344 жыл бұрын
    • Awesome! Thank you! :)

      @statquest@statquest4 жыл бұрын
  • I am brand-new to statistics, and I'm in school to be a data scientist. so many times, I lose the plot watching lectures from my professors who have the Curse of Knowledge. I end up spending hours watching your videos and they help so much, I just don't even have words! I've recommended your channel to all my classmates--and I mentioned it so much, my professor is considering adding your channel to recommended materials for next semester! you are a shining light of joy in a jargon-filled sea of confusion.

    @user-fo4lr4ku7f@user-fo4lr4ku7f Жыл бұрын
    • Thank you so much and good luck with your coursework! :)

      @statquest@statquest Жыл бұрын
    • I study data science too at a uni and his videos are helping me stay afloat in my statistical learning course. Not all heroes were capes and he's truly one of them!

      @Dreadheadezz@Dreadheadezz Жыл бұрын
    • @Linda Wallberg @Josh Sherfey @Lucas Possatti I don't see why we even use lambda, it doesn't seem to change anything 🤔, i'd understand if it were a value between 0-1 but not any>=0. Can someone please explain? Multiplying lamba (scalar) to slope² should only scale it in parallel direction right? We basically just take any smaller arbitrary slope (introduce bias) and that's all.

      @shivanit148@shivanit14811 ай бұрын
    • @@shivanit148 No, we don't take an arbitrary smaller slope. We find the one slope that minimizes the SSR + penalty

      @statquest@statquest11 ай бұрын
  • I am glad that I found this video! I had trouble understanding regularized regression and this video made it too simple to understand.

    @akhilmahajan1417@akhilmahajan14175 жыл бұрын
  • So far the best Video i ever saw for regression ... thanks Josh !!

    @viniths7683@viniths76835 жыл бұрын
  • I have a big data economics exam tomorrow and you literally just saved my life. I don't always understand what my professor is trying to explain, but you did it super clearly. Actual life saver

    @TheGoldenFluzzleBuff@TheGoldenFluzzleBuff5 жыл бұрын
    • How was the exam?

      @huzefaghadiyali5886@huzefaghadiyali58862 жыл бұрын
    • I’m more concerned that this “literally saved their life”.

      @jamesdelaney9599@jamesdelaney959910 ай бұрын
  • I really appreciate your videos! Keep up the good work.

    @monicakulkarni3319@monicakulkarni33194 жыл бұрын
  • Didn't even realized this StatQuest video is super long until you mentioned it, truly enjoy your way to explain, thanks))))))))

    @winghho9@winghho95 жыл бұрын
    • Hooray! I'm glad you liked it. :)

      @statquest@statquest5 жыл бұрын
  • Whenever I feel some concept in ML, DS is not easily understood, I come to this channel because you explain it in a simple way with good examples.

    @malini76@malini762 жыл бұрын
    • Thank you!

      @statquest@statquest2 жыл бұрын
  • Just Brilliant!! Josh Starmer - You are a genius!

    @sam271183@sam2711835 жыл бұрын
    • Thank you! :)

      @statquest@statquest5 жыл бұрын
  • I came to know about this channel 2 hours ago. Simple and Outstanding explanation. My aim is to watch each and every video. Loving your style of teaching. From India.

    @anamfatima5489@anamfatima54893 жыл бұрын
    • Thank you very much! :)

      @statquest@statquest3 жыл бұрын
  • Thanks for this awesome explanation. This is the first time I really understood how ridge regression works.

    @yousufali_28@yousufali_285 жыл бұрын
    • Hooray!!!! :)

      @statquest@statquest5 жыл бұрын
  • Level of simplicity on this channel is just BAM!!!

    @RaviYadav-nj8zh@RaviYadav-nj8zh3 жыл бұрын
    • Thank you! :)

      @statquest@statquest3 жыл бұрын
  • your channel deserves more recognition, Keep up the good work

    @jobandeepsingh1929@jobandeepsingh19294 жыл бұрын
    • Thank you! :)

      @statquest@statquest4 жыл бұрын
  • This is incredibly helpful!! I will be watching many of your videos to supplement my stats/data science studies :) Thank you!

    @seetarajpara7626@seetarajpara76263 жыл бұрын
    • Glad it was helpful!

      @statquest@statquest3 жыл бұрын
  • Firstly i like to thank you for explaining these concepts in such a crystal clear manner , this is one of the best video i ever witnessed. second, i request you to please make some video on backpropagation and some tedious concepts of M.L. once again thank you.

    @omprakash007@omprakash0075 жыл бұрын
  • I looked out for 3-4 videos before this. But this one was the best in term of explanation and very easily understood. Thanks!

    @snehabag4820@snehabag4820 Жыл бұрын
    • Glad it was helpful!

      @statquest@statquest Жыл бұрын
  • Josh, I have been practicing data science since last 4 years and have used Ridge regression as well. But now I am feeling embarrassed after watching this explanation because before the video I only had half baked knowledge. You deserve a lot of accolades my friend :)

    @JT2751257@JT27512574 жыл бұрын
    • Awesome! I'm glad the videos are helpful. :)

      @statquest@statquest4 жыл бұрын
  • You made learning this complicated topic (for me) a lot more fun than from reading from a textbook or from my own lecturer. Very entertaining too... Well done!

    @monazaizan947@monazaizan9476 ай бұрын
    • Glad it was helpful!

      @statquest@statquest6 ай бұрын
  • I've taken 4 machine learning courses and always wondered what ridge regression was, because I've heard it several times, but I was never taught it. I never realized it was just adding the regularization parameter! Awesome! Thank you so much.

    @SomeOfOthers@SomeOfOthers5 жыл бұрын
    • Hooray! I'm glad the video helped clear up a long standing mystery. As you've noticed, a lot of machine learning is about giving old things new names - which makes it a lot easier to understand than we might think at first.

      @statquest@statquest5 жыл бұрын
  • You are absolutely amazing and the videos are so insanely useful! If these videos were available 5 years ago, I would have skipped all my stat classes! : )

    @senzhuang9408@senzhuang94085 жыл бұрын
    • Thank you so much! :)

      @statquest@statquest5 жыл бұрын
  • BAM! The concepts are presents in the clearest way ever.

    @kaimueric9390@kaimueric93904 жыл бұрын
    • Thank you! :)

      @statquest@statquest4 жыл бұрын
  • Thank you so much. You made this so much easier to understand than my professor. Really appreciate it

    @spencerprice1676@spencerprice16765 жыл бұрын
    • You're welcome! I'm glad to hear that the video was helpful. :)

      @statquest@statquest5 жыл бұрын
  • Wow! Such a simple yet detailed exposition!

    @balajiadithya1292@balajiadithya12922 жыл бұрын
    • Thank you!

      @statquest@statquest2 жыл бұрын
  • Thank you, Josh, you made the ML and stat easy and enjoyable. Hands down better than most stat prof.

    @skylarj720@skylarj720 Жыл бұрын
    • Thank you very much! :)

      @statquest@statquest Жыл бұрын
  • The lecture was at a whole different level.....thank you for such amazing content dear Josh

    @hrushikeshkulkarni7353@hrushikeshkulkarni7353 Жыл бұрын
    • Thanks!

      @statquest@statquest Жыл бұрын
  • Thanks Josh! You’re absolutely the best 💪🏻

    @PedroRibeiro-zs5go@PedroRibeiro-zs5go3 жыл бұрын
    • Thank you very much! :)

      @statquest@statquest3 жыл бұрын
  • Thank you very much !! Already subscribe the channel since the first video i saw ( bias and variance ) . You deliver the academic content in a very fun , full of pictures ( which helps a lot because many people find it hard to understand new concept without solid examples , including me ) , and easy to understand way . Please continue making more content like this !!

    @datle5585@datle5585 Жыл бұрын
    • Thank you!

      @statquest@statquest Жыл бұрын
  • StatQuest - you are awesome! You’re my go-to source to learn stats when my textbooks fail me.

    @trmohr@trmohr4 жыл бұрын
    • Hooray!!! :)

      @statquest@statquest4 жыл бұрын
  • Your channel is a god send!

    @juhipathak8433@juhipathak84335 жыл бұрын
    • Thank you! :)

      @statquest@statquest5 жыл бұрын
    • That's so true!

      @karthik-ex4dm@karthik-ex4dm5 жыл бұрын
    • then type qurdriple bam

      @Shubhamkumar-ng1pm@Shubhamkumar-ng1pm4 жыл бұрын
  • Clearly explained is an understatement, it is the saturated BAM!!!

    @Tapsthequant@Tapsthequant3 ай бұрын
    • Thank you very much! :)

      @statquest@statquest3 ай бұрын
  • Never stop teaching sir... U r the best

    @arpitmishra8439@arpitmishra84395 жыл бұрын
  • Probably the most sensible explanation available on youtube..and yes...BAM!! ;)

    @tusharpatil96@tusharpatil963 жыл бұрын
    • Wow, thanks!

      @statquest@statquest3 жыл бұрын
  • Josh, even though I have just started Machine Learning and Data Science in my French Engineering "Grande Ecole", watching your videos just replaced most of the teachers I had met in my life. Great BAM my friend and thank you, just keep it up! You got a rare gift

    @macilguiddir3680@macilguiddir36805 жыл бұрын
    • Thank you so much! I'm so happy to hear that my videos are helpful! :)

      @statquest@statquest5 жыл бұрын
    • StatQuest with Josh Starmer Even French people rely on you and are looking forward to studying your next videos ;)

      @macilguiddir3680@macilguiddir36805 жыл бұрын
    • Hooray!

      @statquest@statquest5 жыл бұрын
    • lol tu dois avoir des très mauvais profs du coup, c'est quelle école?

      @luisakrawczyk8319@luisakrawczyk83195 жыл бұрын
  • I just keep coming back to you Josh! Thanks for your clear explanation.

    @Anthestudios@Anthestudios Жыл бұрын
    • Glad to hear it!

      @statquest@statquest Жыл бұрын
  • Excellent as always! Thank you for sharing!

    @yulinliu850@yulinliu8505 жыл бұрын
    • Hooray! :)

      @statquest@statquest5 жыл бұрын
  • Mega BAM!!!! Thank you I can't wait to learn the next lesson

    @longkhuong8382@longkhuong83825 жыл бұрын
    • Hooray!!!! :) The next one, on Lasso Regression, should come out in the next week or so.

      @statquest@statquest5 жыл бұрын
    • Yeah!, It's great. Thank you

      @longkhuong8382@longkhuong83825 жыл бұрын
  • This is so cool, it's almost like magic.

    @shashankupadhyay821@shashankupadhyay8213 жыл бұрын
    • YES! :)

      @statquest@statquest3 жыл бұрын
  • People like you, makes world a better place … thanks for being you ...

    @yassersayed6109@yassersayed61095 жыл бұрын
    • Thank you!!! :)

      @statquest@statquest5 жыл бұрын
  • Your videos are always with so much fun! Thank you.

    @tedwong9301@tedwong93014 жыл бұрын
    • Thank you! :)

      @statquest@statquest4 жыл бұрын
  • absolutely amazing, thank you sir!

    @aliozcankures7864@aliozcankures78642 жыл бұрын
    • Glad you liked it!

      @statquest@statquest2 жыл бұрын
  • Thank you for this video, it's so helpful! I can't believe, it's only 500 views. Please consider patreon account that people could thank for your work!

    @kslm2687@kslm26875 жыл бұрын
    • Thank you! I'll look into the patreon account. In the mean time you can support my channel through my bandcamp site - even if you don't like the songs, you can buy an album and that will support me. joshuastarmer.bandcamp.com/

      @statquest@statquest5 жыл бұрын
  • Can't say how much I love you!! God please make sure this channel is always here❤

    @runxingjiao1979@runxingjiao1979 Жыл бұрын
    • Thank you! :)

      @statquest@statquest Жыл бұрын
  • its insane i keep coming back to this channel to brush up on material. I am finally graduating this summer but i know for sure i will coming back here just here "small Bam!" and "Bamm" lol

    @habeshadigitalnomad137@habeshadigitalnomad137Ай бұрын
    • Congratulations! BAM! :)

      @statquest@statquestАй бұрын
  • My lecturer explained this by just putting the equation in front of us on the slides. The maths is easy but I didn't understand the point or intuition behind behind adding a penalty. Now I do. Thank you.

    @tommcnally3231@tommcnally32314 жыл бұрын
    • I'm glad the video was helpful. :)

      @statquest@statquest4 жыл бұрын
  • wow. seriously better explained than lectures from my professor in the data science department

    @aliciachen9750@aliciachen97504 жыл бұрын
    • Thanks! :)

      @statquest@statquest4 жыл бұрын
    • i know, right?

      @hgflame@hgflame4 жыл бұрын
  • I came here to learn about ridge regression only to realize it's L2 regularization. Aside from this, StatQuest is simply amazing. I use it to brush up on theory before interviews.

    @OttoFazzl@OttoFazzl4 жыл бұрын
    • It's true - I'm not sure why we call it Ridge Regression and not L2. Or the other way around. And, on top of that, why not pick a name that is easy to remember, like "Squared Regularization".

      @statquest@statquest4 жыл бұрын
  • Greetings from Ukriane, Josh! I'd like to say thanks to you for even though we are in a difficult situation here, but your videos on machine learning techniques always help me comprehend topics of this field....i am grateful to you! Thank you so much!!!

    @youknowwhatlol6628@youknowwhatlol66283 ай бұрын
    • Wow! I can't imagine trying to learn ML in your situation, but I'm happy that I can help in some way.

      @statquest@statquest3 ай бұрын
  • These videos are awesome! Somehow, listening to the video, I feel it comes from/for someone with a background in stats, than a typical computer science machine learning video.

    @programminginterviewprep1808@programminginterviewprep18085 жыл бұрын
    • Interesting. My background is both computer science and statistics - but I did biostatistics for years before I did machine learning, so that might explain it.

      @statquest@statquest5 жыл бұрын
  • I was listening with extreme focus and you suddenly threw "Airspeed of Swallow" at me. I died XDDDDDDDDDDDD

    @vspecky6681@vspecky66814 жыл бұрын
    • Awesome! :)

      @statquest@statquest4 жыл бұрын
    • what do you mean, African or European Swallow

      @oldcowbb@oldcowbb3 жыл бұрын
  • best video about Ridge ever !!!!! very clear and precise!

    @luizelias2560@luizelias25603 жыл бұрын
    • Wow, thanks!

      @statquest@statquest3 жыл бұрын
  • Thank you so much for this video, I love you! I hope it helps for the exam tomorrow

    @mrsmith33712@mrsmith337124 жыл бұрын
    • Good luck! Let me know how it goes!

      @statquest@statquest4 жыл бұрын
  • Quadruple bam!!!! For your explanation

    @akashdesarda5787@akashdesarda57875 жыл бұрын
    • Hooray! I'm glad you like it! :)

      @statquest@statquest5 жыл бұрын
  • Love from India. Wish me good luck interview in less than days.

    @kadhirn4792@kadhirn47924 жыл бұрын
    • Thank you and good luck with your interviews. Let me know how they go. :)

      @statquest@statquest4 жыл бұрын
    • @@statquest narrator: they never did let StatQuest know...

      @Whoasked777@Whoasked7773 жыл бұрын
    • @@Whoasked777 Totally! I hope they went well.

      @statquest@statquest3 жыл бұрын
  • Great explanation Josh. Thank you so much! Helped mea lot!

    @george480@george4804 жыл бұрын
    • Hooray! I'm glad the video was helpful. :)

      @statquest@statquest4 жыл бұрын
  • This really helped me! Thank you so much for making these videos.

    @MayanPatel1@MayanPatel14 жыл бұрын
    • Thank you! :)

      @statquest@statquest4 жыл бұрын
  • Josh, you're a true hero with your explanations. Thanks a bunch! I have one question though. In the video (in the graph at 19:20 for example) you show that a ridge regression would fit real world data better, as it shrinks the beta (the graph shows that in the real world this beta is also smaller, due to most green points (=real world data) being positioned below the red line (=training data)). However, would ridge regression still be better if for example most of the green dots would be above the red line? Because with ridge regression we would shrink the beta, while the real world beta in reality has even a higher slope than the slope of the red line (thus in this case ridge would lead to increase in both variance and bias for real world data?)

    @zeerot@zeerot5 жыл бұрын
    • This is a great question - the key is that when lambda = 0, then you get the exact same result as least squares - so Ridge Regression can not do worse than Least Squares, it can on only do better. In the case you mention, sure, if all of the green dots are above the red dots, neither Least Squares or Ridge Regression will do well - but Ridge Regression will do no worse than Least Squares.

      @statquest@statquest5 жыл бұрын
    • Thank you for posting this question. One thousand comments on this video, all well deserved praise as this video and the whole channel are awesome. Yet only you asked this obvious question. Makes me wonder how many people actually bothered to understand the whole point of Ridge Regression.

      @CyberSinke@CyberSinke2 жыл бұрын
  • How would you do cross validation for the example @ 10:16 to determine lambda? For example, would you then take 10 random samples of 2 (out of 8) data points and try different lambda's (for example lambda 1-20) for each _individual_ sample? And then determine which value of lambda in all those 10 samples gives the lowest variance?

    @Tntpker@Tntpker5 жыл бұрын
    • That's the idea. In practice, there are usually many more samples, so you're not just picking 2 samples at a time, but that's the idea.

      @statquest@statquest5 жыл бұрын
    • @@statquest Thanks!

      @Tntpker@Tntpker5 жыл бұрын
    • How to calculate that variance then?

      @dadipsaus332@dadipsaus3324 жыл бұрын
  • You are literally a LIFE SAVER!! Thank you sosososo much

    @ameliaschricker2527@ameliaschricker25272 жыл бұрын
    • Thanks!

      @statquest@statquest2 жыл бұрын
  • This is not StatQuest.. this is Machine learning slayer! Damn! Another awesome video. Bravo bravo!

    @999Stergios@999Stergios5 жыл бұрын
    • Thank you so much! I really appreciate it. :)

      @statquest@statquest5 жыл бұрын
  • Small question: Does ridge regression only decrease sensitiveness ? What if instead of this example, our test set was above the red line ? Normally we'll need to increase sensitiveness ?

    @iefe65@iefe655 жыл бұрын
    • This will be taken care of... if you are taking a random sample ... don't worry

      @vishaltyagi2983@vishaltyagi29833 ай бұрын
  • Great explanation as always. There is something it's not convincing me about this type of regression. The ridge regression assumes that the training data are always overestimating the slope. Isn't possible that the training data are underestimating the slope instead?

    @fmetaller@fmetaller5 жыл бұрын
    • If the training data underestimate the slope, then shrinking it will not improve the fit during cross validation. In this case the best value for lambda will be zero. So ridge regression can’t make things worse. Does this make sense?

      @statquest@statquest5 жыл бұрын
    • @@statquest yes it's clear. Thank you for your explanation.

      @fmetaller@fmetaller5 жыл бұрын
    • I also had same question. Thankfully, I found your comment!

      @akhilmahajan1417@akhilmahajan14175 жыл бұрын
  • Thank you Josh for a great video as always!

    @heteromodal@heteromodal3 жыл бұрын
    • My pleasure!

      @statquest@statquest3 жыл бұрын
  • So awesome!!! Many complicated stuffs are simply putted. You're grate! :D Thank you.

    @LetWorkTogether@LetWorkTogether4 жыл бұрын
    • Thanks! :)

      @statquest@statquest4 жыл бұрын
  • This reminds me of L2 regularization of weights in neural networks.

    @herp_derpingson@herp_derpingson5 жыл бұрын
    • Yes! This is the exact same thing, only applied to Regression. I think it appeared first in the regression context, but I'm not sure.

      @statquest@statquest5 жыл бұрын
  • So from what I understand, ridge regression controls the slope from getting big right? This affects bias but reduces variance a lot so overall its better. But what if my true model has a slope that is actually bigger(steeper) than what I got using my training data? In that case wouldn't you be making the model worse by using regularization? In other words, why are we "desensitizing" when we don't know what the underlying model is? What if sensitivity in actual model is higher?

    @lazypunk794@lazypunk7945 жыл бұрын
    • I have this exact same doubt! I guess we use trial and error and see whether the model improves, if it doesn't the only way to either use a more complex function or get more training data.

      @sidsr@sidsr5 жыл бұрын
    • @@sidsr oh okay.. but still regularization works pretty much everytime right

      @lazypunk794@lazypunk7945 жыл бұрын
    • I think once you test all possible value of lambda, the one gives you the smallest test error will be the best one. So if true model is steeper (and assume test error gave you an approximation to true error) the lambda will reduce to zero.

      @meinizizheng9867@meinizizheng98674 жыл бұрын
    • by trial and error, your model will get the best performance when lambda=0, which means "no regularizer used".

      @-long-@-long-4 жыл бұрын
  • This Vid is simply life-saving

    @timshao13@timshao13 Жыл бұрын
    • bam! :)

      @statquest@statquest Жыл бұрын
  • EXCELLENT. Really well done!

    @massimobuonaiuto8753@massimobuonaiuto87535 жыл бұрын
    • Thank you so much! :)

      @statquest@statquest5 жыл бұрын
  • you are my sunshine,my only sunshine , you make me happy when f**king math puzzled me!

    @1pompeya170@1pompeya1704 жыл бұрын
  • Who's watching this the day before their machine learning finals?

    @nathanx.675@nathanx.6754 жыл бұрын
    • me lol

      @kelgamel2203@kelgamel22033 жыл бұрын
    • Meeee

      @tinashemwaniki@tinashemwaniki3 жыл бұрын
    • I have my midterm tomorrow

      @AP-pm9qy@AP-pm9qy3 жыл бұрын
  • TRIPLE BAM! You are amazing! Thank you so much for all your videos!

    @sofiyavyshnya6723@sofiyavyshnya67233 жыл бұрын
    • Glad you like them!

      @statquest@statquest3 жыл бұрын
  • this series is really dope and deep

    @well....7751@well....77514 жыл бұрын
    • Thanks! :)

      @statquest@statquest4 жыл бұрын
  • How to prove "the slop close to 0 when lambda increasing in the 9:42"?

    @hzyTMU@hzyTMU4 жыл бұрын
    • I have the same question

      @badoiuecristian@badoiuecristian4 жыл бұрын
    • when lambda tend to infinity, SSE will be negligible compared to lambda * slope^2, hence slope has to go to 0

      @chandankumar-jo7rf@chandankumar-jo7rf4 жыл бұрын
  • How do we get the new line in 3:40 ? We calculated 1.69 and 0.74, what did we do with it to get the new line?

    @MrChryssy1@MrChryssy15 жыл бұрын
    • In practice, ridge regression starts with the least squares estimates for the slope and intercept. Then it changes the slope a little bit to see if the sum of the squared residuals plus lambda times the squared slope gets smaller. If so, keep the new value. Then make the slope a little smaller and see if the sum of squared residuals plus lambda times the squared slope gets smaller. If so, keep the new value. Repeat those steps over and over again until you the sum of the squared residuals plus lambda times the squared slope no longer gets smaller. Does that make sense?

      @statquest@statquest5 жыл бұрын
    • @@statquest Hi Josh, the slope that you are referring to is just one of our parameters that we want to minimize right? For a higher order fitting, can it be any other parameter apart from slope?

      @utkarshkulshrestha2026@utkarshkulshrestha20265 жыл бұрын
    • @@utkarshkulshrestha2026 Least Squares will work to minimize the sum of the squared residuals using all of the parameters and the ridge regression will be applied to all parameters except for the intercept. Thus, for all parameters other than the intercept, we try to minimize the sum of the squared residuals plus the ridge regression penalty. Usually reducing the parameter values will increase the sum of the squared residuals a little bit and decrease the ridge regression penalty a lot. Does that make sense?

      @statquest@statquest5 жыл бұрын
    • @@statquest Yes, this was pretty very much clear. Thank you..!!

      @utkarshkulshrestha2026@utkarshkulshrestha20265 жыл бұрын
    • @@statquest I mean the calculation^^That is what I am not quite sure about

      @MrChryssy1@MrChryssy15 жыл бұрын
  • Your content is SOOOO good, thank you!

    @akarshnagaraj6155@akarshnagaraj61553 жыл бұрын
    • Thank you very much! :)

      @statquest@statquest3 жыл бұрын
  • Wow, you are my personal Lifesaver. Didnt understand the concepts of Ridge Regression in any other source

    @MrFalingdown@MrFalingdown2 жыл бұрын
    • Glad I could help! :)

      @statquest@statquest2 жыл бұрын
  • 14:57 An African or European swallow?

    @gramble10@gramble104 жыл бұрын
    • A most excellent question sir! :)

      @statquest@statquest4 жыл бұрын
  • Love from 🇵🇰 Pakistan.

    @usamanavid2044@usamanavid20444 жыл бұрын
    • Thank you! :)

      @statquest@statquest4 жыл бұрын
  • hello sir i just wanted to tell you that you are the teacher ! thank you for your diamond cut clarification

    @vinodr9655@vinodr96554 жыл бұрын
    • Thank you very much! :)

      @statquest@statquest4 жыл бұрын
  • this is the best content i have ever seen on machine learning triple baam.

    @Shubhamkumar-ng1pm@Shubhamkumar-ng1pm4 жыл бұрын
    • Thank you very much! :)

      @statquest@statquest4 жыл бұрын
KZhead