A pretty reason why Gaussian + Gaussian = Gaussian

2024 ж. 15 Мам.
760 107 Рет қаралды

A visual trick to compute the sum of two normally distributed variables.
3b1b mailing list: 3blue1brown.substack.com/
Help fund future projects: / 3blue1brown
Special thanks to these supporters: www.3blue1brown.com/lessons/g...
For the technically curious who want to go deeper, here's a proof of the central limit theorem using Moment generating functions:
www.cs.toronto.edu/~yuvalf/CL...
And here's a nice discussion of methods using entropy:
mathoverflow.net/questions/18...
Relevant previous videos
Central limit theorem
• But what is the Centra...
Why π is there, and the Herschel-Maxwell derivation
• Why π is in the normal...
Convolutions and adding random variables
• Convolutions | Why X+Y...
Time stamps
0:00 - Recap on where we are
2:10 - What direct calculation would look like
3:38 - The visual trick
8:27 - How this fits into the Central Limit Theorem
12:30 - Mailing list
Thanks to these viewers for their contributions to translations
German: lprecord, qoheniac
Spanish: Pablo Asenjo Navas-Parejo
Vietnamese: Duy Tran
------------------
These animations are largely made using a custom Python library, manim. See the FAQ comments here:
www.3blue1brown.com/faq#manim
github.com/3b1b/manim
github.com/ManimCommunity/manim/
You can find code for specific videos and projects here:
github.com/3b1b/videos/
Music by Vincent Rubinetti.
www.vincentrubinetti.com/
Download the music on Bandcamp:
vincerubinetti.bandcamp.com/a...
Stream the music on Spotify:
open.spotify.com/album/1dVyjw...
------------------
3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with KZhead, if you want to stay posted on new videos, subscribe: 3b1b.co/subscribe
Various social media stuffs:
Website: www.3blue1brown.com
Twitter: / 3blue1brown
Reddit: / 3blue1brown
Instagram: / 3blue1brown
Patreon: / 3blue1brown
Facebook: / 3blue1brown

Пікірлер
  • I made a video covering a proof of the central limit theorem, that is, answering why there is a "central limit" at all. It's currently posted for early viewing on Patreon: www.patreon.com/posts/draft-video-on-i-87894319 I think the video has room for improvement, and decided to put it on a shelf for a bit while working on other projects before turning back to it. In the meantime, though, if you are curious about why all finite variance distributions will tend towards some universal shape, it offers an answer. Also, you may be interested to know that a Gaussian is not the only distribution with the property described in this video, where convolving it with itself gives a (rescaled) version of the original distribution. The relevant search term here is "stable distributions", though all others will have infinite variance, so don't fit the criteria of the CLT. Often when the CLT doesn't apply, it's because the independence assumption doesn't hold, but another way it can break is if you're starting with one of these infinite variance cases.

    @3blue1brown@3blue1brown10 ай бұрын
    • please make a post there about the complete software stack you're using to make your videos!

      @csehszlovakze@csehszlovakze10 ай бұрын
    • 9:08 the second part has "transformatoin" at the top

      @official-obama@official-obama10 ай бұрын
    • It's kinda hidden, but for people who prefer RSS to mailing lists, it's at /feed

      @mskiptr@mskiptr10 ай бұрын
    • Are the functions at 0:10 stable distributions? When you started talking about rotational symmetry I was expecting you to bring up a visual graph of one of those functions convoluted with itself and explain why it doesn’t have the special property, but instead 5:55 only shows trivial examples and my curiosity about this question remained unanswered. Is it because the functions from 0:10 are stable distributions? If not, why weren’t they shown when they would have been much more interesting demonstrations of the Gaussian’s specialness than trivial examples?

      @voidify3@voidify310 ай бұрын
    • Grant, could you please make a video on when the discrete can be approximated by the continuous. For example, in this series you showed that discrete random variables added together approach a continuous normal distribution, and you did discrete and continuous convolutions. But what is the error formula one would get by assuming, say, that d6 dice are continuous valued, get your continuous convolution answer, and then take discrete samples of that answer to match the actual discrete nature of d6 dice. I find it much easier to integrate a 'nice' function then it is to simplify a discrete Σ sum.

      @mydroid2791@mydroid279110 ай бұрын
  • I have to laugh. "Why the normal distribution?" was one of the questions that motivated me to get my M.S. Stat a couple of decades ago. I'm loving this series - it adds so much clarity to what I recall learning.

    @justinahole336@justinahole33610 ай бұрын
  • After all the cliffhangers, it's nice to get this series all wrapped up so neatly.

    @jasonremy1627@jasonremy162710 ай бұрын
    • Wrapped up? He didn't prove the central limit theorem at all. Which is supposedly what this was all about. This video itself barely adds anything at all to the previous ones. Moment generating functions are really not all that complicated - it's high school stuff really. And it gives a much clearer intuition for why a Gaussian is the limit in the central limit theorem: it's the unique probability distribution that has a mean and a standard deviation but no higher moments. In other words it's the simplest* distribution: the one that can be described by the least information. Anything else like skew or asymmetry is "averaged out". Sadly, Grant is so obsessed about representing things visually that he brushes over alternatives that are at times far clearer and more powerful ways of understanding this. * [technically the simplest would be a point distribution were a single outcome has probability 1 and everything else probability 0, but that hardly counts as a distribution. And anyway, it's just a special type of Gaussian with width 0]. EDIT: I got mixed up, replace "moment" with "cumulant" above to correct it. Intuition is the same.

      @QuantumHistorian@QuantumHistorian10 ай бұрын
    • @@QuantumHistorian This series is an excellent demonstration of the idea of limits, not just in that the videos are all about the central limit theorem, but also in that he's tending towards the proof of the central limit theorem without ever actually reaching it.

      @Redingold@Redingold10 ай бұрын
    • @@QuantumHistorian This series of video is clearly not to give a fully technical answer but rather an intuitive view of why it's true. I also agree that the visual "trick" here does not seem to simplify a lot the work given that the integral is already easy to compute using trigonometric change of variable that arise naturally, but maybe i'm biased by my own experience.

      @lelouch1722@lelouch172210 ай бұрын
    • You can probably argue that the purpose of the cliffhanger is to encourage the viewer to ponder upon a new solution. That's very much the format of his videos. 3Blue1Brown will never tell the viewer the answer but rather allow open-ended interpretation.

      @KingDuken@KingDuken10 ай бұрын
    • @@KingDuken That's not even remotely true. He starts with hints, but he almost always gives the full solution at the end. Look at the recent video on chords, or older ones on the chessboard puzzle or the Basel problem.

      @QuantumHistorian@QuantumHistorian10 ай бұрын
  • That made for a great lunch 😁. In your last video you described the Gaussian as an “attractive point in the space of all functions” and I LOVED that phrasing - really made it make sense. However I don’t do enough real math to realize that could be the foundation of a proof. That’s pretty cool.

    @AlphaPhoenixChannel@AlphaPhoenixChannel10 ай бұрын
    • Agreed! :) I'm at work eating my lunch and people around me sometimes ask, "Oh are you in school?" and I'm like, "Nope, just an engineer like you that likes learning the math that was never taught!"

      @KingDuken@KingDuken10 ай бұрын
    • The legend is here! 🙏🏽🛐

      @nisargbhavsar25@nisargbhavsar2510 ай бұрын
    • i also had lunch to this video

      @MeanSoybean@MeanSoybean10 ай бұрын
  • I was studying statistics right now and saw this drop

    @capitaopacoca8454@capitaopacoca845410 ай бұрын
    • For real. Happened twice now. With binomial and this.

      @arvind-venkat@arvind-venkat10 ай бұрын
    • They are watching..... There will come a time when they will order us....

      @baidurjyasarkar8854@baidurjyasarkar885410 ай бұрын
    • I guess Grant calculated the time of day with the highest probability that the world population would study statistics and then release the video at that time, lol

      @THEMATT222@THEMATT22210 ай бұрын
    • Lucky you

      @gaggy7448@gaggy744810 ай бұрын
    • Same

      @benbockelman6125@benbockelman612510 ай бұрын
  • I love that this series actually started with the Borwein integrals video. Like, here's a very curious sequence of integrals and here's an interesting concept to explain it, and then five videos later we've dug so far deep into convolutions that we got an intuitive explanation for one of the most important theorems in all of math. It's all interrelated!

    @johnchessant3012@johnchessant301210 ай бұрын
  • Grant, this has been an absolute masterclass and I genuinely believe it has been your best work so far. Your visualisations have been top notch and it has brought concept space applied to mathematics to a level not seen before, all publicly accessible through KZhead. You are making mathematics a better field for the entire world. Thanks for your hard work!

    @novakonstant@novakonstant9 ай бұрын
  • I like this related explanation: Let X and Y be independent normal random variables, and write S = X+Y for their sum. You can think of S as the dot product of the 2-d vectors (X,Y) and (1,1). As Grant said, the key aspect of the normal random variables is that if you take a draw a pair of them, the result is rotationally symmetric. Now dot product is *also* rotationally symmetric (the dot product between two vectors only depends on their lengths and angle). So the distribution on S would be the same if we rotated (1,1) to any other vector with length sqrt2; in particular, to (sqrt2,0). But (X,Y) dotted with (sqrt2,0) is just sqrt2 X, so we see that S is distributed as (sqrt2 times) a normal random variable.

    @RyanODonnellTeaching@RyanODonnellTeaching10 ай бұрын
  • thank you for always making sure to show the "roadmap" before diving into the details! Knowing the broad outline beforehand really makes things easier to follow, and it's something that a lot of other explanatory videos/articles don't bother to do.

    @AzureLazuline@AzureLazuline10 ай бұрын
  • One level of brilliance is simply to be brilliant. Another level is to be able to explain and teach. Yet another level of brilliance is to be able to clearly visualize & present the advanced concepts. Wow. No words.

    @avip2u@avip2u10 ай бұрын
  • Now there are so many great explanations on this channel, that it really completes making one understand it.

    @fatitankeris6327@fatitankeris632710 ай бұрын
  • I want to talk about a strange area of probability, where random variables no longer commute: Random Matrices You can define the expectation of a random matrix to be the expectation of its trace, which Essentially is the distribution of its eigenvalues. It turns out, theres a new kind of central limit theorem, known as the "Free central limit theorem" This theorem says that if you have "Freely independent" random matrices, then the mean's eigenvalue distribution tends towards not a normal distribution, but a semicircular distribution. In this probability theory (known as free probability theory), a free convolution exists, which essentially gives the distribution of eigenvalues of X+Y. It turns out the semicircle distribution convolved with itself is another semicircle, much like a normal distribution in classical probability.

    @diffusegd@diffusegd10 ай бұрын
    • Is this what we called ''Wigner semicircle law''?

      @SluisaStoffelen-os5oc@SluisaStoffelen-os5oc9 ай бұрын
  • Thank you for bringing us amazing math content Grant! The world needs it! Enjoying my afternoon coffee while watching this one! :)

    @cyancoyote7366@cyancoyote736610 ай бұрын
  • Thank you very much for your hard work, the result is so pleasing. I’ve discovered your channel with the neural network series and I’ve been enjoying your videos ever since. You rekindled in me the taste for mathematics. Greetings and best regards from France

    @SquallEstel@SquallEstel10 ай бұрын
  • Please oh please do a video on the Kalman filter, given how indescribably important it is to our modern existence. The result that the convolution of two Gaussians is a Gaussian is at the heart of the Kalman filter's magic.

    @xyzct@xyzct10 ай бұрын
    • Yes…. So much yes to this, would intersect so many core bits of interest perfectly

      @geekswithfeet9137@geekswithfeet913710 ай бұрын
  • What incredible content. I think like once a year I revisit the same list of statistical oriented content. Between Grant, Richard Mcelreath and Josh Starmer. You really get your bases covered on great stats content.

    @zakwhite5159@zakwhite515910 ай бұрын
  • Congratulations on finally wrapping up this pseudo-series. They’re some of my favorite videos you’ve done!

    @Indecisiveness-1553@Indecisiveness-155310 ай бұрын
  • Last time, just after I completed IFFT, you dropped a video on continuous convolution. Yesterday, I finished studying Bivariate Normal distribution and you dropped this. Perfect timing for me!

    @whitewalker608@whitewalker60810 ай бұрын
  • It's been 7 years since I took calculus but this is a great way to revisit those concepts. Thank you!

    @user-gv3xt5we1j@user-gv3xt5we1j10 ай бұрын
  • honestly one of of the best series on youtube

    @stick-Iink@stick-Iink10 ай бұрын
  • Wondeful video! The feel i got (in high school) when i proved something by symmetry always made my day cheer up! Usually these are the most elegants approches to do and the simplest in intuition. Much respect ❤

    @genuine8879@genuine88792 ай бұрын
  • This is what I needed, was working on my project on Central limit theorem in various scenarios.

    @abhinandanangra@abhinandanangra10 ай бұрын
  • always enjoy ur videos. it's nice to watch them and make some connections i might've missed from my time in school

    @jeffmannlein9772@jeffmannlein97729 ай бұрын
  • I was wondering about this topic for a while because I didn’t quite get this concept intuitively. And then 3blue1brown dropped this !!

    @SilasHaslam@SilasHaslam10 ай бұрын
  • My god he drops a video relevant to the topic I take literally after I finish it

    @thanosauce9128@thanosauce912810 ай бұрын
  • Wow it's already in the playlist.. thank you. I wanted to study this for so long

    @div.6763@div.676310 ай бұрын
  • My abstract brain would have loved showing that Gaussians aee their own convolution via the Fourier Transform, since a convolution in coordinate space is multiplication in momentum space (spot the physicist), and since an FT of a Gaussian is a Gaussian, and the product of two Gaussians is a Gaussian, then the convolution of two Gaussians must also be a Gaussian. But, this is an incredibly satisfying explanation. I'm not left wanting, and after being in the field for nearly a decade, I'm glad to see a frequent concept intuited so cleanly, without the need for arcane notation. ❤

    @estrheagen4160@estrheagen416010 ай бұрын
  • I would love you extending this series on gaussian distributions and CLT for when there is correlation and/or dependency.

    @joaodirk@joaodirk10 ай бұрын
  • @3Blue1Brown could you please do a series of videos for the time series analysis, I think we need a visual and intuitive explanation for a lot of things there! Thank you 😊

    @ahmedkamelkamelo7433@ahmedkamelkamelo743310 ай бұрын
  • After having received my Bachelors of Math this past December, I now just realized why we get that sqrt(2) when finding the convolution. The geometric visualization is extremely easy to understand! (I’m sure I derived this back in first year, but I must have forgotten lol)

    @Mavhawk64@Mavhawk6410 ай бұрын
  • Thanks so much for this--it makes it really clear. And the 3-dimensional model is really a lot more like a bell! (although I know that actual bells have a somewhat different shape). I've been using the concept of combining Gaussian (and uniform) distributions for a while now in my (Scala) library called Number. It keeps track of the error bounds on variables. If keeping track of relative bounds, it's easy: for multiplication, just add the relative bounds together; for functions like e^x or x^p, only very slightly more complex. But, for addition, we need to convert to absolute bounds and use the convolution that you've been describing.

    @RobinHillyard@RobinHillyard10 ай бұрын
  • Man I just love your videos Even though I'm way past the time of having genuine will and ability to learn abstract mathematics (living in wartorn hell doesn't really help) but they still give me a sad and lovely nostalgia of the things I love I'm just really glad I learned about your channel and watched it grow without losing any of the great things that made it simply extraordinary

    @Truth4thetrue@Truth4thetrue10 ай бұрын
  • This video is a joyous moment in maths communications, as all your videos are.

    @EPMTUNES@EPMTUNES10 ай бұрын
  • Awesome video as always! I don't think I've seen you do it yet, but I would love to see you tackle explaining how and why the RSA encryption algorithm works.

    @asseenontv247@asseenontv24710 ай бұрын
  • This question popped back into my head yesterday so good timing

    @MrMctastics@MrMctastics10 ай бұрын
  • You make me finally understand why CLT works, thanks ❤

    @chinchao@chinchao10 ай бұрын
  • A mailing list! Awesome. I loved Tom Scott doing it and now you too? Amazing!

    @deltaeins1580@deltaeins158010 ай бұрын
  • You are partly the reason I am in love with statistics. Thank you. ❤

    @lorenzoplaserrano8734@lorenzoplaserrano873410 ай бұрын
  • After taking AP stats in my high school senior year, I'm glad this series tied up some loose ends of that course. Thanks for all the amazing insight! By the way, I was wondering if you could possibly do a video based on a problem I solved and want to confirm my answers on. It goes like this: You have a line segment of any arbitrary length (it doesn't matter). If you cut it in two random places, what is the probability that the three new segments form a triangle without any excess length left over? Again, I believe I know the answer, but I still feel the need to have my results confirmed. I'm also curious if there is any extra insight that can be provided based on problems such as this one. Again, thanks for making this series, and I can't wait to hear what more spicy knowledge you have in store for us!

    @thegreatsibro9569@thegreatsibro95699 ай бұрын
  • Simply amazing. That is a very simple yet very rich explanation for the central role played by the normal distribution, and the visuals are amazing as usual! The much more technical way I've always envisioned this is to say that the normal distribution is in some sense "the fixed point of the Fourier transform", and to see the Central Limit Theorem as some kind of "convergence to the fixed point" result through the Fourier transforms. I wonder if the rotational symmetry, which is the key property you use here, can be linked to this "fixed point of Fourier" thing?

    @mathemelo@mathemelo7 ай бұрын
  • So great seeing this video finally come out just as I finished statistics

    @torkelholm6577@torkelholm657710 ай бұрын
  • Binomials with same p are stable under convolution, Poisson distributions as well. The normal distribution is not unique in that regard. Even Cauchy distributions are stable without having any moments or satisfying the CLT. If I had to pick any intuitive reason why the normal distribution shows up in the CLT, I enjoy the fact that the normals cumulants are all zero from the third and that a standardized iid sum’s cumulants hence all tend to those of the standard normal distribution whenever they exist. Also, not all standardized sums converge in distribution to a normal distribution. The limit can be a Gumbel distribution for example as well.

    @XxRiseagainstfanxX@XxRiseagainstfanxX10 ай бұрын
  • WONDERFUL THANKS FOR INSPIRING AN ENTIRE GENERATION TO GET AND UNDERSTAND THE TRUE BEAUTY OF MATHEMATICS

    @r4fa3l59@r4fa3l5910 ай бұрын
  • I've been watching for a while now, Idk why I haven't subscribed till now, but I love your videos. I've always found it fascinating that there is an awesome maths channel with a logo that has relatively the same shape as one of my eyes :) (the brown spot is even in the right place too)

    @SaplingDatree@SaplingDatree9 ай бұрын
  • My friend...i can't thank you enough for the "Essence of linear algebra" videos

    @nkkk6801@nkkk680110 ай бұрын
  • Was waiting for continuation of series ❤

    @kirilchi@kirilchi10 ай бұрын
  • When you upload video I feel happy because I learn new concept

    @mathanimation7563@mathanimation756310 ай бұрын
  • Humanity will always be grateful for your superbly amazing, impactful, and meaningful work. I'm confident your viewers are the best candidates to improve our entire world. It's inspiring to see how your efforts can enhance our understanding of the world and empower people to engage with sophisticated ideas. With your powerful content, you hold the impressive potential to inspire and educate countless individuals, fostering a deeper appreciation for math and its importance in our lives. Such efforts unquestionably play a crucial role in advancing our society as a whole. Thanks a million, Sir 3Blue1Brown. You are genuinely enhancing our world with the most insightful visual content currently available. Please continue for good.

    @sentinelaenow4576@sentinelaenow457610 ай бұрын
  • Very very cool. Never learnt convolutions that way!

    @klam77@klam7710 ай бұрын
  • I cant wait what you are up to on the new channel. Take care!

    @Systox25@Systox2510 ай бұрын
  • Thank you for the shoutout at the end! -Daksha

    @monku1521@monku152110 ай бұрын
  • Love your videos is an understatement. Speaking of distributions, any chance 3b1b fans can get a video on optimal transport??

    @shaiguitar@shaiguitar9 ай бұрын
  • "But what is the Fourier Transform? A visual introduction" In that video you showed that the "Centre of weight"(hypotenuse max peak) reaches its peak on the right side, x(real) axis whenever the input sinewave frequency is the same as the rotating frequency. But that only happens if the input sinewave is in phase with the rotation frequency and the rotation starts exactly at x=0 and y=1 on the complex plain. ONLY then does the vector/hypotenuse max peak line up perfectly with the x axis. In reality we have to continuously plot the vector/hypotenuse on a separate graph to get the information we want because on the complex plain the vector/hypotenuse max peak can point in any direction or fall in any quadrant depending on the phase difference between the rotation and the input sinewave signal.

    @steffanjansenvanvuuren3257@steffanjansenvanvuuren32578 ай бұрын
  • This channel is one of the most popular chnl in the field of advance maths..❤❤

    @Vikrampratapmaurya@Vikrampratapmaurya10 ай бұрын
  • after these videos on convolution, it would be cool to see you do a series on the convolution of filters, and also a video on the complex plane math used to design filters would be cool as well. i’m in that spot where i know the z plane math works but i don’t have a full intuition for why

    @morgan0@morgan010 ай бұрын
  • No one believed that math could be soooooooo beautiful before ur channel was created

    @satyakiguha415@satyakiguha41510 ай бұрын
  • Hey grant, I've been a big follower of your videos. Could you please make a detailed series covering all the topics in combinatorics, statistics and probability

    @dakshnarula8036@dakshnarula8036Ай бұрын
  • I've forgotten pretty much everything I learned in college, but one thing I kind of sort of remember is that one way to convolve two functions is to take their Laplace transform and then multiply them. Convolution in the time domain is multiplication in the frequency domain, basically.

    @salchipapa5843@salchipapa584310 ай бұрын
  • Reminds me of old days of programming digital image processing, where we used a speed-up trick of repeatedly applying box function to approximate gaussian filter. It was really fast and no floating point math was required.

    @LiborTinka@LiborTinka10 ай бұрын
  • This is a very elegant explanation of what makes the normal curve so special, but it still seems a little [puts on sunglasses] ...convoluted.

    @vigilantcosmicpenguin8721@vigilantcosmicpenguin872110 ай бұрын
  • I'd love to see a video on deconvolution, and its applications. One noteworthy one is basic processing of an image from a telescope. The aperture (typically circular) applies a convolution of a rectangle function to the incoming light. Convolving the resulting image with the inverse of the rect function will remove the distortions caused by the aperture. One strategy on smaller telescopes (especially using film instead of digital sensors) to avoid this is to put a filter on the aperture whose opacity follows a Gaussian, clearest in the center and darkest at the edge. This minimizes the distortions of the image coming through the telescope and avoids the need to process it afterward.

    @strehlow@strehlow7 ай бұрын
  • The animation at 7:17 about rotating your radius r to be perpendicularly aligned with the background x-y Cartesian grid is super. Like again animation is providing a very immediate, visual, and physically informed intuition / feeling that if you rotate it one way to align with the grid you'll preserve the area and simplify your computation. Just a small detail but these animations are great thank you very much!

    @christopherli7463@christopherli746310 ай бұрын
    • Like it's almost like the feeling in linear algebra when you change to a natural (eigen) basis to decouple your vectors/directions and then the computation just proceeds orthogonally along their individual axes, not interfering with each other and making the computation much more literally straightforward. So like rotation for a better coordinate system. This was a cool video thanks!

      @christopherli7463@christopherli746310 ай бұрын
  • Can't wait for the step 1 explanation. Because this is what I expected from the title.

    @pal181@pal18110 ай бұрын
    • Step 1 explanation is the thing that I have been waiting from this series... The series has been making a point about distributions approaching a normal distribution, and then the finale (or I think this is supposed to be the finale) skips the whole reasoning as to why they approach it in the first place. I hope he will be making a video about it.

      @HoxTop@HoxTop10 ай бұрын
    • @@HoxTop same

      @pal181@pal18110 ай бұрын
  • I love it sooooooo much!!! Can you please also do a video on Principal Component Analysis/Regression?

    @chiyosa7041@chiyosa70419 ай бұрын
  • I really loved the math since the day I started watching your videos not gonna lie!

    @ezxalidosman@ezxalidosman10 ай бұрын
  • The entropy explanation is really interesting and makes a lot of sense. As far as I can tell, what it is saying is that: noticing that convolving many different distributions leads to a gaussian distribution, is the same as noticing that repeated sampling the microstate of a system, which is the same as sampling N independent atomic distributions (or approximately independent... or not, depending on your system) of an equilibrium (maximal entropy) system, for large N, will always correspond to the same value of a macrostate variable.

    @Reda-Ou@Reda-Ou10 ай бұрын
  • No way I was just wishing for a video about this from you like 2 weeks ago

    @EragonShadeslayer@EragonShadeslayer10 ай бұрын
  • Love the videos! They have “re-sparked” my interest in math

    @Me-0063@Me-006310 ай бұрын
  • Great video as always. Thanks a lot! Could you please make a video for Manifolds or Lie groups and Lie algebra?

    @majdwardeh3698@majdwardeh369810 ай бұрын
  • I didn’t watch already but thank you for this video, none of my university teachers ever explained this when studiying probabilities !

    @otakultur5624@otakultur562410 ай бұрын
    • Because they never apply it.

      @raymondfrye5017@raymondfrye501710 ай бұрын
  • Impressive work

    @JulianCrypto@JulianCrypto9 ай бұрын
  • Please also make a video on logistic regression - specifically how the sigmoid function implies probability. I think this would be an interesting topic! Thanks!!

    @TonyWangYQ@TonyWangYQ10 ай бұрын
  • Aw, you left the final answer to the question you posed a video ago as an exercise for the viewer. That would take me a day!

    @joshuascholar3220@joshuascholar32203 ай бұрын
  • Golly id love a little on entropy and it's application here. Important almost even

    @alwayshere6956@alwayshere695610 ай бұрын
  • Majority of your video goes top of my head 😅 as I'm not a good student. But i come here and watch your evey video because of your representation. Thank you 😊

    @fuwadhasan7553@fuwadhasan75539 ай бұрын
  • Very nice! Have you thought about making a video on the concentration of measures phenomenon in higher dimensions?

    @simplyshocked9908@simplyshocked99083 ай бұрын
  • That the convolution of two Gaussians makes me think of some sort of metric (or psuedo-metric) space of integrable probability functions with finite variance, modded out by equivalence of linear transformations on the dependent/independent variables, then a contraction mapping theorem on them. Then the CLT would be sort of a "global" contraction mapping theorem. Wonder if that's provable or even makes sense, gonna go tinker around!

    @jacoblojewski8729@jacoblojewski872910 ай бұрын
  • at 5:44, being super-clear and specific: the properties that imply a 2D Gaussian are (i) a function x and y only through r, and (ii) independence, expressed as the functional equation g(r) = f(x)f(y) you mention independence earlier and it's on the screen in the upper right but i think it's worth emphasizing that it's essential to the derivation

    @coreyyanofsky@coreyyanofsky10 ай бұрын
  • Now this is what I call a series finale!

    @sherifffruitfly@sherifffruitfly10 ай бұрын
  • MY Statistics and calculus professor love your video.

    @567kkd@567kkd9 ай бұрын
  • When I learned that the area under the Gaussian curve and Γ(1/2) are the same and equal to sqrt(π) I was blown away. It was like seeing an interesting cameo in my favourite movie.

    @mylonoceda@mylonoceda10 ай бұрын
  • Was waiting for this video for a long time

    @adwaitpandey2526@adwaitpandey252610 ай бұрын
  • Here's a little idea that I figured out while thinking about catalysts in my high school chemistry class. There is a mysterious fact that's taught just for road memorization in chemistry, that catalysts lower energy use of activation but they don't shift equilibriums. This is broadly been explained as, if catalysts could shift equilibriums then it would be possible to add and remove catalysts from a reaction chamber, shift the equilibrium back and forth, and essentially build a perpetual motion machine from what you could generate power. This fact was mysterious to me until I realized that the distribution of energies in molecules bouncing around a reaction chamber approaches the normal distribution. The normal distribution. The amount of each reactants and products is only determined by the relative differences in energy and the temperature, not the ease of transition. This would not be true for any other distribution I can think of

    @michalchik@michalchik10 ай бұрын
  • That's all great, but will you ever make a video about the Kalman filter?

    @paniczgodek@paniczgodek9 ай бұрын
  • astounding quality as always

    @jeanw4287@jeanw428710 ай бұрын
  • Please make a playlist of this topic !! I wasn't able to watch your videos for some time, for some reason, so it's jumbled up !!

    @tanmayshukla7339@tanmayshukla733910 ай бұрын
  • Freshman me would thank you a lot. "Why Normal?" is the most unanswered question throughout my stats undergrad.

    @rayhanlahdji@rayhanlahdji9 ай бұрын
  • Grand was and is my source of inspiration to master mathematics, and to become linguistically accurate! One of my hero ❤.

    @ShivamSharma-ob8ix@ShivamSharma-ob8ix10 ай бұрын
  • FYI there is a small typo at 9:10 in the challenge problem, "The transformatoin of the line..." Thank you visualizing this connection!

    @lanog40@lanog4010 ай бұрын
  • Clicked on the video knowing it would be over my head. Was not disappointed.

    @FloydMaxwell@FloydMaxwell10 ай бұрын
  • 3b1b upload, yessss!!! gonna watch it later though

    @Djenzh@Djenzh10 ай бұрын
  • A video on optimal transport would be great.

    @vigneshwarankannan4999@vigneshwarankannan499910 ай бұрын
  • I did a whole stats degree and never fully got this thank you again grant

    @lachlanperrier2851@lachlanperrier285110 ай бұрын
  • More generally, linear transformations of Gaussian-distributed random vectors are also Gaussian random vectors. This is one of the main reasons why Kalman filtering works. BTW, convolution is also a bilinear transformation on L^p spaces.

    @musicarroll@musicarroll10 ай бұрын
  • For graphing functions, do you use special software or you use the python and relevant libraries?

    @ohaies1914@ohaies19149 ай бұрын
  • Am I the only one who does not find pleasure in statistical functions, and prefers topics that talk about deterministic functions and definite equations?

    @omargaber3122@omargaber312210 ай бұрын
    • This isn't Statistics, it's Probability. There are no random processes at all in the video, everything Grant talked about in this entire series is entirely deterministic.

      @rafaelschipiura9865@rafaelschipiura986510 ай бұрын
  • Please make a video like this about the Student's t-distribution!

    @henriqnuchoa@henriqnuchoa8 ай бұрын
  • Great vid as always

    @mahadlodhi@mahadlodhi10 ай бұрын
  • Could you do a video on strum liouville theory and generalized Fourier series?

    @markgross9582@markgross95828 ай бұрын
  • Your videos are really informative. Is there any way that I could dub it in my language and publish it on my channel?

    @stallinbhandari3019@stallinbhandari30199 ай бұрын
KZhead