Mastering Dynamic Programming - How to solve any interview problem (Part 1)
🎬 Mastering Dynamic Programming: An Introduction 🎬
Are you ready to unravel the secrets of dynamic programming? 🤔 Dive into the world of efficient problem-solving with this comprehensive introduction to dynamic programming. Whether you're a budding programmer or an algorithm aficionado, this video is your gateway to understanding the magic behind DP.
🔑 Key Takeaways:
📌 Demystifying the concept of dynamic programming.
📌 Understanding the core principles behind dynamic programming.
📌 Unleashing the power of recursion and memoization.
📌 Step-by-step breakdown of dynamic programming problem-solving.
Dynamic programming is like a puzzle-solving technique, and this video is your ultimate guide to fitting the pieces together. Get ready to elevate your coding skills and witness the art of optimization in action.
🚀 If you found this video helpful, don't forget to like, share, and subscribe for more tech tutorials!
Checkout part 2: • Mastering Dynamic Prog...
🌐 SiteGround: the hosting solution I like (affiliate link): www.siteground.com/index.htm?...
📖 Introduction to Algorithms, one of the key books about algorithms (affiliate link): www.amazon.com/Introduction-A...
🔗 Connect with me:
Support me on patreon: / techwithnikola
LinkedIn: / nikola-stojiljkovic-67...
Join my discord: / discord
Visit my blog: techwithnikola.com
Follow me on Instagram: / techwithnikola
Follow me on Twitter: / techwithnikola
Timecodes
00:00 - Intro to DP
01:40 - Problem: Fibonacci
04:44 - Memoization
06:22 - Bottom-Up Approach
07:20 - Dependency order of subproblems
07:52 - Problem: Minimum Coins
13:39 - Problem: Coins - How Many Ways
15:22 - Problem: Maze
18:55 - Key Takeaways
Wow, “recursion + memorization” is the most eloquent way I’ve ever heard it defined. Total crystallizing moment for me, even though I’ve used dynamic programming solutions before.
Thank you for the comment. I'm glad that you've found that reasoning useful :)
How was that not obvious? Given you’ve done it before
@@Pseudo___ What are you expecting them to say? You're just insulting them while showing off your own ignorance about how learning works. Way to be a pointless buzzkill about someone's a-ha moment. Many explanations _don't_ reduce DP to such a simple, concise rule of thumb. So if a learner of DP didn't get lucky enough to benefit from one or more accessible explanations of the subject by educators, then _of course_ they will require time and experience before they'll be able to synthesize such an explanation for themselves. And by far, the most likely time for a simple explanation to finally "click" with any learner is _after_ they have already successfully developed a good _intuition_ for the subject. This person vocalized the joy of that final "click" - which is the glory of education itself. That's just awesome. It certainly does not warrant your unsolicited negativity.
@@tantalus_complexnice comment
i think the main problem is that recursion is too slow. tabulation combines memoization and greedy very well, and although runs in the same time complexity as recursive + memo solutions, generally take up less time and memory since recursive calls are expensive timewise and in the memory/stack
This has to be one of the best dynamic programming videos out there. Props! Something that I feel could have been mentioned though is that a lot of the times it is also useful to reframe the problem which can make the solution a lot more intuitive as well. Whilst I understand that the point of the last maze problem is to teach DP effectively (it is a good and intuitive example for how a bigger problem can be reframed in terms of sub problems!), a useful observation is that the distance which the rabbit needs to move in an N×M grid is always going to be N-1 rights and M-1 downs. The number of ways is just the total possible arrangements of that many rights and downs, which is given by (N+M-2)! / [(N-1)!(M-1)!]. DP is thus not required for the last problem at all!
Thanks a lot for the comment. Yeah, you are totally right. I was having back-and-forth thoughts about whether I should mention this, and decided against it because my focus was on explaining the intuition behind DP. Similarly, it is possible to compute N-th Fibonacci number in O(log(N)) time complexity and O(1) memory complexity. In retrospect, I should have mentioned both without affecting the explanation of the DP approach. I will remember this for future videos. Thanks again for the feedback. These types of comments are very useful.
I guess the maze example could be made interesting by adding obstacles, such that counting the solutions without enumerating them becomes a lot harder.
Can you explain where you got this formula?
@@saveriov.p.7725 The # of combinations of r unordered elements out of a total of n is; n!/(r! * (n-r)!)
I think I understand it after rewatching at least 20 times. Putting that knowledge into code is a whole other story. I'm going to search for some more dynamic programming problems
I think that’s normal if you really want to understand it. I think it took me many many attempts as well, and the only way to truly get good at it is to solve many problems.
Can I ask something
@@DrZainabYassinYou don't need permission to ask something. Just go ahead and ask!
Here are some classic dynamic programming problems: part I: Fibonacci Numbers: Problem: Compute the nth Fibonacci number. Dynamic Programming Approach: Use memoization or bottom-up tabulation to store and reuse previously computed Fibonacci numbers. Longest Common Subsequence (LCS): Problem: Given two sequences, find the length of the longest subsequence present in both of them. Dynamic Programming Approach: Build a table to store the lengths of LCS for different subproblems. Longest Increasing Subsequence (LIS): Problem: Given an unsorted array of integers, find the length of the longest increasing subsequence. Dynamic Programming Approach: Build a table to store the lengths of LIS for different subproblems. Knapsack Problem: Problem: Given a set of items, each with a weight and a value, determine the maximum value that can be obtained by selecting a subset of items with a total weight not exceeding a given limit. Dynamic Programming Approach: Create a table to store the maximum value for different subproblems. Coin Change Problem: Problem: Given a set of coin denominations and a total amount, find the number of ways to make the amount using any combination of coins. Dynamic Programming Approach: Build a table to store the number of ways to make change for different subproblems. Edit Distance: Problem: Given two strings, find the minimum number of operations (insertion, deletion, and substitution) required to convert one string into another. Dynamic Programming Approach: Build a table to store the minimum edit distance for different subproblems. Matrix Chain Multiplication: Problem: Given a sequence of matrices, find the most efficient way to multiply them. Dynamic Programming Approach: Build a table to store the minimum number of scalar multiplications needed for different subproblems. Subset Sum: Problem: Given a set of non-negative integers, determine if there is a subset that sums to a given target. Dynamic Programming Approach: Build a table to store whether a subset sum is possible for different subproblems. Rod Cutting Problem: Problem: Given a rod of length n and a table of prices for rod pieces of various lengths, find the maximum value obtainable by cutting the rod and selling the pieces. Dynamic Programming Approach: Build a table to store the maximum value for different subproblems. Maximum Subarray Sum: Problem: Given an array of integers, find the contiguous subarray with the largest sum. Dynamic Programming Approach: Keep track of the maximum subarray sum ending at each position in the array. part II: Palindrome Partitioning: Problem: Given a string, partition it into as many palindromic substrings as possible. Dynamic Programming Approach: Build a table to store the minimum number of cuts needed to partition substrings into palindromes. Word Break Problem: Problem: Given a string and a dictionary of words, determine if the string can be segmented into a space-separated sequence of dictionary words. Dynamic Programming Approach: Build a table to store whether a substring can be segmented using the given dictionary. Longest Palindromic Substring: Problem: Given a string, find the longest palindromic substring. Dynamic Programming Approach: Build a table to store whether substrings are palindromic. Count Distinct Subsequences: Problem: Given a string, count the number of distinct subsequences of it. Dynamic Programming Approach: Build a table to store the count of distinct subsequences for different subproblems. Maximum Sum Increasing Subsequence: Problem: Given an array of integers, find the maximum sum of increasing subsequence. Dynamic Programming Approach: Build a table to store the maximum sum of increasing subsequences for different subproblems. Largest Sum Rectangle in a 2D Matrix: Problem: Given a 2D matrix of integers, find the largest sum rectangle. Dynamic Programming Approach: Reduce the problem to finding the largest sum subarray in each column. Egg Dropping Problem: Problem: You are given k eggs and a building with n floors. Find the minimum number of drops needed to determine the critical floor from which eggs start to break. Dynamic Programming Approach: Build a table to store the minimum number of drops for different subproblems. Counting Paths in a Grid: Problem: Given a grid, find the number of unique paths from the top-left corner to the bottom-right corner. Dynamic Programming Approach: Build a table to store the number of paths for different positions in the grid. Wildcard Pattern Matching: Problem: Given a text and a wildcard pattern, implement wildcard pattern matching with '*' and '?'. Dynamic Programming Approach: Build a table to store whether substrings match for different subproblems. Minimum Cost Path in a Matrix: Problem: Given a 2D matrix with non-negative integers, find the minimum cost path from the top-left corner to the bottom-right corner. Dynamic Programming Approach: Build a table to store the minimum cost for different subproblems. part III: Distinct Paths in a Grid: Problem: Given a grid of m x n, find the number of unique paths from the top-left corner to the bottom-right corner, where movement is allowed only down or to the right. Dynamic Programming Approach: Build a table to store the number of unique paths for different positions in the grid. Count Palindromic Subsequences: Problem: Given a string, count the number of palindromic subsequences. Dynamic Programming Approach: Build a table to store the count of palindromic subsequences for different subproblems. Maximum Length Chain of Pairs: Problem: Given pairs of integers, find the length of the longest chain of pairs such that the second element of the pair is greater than the first element. Dynamic Programming Approach: Sort the pairs and apply a dynamic programming approach to find the longest chain. Longest Bitonic Subsequence: Problem: Given an array of integers, find the length of the longest bitonic subsequence. A bitonic subsequence is one that first increases and then decreases. Dynamic Programming Approach: Build tables to store the length of increasing and decreasing subsequences for different subproblems. Partition Equal Subset Sum: Problem: Given an array of positive integers, determine if it can be partitioned into two subsets with equal sum. Dynamic Programming Approach: Build a table to store whether a subset with a particular sum is possible. Maximum Product Subarray: Problem: Given an array of integers, find the contiguous subarray with the largest product. Dynamic Programming Approach: Keep track of both the maximum and minimum product subarrays at each position. Decode Ways: Problem: A message containing letters from A-Z can be encoded into numbers. Given a string, determine the number of ways it can be decoded. Dynamic Programming Approach: Build a table to store the number of ways to decode substrings for different subproblems. Shortest Common Supersequence: Problem: Given two strings, find the shortest string that has both strings as subsequences. Dynamic Programming Approach: Build a table to store the length of the shortest common supersequence for different subproblems. Maximum Profit in Stock Market: Problem: Given an array representing stock prices on different days, find the maximum profit that can be obtained by buying and selling stocks. Dynamic Programming Approach: Keep track of the minimum stock price and maximum profit at each position.
Huge respect for your perseverance. Learning complex topics are frustrating, just put hours, be calm and watch everything again
All the great performers I have worked with are fuelled by a personal dream.
Dynamic programming -> Dont do what is already done
thank you for creating videos like this. I've been in the industry for a couple of decades now and still learning new things and approaches. more power to you sir. and looking forward to more videos.
@@aqfj5zy what are you waffling about, blud.
Really cool. Found this channel through the git video and started watching all the other videos. Really enjoying them so far, so thanks for making them
I'm so glad that you're enjoying my videos. Thanks for taking the time to leave a comment.
Appreciate your video! I'm decent at algorithmic problems, and knew all of the basic dynamic problems you mentioned in this video, but I liked your emphasis on first writing a brute force solution and then optimizing it. I was a bit confused by your choice to always use bottom-up approach, because in my opinion bottom-up can sometimes be difficult to write as it can be hard to determine which states to update first. I've always found the recursive approach more intuitive, but maybe it's just me. Looking forward to part 2! Hope you can help me find some intuition to solve harder DP problems as those are still pretty difficult for me.
Thank you for the comment. Always using bottom-up approach is a personal preference. The main advantage is that it's easier to reuse memory for memoized solutions when we they are not required anymore. For example, to solve fibonacci with DP, we don't have to use the O(N) memory. Instead, we can only store the last two values. This is very easy to do when using the bottom-up approach, but it's not obvious to me how to do it with the recursive one. Hopefully that makes sense. You are right that the recursive approach is more intuitive and I think it's a perfectly fine approach in most cases.
same buddy I too found the recursive approach to be more intuitive but generally looping solutions perform better than recursion as recursion uses the concept of stack behind the sence.
Hey, Nikola! This was a really useful video, as we're currently going through DP on my Master's. It's such a difficult paradigm to get used to, but your video really helped understanding the paradigm, and hopefully it'll help my future self learning the DP paradigm :) Also, any idea when the next part is coming out? :D
Hello Mikkel, I'm glad you've enjoyed it. Indeed, it takes to get used to the paradigm, but it will happen - just keep practicing. I don't have the exact date for part 2, but a possible timeline is in a month or two. Do you have any suggestions on what you'd like to see in part 2?
I would like to know whether DP is restricted to counting and shortest path problems. I would also like to know how one can solve for an optimal order, if that makes sense? 😊
@@TechWithNikola I noticed you didn't do much recursion here, and I'd like to see Recursion done using the DP paradigm (I noticed another comment saying something along the same lines). Other than that, no, I don't have any specific topics as of now, that I'd like to see. I might post a new comment as my Master's course progress, but for now recursion done by DP is the only thing :)
I remember solving the knapsack problem in C++ back when I was studying algorithms at the University. I found it so hard at first, but once it clicks it’s awesome
Indeed. I’ve had the same experience long time ago. I think knapsack was one of the first problems I learned about.
I don't know how, but your video came across as timely as ever! I am a student and I was given a recursion problem. It may not have been necessary, but I'm sure glad that I was able to optimize recursion (which is so much slower than the iterative approach that already for the 50th element you have to wait more than 30 seconds, when the iterative approach manages in less than a second)
Thank you for the comment. I’m glad you’ve found it useful and that the timing was perfect! 😀
Very looking forward to the 2nd part!
Dynamic programming is recursion + memoization, thank you so much I will never forget that connection, it made de concept click for me.
You’re welcome! Glad you’ve found it useful. That’s the key realization, and interestingly it took me a long time before I started phrasing it like that.
This video is epic. The way everything is explained makes it very easy to understand. The audio is super well recorded. The visuals are well done. If that's not enough, he explains the solution in code in both a recursive and not recursive fashion. I don't use recursion much, and I didn't really understand how to switch between recursive and non-recursive, and now I understand that better as a bonus. Instant subscribe. I hope you make lots of videos.
Thanks a lot for the comment. I'm very happy when I hear from people that find it useful and informative. I will definitely make lots more videos, and I hope I'll have more time in the future to upload more regularly.
Great video Nikola. I would love to see more of these in future. Once again, Thank you for this 😊
Thank you 🙂 You’re welcome. Looking forward to hear your thoughts for my future videos.
I knew the examples but not knowing what dynamic programming is until watching this nice explanation. It is like connecting broken pieces into a large one. look forwards to next part
Yeah, it's exactly like connecting broken pieces into a large one. I'm glad you've found the video useful :)
This is SUCH a good video. Thanks so much! I did the minimum coins problem in Nushell to try to get the concept to stick; it was fun.
Thanks a lot. I’m glad you liked it! 😀
what a blessing of a channel, thanks!
You’re welcome!
Watching your videos truly helped me better understand this for my final. Thanks brother, here’s hoping I can graduate
really crystal clear explanation on dynamic programming! came here after your git video. Subscribed!!
Thanks a lot. I’m very happy to hear that you’ve liked both videos!
Hope this channel grows a lot. I'm on mu first steps on coding, but I know it's gold.
Thank you! Consider sharing this channel with your friends. It would help with the growth a lot 😊
This was extremely useful and is one of the better dynamic programming explanations I've seen.
Thank you. I'm glad you've liked it!
Tech with Nikola = Tequila 🍸🎉
Fascinating, looking forward to future dynamic programming videos.
Thanks!
I would love to see more such quality content on YT ❤❤
Thank you for the kind words! ❤️
Very good explanation! keep it the good work and I love your videos!
Thank you! This means a lot to me.
Thanks so much. Will appreciate a part 2. I just subscribed!
Welcome aboard! Next video will be dynamic programming part 2. Apologies for the delay, I've been sick and didn't have enough time to make a new video.
Loved your content, need more of these :)
Thanks :) More to come!
Thanks, I have recently started learning DP. Your illustrations helped me.
You’re welcome. I’m glad to hear that.
I appreciate this content so much. I personally find recursive functions less intuitive and more of a headache because I constantly have to keep track of what I just did. If I lose it? then yea I have to start over. Maybe there's a smarter way to approach them, but I haven't come across any intuitive explanations for the past 3 years I've been into coding. Your bottom-up approach makes so much more sense though. I just love how we are storing the subproblems for later use. It makes accessing previous information very intuitive and less of a headache for me. I can see the pattern there, and I believe I could apply it to other problems too if I could frame them right. Thank you so much!
Hey, thank you for the comment. Yeah, recursion can be tricky but it becomes easy over time. I will consider making some videos about recursion, and try to share my thought process. Bottom-up approach is a good way to improve, but be careful as it can’t replace recursion in many non-DP problems.
Please make part 2!! This video was amazing. Helped me to understand the dynamic programming.
Thank you taking the time to comment. I'm glad to hear that you've found it useful. I have just published the part 2: kzhead.info/sun/palumJVpfamfia8/bejne.html Looking forward to hearing what you think about it. I've tried to keep the same style, but I was moving forward quicker this time, and I don't know if that's appropriate - any feedback is welcome.
Hey, I almost never comment, but your video is really amazing. After watching it and refreshing my brain about DP, I managed to solve a new medium DP question on Leetcode without help!! (DP is one of my weaknesses lol) I’m looking forward to watching your future video on harder DP questions!
Hey, thanks a lot for taking the time to comment. I’m glad you’ve found it useful!
Which Leetcode problem it was ?
clearly explained. Thanks!
Thank you!
Thank you very much! It helped me understand basic concepts of DP a lot! You're a very cool guy.
I’m glad to hear that. Thank you for the kind words
Amazing video! hope to see more videos of yours
Thank you! Yeah, I’ll be making more videos.
Thanks for the video that was very well explained especially coin change problem animation 👍👍
Thanks. I’m so glad you’ve liked it! 😀
Very nice and clear video. Thank you!
Thank you!
Whenever a technical concept is being explained by an eastern european accent, you know you just struck gold.
😂 thanks!
Really good explanation. Thank you very much!
Thank you! I'm glad you've enjoyed it.
Without this video I wouldn't have gotten the power of both recursion and memorization. Thanks
That’s great to hear. You’re welcome!
Memoization you mean?
great video, very precise , concise and explanatory waiting for more, keep it up!!!
Thank you!
wonderful video! so well explained! I had a weird distorted view of what dynamic programming meant before this. while thinking about the maze problem it occurred that it's really a 2D Fibonacci series. also, only a 1d array is needed, reducing memory complexity and cache usage. you can even keep a running total for each column, initialized to 1, and forgo the initial setup (except for zeroing the array memory) halving the memory reads. in C looks like: int paths(int n, int m) { int t; if(m < n) { t = m; m = n; n = t; } // n rows is smaller dimension int memo[--n] = {0}; // reduce n by one since first row is handled by t = 1 below for(int i = 0; i < m; ++i) { // column counter t = 1; // first row always 1 for(int j = 0; j < n; ++j) { // row counter t = (memo[j] += t); // do the work } } return t; // we know what the final value was already } /// asm version (linux calling convention) .paths: cmp rdi, rsi cmovlt rax, rsi cmovlt rsi, rdi cmovlt rdi, rax dec rsi mov rcx, rsi mov rdx, rsp ;; using stack space directly for array xor rax, rax .zloop mov [rdx], rax ;; memory clearing (using int64's here) sub rdx, 8 sub rcx, 1 jnz zloop .iloop mov rax, 1 mov rcx, rsi mov rdx, rsp ;; reset running total, array address pointer and row count each column .jloop add rax, [rdx] ;; do the work mov [rdx], rax sub rdx, 8 ;; iterate sub rcx, 1 jnz jloop ;; inner loop should execute once per clock cycle sub rdi, 1 jnz iloop ret ;; result is already in rax, no registers needed to be spilled to stack
Thank you! I’m so glad to hear that this video cleared some things up. Your observations are correct. This is why I like bottom up approach more because it allows you to do these kinds of optimizations. I like the ASM solution 😀
I avided to watch like this video throughout the YT til found your video now. thank you buddy . keep up♥
You're welcome :) Glad to hear you've enjoyed it.
Unbelievably well done video👍👍 you deserve many more subscribers!!
Thanks a lot! Hopefully one day 😀
Great explanation and visuals! I got very high hopes for this channel.
Thank you!
best dynamic programming video i have ever seen
Thanks!
Fun fact, for the maze problem, you can also solve this with combinatorics. The key to note is that no matter the path, the number of right moves and down moves end up being the exact same. For the 18 x 6 grid example, you'd need to move 17 down, 5 right in total. Therefore, for 22 total moves, we pick 17 spaces for the down moves and leave the remaining for the right moves. C(22, 17) = 26334 This is equivalent to picking 5 spaces for the right moves and leaving the remaining for the down moves. C(22, 5) = 26334
This is masterpiece, we want more content like this.
Thanks, glad you've enjoyed it :)
You are a gifted teacher!!! Great video ... please keep doing what you are doing.👍👍👍
Thanks a lot for the kind words. I really hope I’ll get more free time to make videos soon.
Great video, the only thing I might recommend is adding a diagram explaining how `for i in range(1, m +1):` builds up the memo dict from the bottom up.
Thanks. Yeah, that would have been useful.
I cannot wait for the second part :D
I'm glad :) Would you prefer to see more advanced DP problems for part 2? The alternative is to explain how to construct the solution path. For example, when I say path I mean what coins should we choose to get to sum 13. Right now, our solution says "you need 3 coins" but it would be nice to say "you need 3 coins: 4, 4, 5". I want to do both eventually, it's just the matter of choosing which one to do first.
Great video! starting my DP journey from this one.
Thank you! Good luck. Feel free to ask here if you need any help :)
Be thankful when you don't know something for it gives you the opportunity to learn.
You hit this to out of the park...plz plz keep teaching...
Glad you’ve liked it. I will. Comments like yours always remind me that it’s worth making more videos
Thank you for this usefull and comprehensive way to explain the concept of dynamic programming. That's a realy good video, I watched it many times.
You're welcome Kevin. Thank you for taking the time to comment. I really appreciate it!
Great video!!!
Thank you
outstanding! 🙂
Thank you! 😀
Excellent video
Thank you
Great video, thanks 👍
Glad you enjoyed it
Your voice is really great❤
Thanks! ❤️
Another solution for fibbonachi was the "sliding window" where you work your way up similar to the bottom up. Example in C size_t fib(size_t n) { size_t high = 1, low = 1, tmp; for (size_t i = 2; i < n; ++i) { // add low = high + low; // swap tmp = high; high = low; low = tmp; } return high; } Edit: I am not sure this is called dynamic programming.
Yup, this would work and use O(1) memory. This is the main advantage of the bottom-up approach because it makes it easy to reuse memory of memoized solutions that we don't need anymore.
Gold. Can't wait for the second part
Thanks. What would you like to see in part 2? Some options: - How to reconstruct the path that leads to optimal solution (e.g. which coins to choose for minimum number of coons) - more advanced dp problems - anything else
More advanced problems would be awesome. For example I was trying to solve a problem where you need to find the shortest path from one corner of a grid to another. Some cells have walls and you can only cross one of them (or k in the general case). Also you can only move up, down, left or right. Your video helped me a lot trying to solve this using DP but I'm still having trouble, specially trying to think the bottom up approach.
@@nicolaslupi3111thanks for suggestion. I’m glad the video helped. I’ll think of some advanced problems to cover in the next video. I may continue this series 1 problem per part from now on, to get videos out sooner.
Great explanation, thanks a lot for sharing. For us educators this is at least inspirational.
Thank you the comment. I’m glad you like the content 😀
very easy to follow and entertaining!!
Thank you!
Very nicely presented and animated! 👍 What tool have you used to create it?
Thank you. This specific video uses combination of powerpoint and keynote. For other videos I’ve also used Manim library.
Great stuff and well explained.
Thank you
Its really helpful. Thanks.
You’re welcome! I’m glad you’ve found it useful.
simple and easy to understand
Thank you :)
Very nice video
Thank you!
Great educational video, thank you!
You’re welcome! Glad you liked it.
nice vid, keep going!
Thank you!
❤❤❤ I love your work sir
Thank you ❤️
amazing explanation!
Thanks. Glad you liked it!
Thank you
Awesome explanation, thank you very much
Thank you. Glad you’ve enjoyed it!
YOU ARE AMAZING THANK YOU
Thank you for the kind words! It makes me very happy when I hear that people enjoy my videos.
Very good informative video. Thank you. Third problem can also be solved analytically (x+y-2) choose (x-1).
You're welcome, and thank you for the comment. Yup, the third problem can be solved with simple combinatorics. It's still useful to learn DP in that way, because the problem can easily change to e.g. have obstacles in the board, and combinatorics don't work all of a sudden. :)
There's a slight issue for Coin Problem: How many ways The given solution does not account for duplicates such as 1+1+2 and 1+2+1 (which together should count as 1 possible solution instead of 2) The example 5, [1,4,5] should have an output of 3: 1+1+1+1+1 1+4 5 But instead is 4 because it counts 4+1 as a separate solution to 1+4. An iterative solution that handles this would be for example: memo = defaultdict(int) memo[0] = 1 for coin in coins: for i in range(coin, amount + 1): memo[i] += memo[i-coin] return memo[amount]
Hello, thanks for the comment. > The given solution does not account for duplicates such as 1+1+2 and 1+2+1 (which together should count as 1 possible solution instead of 2) Why should this count as one solution? Note that the problem definition explicitly says that these should count as 2 solutions, and 1+4 and 4+1 are explicitly called out as an example (see example at 14:04).
@@TechWithNikola oh ur right guess i just completely ignored that lol. If order matters then the given solution works.
The problem does become interesting if duplicates were to be excluded though.
Top notch stuff
Thanks
Dan took the deep dive down the rabbit hole.
well put together
Thanks!
Is there a syntax highlighting like that for c? I really like it
Part 2 please!
Part 2 is here: kzhead.info/sun/palumJVpfamfia8/bejne.htmlsi=_dT9c6EOSpU_xa4A It's not as popular as part 1, but I do hope that it matches your expectations. Let me know! Any feedback is welcome.
messing with names to bypass read tape, never gets old
Amazing, I think everyone should check it first before start learning DSA
Thank you!
Hello incredible videos. They have very good effects on graphics and charts. What program do you make them with?
Hi, apologies for the late response. Thanks. I use a combination of: - powerpoint - keynote - manim - adobe premiere
Absolutely beautiful
Thank you!
Is it possible for you to share which software you have used for making these cool ppt with animation?
Of course. I've used mostly Microsoft Powerpoint and Apple's Keynote for this video. With that said, I've struggled a lot to make some animations (e.g. coin moves) using powerpoint. I managed, but it was very hacky and not something I'd like to go through again. I'm still exploring alternatives for motion graphics. I've tried Manim python library for other videos, but some simple animations are harder there. I'll likely end up using the combination of these tools for my future videos.
hello, I loved your video and I would like to know how you do the animations of the arrangements and such, what kind of tools and/or technologies do you use?
Hi, glad to hear that. I use the following: - Keynote - Powerpoint - Manim - Adobe Premiere
🥺😭🤧 Finally!!! I found it... Subscribed!
Thank you!
Excellent video any link to the part 2 ?
Thanks. Not yet, but I’ll let you know once it’s ready. I plan to work on it in a few weeks
Amazing video, please can you tell me which tools do you use to make this animation and slides?
Thanks. I’ve used combination of powerpoint, keynote and manim for all of my videos, then I’d do further video editing with Adobe Premiere.
How do you define a dynamic array in c++? Can you get the same constant run-time on a function that read/writes from a file?
Thank you for this awsome content. So in a nutshell by saying "dynamic programming ", that's meen benefiting from computer storage or memory capacity to solve hard problems (NP hard) that are to complex to solve in terms of times ?
You're welcome! You could maybe look at it like that, but I wouldn't use that exact phrasing to describe DP. I think the general idea when we say DP is "solve small independent problems and use them to solve bigger problems". You're right that we are using memory to store solution for each sub-problem, and we have to (with the exception for solutions that are not required anymore).
Nice video! Where did you get the images of the coins?
Thanks. I’ve got them from freepik. You can probably find similar images on other websites too, but I used freepik since I already have a subscription.
I love how you could just solve the maze one using combinations but you did it recursively anyway: import math def grid_paths(m,n): return math.factorial(m+n-2)/(math.factorial(m-1)*math.factorial(n-1))
Correct. The intent was to show ideas behind dynamic programming though. Other problems can also be solved differently, and in better time complexity - for example, fibonacci can be solved in O(lg N). One thing to keep in mind is that the dp approach can easily be extended to solve the maze problem with obstacles or other constraints.
I'm curious to know whether bottom up approach for minimum coins problem ended up calculating some intermediate values that were never used
It's possible to construct such examples where a recursive approach may perform fewer computations, e.g. if the coin amounts are massive and correlated in a way that many sums can't be produced. I think it's not worth worrying about this if the common case requres us to compute all solutions anyway.
DP all the way
There's one insight I've had about bottom-up vs. top-down dynamic programming which I've had recently. To be concrete, let's say the top-down solution builds a memoization by a recursive function first checking whether the answer to the current call is memoized, then computing and storing it if not. The computation is done by the function recursively calling itself with smaller arguments. On the other hand, a bottom-up solution builds the same table by calling the function with the smallest arguments first, then larger ones, in such a way that the function is never called twice with the same arguments, and when the function is called the table never has the answer already (for the arguments of that call). I would generally expect the bottom-up solution to be faster: the overhead of checking whether the table is full has been removed, as has the redundant memoized recursive calls-they are replaced by an unconditional table lookup. The major exception is if the top-down solution can fill out the table sparsely: then it can skip the work needed to fill out unnecessary table entries, and automatically does so. [But if that's possible, can't the bottom-up approach do the same, after a bit of thinking?]
Great summary. I agree with all your points. > But if that's possible, can't the bottom-up approach do the same, after a bit of thinking? It's difficult to track which problem to solve next with the bottom-up approach. We need a topological sort of the problems, which may be simple for some problems, but hard to generalize. If the problem space is sparse, then there's a good chance that we can solve the problem without DP.
@@TechWithNikola > Great summary Thanks. > It's difficult to track which problem to solve next with the bottom-up approach. We need a Topological sort [...]. Yes, that's a good point! The bottom-up approach walks the argument/subproblem space in topological order, from sink-most to source-most. The the top-down approach simply walks all the out-edges from the current grid cell. Checking for cached results enables us to ignore the topological ordering and discover enough information about it on the go. > If the problem space is sparse, then there's a good chance that we can solve the problem without DP. Hm. Interesting. Hm. This is not obvious to me. Can you illustrate this principle with a few examples?
@@jonaskoelker > Can you illustrate this principle with a few examples? I'm just saying this from intuition, so take it with a grain of salt. I can't provide an example to generalize my statement because every problem is unique. The main reason is that I find it hard to imagine an interesting DP-like problem with the sparse solution space. It would be easier to think about this if you can find a problem, then we can try to solve it and maybe draw some conclusions. I attempted to phrase some of these problems so that the solution space is sparse, but I failed to come up with anything interesting. E.g. the minimum coin problem with sparse solution space would be something like "coin values are very large". Then instead of iterating over each solution with a for loop, we would compute possible solutions and add them to the queue (like a BFS algorithm maybe?). But this is just a direct way of coming up with the topological ordering that is specific to the problem. Sorry but I don't have a better answer at this point. I'll comment if I can think of anything interesting.
@@TechWithNikola I see. I figure the applicability of DP has to do with reusing solutions to subproblems. Of course, just because a DP solution exists doesn't mean a non-DP solution doesn't exist. I guess in some sense there is _always_ (or almost always) a non-DP solution: brute force. So maybe the thing to look for is the kind of problem where DP works but is not the simplest solution. A leetcode example comes to mind: sliding window maximum. Given an array A of length n and a window size k, return `[max(a[i:i+k]) for i in range(approximately n - k + 1)]`. Given the maximum across a window of length (k-1) we can easily compute the maximum across a window of size k if it contains the smaller window. So DP applies. But it takes quadratic space (and thus time), and there is a solution which only takes linear space and time. (My first such solution involves rot13(pnegrvna gerrf).)
Oh, I should add: the sliding window maximum problem is relatively easy because we're computing the maximum for all windows simultaneously. There's a highly fancy data structure that uses linear space and lets us compute range maxima (or minima) in O(1) time per query, without the queries being batched. Erik DeMaine talks about it in one of his "Advanced Data Structures" lectures on OCW. It's on youtube. He talks in terms of RMQ: range _minimum_ queries. It turns out to relate to least common ancestor queries, so maybe looking for a lecture on trees will help you find it. [But no one will independently rediscover this in the context of a leet code problem. The batched window maximum problems seems of appropriate difficulty relative to the kind of leet code problems I'm familiar with.]
Great stuff. Do you know of any real world applications for dynamic programming?
Thanks. I know en.wikipedia.org/wiki/Seam_carving uses dynamic programming. I've also used it myself when I played a bit with the natural language processing to calculate probabilities, but that doesn't necessarily mean it was the best way to solve the problem :) There are a couple of use-cases for finding string distances, or doing text justification alignment that are useful in practice. Also, I've read that this can be useful when processing genetic information, but I don't know the details: en.wikipedia.org/wiki/Needleman%E2%80%93Wunsch_algorithm