I Analyzed My Finance With Local LLMs
Get $200 off CourseraPlus 👉 imp.i384100.net/nLvdN9
GitHub repo 👉 github.com/thu-vu92/local-llm...
🔑 TIMESTAMPS
================================
0:00 - Project intro
1:35 - Sponsor (Coursera)
2:04 - Why using local LLMs?
3:34 - Install Ollama
4:14 - Run local Mistral model
6:17 - Run local Llama2 model
7:27 - Customize LLMs with Ollama
9:53 - Access Llama2 with Langchain (Python)
10:45 - Categorise bank transactions
14:46 - Create personal finance dashboard
17:24 - Conclusions
👩🏻💻 COURSES & RESOURCES
================================
📖 Google Advanced Data Analytics Certificate 👉 imp.i384100.net/anK9zZ
📖 Google Data Analytics Certificate 👉 imp.i384100.net/15v9y6
📖 Learn SQL Basics for Data Science Specialization 👉 imp.i384100.net/AovPnJ
📖 Excel Skills for Business 👉 coursera.pxf.io/doPaoy
📖 Machine Learning Specialization 👉 imp.i384100.net/RyjykN
📖 Data Visualization with Tableau Specialization 👉imp.i384100.net/n15XWR
📖 Deep Learning Specialization 👉 imp.i384100.net/zavBA0
📖 Mathematics for Machine Learning and Data Science Specialization 👉 imp.i384100.net/LXK0gj
📖 Applied Data Science with Python 👉 imp.i384100.net/gbxOqv
🙋🏻♀️ LET'S CONNECT!
================================
🤓 Join my Discord server: / discord
📩 Newsletter: thu-vu.ck.page/profile
✍ Medium: / vuthihienthu.ueb
🔗 All links: linktr.ee/thuvuanalytics
As a member of the Amazon and Coursera Affiliate Programs, I earn a commission from qualifying purchases on the links above. By using the links you help support this channel at no cost for you.
#ai #datascience #ThuVu #dataanalytics
This is great. We're in the process of integrating LLMs into our "what if" scenario modelling platform and this gave me a few ideas on next steps. Sharing this video with my dev team!
Hi Thu! Last year I had referenced your panel dashboard video to build my personal finance dashboard. I like seeing how you built yours. Your content is very useful. Thank you!
This is such a great video. Thank you for making it. I had no idea this sort of thing was possible and I'm finding all sorts of ways to take advantage of it now.
What an amazing video! This is definitely a personal project that I've wanted to tackle and while I'm familiar with other languages, I'll definitely use your video as a guideline.
Always good to see more people bringing data skills to understand personal finance.
Wow absolutely wow, thank you for such a great project, so many ideas ringing in my head. Cheers
This was an excellent video - many thanks for sharing!
Incredible intro video for the semi technical about how chat gpt and similar models will be used in daily life to improve the mundane tasks, with a side of cautions about incorrect answers and computational limitations! Great balance, I’m already sharing it around our team 😊
Thanks a lot for your comment and for sharing it around! Really appreciate it 🤩🙌
Excellent video, I used the concepts to enhance a project that I had already started in R and it worked fine, but so slow in my computer (like 5 min to analyse 10 registers). Now I know the concepts and I`ll keep experimenting with other LLM models. Thank you!
Love the video! The beginning sets up the project perjectly and the tutorial is very easy to follow!
Thank you for sharing this dear! You covered the basics and shown the path to a great first goal with your own custom on premise and well licensed LLM. Huge!
You are so welcome! Glad it was helpful 🙌
Thank you so much for making this video. Subscribed, this is exactly the content I look for
J'ai adoré, vidéo super clair allant droit au but et qui nous la joie d'aller découvrir le code
Wow this is fantastic video. Thank you, Thu!
Thanks for the great overview of using aa local LLM Thuy! Very useful and informative.
Amazing work you put in here. This is inspiring
Really awesome explanation! I am going to use this. Thank you Thu!!
Thanks so much! It giving me inspiration for using this in a security analysis context.
thank you! this is a project i'd love to try, keep up the good work 😊
Great video... My 2 cents: we can force LLMs to respond only in json format by stating it in system prompt, so you get consistent parsable response always (I've tried with gpt4), also you can provide list of possible expense categories to avoid grouping them together later (like 'Food & Beverage' and 'Food/Beverage')
Yeah, it is very powerful! However, is llama2 also providing this?
@@martinmoder5900 llama2 and even gemma:2b does that too, but when I tried it still generated "new" categories, and the json answers would be "odd" like sometime it would modify the name of the expense.
Thanks for the demo and info. So detailed and analytics are great. Have a great day
Thank you so much. 🥰It is so well explained and a very cool project. I think LLMs are a powerful tool and running them locally will make it safe to share critical information with them.
Thank you, really appreciate it! ❤
Thanks for the video. Nicely done and presented, educational with an interesting use case
this is great.. thank you for the breakdown of all these options
Well done I'll try and re-create this. Thank you once again
I learned so so much watching this. Thank you so much.
Great video to start using LLM! Thank you for sharing!
Are you a real human? I have NEVER seen an author on youtube cover so much incredible knowledge in such a short video. This is absolutely AMAZING!!! Thank you
Her being an AGI would make perfectly sense
Thank you so much for sharing this with us!! I’ve been looking to do this for years but just thinking about the task ahead, I would give up. I will definitely analyze my own financial statements. Thanks mucho gusto!!
This is great! I was recently experimenting on a personal finance tracker dashboard and connect it to a chatting apps, so the user could easily input their financial activity by only typing it. On the process, i try to use chat gpt to simplify and generalise the format so we can input the data faster, never have i thought that it could be done by a local LLM. Looking forward for your next video.
Fantastic! Your videos are always good surprises at my feed.
I was looking for THIS! Thanks!!
Amazing. Thank you for sharing this, I learned so much!
this is one of the best videos I watched about llms
ayo, i'm just doing my first step that's logging every expenses i got since the start of this year i'm just thinking about doing some sort of software that help me manage my expenses and savings and this is exactly what i think of thank you for the high quality video
Great video .. The one project which I wanted to take up during my holidays .. Learn in the same time have a view on my personal finance ..
Thanks for sharing with us, much appreciation! ❤️
Thank you for watching! ❤️
Very concise and informative video. I appreciate it.
Outstanding video, especially for this beginner. Didn’t know you could run the models locally. Those ollama layers look like docker, fascinating how the context is setup. Time for me to spend some cycles on all your vids, not just the couple I’ve casually looked at. Thanks!
Glad to hear you found the videos helpful! Thanks for stopping by 🙌🏽
Me too. I thought you need to have some monstrous supercomputer and spend weeks on configuring everything to run one of these models locally
I see how this is useful for being one's own accountant :) Super!
You are awesome! Thanks for making this video.
Amazing job explaining this!
Love it , i am subscribing instantly , i have a lot of questions.
Awesome research as always!
Very well explained. Looking forward to you posting the github repo.
Thank you for watching! I've added the repo link in the description 🙌🏽
Nice. Might give this a try over the weekend. Just need to figure out how to get my banks data.
You are a very good presenter, easy to follow. Nice content
incredible, loved the content.
Thanks Thu, just heard about local LLMs from my boss today and look whose video is on the top to help me out! 😃
Hey Shivam! Thanks for watching! So happy to see your comment 😍🤗
Thanks for the great intro into how to get started with local LLMs. I'll give it a go after Tết 😄
Happy Tet holiday! 😀🎉
I loved this and hope to try this out for myself (though my programming skills are very rusty)
Finally the text classification video that I was searching for
Thanks Thu, great demo of Ollama, sorry your arent going to be retiring anytime soon😢 I really like the multimodal model support in Ollama, llava is a great model to try and runs on not much RAM.
Thank you Oliver! I would absolutely not mind making videos until I retire though 🤣. The multimodal support is interesting, I haven't tried it out yet but will look into those models a bit more 🙌🏽.
Thankyou so much for this video. I relly like the explanation. Thanks
I love this video, thank you very much!!
I never ever ever comment on anything, but goddamn - what a great video/tutorial. Just finished playing with the notebook and I learned a ton!
That’s so awesome to hear! Thank you so much for commenting ❤️🤗
Your videos are well thought out .. Keep them coming - Dont want you "retiring soon" 🙂
Haha thank you for this! Don’t worry, with KZhead I don’t want to retire anytime soon 😉🤗
Your content always useful! I like the Panel lots.
Thank you so much! So happy to hear 🤩
@@Thuvu5 💛
Great insights and well explained!
Great vid, great content, and easy to understand.
If you want to give data as many as the number of tokens of the model. You don't need to calculate and know by hand. Instead, you can do this with "chunks" in Langchain. nice explanation thank you
Great info, and thanks a lot
This is incredible, a bit far fetched from my skills and time in hands. But surely inspiring!
OMG this is inspiring I always wanted a 3rd party view about my expenses without loosing control of my data and this video hits the nail on the head.
So glad to hear! Good luck with your project 🤗
Thank you SOOOOOOO much for this !! this is an awesome tutorial
You are so welcome! Glad you like it!
very good! thank you for sharing!
That's awesome. I would also use Llama to write the code for generating plotly charts/dashboards haha!
Great video like always Thu! You never fail to fascinate me with your content as you make Data Science seem so fun to experiment with! Do you happen to have experience with the Bloomberg Terminal or any project idea to do using it? Would be amazing to know what you think of it! 🥰💛
Thank you for such kind words! No I haven’t had the chance to try out Bloomberg Terminal. It’s perhaps worth looking into for a future video 🤔
@@Thuvu5 excited and hoping to have a look at it 💫💕
🎯 Key Takeaways for quick navigation: 00:00 💲 *Reviewing Income and Expense Breakdown* - Explained the process of analyzing financial transactions. - Talked about classification of expenses into categories. - Spoke about using low-tech ways and an AI assistant for classification. 02:16 💻 *Running a Large Language Model Locally* - Discussed different ways to run an open-source language model locally. - Listed various popular frameworks to run models on personal devices. - Explained why these frameworks are needed, emphasizing the size of the model and memory efficiency. 04:18 📚 *Installing and Understanding Language Models * - Demonstrated how to install a language model through the terminal. - Showed the interaction with the language model through queries in the terminal. - Assessed the model's math capabilities, showing a failed example. 06:48 🎯 *Evaluating Expense Classification of Language Models* - Checked if the language models can categorize expenses properly through the terminal. - Demonstrated how to switch models, correctly installing another model. - Showed the differences between the models and preferred one due to answer formatting. 08:24 🛠️ *Creating Custom Language Models* - Explained how to specify base models and set parameters for language models. - Demonstrated how to create a custom model through the terminal. - Discussed viewing the list of models available and building a custom blueprint to meet specific requirements. 11:46 🔄 *Creating For Loop to Classify Expenses * - Discussed forming a for loop to classify multiple expenses. - Detailed how to chunk long lists of transactions to avoid token limit in the language model. - Mentioned the unpredictability of language models and potential need for multiple queries. 14:32 🔍 *Analyzing and Categorizing Expenses* - Demonstrated how to analyze and categorize transactions. - Showed how to group transactions together, clean up the dataframe, and merge it with the main transaction dataframe. 15:14 📊 *Creating Personal Finance Dashboard * - Detailed the creation of a personal finance dashboard, that includes income and expenses breakdown for two years. - Introduced useful visualization tools such as Plotly Express and Panel, giving a short tutorial on how to use them. - Demonstrated the assembling of a data dashboard from charts and supplementing it with custom text. 17:02 📈 *Visualizing Financial Behavior Over Time* - Demonstrated the use of the finance dashboard, drawing observations. - Concluded with a note on importance of incorporating assets into financial management. - Highlighted the value of running large language models on personal devices for tasks like these. Made with HARPA AI
Excellent video and practical application, you didn't get to cover pydantic much which solves a current challenge with LLMs. As for the dashboard, maybe another framework or approach with less or no code could be be more efficient :)
I was wondering where I listened to this music. Amazon learning has this background music. Thanks for sharing :)
Thanks, That was inspiring indeed :)
Amazing and inspiring 😊
This is a great video.!I learned a lot Thank you so much! 👍🎁🎁
Thank you! 🦙
Awesome video, learned a lot of new tools and want to try this out. For the dashboard, wonder if using Excel would be easier? Not sure.
Well explained ❤
Love this!
Cool project! I'd like to try it myself. One interesting idea is to have the LLM generate a memo field for each transaction (which can be controlled via prompting). Then by embedding these and doing hybrid retrieval, you can search in natural language as well as by metadata for transactions.
That’s an interesting idea! Would love to see how well the retrieval works 🤗
Thanks again for another wonderful video. Ollama is now available on Windows as a preview. I used that preview version on the solution you shared here and it worked great! 🙂 Can you recommend a tutorial on the panel library? Thanks in advance.
thats a awsome vid thanks 🥰
"Although, as you can see I can't retire anytime soon" 😂😳 Thu, this was a pretty ingenious way to label data; one of the biggest part of our time is data cleanup and this helps speed it up
out of curiousity, why did you choose ollama? (vice something like LM studio)
Haha, yeah I thought I'd saved much more.. 😂 Definitely, I hope to explore more analysis use cases for local LLMs. I heard about LM studio but somehow I just like the setup with Ollama better. I guess they are very much the same in the backend.
Trust me, clicking the video and scrolling through the comments, I was anticipating your comment to be at the very top😅
As always, high-quality content from a highly competent woman!
That's so kind of you, I'm trying to be ;)
pretty cool work
I've noticed that most LLM understand that you would like a CSV formatted output and you use that to get more consistent output.
This is a great inroduction to Ollama
love your videos
Great video. Very inspiring. Also...I used to live in Amstelveen (20+ years ago!). Funny to see that name in there.
Oh haha, the world is small! 😀
Great stuff 👏🏻
Thanks for the visit! ;)
Fantastic video
Great Video! Still happy with Panel? Tried Gradio?
I just read about the latest Meta LLAMA model that is supposed to be better than GPT4 for s/w dev! I hope that we can run it as a LOCAL LLM ! Thank You for this timely vid. ...
Ooh that’s pretty cool! 🤩 So great to hear many models are approaching GPT4 capabilities 🤯
Thanks for this video
Thanks for the video! It was very clear and helpful. I'm curious, why didn't you use the Langchain CSV agent? Have you tried it before? If so, did you find it to be overkill or not helpful for this case? I'm new to Langchain and LLMs, so this video was incredibly informative. Thanks again!
Thank you for a great video! Liked and subscribed! I have a small question: how do you validate the correctness of splitting into categories? I mean, how do you automate verification, that all records got correct corresponding categories, not just random 30 out of several thousand records?
Great Work.
Nice showcase of that it's ok if things don't work out first try - there's another model / another try :)
Amazing!!
Wow 🎉🎉🎉thanks 🎉🎉🎉
As a Javascript coder, this was a mindblowing video, I had no idea Python was this powerful.