Free and Private GitHub Copilot Clone for VS Code Using Ollama and Continue
Install a Private AI Coding Assistant in VS Code for Free Using Ollama and Continue
In this video, we'll see how you can use Ollama and Continue to run a private GitHub Copilot clone locally for free using open source large language models (LLMs) such as codellama and deepseek-coder.
This lets you try out different models, and even use uncensored models.
Don't send your private data to OpenAI's ChatGPT or GitHub's Copilot, keep it private on your pc or mac.
👍 Please like if you found this video helpful, and subscribe to stay updated with my latest tutorials. 🔔
❤️ You can support this channel by buying me a ☕: buymeacoffee.com/codesfinance
🔖 Chapters:
00:00 Intro
00:22 Install Ollama and Models
01:34 Install Continue
05:00 Chat
05:31 Edit and Refactor
06:21 Fix Bugs
07:12 Other Uses
07:26 Outro
🍺 Homebrew installation commands:
brew install ollama
🔗 Video links:
Ollama: ollama.com/
Continue: continue.dev/
🐍 More Vincent Codes Finance:
- ✍🏻 Blog: vincent.codes.finance
- 🐦 X: / codesfinance
- 🧵 Threads: www.threads.net/@codesfinance
- 😺 GitHub: github.com/Vincent-Codes-Finance
- 📘 Facebook: / 61559283113665
- 👨💼 LinkedIn: / vincent-codes-finance
- 🎓 Academic website: www.vincentgregoire.com/
#chatgpt #llm #largelanguagemodels #ollama #copilot #githubcopilot #python #javascript #programming #code #gpt #opensource #opensourceai #llama2 #mistral #bigdata #research #researchtips #vscode #professor #datascience #dataanalytics #dataanalysis #uncensored #private
👍 Please like if you found this video helpful, and subscribe to stay updated with my latest tutorials. 🔔 ❤ You can support this channel by buying me a ☕: buymeacoffee.com/codesfinance
This was pretty cool! I'm just a barely a hobby coder, some Arduino or Processing Java stuff once in a blue moon lol. I've barely scratched the surface with this extension. I've yet to see how much better this is than just copy-paste to / from an LLM, but it certainly have potential! Seems pretty helpful, although you also have to be skeptical of it's outputs and claims (same as ever), sometimes it seems to hallucinate a bit, or assume some stuff not quite right. But it's alright as long as one is aware of it. Using Linux Mint 21.3, and VSCodium. Already had Ollama present. I found the Continue add-on.and thanks to your config.json editing I got it working. So far with Codegemma. Great video, thanks!
Thanks! I guess the main benefit of having it directly in VS Code is convenience: it lets you stay within VS Code, is integrated with the UI, and the buttons and slash commands will automatically wraps prompts around your code.. In terms of output quality, it will mostly depend on the model you use.
Thanks for awesome video! Great info :)
Glad you found it useful!
very good info
Thanks, glad it was useful!
hi can we ollama provider url? i want to use ollama which i serve in my server
Yes! You can specify the url with the apiBase parameter. See docs.continue.dev/reference/Model%20Providers/ollama for an example.
@@VincentCodesFinance Thank you 🙏
on osx i cant even find the config.json :(
By default it will be created in "~/.continue/config.json", but you can open it directly by clicking on the gears icon in the Continue extension sidebar.
Bruh i moved it to the right aide & after closing it's not visible now
If you moved it to the right, it's now in your secondary side bar. If it was closed, you can reopen it from the menu: View->Appearance->Secondary Side Bar (command-option B on mac)
@@VincentCodesFinance oh tnx bruh 🙏🙏💪