so i've been playing around getting offline ai suggestions for coding directly into vs-code via the ongoing project called
ollama + their new tool - *continue*. it's pretty cool because you can keep your projects super private while still leveraging some smart auto-complete features. basically, instead of having to connect online or use public models like chatgpt (which might not always respect privacy), this setup keeps everything local on the machine. have anyone tried setting up something similar? i'd love hear about any tips and tricks you've picked along your journey!
Source:
https://www.sitepoint.com/local-llm-code-completion-vs-code-ollama/?utm_source=rss