thirdBreakfast@lemmy.worldtoSelfhosted@lemmy.world•Set up Tailscale with NGINX Proxy ManagerEnglish
1·
1 day agoGreat write up, thanks. For video learners, Wolfgang does a good step-by-step on YouTube
Great write up, thanks. For video learners, Wolfgang does a good step-by-step on YouTube
Guide to Self Hosting LLMs with Ollama.
ollama run llama3.2
Build anything small into a container on your laptop, push it to DockerHub or the Github package registry then host it on fly.io for free.