Convenient Homelab LLMs with NixOS and WSL
2025-04-11
This post details a setup for running LLMs conveniently on a homelab using NixOS within Windows Subsystem for Linux (WSL). The author overcame challenges like VRAM locking, WSL auto-shutdown, and Nvidia driver issues. By leveraging Ollama, the Nvidia Container Toolkit, and NixOS's configuration management, they achieved automated updates and easy system rebuilding. The guide covers keeping WSL running, NixOS installation, Nvidia driver configuration, setting up an Ollama container, and optional Tailscale networking, ultimately providing a readily accessible local LLM environment.
Read more
Development