Running private LLMs on a P51 Thinkpad using Windows WSL.

TL:DR: Install Open WebUI on Docker running on Ubuntu installed in WSL on Windows 10. Pull llama3.1, gemma2 and other LLM manifests and personalize with your data. Forward port 3000 to the WSL IP address and access any LLM model from your private LAN on any web browser. QED.