If you're looking at using Ollama on your PC to run local LLMs (Large Language Models), with Windows PCs at least, you have two options. The first is to just use the Windows app and run it natively.
Curious about the performance of Ollama on WSL versus just running on Windows 11, I did some quick comparisons. When you purchase through links on our site, we may earn an affiliate commission. Here’s ...