SOLVED = Running Ollama as a Windows service (for use in server environments)
Currently Ollama for Windows (preview) will not allow you to specify an install location and installs to the logged in user’s path. For those of us who want to run Ollama as a service that doesn’t require a user to be logged in, this is not ideal.
After much research, trial, and error, I finally figured out how to get this running as a local service on Windows Server. Here’s how I did it:
Uninstall your existing Windows preview version of Ollama (you will lose your downloaded models unfortunately).
Download the latest Ollama-windows-amd64.zip standalone package from the Ollama GitHub Releases page. https://github.com/ollama/ollama/releases. (Extract it to a folder in Program Files or wherever)
Instal NSSM (Non Sucking Service Manager for Windows) https://nssm.cc/download https://github.com/kirillkovalenko/nssm
Add NSSM.exe and Ollama.exe locations to the system path variable in Windows
From a Windows command prompt, enter the command: NSSM install Ollama (it will pop up a GUI in windows where you set everything up for the NSSM service in the next steps).
Find Ollama.exe in the “path” field of the GUI that popped up from the last step And set the “startup directory” to the same location.
Add the word “serve” to the “Arguments” field
In the “Details” tab in “Display name” set it to “Ollama” (this will be the name of the service).
In “Login” tab set to “Local System” account.
In “Process” tab set “Priority” to “High” (otherwise Ollama will run like absolute garbage. This setting made a HUGE DIFFERENCE for Ollama’s performance). You could also set it to “real time” if you want to really kick it up a notch if you have nothing else important running on the server.
Everything else you can leave default. Click save.
Open the Windows Services app and start the newly created Ollama service.
If you want to edit anything later, you can type: NSSM edit Ollama.
A couple of weird things:
- I can’t for the life of me locate where the models are stored now after turning Ollama into a service.
- In this version, there is no Ollama task bar icon to show that it’s running. You can go to http://localhost:11434 in a browser to check to see if it’s running
- I don’t know where log files are stored now either. If you can find those or the model files location, please share.
- If you want to use any Ollama environment variables (such as OLLAMA_FLASH_ATTENTION=1) add them to your Windows environment variables within Windows Control Panel > System Properties > Advanced > Environment Variables > System Variables > New.