Get the latest version of Local AI service
Learn how to get the latest version of Local AI service
For your convenience, Msty bundles the latest version of Local AI service (Ollama) with the app at the time of the app release.
However, if you want to get the latest version of Local AI service, first try going to Settings > Local AI > Service Version and clicking on Check for Updates
.
If there is a new version available, it will be downloaded.
Manual download
If you are unable to download the latest version of Local AI service using this method, you can follow the steps below to manually download and install the latest version of Local AI service.
Windows
Note: There is a small annoyance with the official build of Ollama where during chatting it opens up a blank Terminal window. You could just ignore it or wait for the latest release of Msty. We have sent a PR to Ollama team and waiting for it to get merged. Please upvote this PR to get it prioritized: https://github.com/ollama/ollama/pull/8668
- Go to releases page on the Ollama repository: https://github.com/ollama/ollama/releases
- From under
Assets
, downloadollama-windows-amd64.zip
. - Once downloaded, extract the contents of the zip file and move
ollama.exe
toC:\Users\<username>\AppData\Roaming\Msty
- Rename
ollama.exe
tomsty-local.exe
. - Move
lib
folder toC:\Users\<username>\AppData\Roaming\Msty
Additional config for AMD ROCm™ GPU users only:
- Go to releases page on the Ollama repository: https://github.com/ollama/ollama/releases
- From under
Assets
, downloadollama-windows-amd64-rocm.tgz
- Once downloaded, extract contents and move the whole
rocm
folder underlib\ollama
toC:\Users\<username>\AppData\Roaming\Msty\lib\ollama
Now restart Msty and verify the new version of Local AI service by going to Settings > Local AI > Service Version.
You can cleanup the downloaded files once the above steps are complete.