Get the latest version of Local AI service

Learn how to get the latest version of Local AI service

For your convenience, Msty bundles the latest version of Local AI service (Ollama) with the app at the time of the app release. However, if you want to get the latest version of Local AI service, first try going to Settings > Local AI > Service Version and clicking on Check for Updates. If there is a new version available, it will be downloaded.

Manual download

If you are unable to download the latest version of Local AI service using this method, you can follow the steps below to manually download and install the latest version of Local AI service.

Windows
Mac
Linux

Windows

  1. Go to releases page on the Ollama repository: https://github.com/ollama/ollama/releases
  2. From under Assets, download ollama-windows-amd64.zip.
  3. Once downloaded, extract the contents of the zip file and move ollama.exe to C:\Users\<username>\AppData\Roaming\Msty
  4. Rename ollama.exe to msty-local.exe.
  5. Move lib folder to C:\Users\<username>\AppData\Roaming\Msty

Additional config for AMD ROCm™ GPU users only:

  1. Go to releases page on the Ollama repository: https://github.com/ollama/ollama/releases
  2. From under Assets, download ollama-windows-amd64-rocm.tgz
  3. Once downloaded, extract contents and move the whole rocm folder under lib\ollama to C:\Users\<username>\AppData\Roaming\Msty\lib\ollama

Now restart Msty and verify the new version of Local AI service by going to Settings > Local AI > Service Version.

You can cleanup the downloaded files once the above steps are complete.