llama-desktop 2.2.5

Desktop interface for Ollama
llama-desktop-2.2.5 is not a library.

Llamma Desktop

Llama

Desktop app to connect to Ollama and send queries.

Llama Desktop reads the Ollama service URI from the environment variable OLLAMA_HOST, defaults to http://localhost:11434.

Installation

Ollama

In case you have an NVIDIA GPU and want to run Ollama locally:

curl -fsSL https://ollama.com/install.sh | sh
systemctl enable ollama
systemctl start ollama
ollama pull mistral:latest
ollama pull phind-codellama:latest

Last stable release

cargo install llama-desktop

Development version

cargo install git@github.com:cacilhas/llama-desktop.git

License