llama-desktop-2.2.10 is not a library.
Llamma Desktop

Desktop app to connect to Ollama and send queries.
Llama Desktop reads the Ollama service URI from the environment variable
OLLAMA_HOST, defaults to http://localhost:11434.
Installation
Ollama
In case you have an NVIDIA GPU and want to run Ollama locally:
|
Last stable release
Development version