cliplama 0.0.2

a local model that does literally nothing :3
cliplama-0.0.2 is not a library.

cliplama

a local model that does literally nothing :3

Use environment variables HOST and PORT to configure the server. Defaults to localhost:11434.

The project implements a subset of the Ollama API that's used by VS Code:

  • GET /api/version
  • GET /api/tags
  • POST /api/show
  • POST /chat/completions.

Installation:

$ cargo install cliplama

Running:

$ PORT=2137 HOST=127.0.0.1 cliplama
Listening on 127.0.0.1:2137

$ cliplama help
cliplama — a local model that does literally nothing :3
version: 0.0.2
Use environment variables HOST and PORT to configure the server. Defaults to localhost:11434