Skip to main content

Crate ferrum_cli

Crate ferrum_cli 

Source
Expand description

§Ferrum CLI Library

Ollama-style command-line interface for the Ferrum LLM inference framework.

§Commands

  • run: Run a model and start interactive chat
  • serve: Start the HTTP inference server
  • stop: Stop the running server
  • pull: Download a model from HuggingFace
  • list: List downloaded models

Re-exports§

pub use config::CliConfig;

Modules§

commands
CLI Commands - Ollama-style interface
config
CLI configuration management
utils
CLI utility functions