ai_assistant_core 0.2.0

Simple, ergonomic Rust client & server for local LLMs (Ollama, LM Studio, OpenAI-compatible). Chat, list models, stream responses, serve your model remotely.
Documentation

ai_assistant_core

There is very little structured metadata to build this page from currently. You should check the main library docs, readme, or Cargo.toml in case the author documented the features in them.

This version has 2 feature flags, 0 of them enabled by default.

default

This feature flag does not enable additional features.

nat

serve