menta 0.0.4

Minimal Rust library for non-UI LLM and AI primitives
Documentation
# menta

Minimal Rust library for non-UI LLM/AI primitives.

`menta` is published on crates.io at `0.0.4`.

## Workspace

- `crates/menta`: core library crate
- `crates/derive`: proc-macro crate for `#[derive(Tool)]`
- `crates/agent`: interactive CLI agent package (`menta-cli`)

## Installation

Add `menta` to your `Cargo.toml`:

```toml
[dependencies]
menta = "0.0.4"
tokio = { version = "1", features = ["macros", "rt-multi-thread"] }
```

`tokio` is only needed if you call async helpers like `generate_text` or `stream_text`.

## Included

- Provider registry with string model ids like `openai/gpt-4.1-mini`
- Inventory-based provider discovery
- Built-in providers: `mock`, `openai`, `anthropic`
- Unified message model with content parts: `ModelMessage`, `Part`
- Text generation: `generate_text`
- Type-driven outputs with `GenerateTextRequest<T>`
- Streaming events: `stream_text`
- `schemars`-based typed schemas inferred from `T`
- Derive-based tools: `#[derive(Tool)]`, `ToolExecute`, `ToolSchema`
- Embeddings and similarity helpers: `embed`, `embed_many`, `cosine_similarity`, `rank_by_similarity`
- OpenAI examples via `OPENAI_API_KEY`

Text requests can use `GenerateTextRequest::new()`. Typed outputs can use `GenerateTextRequest::<T>::typed()`.

For builder ergonomics, single typed tools can be added with `.tool::<MyTool>()`.

## Quick Start

```rust
use menta::{GenerateTextRequest, generate_text};

#[tokio::main]
async fn main() -> Result<(), menta::Error> {
    let request = GenerateTextRequest::new()
        .model("openai/gpt-4.1-mini")
        .prompt("Write a one-line summary of Rust.");

    let response = generate_text(request).await?;
    println!("{}", response.text);

    Ok(())
}
```

## Examples

- `crates/menta/examples/generate_text.rs`: basic text generation
- `crates/menta/examples/stream_text.rs`: streaming events
- `crates/menta/examples/generate_object.rs`: structured output with `GenerateTextRequest::<Type>`
- `crates/menta/examples/embeddings.rs`: embeddings and similarity ranking
- `crates/menta/examples/tool.rs`: Tokio example for `#[derive(Tool)]` with automatic execution in `generate_text`
- `crates/menta/examples/stream_tools.rs`: streaming a tool-call turn and inspecting `Finish.parts`
- `crates/menta/examples/agent.rs`: interactive agent-style REPL example
- `crates/agent`: installable CLI project (`menta-cli` package)

Run one example with:

```sh
export OPENAI_API_KEY=...
cargo run -p menta --example generate_text
cargo run -p menta --example tool
cargo run -p menta --example agent
```

The agent example runs as a loop and accepts prompts until `exit` or `quit`.

Run or install the CLI agent project:

```sh
cargo run -p menta-cli
cargo install menta-cli --bin menta
```

Agent flags:

```sh
cargo run -p menta-cli -- --verbose
cargo run -p menta-cli -- --model anthropic/claude-3-5-sonnet-latest
```

Agent commands:

```text
/help
/new
/model [provider/model]
/pwd
/tools
!<shell-command>
```

## Publishing Notes

- Crate version: `0.0.4`
- License: `MIT OR Apache-2.0`
- Repository: `https://github.com/KABBOUCHI/menta`
- Publish order: `menta_derive` first, then `menta`, then `menta-cli`

## Design Notes

- This is intentionally a core-only library, not a UI/chat app.
- The API stays small and dependency-light.
- The mock provider is still used in tests through the same registry as real providers.

## Verification

Run:

```sh
cargo test --workspace
cargo package --allow-dirty --workspace
```