menta 0.0.1

Minimal Rust library for non-UI LLM and AI primitives
Documentation
# menta

Minimal Rust library for non-UI LLM/AI primitives.

`menta` is being prepared for an initial `0.0.1` crates.io release.

## Installation

Add `menta` to your `Cargo.toml`:

```toml
[dependencies]
menta = "0.0.1"
tokio = { version = "1", features = ["macros", "rt-multi-thread"] }
```

`tokio` is only needed if you call async helpers like `generate_text`.

## Included

- Provider registry with string model ids like `openai/gpt-4.1-mini`
- Inventory-based provider discovery
- Built-in providers: `mock`, `openai`, `anthropic`
- Unified message model with content parts: `ModelMessage`, `Part`
- Text generation: `generate_text`
- Type-driven outputs with `GenerateTextRequest<T>`
- Streaming events: `stream_text`
- `schemars`-based typed schemas inferred from `T`
- Derive-based tools: `#[derive(Tool)]`, `ToolExecute`, `ToolSchema`
- Embeddings and similarity helpers: `embed`, `embed_many`, `cosine_similarity`, `rank_by_similarity`
- OpenAI examples via `OPENAI_API_KEY`

Text requests can use `GenerateTextRequest::new()`. Typed outputs can use `GenerateTextRequest::<T>::typed()`.

For builder ergonomics, single typed tools can be added with `.tool::<MyTool>()`.

## Quick Start

```rust
use menta::{GenerateTextRequest, generate_text};

#[tokio::main]
async fn main() -> Result<(), menta::Error> {
    let request = GenerateTextRequest::new()
        .model("openai/gpt-4.1-mini")
        .prompt("Write a one-line summary of Rust.");

    let response = generate_text(request).await?;
    println!("{}", response.text);

    Ok(())
}
```

## Examples

- `examples/generate_text.rs`: basic text generation
- `examples/stream_text.rs`: streaming events
- `examples/generate_object.rs`: structured output with `GenerateTextRequest::<Type>`
- `examples/embeddings.rs`: embeddings and similarity ranking
- `examples/tool.rs`: Tokio example for `#[derive(Tool)]` with automatic execution in `generate_text`
- `examples/agent.rs`: interactive agent-style REPL example

Run one example with:

```sh
export OPENAI_API_KEY=...
cargo run --example generate_text
cargo run --example tool
cargo run --example agent
```

The agent example runs as a loop and accepts prompts until `exit` or `quit`.

## Publishing Notes

- Crate version: `0.0.1`
- License: `MIT OR Apache-2.0`
- Repository: `https://github.com/KABBOUCHI/menta`
- Publish order: `menta_derive` first, then `menta`

## Design Notes

- This is intentionally a core-only library, not a UI/chat app.
- The API stays small and dependency-light.
- The mock provider is still used in tests through the same registry as real providers.

## Verification

Run:

```sh
cargo test
cargo package --allow-dirty --workspace
```