x-ai 0.1.0

✨ A cli, tui, and sdk for interacting with the 𝕏-AI API
Documentation

✨ 𝕏-AI

CircleCI Crates.io docs License

banner

✨ X-AI: A CLI, TUI, and SDK for interacting with the xAI Grok API, allowing you to chat with Grok models, create embeddings, and inspect your API key and available models.

📖 Table of Contents

🚀 Installation

To install the xai CLI:

cargo install x-ai --all-features

✨ Features

  • Interactive TUI with Settings, Chat, History, and Model Info tabs
  • Chat with Grok models (grok-4 by default)
  • Legacy text completions
  • Create text embeddings
  • List all available models
  • Inspect a specific model's details
  • Fetch API key information
  • Deferred Chat Completions
  • Full Responses API (Create, Get, Delete)

Environment Variables

Before using the CLI or SDK, export your API key:

export XAI_API_KEY=<your_xai_api_key>
export XAI_MODEL=grok-4   # optional, defaults to grok-4

Generate an API key from the xAI Console.

⌨️ Usage as CLI

Launch TUI (default, no subcommand):

xai

Chat with Grok:

xai chat -t "What is the answer to life?"

Legacy text completion:

xai complete -p "Once upon a time" --max-tokens 200

Create embeddings:

xai embed -t "Hello, world!"

List all models:

xai models

Get details for a specific model:

xai model -m grok-4

Show API key info:

xai apikey

🎨 Options

Option Description
(none) Launch TUI mode.
--api-key xAI API key (overrides XAI_API_KEY env var).
--model Model to use (overrides XAI_MODEL env var).

🛠 Subcommands

Subcommand Description
chat Chat with a Grok model.
complete Legacy text completion.
embed Create text embeddings.
models List all available models.
model Get details for a specific model.
apikey Show API key information.

✨ Usage as SDK

Add to your Cargo.toml:

[dependencies]
x-ai = "0.1.0"
tokio = { version = "1", features = ["full"] }

Fetch API Key Information 🔑

use std::env;
use x_ai::api_key::ApiKeyRequestBuilder;
use x_ai::client::XaiClient;
use x_ai::traits::{ApiKeyFetcher, ClientConfig};

#[tokio::main]
async fn main() {
    let client = XaiClient::builder()
        .build()
        .expect("Failed to build XaiClient");

    client.set_api_key(
        env::var("XAI_API_KEY")
            .expect("XAI_API_KEY must be set!")
            .to_string(),
    );

    let builder = ApiKeyRequestBuilder::new(client.clone());
    let result = builder.fetch_api_key_info().await;
    println!("{:?}", result.unwrap());
}

Chat Completions 💬

use std::env;
use x_ai::chat_compl::{ChatCompletionsRequestBuilder, Message};
use x_ai::client::XaiClient;
use x_ai::traits::{ChatCompletionsFetcher, ClientConfig};

#[tokio::main]
async fn main() {
    let client = XaiClient::builder()
        .build()
        .expect("Failed to build XaiClient");

    client.set_api_key(
        env::var("XAI_API_KEY")
            .expect("XAI_API_KEY must be set!")
            .to_string(),
    );

    let messages = vec![
        Message::text("system", "You are Grok, a chatbot inspired by the Hitchhiker's Guide to the Galaxy."),
        Message::text("user", "What is the answer to life and the universe?"),
    ];

    let builder = ChatCompletionsRequestBuilder::new(client.clone(), "grok-4".to_string(), messages);
    let request = builder.clone().build().unwrap();
    let completion = builder.create_chat_completion(request).await.unwrap();
    println!("Response: {}", completion.choices[0].message.content);
}

Text Completions 📝

use std::env;
use x_ai::client::XaiClient;
use x_ai::completions::CompletionsRequestBuilder;
use x_ai::traits::{ClientConfig, CompletionsFetcher};

#[tokio::main]
async fn main() {
    let client = XaiClient::builder()
        .build()
        .expect("Failed to build XaiClient");

    client.set_api_key(
        env::var("XAI_API_KEY")
            .expect("XAI_API_KEY must be set!")
            .to_string(),
    );

    let builder = CompletionsRequestBuilder::new(
        client.clone(),
        "grok-4".to_string(),
        "What is AI?".to_string(),
    )
    .max_tokens(50);

    let request = builder.clone().build().unwrap();
    let completion = builder.create_completions(request).await.unwrap();
    println!("{}", completion.choices[0].text);
}

Embedding Creation 📊

use std::env;
use x_ai::client::XaiClient;
use x_ai::embedding::EmbeddingRequestBuilder;
use x_ai::traits::{ClientConfig, EmbeddingFetcher};

#[tokio::main]
async fn main() {
    let client = XaiClient::builder()
        .build()
        .expect("Failed to build XaiClient");

    client.set_api_key(
        env::var("XAI_API_KEY")
            .expect("XAI_API_KEY must be set!")
            .to_string(),
    );

    let builder = EmbeddingRequestBuilder::new(
        client.clone(),
        "grok-4".to_string(),
        vec!["Hello, world!".to_string()],
        "float".to_string(),
    );

    let request = builder.clone().build().unwrap();
    let embedding = builder.create_embedding(request).await.unwrap();
    println!("{:?}", embedding.data);
}

List Models 📜

use std::env;
use x_ai::client::XaiClient;
use x_ai::list_mod::ReducedModelListRequestBuilder;
use x_ai::traits::{ClientConfig, ListModelFetcher};

#[tokio::main]
async fn main() {
    let client = XaiClient::builder()
        .build()
        .expect("Failed to build XaiClient");

    client.set_api_key(
        env::var("XAI_API_KEY")
            .expect("XAI_API_KEY must be set!")
            .to_string(),
    );

    let builder = ReducedModelListRequestBuilder::new(client.clone());
    let models = builder.fetch_model_info().await.unwrap();
    for model in models.data {
        println!("{} (owned by: {})", model.id, model.owned_by);
    }
}

🤝 Contributing

Contributions and feedback are welcome! If you'd like to contribute, report an issue, or suggest an enhancement, please engage with the project on GitHub. Your contributions help improve this crate for the community.

📄 License

This project is licensed under the MIT License.

© 2026 Wise AI Foundation