<div align="center">
# ✨ 𝕏-AI
[](https://dl.circleci.com/status-badge/redirect/circleci/2rCDyKJRUEePhb1wtzWHqR/wCPbiPVjGAET17HXk9U2d/tree/main)
[](https://crates.io/crates/x-ai)
[](https://docs.rs/x-ai/)
[](LICENSE)

</div>
> ✨ X-AI: A CLI, TUI, and SDK for interacting with the [xAI Grok API](https://docs.x.ai/api/), allowing you to chat with Grok models, create embeddings, and inspect your API key and available models.
## 📖 Table of Contents
- [Installation](#-installation)
- [Features](#-features)
- [Usage as CLI](#-usage-as-cli)
- [Options](#-options)
- [Subcommands](#-subcommands)
- [Usage as SDK](#-usage-as-sdk)
- [Contributing](#-contributing)
- [License](#-license)
## 🚀 Installation
To install the `xai` CLI:
```bash
cargo install x-ai --all-features
```
## ✨ Features
- Interactive TUI with Settings, Chat, History, and Model Info tabs
- Chat with Grok models (`grok-4` by default)
- Legacy text completions
- Create text embeddings
- List all available models
- Inspect a specific model's details
- Fetch API key information
- Deferred Chat Completions
- Full Responses API (Create, Get, Delete)
## Environment Variables
Before using the CLI or SDK, export your API key:
```bash
export XAI_API_KEY=<your_xai_api_key>
export XAI_MODEL=grok-4 # optional, defaults to grok-4
```
Generate an API key from the [xAI Console](https://console.x.ai).
## ⌨️ Usage as CLI
### Launch TUI (default, no subcommand):
```sh
xai
```
### Chat with Grok:
```sh
xai chat -t "What is the answer to life?"
```
### Legacy text completion:
```sh
xai complete -p "Once upon a time" --max-tokens 200
```
### Create embeddings:
```sh
xai embed -t "Hello, world!"
```
### List all models:
```sh
xai models
```
### Get details for a specific model:
```sh
xai model -m grok-4
```
### Show API key info:
```sh
xai apikey
```
## 🎨 Options
| _(none)_ | Launch TUI mode. |
| `--api-key` | xAI API key (overrides `XAI_API_KEY` env var). |
| `--model` | Model to use (overrides `XAI_MODEL` env var). |
## 🛠 Subcommands
| `chat` | Chat with a Grok model. |
| `complete` | Legacy text completion. |
| `embed` | Create text embeddings. |
| `models` | List all available models. |
| `model` | Get details for a specific model. |
| `apikey` | Show API key information. |
## ✨ Usage as SDK
Add to your `Cargo.toml`:
```toml
[dependencies]
x-ai = "0.1.0"
tokio = { version = "1", features = ["full"] }
```
### Fetch API Key Information 🔑
```rust
use std::env;
use x_ai::api_key::ApiKeyRequestBuilder;
use x_ai::client::XaiClient;
use x_ai::traits::{ApiKeyFetcher, ClientConfig};
#[tokio::main]
async fn main() {
let client = XaiClient::builder()
.build()
.expect("Failed to build XaiClient");
client.set_api_key(
env::var("XAI_API_KEY")
.expect("XAI_API_KEY must be set!")
.to_string(),
);
let builder = ApiKeyRequestBuilder::new(client.clone());
let result = builder.fetch_api_key_info().await;
println!("{:?}", result.unwrap());
}
```
### Chat Completions 💬
```rust,ignore
use std::env;
use x_ai::chat_compl::{ChatCompletionsRequestBuilder, Message};
use x_ai::client::XaiClient;
use x_ai::traits::{ChatCompletionsFetcher, ClientConfig};
#[tokio::main]
async fn main() {
let client = XaiClient::builder()
.build()
.expect("Failed to build XaiClient");
client.set_api_key(
env::var("XAI_API_KEY")
.expect("XAI_API_KEY must be set!")
.to_string(),
);
let messages = vec![
Message::text("system", "You are Grok, a chatbot inspired by the Hitchhiker's Guide to the Galaxy."),
Message::text("user", "What is the answer to life and the universe?"),
];
let builder = ChatCompletionsRequestBuilder::new(client.clone(), "grok-4".to_string(), messages);
let request = builder.clone().build().unwrap();
let completion = builder.create_chat_completion(request).await.unwrap();
println!("Response: {}", completion.choices[0].message.content);
}
```
### Text Completions 📝
```rust,ignore
use std::env;
use x_ai::client::XaiClient;
use x_ai::completions::CompletionsRequestBuilder;
use x_ai::traits::{ClientConfig, CompletionsFetcher};
#[tokio::main]
async fn main() {
let client = XaiClient::builder()
.build()
.expect("Failed to build XaiClient");
client.set_api_key(
env::var("XAI_API_KEY")
.expect("XAI_API_KEY must be set!")
.to_string(),
);
let builder = CompletionsRequestBuilder::new(
client.clone(),
"grok-4".to_string(),
"What is AI?".to_string(),
)
.max_tokens(50);
let request = builder.clone().build().unwrap();
let completion = builder.create_completions(request).await.unwrap();
println!("{}", completion.choices[0].text);
}
```
### Embedding Creation 📊
```rust,ignore
use std::env;
use x_ai::client::XaiClient;
use x_ai::embedding::EmbeddingRequestBuilder;
use x_ai::traits::{ClientConfig, EmbeddingFetcher};
#[tokio::main]
async fn main() {
let client = XaiClient::builder()
.build()
.expect("Failed to build XaiClient");
client.set_api_key(
env::var("XAI_API_KEY")
.expect("XAI_API_KEY must be set!")
.to_string(),
);
let builder = EmbeddingRequestBuilder::new(
client.clone(),
"grok-4".to_string(),
vec!["Hello, world!".to_string()],
"float".to_string(),
);
let request = builder.clone().build().unwrap();
let embedding = builder.create_embedding(request).await.unwrap();
println!("{:?}", embedding.data);
}
```
### List Models 📜
```rust,ignore
use std::env;
use x_ai::client::XaiClient;
use x_ai::list_mod::ReducedModelListRequestBuilder;
use x_ai::traits::{ClientConfig, ListModelFetcher};
#[tokio::main]
async fn main() {
let client = XaiClient::builder()
.build()
.expect("Failed to build XaiClient");
client.set_api_key(
env::var("XAI_API_KEY")
.expect("XAI_API_KEY must be set!")
.to_string(),
);
let builder = ReducedModelListRequestBuilder::new(client.clone());
let models = builder.fetch_model_info().await.unwrap();
for model in models.data {
println!("• {} (owned by: {})", model.id, model.owned_by);
}
}
```
## 🤝 Contributing
Contributions and feedback are welcome! If you'd like to contribute, report an issue, or suggest an enhancement, please engage with the project on [GitHub](https://github.com/wiseaidotdev/x-ai). Your contributions help improve this crate for the community.
## 📄 License
This project is licensed under the [MIT License](LICENSE).
© 2026 Wise AI Foundation