thinkinglanguage-0.1.0 is not a library.
ThinkingLanguage
A purpose-built language for Data Engineering & AI — Modern Problems need Modern Solutions.
ThinkingLanguage (TL) replaces the fragile Python + SQL + YAML + Spark glue-code stack with a single, coherent language where data pipelines, transformations, AI model training, and real-time streaming are first-class language constructs.
Highlights
- Native tables — columnar data backed by Apache Arrow/DataFusion with pipe-based transforms
- AI/ML built-in — tensors, model training (linfa), ONNX inference, embeddings, LLM APIs
- Streaming & Pipelines — ETL/ELT constructs, windowed streams, Kafka integration
- GPU acceleration — wgpu-based tensor operations on Vulkan/Metal/DX12/WebGPU
- Multiple backends — bytecode VM (default), LLVM AOT native compilation, WASM browser target
- Gradual typing — optional type annotations, generics, traits, Result/Option with
?operator - Ownership semantics — pipe-as-move,
.clone(), read-only&ref, use-after-move detection - Rich tooling — LSP server, VS Code extension, formatter, linter, doc generator, package manager
Installation
Pre-built binaries (recommended)
Download the latest release from GitHub Releases.
Linux (x86_64):
macOS (Apple Silicon):
Windows: Download tl-x86_64-pc-windows-msvc.zip from the releases page, extract, and add tl.exe to your PATH.
From crates.io
# With SQLite support
From source
Quick Start
# Run a script
# Start the REPL
A Taste of TL
Variables and Functions
let name = "world"
let nums = [1, 2, 3, 4, 5]
fn greet(who: string) -> string {
"Hello, {who}!"
}
let doubled = nums |> map((x) => x * 2) |> filter((x) => x > 4)
print(greet(name))
print(doubled)
Data Pipelines
let users = read_csv("users.csv")
users
|> filter(age > 30)
|> select(name, age, department)
|> with { senior = age > 35 }
|> aggregate(by: department, count: count(), avg_age: avg(age))
|> sort(count, "desc")
|> show()
AI / Machine Learning
let data = read_csv("iris.csv")
let model = train_model(data, target: "species", algorithm: "random_forest")
let predictions = predict(model, new_data)
let t = tensor([[1.0, 2.0], [3.0, 4.0]])
let result = t |> matmul(tensor([[5.0], [6.0]]))
Streaming
pipeline etl_daily {
schedule: "0 6 * * *"
steps {
extract read_csv("raw_data.csv")
transform |> filter(status == "active") |> select(id, name, score)
load write_parquet("clean_data.parquet")
}
}
Architecture
Source (.tl)
│
▼
┌─────────┐ ┌──────────┐ ┌─────────┐ ┌───────────┐
│ Lexer │───▶│ Parser │───▶│ AST │───▶│ Compiler │
│ (logos) │ │(rec.desc)│ │ │ │ │
└─────────┘ └──────────┘ └─────────┘ └─────┬─────┘
│
┌──────────────────┬──┴──────────────┐
▼ ▼ ▼
┌──────────┐ ┌───────────┐ ┌────────────┐
│ Bytecode │ │ TL-IR │ │ LLVM │
│ VM │ │ Optimizer │ │ Backend │
│(default) │ │ │ │ (AOT) │
└──────────┘ └───────────┘ └────────────┘
│ │
▼ ▼
┌──────────┐ ┌────────────┐
│ WASM │ │ Native │
│ Backend │ │ Binary │
└──────────┘ └────────────┘
Feature Flags
TL uses Cargo feature flags for optional integrations:
| Flag | Description | Dependencies |
|---|---|---|
sqlite |
SQLite connector | rusqlite (bundled) |
mysql |
MySQL connector | mysql_async |
redis |
Redis connector | redis |
s3 |
S3 object storage | aws-sdk-s3 |
kafka |
Kafka streaming | rdkafka |
python |
Python FFI bridge | pyo3 |
gpu |
GPU tensor operations | wgpu, bytemuck |
llvm-backend |
LLVM AOT compilation | inkwell, llvm-sys |
async-runtime |
Tokio async I/O | tokio, reqwest |
notebook |
Interactive notebook TUI | ratatui, crossterm |
registry |
Package registry client | reqwest |
# Build with specific features
# Build with all features (except llvm-backend which needs LLVM 19 installed)
CLI Commands
| Command | Description |
|---|---|
tl run <file> |
Execute a .tl source file |
tl shell |
Start the interactive REPL |
tl check <file> |
Type-check without executing |
tl test <path> |
Run tests in a file or directory |
tl fmt <path> |
Format source files |
tl lint <path> |
Lint for style and correctness |
tl build |
Build the current project (requires tl.toml) |
tl init <name> |
Initialize a new TL project |
tl doc <path> |
Generate documentation |
tl disasm <file> |
Disassemble bytecode |
tl compile <file> |
Compile to native object file (LLVM) |
tl deploy <file> |
Generate deployment artifacts |
tl lineage <file> |
Show data lineage |
tl notebook <file> |
Open interactive notebook |
tl lsp |
Start the LSP server |
tl models list|info|delete |
Manage model registry |
tl migrate apply|check|diff |
Schema migration |
tl add <pkg> |
Add a dependency |
tl remove <pkg> |
Remove a dependency |
tl install |
Install all dependencies |
tl update [pkg] |
Update dependencies (with version diffs) |
tl update --dry-run |
Preview dependency updates without changes |
tl outdated |
Show outdated dependencies |
tl publish |
Publish to package registry |
tl search <query> |
Search package registry |
Project Structure
crates/
tl-lexer/ Tokenization (logos)
tl-ast/ Abstract syntax tree
tl-parser/ Recursive descent parser
tl-interpreter/ Tree-walking interpreter
tl-compiler/ Bytecode compiler + register VM
tl-types/ Type system and checker
tl-errors/ Error types and diagnostics
tl-data/ Data engine (DataFusion, Arrow)
tl-ai/ AI/ML (ndarray, linfa, ONNX)
tl-stream/ Streaming and pipelines
tl-ir/ Intermediate representation optimizer
tl-llvm/ LLVM AOT backend (inkwell)
tl-wasm/ WASM browser backend
tl-gpu/ GPU tensor operations (wgpu)
tl-lsp/ Language Server Protocol
tl-package/ Package manager
tl-registry/ Package registry server
tl-cli/ CLI entry point
editors/
vscode/ VS Code extension
benchmarks/ Criterion benchmarks
examples/ Example .tl programs
docs/ Documentation
Documentation
Detailed documentation is available in the docs/ directory:
- Getting Started — Installation, quickstart, editor setup
- Language Guide — Syntax, types, functions, structs, generics, error handling
- Data Engineering — Tables, connectors, streaming, data quality
- AI & ML — Tensors, model training, ONNX inference
- Tools & Ecosystem — CLI, package manager, notebook, LSP
- Advanced Topics — Backends, Python FFI, security, schema evolution
Tests
# Run all tests (excluding feature-gated crates)
# With specific features
License
MIT OR Apache-2.0