Sema is a Scheme-like Lisp where prompts are s-expressions, conversations are persistent data structures, and LLM calls are just another form of evaluation. It combines a Scheme core with Clojure-style keywords (:foo), map literals ({:key val}), and vector literals ([1 2 3]).
What It Looks Like
A coding agent with file tools, safety checks, and budget tracking — in ~40 lines:
;; Define tools the LLM can call
(deftool read-file
"Read a file's contents"
{:path {:type :string :description "File path"}}
(lambda (path)
(if (file/exists? path) (file/read path) "File not found")))
(deftool edit-file
"Replace text in a file"
{:path {:type :string} :old {:type :string} :new {:type :string}}
(lambda (path old new)
(file/write path (string/replace (file/read path) old new))
"Done"))
(deftool run-command
"Run a shell command"
{:command {:type :string :description "Shell command to run"}}
(lambda (command) (:stdout (shell "sh" "-c" command))))
;; Create an agent with tools, system prompt, and spending limit
(defagent coder
{:system (format "You are a coding assistant. Working directory: ~a" (sys/cwd))
:tools [read-file edit-file run-command]
:model "claude-sonnet-4-20250514"
:max-turns 20})
;; Run it — budget is scoped, automatically restored after the block
(llm/with-budget {:max-cost-usd 0.50} (lambda ()
(define result (agent/run coder "Add error handling to src/main.rs"))
(println (:response result))
(println (format "Cost: $~a" (:spent (llm/budget-remaining))))))
Key Features
;; Simple completion
(llm/complete "Explain monads in one sentence")
;; Structured data extraction — returns a map, not a string
(llm/extract
{:vendor {:type :string} :amount {:type :number} :date {:type :string}}
"Bought coffee for $4.50 at Blue Bottle on Jan 15")
;; => {:amount 4.5 :date "2025-01-15" :vendor "Blue Bottle"}
;; Classification
(llm/classify [:positive :negative :neutral] "This product is amazing!")
;; => :positive
;; Multi-turn conversations as immutable data
(define conv (conversation/new {:model "claude-haiku-4-5-20251001"}))
(define conv (conversation/say conv "The secret number is 7"))
(define conv (conversation/say conv "What's the secret number?"))
(conversation/last-reply conv) ;; => "The secret number is 7."
;; Streaming
(llm/stream "Tell me a story" {:max-tokens 500})
;; Batch — all prompts sent concurrently
(llm/batch ["Translate 'hello' to French"
"Translate 'hello' to Spanish"
"Translate 'hello' to German"])
;; Vision — extract structured data from images
(llm/extract-from-image
{:text :string :background_color :string}
"assets/logo.png")
;; => {:background_color "white" :text "Sema"}
;; Multi-modal chat — send images in messages
(define img (file/read-bytes "photo.jpg"))
(llm/chat [(message/with-image :user "Describe this image." img)])
;; Cost tracking
(llm/set-budget 1.00)
(llm/budget-remaining) ;; => {:limit 1.0 :spent 0.05 :remaining 0.95}
;; Response caching — avoid duplicate API calls during development
(llm/with-cache (lambda ()
(llm/complete "Explain monads")))
;; Fallback chains — automatic provider failover
(llm/with-fallback [:anthropic :openai :groq]
(lambda () (llm/complete "Hello")))
;; In-memory vector store for semantic search (RAG)
(vector-store/create "docs")
(vector-store/add "docs" "id" (llm/embed "text") {:source "file.txt"})
(vector-store/search "docs" (llm/embed "query") 5)
;; Text chunking for LLM pipelines
(text/chunk long-document {:size 500 :overlap 100})
;; Prompt templates
(prompt/render "Hello {{name}}" {:name "Alice"})
; => "Hello Alice"
;; Persistent key-value store
(kv/open "cache" "cache.json")
(kv/set "cache" "key" {:data "value"})
(kv/get "cache" "key")
Supported Providers
All providers are auto-configured from environment variables — just set the API key and go.
| Provider | Chat | Stream | Tools | Embeddings | Vision |
|---|---|---|---|---|---|
| Anthropic | ✅ | ✅ | ✅ | — | ✅ |
| OpenAI | ✅ | ✅ | ✅ | ✅ | ✅ |
| Google Gemini | ✅ | ✅ | ✅ | — | ✅ |
| Ollama | ✅ | ✅ | ✅ | — | ✅ |
| Groq | ✅ | ✅ | ✅ | — | — |
| xAI | ✅ | ✅ | ✅ | — | — |
| Mistral | ✅ | ✅ | ✅ | — | — |
| Moonshot | ✅ | ✅ | ✅ | — | — |
| Jina | — | — | — | ✅ | — |
| Voyage | — | — | — | ✅ | — |
| Cohere | — | — | — | ✅ | — |
| Any OpenAI-compat | ✅ | ✅ | ✅ | — | ✅ |
| Custom (Lisp) | ✅ | — | ✅ | — | — |
It's Also a Real Lisp
460+ built-in functions, tail-call optimization, macros, modules, error handling — not a toy.
;; Closures, higher-order functions, TCO
(define (fibonacci n)
(let loop ((i 0) (a 0) (b 1))
(if (= i n) a (loop (+ i 1) b (+ a b)))))
(fibonacci 50) ;; => 12586269025
;; Maps, keywords-as-functions, destructuring
(define person {:name "Ada" :age 36 :langs ["Lisp" "Rust"]})
(:name person) ;; => "Ada"
;; Functional pipelines
(->> (range 1 100)
(filter even?)
(map (fn (x) (* x x)))
(take 5))
;; => (4 16 36 64 100)
;; Macros
(defmacro unless (test . body)
`(if ,test nil (begin ,@body)))
;; Modules
(module utils (export square)
(define (square x) (* x x)))
;; HTTP, JSON, regex, file I/O, crypto, CSV, datetime...
(define data (json/decode (http/get "https://api.example.com/data")))
📖 Full language reference, stdlib docs, and more examples at sema-lang.com/docs
Try It Now
sema.run — Browser-based playground with 20+ example programs. No install required. Runs entirely in WebAssembly.
Installation
Install pre-built binaries (no Rust required):
# macOS / Linux
|
# Windows (PowerShell)
# Homebrew (macOS / Linux)
Or install from crates.io:
Or build from source:
&&
# Binary at target/release/sema
Shell Completions
Generate tab-completion scripts for your shell:
# Zsh (macOS / Linux)
# Bash
# Fish
📖 Full setup instructions for all shells: sema-lang.com/docs/shell-completions
📖 Full CLI reference, flags, and REPL commands: sema-lang.com/docs/cli
Editor Support
| Editor | Install |
|---|---|
| VS Code | cd editors/vscode/sema && npx @vscode/vsce package then install .vsix |
| Vim / Neovim | Plug 'helgesverre/sema', { 'rtp': 'editors/vim' } |
| Emacs | (require 'sema-mode) — see docs |
| Helix | Copy languages.toml + query files — see docs |
All editors provide syntax highlighting for 460+ builtins, special forms, keyword literals, character literals, LLM primitives, and more.
📖 Full installation instructions: sema-lang.com/docs/editors
Example Programs
The examples/ directory has 50+ programs:
| Example | What it does |
|---|---|
coding-agent.sema |
Full coding agent with file editing, search, and shell tools |
review.sema |
AI code reviewer for git diffs |
commit-msg.sema |
Generate conventional commit messages from staged changes |
summarize.sema |
Summarize files or piped input |
game-of-life.sema |
Conway's Game of Life |
brainfuck.sema |
Brainfuck interpreter |
mandelbrot.sema |
ASCII Mandelbrot set |
json-api.sema |
Fetch and process JSON APIs |
test-vision.sema |
Vision extraction and multi-modal chat tests |
test-extract.sema |
Structured extraction and classification |
test-batch.sema |
Batch/parallel LLM completions |
test-pipeline.sema |
Caching, budgets, rate limiting, retry, fallback chains |
test-text-tools.sema |
Text chunking, prompt templates, document abstraction |
test-vector-store.sema |
In-memory vector store with similarity search |
test-kv-store.sema |
Persistent JSON-backed key-value store |
Why Sema?
- LLMs as language primitives — prompts, messages, conversations, tools, and agents are first-class data types, not string templates bolted on
- Multi-provider — swap between Anthropic, OpenAI, Gemini, Ollama, any OpenAI-compatible endpoint, or define your own provider in Sema
- Pipeline-ready — response caching, fallback chains, rate limiting, retry with backoff, text chunking, prompt templates, vector store, and a persistent KV store
- Cost-aware — built-in budget tracking with dynamic pricing from llm-prices.com
- Practical Lisp — closures, TCO, macros, modules, error handling, HTTP, file I/O, regex, JSON, and 460+ stdlib functions
- Embeddable — available on crates.io, clean Rust crate structure with a builder API
- Developer-friendly — REPL with tab completion, structured error messages with hints, and 50+ example programs
Why Not Sema?
- No full numeric tower (rationals, bignums, complex numbers)
- No continuations (
call/cc) or hygienic macros (syntax-rules) - Single-threaded —
Rc-based, no cross-thread sharing of values - No JIT — tree-walking interpreter and bytecode VM, no native code generation
- No package manager —
importresolves local files only - Young language — solid but not battle-tested at scale
Architecture
crates/
sema-core/ NaN-boxed Value type, errors, environment
sema-reader/ Lexer and s-expression parser
sema-vm/ Bytecode compiler and virtual machine
sema-eval/ Trampoline-based evaluator, special forms, modules
sema-stdlib/ 460+ built-in functions across 21 modules
sema-llm/ LLM provider trait + multi-provider clients
sema-wasm/ WebAssembly build for sema.run playground
sema/ CLI binary: REPL + file runner
🔬 Deep-dive into the internals: Architecture · Evaluator · Lisp Comparison
License
MIT — see LICENSE.