๐๏ธ LMM ๐ฆ
LMM (Large Mathematical Model) is a pureโRust framework that models higherโdimensional reality through symbolic mathematics and physics simulation; Inspired by the Pharaonic model of intelligence: compress the world into durable, universal equations. No training. No GPU. No API key.
๐ง Linux (Recommended) |
๐ช Windows | ๐ณ Docker |
|---|---|---|
![]() |
![]() |
![]() |
Download lmm binary |
Download lmm.exe binary |
docker pull wiseaidev/lmm |
cargo install lmm --features rust-binary |
cargo install lmm --features rust-binary |
docker run -it wiseaidev/lmm |
lmm โ launches CLI |
lmm โ launches CLI |
Read DOCKER.md |
๐ฌ Demo
The following demonstrates the symbolic prediction engine generating coherent English sentences powered entirely by deterministic mathematical equations and structural Subject-Verb-Object grammar; No neural networks, no statistical models. The engine supports a full suite of CLI subcommands including predict, summarize, sentence, paragraph, essay, and ask, enabling multi-paragraph construction driven entirely by mathematics.
๐ง What Does LMM Provide?
LMM bridges multimodal perception and actionable scientific discovery through five tightly integrated layers:
| Layer | Modules | Purpose |
|---|---|---|
| Perception | perception.rs, tensor.rs |
Raw bytes โ normalised tensors |
| Symbolic | equation.rs, symbolic.rs, discovery.rs |
GP symbolic regression, differentiation, simplification |
| Physics | physics.rs, simulation.rs |
ODE models + Euler / RK4 / RK45 / leapfrog integrators |
| Causal | causal.rs |
SCM graphs, do-calculus interventions, counterfactuals |
| Cognition | consciousness.rs, world.rs, operator.rs |
Full perceive โ encode โ predict โ act loop |
โ๏ธ Architecture
flowchart TD
A["Raw Input\n(bytes / sensors)"]
B["MultiModalPerception\n โ Tensor"]
C["Consciousness Loop\nperceive โ encode โ predict\nevaluate โ plan (lookahead)"]
D["WorldModel\n(RK4 physics)"]
E["SymbolicRegression\n(GP equation search)"]
F["CausalGraph\nintervention / counterfactual"]
G["Expression AST\ndifferentiate / simplify"]
A --> B --> C
C --> D
C --> E
E --> G
G --> F
D --> F
๐ฌ Key Capabilities
- ๐งฌ Genetic Programming: population-based symbolic regression with template seeding (linear, quadratic, periodic) and variable-enforcement guards.
- ๐ Symbolic Calculus: automatic differentiation (chain rule, product rule, trig) and constant-folding simplification.
- ๐ Physics Suite: Harmonic Oscillator, Lorenz Attractor, Pendulum, SIR Epidemic, N-body Gravity; All implement
Simulatable. - ๐ข Field Calculus: N-D gradient, Laplacian, divergence, and 3-D curl via central differences.
- ๐ Causal Reasoning: structural causal models,
do(X=v)interventions, and counterfactual queries. - ๐งฉ Neural Operators: circular convolution with SGD kernel learning and Fourier spectral operators.
- ๐ค Text โ Equation: losslessly encode any text string into a symbolic equation and recover it exactly via integer residuals.
- ๐ฎ Symbolic Prediction: equation-native text continuation using sliding-window GP regression and vocabulary anchoring.
- ๐ฒ Stochastic Enhancement: synonym-bank word replacement (
--stochastic) delivers unique output each run while preserving mathematical sentence structure. - ๐จ Spectral Image Synthesis: generate procedural PPM images from a text prompt by hashing it into Fourier wave components.
๐ฆ Installation
The lmm crate ships the following Cargo features:
| Feature | Description |
|---|---|
rust-binary |
Enables the standalone lmm terminal CLI executable |
cli |
Core CLI scaffolding (subsets of rust-binary) |
net |
Internet-aware ask command via DuckDuckGo search |
python |
Python extension module via pyo3 / maturin |
node |
Node.js native add-on via napi-derive |
๐ฆ Rust
The lmm library is available on crates.io. For the complete API reference, installation guide, and worked examples, see the Rust usage guide.
๐ป Command-Line Interface
The lmm binary supports 15 subcommands spanning simulation, discovery, encoding, prediction, summarisation, and rich text generation: all powered by pure equations.
For the full option reference and usage examples, see the CLI documentation or run lmm --help after installing with cargo install lmm --features rust-binary.
๐ Python
The Python bindings are published to PyPI as lmm-rs and are installed with pip install lmm-rs. Built with maturin, the package ships pre-compiled wheels for major CPython versions and runs a fully embedded Tokio runtime; no asyncio required.
For installation instructions, configuration options, and full method signatures, see the Python usage guide.
๐ฉ Node.js
The Node.js bindings are published to npm as @wiseaidev/lmm and are installed with npm install @wiseaidev/lmm. Built with napi-rs, the package ships a pre-compiled .node add-on with TypeScript type definitions.
For installation instructions, type definitions, and examples, see the Node.js usage guide.
๐ WebAssembly (WASM)
LMM natively targets wasm32-unknown-unknown. Because reqwest switches to the browser fetch API automatically, you can deploy LMM inside Rust frontend frameworks such as Yew, Dioxus, and Leptos without any additional glue code.
For CORS considerations, build steps, and usage details, see the WASM usage guide.
๐ค Agent Framework
The lmm-agent crate extends LMM with a fully autonomous, equation-based agent layer; no LLM, no API key, no training data.
| Document | Description |
|---|---|
| AGENT.md | Architecture, quick-start, types, and async API reference |
| DERIVE.md | #[derive(Auto)] macro: generated traits and field contract |
| lmm-agent README | Crate-level API reference, builder, and example |
| lmm-derive README | Macro crate details and field rules |
๐ฐ Publications & Research
The architecture, formal mathematics, and paradigm are fully documented in the official whitepaper: Read the Whitepaper (PDF).
Blog Posts
- LLMs are Useful. LMMs will Break Reality: the original post that started this project.
- Training Is An Evil Concept. LMMs Eliminates It Altogether: ethical, architectural, and data advantages of training-free models.
๐ Citation
If you use LMM in your research, please cite our whitepaper:
๐ค Contributing
Contributions are welcome! Feel free to open issues or pull requests on GitHub.
๐ License
Licensed under the MIT License.
โญ Star Us
If you use or enjoy LMM, please leave us a star on GitHub! It helps others discover the project and keeps the momentum going โ.

