๐๏ธ LMM ๐ฆ
LMM is a pureโRust framework that represents higherโdimensional realities through symbolic mathematics and physics simulation, inspired by the Pharaonic model of intelligence: compress the world into durable, universal equations.
๐ฌ Demo
The following is a proof-of-concept demonstration of the predictive engine generating coherent English sentences. It is powered purely by deterministic mathematical equations and structural Subject-Verb-Object loops, utilizing the standard Linux system dictionary (/usr/share/dict/words) as its vocabulary fallback to function.
This engine supports a complete suite of text-generation CLI commands, including summarize, sentence, paragraph, essay, and ask enabling sophisticated multi-paragraph construction driven entirely by mathematics.
๐ง Framework Overview
LMM bridges multimodal perception and actionable scientific discovery through five tightly integrated layers:
| Layer | Modules | Purpose |
|---|---|---|
| Perception | perception.rs, tensor.rs |
Raw bytes โ normalised tensors |
| Symbolic | equation.rs, symbolic.rs, discovery.rs |
GP symbolic regression, differentiation, simplification |
| Physics | physics.rs, simulation.rs |
ODE models + Euler / RK4 / RK45 / leapfrog integrators |
| Causal | causal.rs |
SCM graphs, do-calculus interventions, counterfactuals |
| Cognition | consciousness.rs, world.rs, operator.rs |
Full perceive โ encode โ predict โ act loop |
โ๏ธ Architecture
flowchart TD
A["Raw Input\n(bytes / sensors)"]
B["MultiModalPerception\nโโโบ Tensor"]
C["Consciousness Loop\nperceive โ encode โ predict\nevaluate โ plan (lookahead)"]
D["WorldModel\n(RK4 physics)"]
E["SymbolicRegression\n(GP equation search)"]
F["CausalGraph\nintervention / counterfactual"]
G["Expression AST\ndifferentiate / simplify"]
A --> B --> C
C --> D
C --> E
E --> G
G --> F
D --> F
๐ฌ Key Capabilities
- ๐งฌ Genetic Programming: real population-based symbolic regression that seeds templates (linear, quadratic, periodic) and enforces variable-containing equations.
- ๐ Symbolic Calculus: automatic differentiation (chain rule, product rule), constant folding simplification.
- ๐ Physics Suite: Harmonic, Lorenz, Pendulum, SIR epidemic, N-body gravity: all implement
Simulatable. - ๐ข Field Calculus: N-D gradient, Laplacian, divergence, 3-D curl (central differences).
- ๐ Causal Reasoning: structural causal models,
do(X=v)interventions, counterfactual queries. - ๐งฉ Neural Operators: circular convolution with SGD kernel learning, Fourier spectral operators.
- ๐ค Text โ Equation: encode any text into a symbolic equation; decode it back exactly (lossless via residuals).
- ๐ฎ Symbolic Prediction: LMM-native text continuation via sliding-window GP regression and vocabulary anchoring.
๐ฆ Installation
From Source
The binary is at ./target/release/lmm.
Via Cargo
[!NOTE] Requires Rust 1.86+. Install via rustup.
[!TIP] To enable internet-aware commands (
ask), install with thenetfeature:# orOr build from source:
cargo build --release --features cli,net
๐ CLI Usage
)
)
๐ Subcommand Reference
1. simulate: Harmonic Oscillator
Runs a harmonic oscillator using the RK4 integrator.
| Flag | Default | Description |
|---|---|---|
-s, --step |
0.01 |
Integration step size (ฮt) |
-t, --steps |
100 |
Number of integration steps |
2. physics: Physics Model Simulation
Simulate one of four built-in physics models.
# Lorenz chaotic attractor (ฯ=10, ฯ=28, ฮฒ=8/3)
# Nonlinear pendulum
# SIR epidemic model
# Damped harmonic oscillator (default)
Lorenz example:
SIR example:
| Flag | Default | Description |
|---|---|---|
-m, --model |
harmonic |
Model: lorenz, pendulum, sir, harmonic |
-s, --steps |
200 |
Number of integration steps |
-z, --step-size |
0.01 |
Step size ฮt |
3. discover: Symbolic Regression
Runs Genetic Programming (GP) to discover a symbolic equation from data.
))
The engine fits data points (i*0.5, 2*i*0.5 + 1) by default and finds the
underlying linear law. Increase --iterations for more complex datasets.
| Flag | Default | Description |
|---|---|---|
-d, --data-path |
synthetic |
Data source (synthetic = built-in linear data) |
-i, --iterations |
100 |
Number of GP evolution iterations |
4. consciousness: Perceive โ Predict โ Act Loop
Runs one tick of the full consciousness loop: raw bytes โ perception tensor โ world model prediction โ action plan.
| Flag | Default | Description |
|---|---|---|
-l, --lookahead |
3 |
Multi-step lookahead horizon depth |
5. causal: Causal Graph + do-Calculus
Builds a 3-node Structural Causal Model (x โ y โ z) and applies an
intervention do(node = value), printing before/after values.
# Intervene on x: set x = 10, observe how y and z change
)))
))))
The SCM is:
y = 2 * xz = y + 1
| Flag | Default | Description |
|---|---|---|
-n, --intervene-node |
x |
Name of the node to intervene on |
-v, --intervene-value |
1.0 |
Value to set the node to (do-calculus) |
6. field: Scalar Field Calculus
Computes differential operators on a 1-D scalar field f(i) = iยฒ.
# Gradient: should approach 2i (central differences)
# Laplacian: should be โ 2 everywhere (second derivative of xยฒ)
| Flag | Default | Description |
|---|---|---|
-s, --size |
10 |
Number of field points |
-o, --operation |
gradient |
Operation: gradient or laplacian |
7. encode: Text โ Symbolic Equation
This is the flagship demonstration of LMM's power. Any text is treated as a
sequence of byte values indexed by position. The GP engine discovers a symbolic
equation f(x) โ byte[x]. Integer residuals (byte[x] โ round(f(x))) are
stored alongside the equation, guaranteeing lossless round-trip recovery.
)
))
}
[!NOTE] GP is stochastic: the discovered equation and residual values will differ across runs. The round-trip recovery is always โ PERFECT because the integer residuals correct for any approximation error.
# Encode from a file
| Flag | Default | Description |
|---|---|---|
-i, --input |
- |
Path to a text file to encode (- = use --text) |
-t, --text |
Hello, LMM! |
Inline text (used when --input is -) |
--iterations |
80 |
GP evolution iterations |
--depth |
4 |
Maximum expression tree depth |
8. decode: Symbolic Equation โ Text
Reconstructs the original text from the equation and residuals printed by encode.
))
| Flag | Required | Description |
|---|---|---|
-e, --equation |
โ | Equation string (from encode output) |
-l, --length |
โ | Number of characters to recover |
-r, --residuals |
โ | Comma-separated residuals. Use --residuals="-3,1,..." for negative values |
[!IMPORTANT] Use
--residuals="-3,..."(with=) or quote the argument when residuals contain negative values to prevent the shell from treating them as flags.
9. predict: Symbolic Text Continuation
The predict command acts as LMM's continuation engine. Unlike neural network LLMs that use massive statistical models, LMM strings together coherent English output using Pure Mathematics.
It does this by operating on three distinct, deterministic signals:
- GP Trajectory Equation:
f(pos) โ word_byte_tone(discovers long-range subject themes) - GP Rhythm Equation:
g(pos) โ word_length(discovers alternating phonetic cadence) - Dictionary Grammar Engine: Maps mathematical values to a curated pool of English nouns, verbs, adjectives with system dictionary fallback (
/usr/share/dict), while flowing through cyclic Subject-Verb-Object (SVO) POS grammar loops.
Loaded 63746 dictionary words
โโโ LMM PREDICTOR โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Input text : "Wise AI built the first LMM"
Window used : 6 words
Trajectory : (99.77577741824268 + ((x + 3.4804258799212793) + 1.7728570078579993))
Rhythm : (cos(exp(x)) + 3.851491814600415)
โโโ PREDICTED CONTINUATION โโโโโโโโโโโโโโโโโโโโโโโ
Wise AI built the first LMM in the true law often long time and a open path of an old scope is the solid order
[!NOTE] Text is parsed completely via pure equations over a carefully constructed English vocabulary pool, mapping geometric relationships and POS states into elegant and mysterious sentences.
| Flag | Default | Description |
|---|---|---|
-i, --input |
- |
Path to a text file (- = use --text) |
-t, --text |
The Pharaohs encoded reality in |
Inline text seed |
-w, --window |
32 |
Context window in words |
-p, --predict-length |
16 |
Approximate character budget for continuation |
--iterations |
80 |
GP evolution iterations for the prediction model |
--depth |
4 |
Maximum expression tree depth |
10. summarize: Key Sentence Extraction
Summarize distills a large body of text down to its most mathematically significant sentences. It scores each sentence by tracking tone deviations, length variances, and relative position.
| Flag | Default | Description |
|---|---|---|
-t, --text |
... |
Input text to summarize |
-n, --sentences |
2 |
Number of key sentences to extract |
11. sentence: Single Sentence Generation
Generates a single, structurally elegant sentence inspired by a seed text, using rotating Subject-Verb-Object (SVO) sequence patterns parsed from math tones.
12. paragraph: Cohesive Paragraph Generation
Chains multiple logically coherent sentences together, seeding subsequent sentence structures using extracted keywords from the original seed.
| Flag | Default | Description |
|---|---|---|
-t, --text |
... |
Seed topic for paragraph generation |
-n, --sentences |
4 |
Number of sentences in the paragraph |
13. essay: Full Essay Blueprint
Generates a fully structured essay, complete with an introduction, mathematical body paragraphs based on derived sub-topics, and a cohesive conclusion.
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ๐ Essay ยท Generate a Full Essay โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ฑ Topic : "Symmetry and the deeper patterns of physics"
๐ Paragraphs : 2 (15 sentences each)
-- Essay -------------------------------------
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ Symmetry And The Deeper Patterns Of Physics
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-- Introduction ------------------------------
Analysis illuminates the probabilistic motion of the truth. The deterministic probability reflects knowledge. Algebra holds the infinite truth of motion. The frequency of truth shapes chaos. At its core, the axiomatic probability transforms perception. Physics unveils bounded infinity beyond truth. Recursion connects the bounded balance of the truth. The dynamic wavelength unveils infinity. Wavelength remains the bounded truth through reality. The dimension of truth produces identity. At its core, the discrete frequency governs existence. Frequency produces invariant perception pervading truth. Structure unveils the invariant existence of the truth. The coherent integration produces existence. Integration remains the mathematical truth governing complexity.
-- Body ยท ยง1 ---------------------------------
At its core, the discrete computation governs truth. Computation manifests invariant harmony pervading symmetry. Topology enables the invariant truth of the symmetry. The coherent transformation manifests change. Transformation remains the mathematical symmetry governing time. The physics of symmetry encodes order. In this framework, the continuous calculus generates nature. Geometry enables axiomatic space across symmetry. Gradient determines the axiomatic time of the symmetry. The structural divergence illuminates matter. Physics remains the probabilistic symmetry beneath meaning. The mathematics of symmetry expresses truth. At its core, the deterministic recursion captures matter. Entropy unveils abstract balance of symmetry. Divergence connects the abstract limits of the symmetry.
-- Body ยท ยง2 ---------------------------------
The recursive transformation manifests energy. Transformation remains the continuous symmetry within limits. The calculus of symmetry enables meaning. At its core, the axiomatic analysis transforms knowledge. Gradient illuminates probabilistic motion inside symmetry. Pattern illuminates the probabilistic truth of the symmetry. The fundamental physics expresses matter. Physics remains the deterministic symmetry underlying harmony. The algebra of symmetry unveils energy. In this framework, the abstract recursion captures causality. Entropy describes bounded truth beyond symmetry. Divergence connects the bounded unity of the symmetry. The dynamic logic unveils truth. Resonance remains the bounded symmetry through knowledge. The entropy of symmetry produces knowledge.
-- Conclusion --------------------------------
Gradient unveils the invariant limits of the symmetry. The invariant computation compresses balance. Divergence is the bounded symmetry of identity. The pattern of symmetry defines matter. Fundamentally, the continuous gradient generates truth. Logic determines axiomatic harmony across symmetry. Recursion determines the axiomatic infinity of the symmetry. The structural entropy illuminates infinity. Entropy are the probabilistic symmetry beneath space. The dimension of symmetry expresses reality. Moreover, the deterministic frequency defines existence. Frequency connects abstract perception of symmetry. Structure encodes the abstract chaos of the symmetry. The elegant integration encodes existence. Integration are the dynamic symmetry within meaning.
| Flag | Default | Description |
|---|---|---|
-t, --text |
... |
Topic or title seed for the essay |
-n, --paragraphs |
2 |
Number of body paragraphs to generate |
-s, --sentences |
3 |
Number of sentences per paragraph |
14. ask: Internet-Aware Knowledge Synthesis (requires net feature)
Searches the internet via DuckDuckGo Lite, aggregates the result snippets into a single text corpus, then applies the LMM's GP-scored equation engine to extract and compose the most mathematically significant sentences into a coherent response.
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ ๐ Ask ยท Internet-Aware Knowledge Synthesis โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Prompt : "What is the Rust programming language?"
-- DuckDuckGo Results ------------------------
Rust (programming language)
Abstract: Rust is a general-purpose programming language. It is noted for its emphasis on performance, type safety, concurrency, and memory safety. Rust supports multiple programming paradigms. It was influenced by ideas from functional programming, including immutability, higher-order functions, algebraic data types, and pattern matching. It also supports object-oriented programming via structs, enums, traits, and methods. Rust is noted for enforcing memory safety without a conventional garbage collector; instead, memory safety errors and data races are prevented by the "borrow checker", which tracks the object lifetime of references at compile time. Software developer Graydon Hoare created Rust in 2006 while working at Mozilla, which officially sponsored the project in 2009. The first stable release, Rust 1.0, was published in May 2015.
Abstract Source: Wikipedia
Abstract URL: https://en.wikipedia.org/wiki/Rust_(programming_language)
Image URL: https://duckduckgo.com/i/832f249b21809a13.png
1. Rust (programming language) Category
URL: https://duckduckgo.com/c/Rust_(programming_language)?kp=%2D2
--------------------------------------------
2. History of programming languages - The history of programming languages spans from documentation of early mechanical computers to modern tools for software development. Early programming languages were highly specialized, relying on mathematical notation and similarly obscure syntax.
URL: https://duckduckgo.com/History_of_programming_languages?kp=%2D2
--------------------------------------------
3. Outline of the Rust programming language - The following outline is provided as an overview of and topical guide to Rust: Rust is a multi-paradigm programming language emphasizing performance, memory safety, and concurrency.
URL: https://duckduckgo.com/Outline_of_the_Rust_programming_language?kp=%2D2
--------------------------------------------
4. Pattern matching programming languages
URL: https://duckduckgo.com/c/Pattern_matching_programming_languages?kp=%2D2
--------------------------------------------
5. Multi-paradigm programming languages
URL: https://duckduckgo.com/c/Multi-paradigm_programming_languages?kp=%2D2
--------------------------------------------
-- LMM Response ------------------------------
Rust is a general-purpose programming language.
Rust supports multiple programming paradigms.
Outline of the Rust programming language - The following outline is provided as an overview of and topical guide to Rust: Rust is a multi-paradigm programming language emphasizing performance, memory safety, and concurrency.
[!NOTE] The
askcommand requires building with--features cli,net. No API key is needed โ it uses DuckDuckGo Lite (text-only, no JavaScript required).
| Flag | Default | Description |
|---|---|---|
-p, --prompt |
required | The question or search query |
-l, --limit |
5 |
Maximum number of search results to fetch |
-n, --sentences |
3 |
Number of key sentences to extract |
--region |
wt-wt |
DuckDuckGo region code (e.g. us-en, uk-en) |
--iterations |
40 |
GP scoring iterations |
--depth |
3 |
Maximum GP expression depth |
๐ฌ Architecture Deep Dive
Genetic Programming Symbolic Regression
flowchart TD
A["Seeded population\n(linear/quadratic/periodic templates + random)"]
B["Evaluate fitness\nMDL = nยทln(MSE) + complexityยทln(2)"]
C["Tournament selection (k=5)"]
D["Crossover & mutation"]
E["Reject constant collapse\n(inject fresh random expr 70% of the time)"]
F{Iterations done?}
G["Best variable-containing expression (simplified)"]
A --> B --> C --> D --> E --> F
F -- No --> B
F -- Yes --> G
Multi-Signal Prediction Engine
flowchart TD
In["Context Window\n(Recent Tokens)"]
In -->|Train| M["2nd-Order Markov Chain\nTransition probabilities"]
In -->|GP Fit| T["Word-ID Trajectory GP\nf(pos) โ word_id"]
In -->|GP Fit| R["Word-Length Rhythm GP\ng(pos) โ length"]
In --> S["Suffix Pattern Matcher"]
S -- "Match Found" --> Out["Exact Phrase Continuation"]
S -- "No Match" --> Score["Composite Scorer"]
M --> Score
T --> Score
R --> Score
Score -->|Lowest Score| W["Select Best Word from Vocab"]
W -->|Update Recency| Out
RK45 Adaptive Integrator
All Butcher-tableau coefficients are named package-level constants:
const RK45_A41: f64 = 1932.0 / 2197.0;
const RK45_A42: f64 = -7200.0 / 2197.0;
const RK45_A43: f64 = 7296.0 / 2197.0;
const RK45_B5_1: f64 = 16.0 / 135.0;
// ... etc.
Step size is adapted each iteration using the error estimate:
)
๐ฐ Whitepaper
LLMs are Usefull. LMMs will Break Reality: the blog post that started this project.
๐ค Contributing
Contributions are welcome! Feel free to open issues or pull requests.
๐ License
This project is licensed under the MIT License: see the LICENSE file for details.
