Expand description
§nexcore-cognition
Typed cognitive engine — the transformer algorithm as strict Rust.
§Meta-cognitive origin
This crate captures the fundamental algorithm behind large language model cognition: attention selects, transformation processes, generation builds. Each module maps to an observable pattern in how neural networks process information, translated faithfully into Rust’s type system.
§Architecture (bottom-up)
pipeline ──► generator ──► block ──► attention + feed_forward
│ │ │
▼ ▼ ▼
normalize mask tensor
residual │
embedding error§T1 Primitive grounding
| Module | Primitives | Cognitive role |
|---|---|---|
| tensor | N, Σ, ×, ∂, κ | Numerical substrate |
| embedding | μ, λ, N | Symbol → vector |
| attention | κ, →, N, μ, Σ | Relevance selection |
| feed_forward | μ, ς | Nonlinear transformation |
| residual | π, Σ | Context preservation |
| normalize | ∂, N | Signal stability |
| block | σ, ∃ | Composable unit |
| mask | ∂, →, ∝ | Causal constraint |
| generator | σ, ρ, ∝, →, ∂ | Autoregressive output |
| sample | N, ∂, ν, κ | Stochastic selection |
| metrics | κ, N, ν, μ | Self-measurement |
| pipeline | σ, →, Σ, κ | Full cognitive flow |
Modules§
- attention
- Scaled dot-product attention and multi-head attention.
- block
- Transformer block — the composable unit of cognition.
- embedding
- Token embedding and positional encoding.
- error
- Error types for the cognitive engine.
- feed_
forward - Feed-forward network (FFN).
- generator
- Autoregressive generation — building output token by token.
- mask
- Causal masking for autoregressive attention.
- metrics
- Meta-cognitive self-measurement.
- normalize
- Layer normalization.
- pipeline
- Full cognitive pipeline — the complete flow from input to output.
- residual
- Residual connections (skip connections).
- sample
- Sampling strategies for token generation.
- tensor
- Dense tensor type and operations.
Functions§
- make_
rng - Create a seeded or OS-random
StdRngfor use with the cognitive engine.