Expand description
§Antitransformer
AI text detector via statistical fingerprints.
Detects transformer-generated text through 5 statistical features:
- Zipf’s law deviation (power law smoothing)
- Entropy uniformity (suspiciously consistent information density)
- Burstiness dampening (loss of natural word clustering)
- Perplexity consistency (uniform surprise level)
- TTR anomaly (type-token ratio deviation from human baseline)
Features aggregated through chemistry-primitive transfer:
- Beer-Lambert weighted summation
- Hill cooperative amplification
- Arrhenius threshold gating
§Primitive Grounding (T1 → Detection)
| Module | Dominant Primitives |
|---|---|
| tokenize | σ Sequence, N Quantity |
| zipf | κ Comparison, N Quantity |
| entropy | Σ Sum, N Quantity |
| burstiness | ν Frequency, ∂ Boundary |
| perplexity | ν Frequency, κ Comparison |
| aggregation | Σ Sum, ρ Recursion |
| classify | ∂ Boundary, → Causality |
Modules§
- aggregation
- Signal Aggregation
- burstiness
- Burstiness Coefficient
- chemistry
- Inlined Chemistry Primitives
- classify
- Classification via Arrhenius Threshold
- daemon
- HTTP Daemon
- entropy
- Sliding Window Shannon Entropy
- perplexity
- Perplexity Variance
- pipeline
- Analysis Pipeline
- tokenize
- Text Tokenization
- zipf
- Zipf’s Law Deviation