███████╗████████╗██████╗ ██╗ ██╗██╗ ██╗███████╗
██╔════╝╚══██╔══╝██╔══██╗╚██╗ ██╔╝██║ ██╔╝██╔════╝
███████╗ ██║ ██████╔╝ ╚████╔╝ █████╔╝ █████╗
╚════██║ ██║ ██╔══██╗ ╚██╔╝ ██╔═██╗ ██╔══╝
███████║ ██║ ██║ ██║ ██║ ██║ ██╗███████╗
╚══════╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝╚══════╝
[THE 2ND FASTEST DYNAMIC LANGUAGE IN THE WORLD]
"There is more than one way to do it — in parallel."
The 2nd fastest dynamic language runtime ever benchmarked — behind only Mike Pall's LuaJIT, and beating it on 3 of 8 benchmarks. A Perl 5 compatible interpreter in Rust with native parallel primitives, NaN-boxed values, three-tier regex, bytecode VM + Cranelift JIT, streaming iterators, and rayon work-stealing across all cores. Faster than perl5, Python, Ruby, Julia, and Raku on every benchmark.
Strykelang is under heavy development and will soon replace zsh/fish/bash and all other login shells
Read the Docs · Coverage Report · Full Reference
Table of Contents
- [0x00] Overview
- [0x01] Install
- [0x02] Usage
- [0x03] Parallel Primitives
- [0x04] Shared State (
mysync) - [0x05] Native Data Scripting
- [0x06] Async / Trace / Timer
- [0x07] CLI Flags
- [0x08] Supported Perl Features
- [0x09] Architecture
- [0x0A] Examples
- [0x0B] Benchmarks
- [0x0C] Development & CI
- [0x0D] Standalone Binaries (
stryke build) - [0x0E] Inline Rust FFI (
rust { ... }) - [0x0F] Bytecode Cache (
.pec) - [0x10] Distributed
pmap_onover SSH (cluster) - [0x11] Language Server (
stryke lsp) - [0x12] Language Reflection
- [0x13] zshrs Shell
- [0x14] Documentation
- [0xFF] License
[0x00] OVERVIEW
stryke parses and executes Perl 5 scripts with rayon-powered work-stealing primitives across every CPU core. Highlights:
- New Parallel Subroutines and |> Pipeline Syntactic Sugar
- Runtime values —
PerlValueis a NaN-boxedu64: immediates (undef,i32, rawf64bits) and taggedArc<HeapObject>pointers for big ints, strings, arrays, hashes, refs, regexes, atomics, channels. - Three-tier regex — Rust
regex→fancy-regex(backrefs) →pcre2(PCRE-only verbs). - Bytecode VM + JIT — match-dispatch interpreter with Cranelift block + linear-sub JIT (
src/vm.rs,src/jit.rs). - Rayon parallelism — every parallel builtin uses work-stealing across all cores.
- Over 3200 standard library functions
[0x01] INSTALL
# or from source
&& &&
Zsh tab completion
# or: fpath=(/path/to/stryke/completions $fpath) in .zshrc
&&
stryke <TAB> then completes flags, options, and script files.
[0x01b] CONCISENESS — STRYKE VS THE WORLD
stryke is the most concise yet readable ASCII-only general-purpose scripting language — shorter than Perl, Ruby, Python, and AWK for real-world tasks.
vs mainstream languages
| Task | stryke | chars | perl | chars | ruby | chars | python | chars |
|---|---|---|---|---|---|---|---|---|
| hello world | p"hello" |
8 | print"hello" |
12 | puts"hello" |
10 | print("hello") |
14 |
| sum 1-100 | p sum 1:100 |
11 | use List::Util'sum';say sum 1..100 |
38 | p (1..100).sum |
15 | print(sum(range(1,101))) |
24 |
| double+filter+sum | ~>1:10map{_*2}fi{_>5}sum p |
28 | say for grep{$_>5}map{$_*2}1..10 |
36 | p (1..10).map{...}.select{...}.sum |
42 | print(sum(x for x in[...])) |
56 |
| max of list | p max 3,1,4,1,5 |
15 | use List::Util'max';say max(...) |
38 | p [3,1,4,1,5].max |
17 | print(max([3,1,4,1,5])) |
23 |
| reverse string | p rev"hello" |
12 | say reverse"hello" |
18 | puts"hello".reverse |
18 | print("hello"[::-1]) |
20 |
| count array | p cnt 1:10 |
10 | say scalar 1..10 |
17 | p (1..10).count |
16 | print(len(range(1,11))) |
23 |
| join with comma | p join",",1:5 |
14 | say join",",1..5 |
17 | puts (1..5).to_a.join(",") |
24 | print(",".join(map(...))) |
36 |
| first element | p first 1:10 |
13 | say((1..10)[0]) |
16 | p (1..10).first |
16 | print(list(range(...))[0]) |
27 |
| any even | p any{even}1:5 |
14 | use List::Util'any';say any{$_%2==0}1..5 |
42 | `p (1..5).any?{ | x | x%2==0}` | 25 |
| unique values | p uniq 1,2,2,3 |
15 | use List::Util'uniq';say uniq(...) |
38 | p [1,2,2,3].uniq |
17 | print(list(set([...]))) |
27 |
stryke wins every task against Perl, Ruby, and Python.
vs K (array language)
K is more terse for pure array math: +/1+!100 (8 chars) vs stryke p sum 1:100 (11 chars). But K is a financial DSL, not a general-purpose language — it lacks:
| Feature | stryke | K |
|---|---|---|
| HTTP client | fetch"url" |
❌ |
| JSON parsing | json_decode $s |
needs lib |
| Regex | $s=~/\d+/ |
limited |
| SHA256/crypto | sha256"data" |
❌ |
| Parallel map | pmap{$_*2}@a |
❌ |
| Compression | gzip $data |
❌ |
| Base64 | b64e"hi" |
❌ |
| UUID | uuid |
❌ |
| SQLite | db_query $db,$sql |
❌ |
| TOML/YAML | toml_decode $s |
❌ |
K is a calculator. stryke is a programming language.
vs golf languages
GolfScript, Pyth, 05AB1E, Jelly — these are shorter but are write-only puzzles designed for competitions, not real software. stryke remains readable and maintainable.
[0x01c] WHY STRYKE — ONE-LINER COMPARISON
stryke is a one-liner-first language. No -e flag needed, everything built in, shortest syntax wins.
Character count — real tasks
| Task | stryke |
perl |
ruby |
python |
awk / other |
|---|---|---|---|---|---|
| Print hello world | s 'p "hello world"' 19c |
perl -e 'print "hello world\n"' 32c |
ruby -e 'puts "hello world"' 29c |
python3 -c 'print("hello world")' 34c |
echo | awk '{print "hello world"}' 36c |
| Sum 1..100 | s 'p sum 1..100' 16c |
perl -MList::Util=sum -e 'print sum 1..100' 45c |
ruby -e 'puts (1..100).sum' 28c |
python3 -c 'print(sum(range(1,101)))' 38c |
— |
| Word frequencies | s -an 'freq(@F) |> dd' 22c |
perl -ane '$h{$_}++ for @F}{print "$_ $h{$_}\n" for keys %h' 61c |
— | — | awk '{for(i=1;i<=NF;i++) a[$i]++} END{...}' 65c+ |
| SHA256 of file | s 'p s256"f"' 13c |
perl -MDigest::SHA=sha256_hex -e '...' 70c+ |
— | python3 -c 'import hashlib;...' 80c+ |
shasum -a 256 f 15c |
| Fetch JSON API | s 'fetch_json(URL) |> dd' 25c |
needs LWP + JSON modules |
needs net/http + json |
needs urllib + json |
curl -s URL | jq . ~40c |
| CSV → JSON | s 'csv_read("f") |> tj |> p' 28c |
needs Text::CSV + JSON |
needs csv + json |
needs csv + json imports |
— |
| Parallel map | s '1:1e6 |> pmap { $_ * 2 }' 29c |
not built in | not built in | not built in | xargs -P8 50c+ |
| Streaming parallel | s 'range(0,1e9) |> pmaps { $_ * 2 } |> take 10' 42c |
not built in | not built in | not built in | not built in |
| Sparkline | s '(3,7,1,9) |> spark |> p' 27c |
not built in | not built in | not built in | not built in |
| In-place sed (parallel) | s -i -pe 's/foo/bar/g' *.txt 28c |
perl -i -pe 's/foo/bar/g' *.txt 33c (sequential) |
ruby -i -pe '$_.gsub!(...)' 35c+ |
— | sed -i '' 's/foo/bar/g' *.txt 31c (sequential) |
Feature matrix
| Feature | stryke | perl5 | ruby | python | awk | jq | nushell |
|---|---|---|---|---|---|---|---|
No -e flag needed |
yes | no | no | no (-c) |
— | — | — |
| No semicolons | yes | no | yes | yes | yes | yes | yes |
| Built-in HTTP | yes | no | no | no | no | no | yes |
| Built-in JSON | yes | no | no | yes | no | yes | yes |
| Built-in CSV | yes | no | no | yes | no | @csv |
yes |
| Built-in SQLite | yes | no | no | yes | no | no | yes |
| Parallel map/grep | yes | no | no | no | no | no | par-each |
Pipe-forward |> |
yes | no | no | no | no | | |
| |
Thread macro ~> |
yes | no | no | no | no | no | no |
In-place edit -i |
parallel | sequential | sequential | no | no | no | no |
| Regex engine | 3-tier | PCRE | Onigmo | re |
ERE | PCRE | — |
| Data viz (spark/bars/flame) | yes | no | no | no | no | no | no |
| Clipboard (clip/paste) | yes | no | no | no | no | no | clip |
$NR/$NF AWK compat |
yes | -MEnglish |
no | no | native | no | no |
| Typed structs/enums/classes | yes | no | native | native | no | no | native |
| JIT compiler | Cranelift | no | YJIT | no | no | no | no |
| Single binary | 21MB | system pkg | system pkg | system pkg | system pkg | 3MB | 50MB+ |
[0x02] USAGE
STRYKE_BC_CACHE=1
-eis optional. If the first argument isn't a file on disk and looks like code,strykeruns it directly.stryke 'p 42'andstryke -e 'p 42'are equivalent. Use-ewhen combining with-n/-p/-l/-a(e.g.stryke -lane 'p $F[0]').
Semicolons
A newline ends a statement, so you do not need a trailing ; on each line. Use semicolons only when you put more than one statement on the same physical line.
my $answer = 40 + 2
$answer
my $x = 1; my $y = 2; $x + $y
Interactive REPL
Run stryke with no arguments to enter a readline session: line editing, history (~/.stryke_history), tab completion for keywords, lexicals in scope, sub names, methods after -> on blessed objects, and file paths. exit/quit/Ctrl-D leaves. Non-TTY stdin is read as a complete program.
__DATA__
A line whose trimmed text is exactly __DATA__ ends the program; the trailing bytes are exposed via the DATA filehandle.
Stdin / -n / -p / -i
| | |
-l chomps each record and sets $\. eof with no args is true on the last line of stdin or each @ARGV file (Perl-compat).
Text decoding — script reads, require, do, slurp, <>, backticks, par_lines, etc. all use UTF-8 when valid, else Latin-1 octets per line/chunk (matches stock perl tolerance). use open ':encoding(UTF-8)' switches <> to UTF-8 with U+FFFD replacement.
[0x03] PARALLEL PRIMITIVES
Each parallel block runs in its own interpreter context with captured lexical scope — no data races. Use mysync for shared counters. Optional progress => 1 enables an animated stderr bar (TTY) or per-item log lines (non-TTY).
my @doubled = @data |> $_ * 2 , progress => 1
my @evens = @data |> $_ % 2 == 0
my @sorted = @data |> psort { $a <=> $b }
my $sum = @numbers |> preduce { $a + $b }
process, @items
my @hashes = sha256, @blobs, progress => 1
(0, 1e9) |> pmaps { ($_) } |> 10 |> (0, 1e6) |> pgreps { ($_) } |> (0, 1e6) |> pflat_maps { [$_, $_ * 10] } |>
my $sum2 = @nums |> pmap_reduce { $_ * 2 } { $a + $b }
my @squared = @million |> 1000 { $_ ** 2 }
my @once = @inputs |>
my $hist = @words |> preduce_init {}, { my ($acc, $x) = @_; $acc->++; $acc }
8, work, progress => 1 8 { ($_) } fan { ($_) } my @r = 8, my @r = 8 { $_ * $_ }
my @r = (@data |> pipeline)->({ $_ > 10 })->({ $_ * 2 })->(100)->
my @r = @data |> pipeline |> $_ > 10 |> $_ * 2 |> 100 |>
my @r = @data |> par_pipeline |> $_ > 10 |> $_ * 2 |>
my $n = (
source => { (STDIN) },
stages => [ parse_json, transform ],
workers => [4, 2],
buffer => 256,
)
my @r = ((1..1_000) |> par_pipeline_stream)->({ $_ > 500 })->({ $_ * 2 })->()
my @r = (1..1_000) |> par_pipeline_stream |> $_ > 500 |> $_ * 2 |>
my ($tx, $rx) = (128) my ($val, $idx) = ($rx1, $rx2)
my ($v, $i) = ($rx1, $rx2, timeout => 0.5)
my $sync = (3)
3 { $sync->; }
my $pool = (4)
$pool->({ ($_) }) for @tasks
my @results = $pool->()
my @logs = |> , { if // } , { if // } , , @paths my @rs = , my $n = @rs
,
,
stryke - 8 -e
my $cluster = ([, ])
my @r = @huge |> $cluster
Parallel capture safety — workers set Scope::parallel_guard after restoring captured lexicals. Assignments to captured non-mysync aggregates are rejected at runtime; mysync, package-qualified names, and topics ($_/$a/$b) are allowed. pmap/pgrep treat block failures as undef/false; use pfor when failures must abort.
Outer topic $_< — inside nested blocks (fan, fan_cap, map, grep, >{}), $_ is rebound per iteration. Use $_< to access the previous topic, $_<< for two levels up, up to $_<<<< (4 levels). This is a stryke extension — stock Perl 5 has no equivalent.
~> 10 >{ }
$_ = 100
my @r = 3 { $_< }
$_ = 100
my @r = 2 {
my $outer = $_< my $cr = fn { $outer + $_< } $cr->($_) }
$_ = 50; ~> 10 >{ $_ + $_< }
$_ =
1 { $_ = ; }
[0x04] SHARED STATE (mysync)
mysync declares variables backed by Arc<Mutex> shared across parallel blocks. Compound ops (++, +=, .=, |=, &=) hold the lock for the full read-modify-write cycle — fully atomic.
$counter = 0
10000 { $counter++ } $counter
@results
(1..100) |> pfor { @results, $_ * $_ }
%histogram
(0..999) |> pfor { $histogram += 1 }
$q = ()
$pq = heap { $a <=> $b }
For mysync scalars holding a Set, |/& are union/intersection. Without mysync, each thread gets an independent copy.
[0x05] NATIVE DATA SCRIPTING
| Area | Builtins |
|---|---|
HTTP (ureq) |
fetch, fetch_json, fetch_async, await fetch_async_json, par_fetch, serve |
JSON (serde_json) |
json_encode, json_decode |
CSV (csv) |
csv_read (AoH), csv_write, par_csv_read |
| DataFrame | dataframe(path) → columnar; ->filter, ->group_by, ->sum, ->nrow, ->ncol |
SQLite (rusqlite, bundled) |
sqlite(path) → ->exec, ->query, ->last_insert_rowid |
| TOML / YAML | toml_decode, yaml_decode |
| Crypto | sha1, sha224, sha256, sha384, sha512, md5, hmac, hmac_sha256, crc32, uuid, base64_encode/decode, hex_encode/decode |
Compression (flate2, zstd) |
gzip, gunzip, zstd, zstd_decode |
Time (chrono, chrono-tz) |
datetime_utc, datetime_from_epoch, datetime_parse_rfc3339, datetime_strftime, datetime_now_tz, datetime_format_tz, datetime_parse_local, datetime_add_seconds, elapsed |
| Structs / Enums / Classes / Types | struct Point { x => Float }, enum Color { Red, Green } (exhaustive match), class Dog extends Animal { breed: Str; fn bark { } }, abstract class/final class, trait Printable { fn to_str } (enforced, default method inheritance), pub/priv/prot visibility, static count: Int, BUILD/DESTROY, final fn, methods()/superclass()/does(), static::method(), typed my $x : Int |
| Cyberpunk Terminal Art | cyber_city (neon cityscape), cyber_grid (synthwave perspective grid), cyber_rain/matrix_rain (digital rain), cyber_glitch/glitch_text (text corruption), cyber_banner/neon_banner (block-letter banners), cyber_circuit (circuit board), cyber_skull, cyber_eye — all output ANSI-colored Unicode art |
my $data = |>
$data->
8080, ($req) {
my $data = +{ path => $req->, method => $req-> }
status => 200, body => ($data)
}
my @rows = |>
my $df = |>
my $db = |>
$db->()
Point { x => Float, y => Float }
Point { x => Float = 0.0, y => Float = 0.0 } Pair { key, value }
my $p = (x => 1.5, y => 2.0) my $p = (1.5, 2.0) my $p = Point->(x => 1.5, y => 2.0) my $p = ()
$p-> $p->(3.0) $p->
Circle {
radius => Float,
area { 3.14159 * $self-> ** 2 }
($factor: Float) {
(radius => $self-> * $factor)
}
}
my $c = (radius => 5)
$c-> $c->(2)
my $q = $p->(y => 5) my $h = $p-> my @f = $p-> my $c = $p->
$p
my $a = (1, 2)
my $b = (1, 2)
$a == $b
Color { Red, Green, Blue } Maybe { None, Some => Any } Result { Ok => Int, Err => Str }
my $c = () my $m = (42) my $r = ()
$c $m $r
my $light = ()
my $s = ($light) {
() => ,
() => ,
}
() () (120, 40, 99) (80, 20) (80, 24) (, 7) (60, 20) () ()
Animal {
name:
age: Int = 0
speak { . $self-> }
}
Animal {
breed: Str =
bark { . $self-> }
speak { $self-> . } }
my $dog = (name => , age => 5, breed => )
my $dog = (, 5, )
$dog-> $dog->(6) $dog->
$dog->() $dog->()
Math {
Self.($a, $b) { $a + $b }
Self.pi { 3.14159 }
}
(3, 4) ()
Printable { to_str }
Printable {
name:
to_str { $self-> }
}
A, B { }
$dog->() $dog->() $dog->()
my @f = $dog->() my $h = $dog->() my $d2 = $dog->(age => 1) my $d3 = $dog->()
$dog
Secret {
visible: Int = 1
hidden: Int = 42
internal: Str = get_hidden { $self-> } }
Secret {
get_internal { $self-> } }
Shape {
name:
kind { } }
Shape {
radius:
area { 3.14159 * $self-> * $self-> }
}
Counter {
count: Int = 0
name:
BUILD { (() + 1) }
}
my $a = (name => )
my $b = (name => )
()
Logger {
log: Str =
BUILD { $self->() }
}
Resource {
DESTROY { }
}
my $r = ()
$r->()
Drawable { draw }
Drawable {
draw { } }
()->()
Greetable {
greeting { } }
Greetable {
n:
name { $self-> }
}
Greetable {
n:
name { $self-> }
greeting { } }
Singleton { value: Int = 1 }
Secure {
id { 42 }
label { } }
my @m = $dog->() my @p = $dog->()
Base {
class_name { () }
identify { }
}
Base {
identify { }
}
()->()
Vec2 {
x: ; Int = 42
my $add = ($a: Int, $b: Int) { $a + $b }
$add->(3, 4)
($name: Str) { }
()
my $data = {a => [1, 2], b => }
my $s = $data my $copy = $s $copy->
my $f = ($x: Int) { $x * 2 }
$f my $f2 = $f $f2->(21)
(1, 5) |> (5, 1) |>
Sets
Native sets deduplicate by value (internal canonical keys; insertion order preserved for ->values). Use the set(LIST) builtin or Set->new(LIST); |> can supply the list. | / & are union / intersection when either side is a set (otherwise bitwise int ops).
my $s = (1, 2, 2, 3) my $t = (1, 1, 2, 4) |>
my $u = $s | $t my $i = $s & $t $s->(2) $s-> my @v = $s->
[0x06] ASYNC / TRACE / TIMER
my $data = async { |> fetch }
my $file = spawn { |> \& }
($data), ($file)
$counter = 0
trace { 10 { $counter++ } }
my $ms =
my $report = 1000
5
times => 3, backoff =>
(10, )
my $g = gen { $_ for 1..5 }
my $next = $g->
[0x07] CLI FLAGS
All stock perl flags are supported: -0, -a, -c, -C, -d, -D, -e, -E, -f, -F, -g, -h, -i, -I, -l, -m, -M, -n, -p, -s, -S, -t, -T, -u, -U, -v, -V, -w, -W, -x, -X. Perl-style single-dash (-version, -help) and GNU-style double-dash (--version, --help) long forms work. Bundled switches are expanded: -Mstrict → -M strict, -I/tmp → -I /tmp, -V:version → -V version, -lane → -l -a -n -e.
stryke-specific long flags:
| Flag | Description |
|---|---|
--lint / --check |
Parse + compile bytecode without running |
--disasm / --disassemble |
Print bytecode disassembly to stderr before VM execution |
--ast |
Dump parsed AST as JSON and exit |
--fmt |
Pretty-print parsed Perl to stdout and exit |
--profile |
Wall-clock profile: per-line + per-sub timings on stderr |
--flame |
Flamegraph: colored terminal bars when interactive, SVG when piped (stryke --flame x.stk > flame.svg) |
--no-jit |
Disable Cranelift JIT (bytecode interpreter only) |
--compat |
Perl 5 strict-compatibility mode: disable all stryke extensions (|>, struct, enum, match, pmap, #{expr}, etc.) |
--explain CODE |
Print expanded hint for an error code (e.g. E0001) |
--lsp |
Language server over stdio ([0x11]) |
-j N / --threads N |
Set number of parallel threads (rayon) |
--remote-worker |
Persistent cluster worker over stdio ([0x10]) |
--remote-worker-v1 |
Legacy one-shot cluster worker over stdio |
build SCRIPT [-o OUT] |
AOT compile script to standalone binary ([0x0D]) |
doc [TOPIC] |
Interactive reference book with vim-style navigation (stryke doc, stryke doc pmap, stryke doc --toc) |
serve [PORT] [SCRIPT] |
HTTP server (default port 8000): static files (stryke serve), script (stryke serve 8080 app.stk), one-liner (stryke serve 3000 -e 'EXPR') |
fmt [-i] FILE... |
Format source files in place or to stdout (stryke fmt -i . formats all recursively) |
check FILE... |
Parse + compile without executing; report errors with file:line:col (CI/editor integration) |
disasm FILE |
Disassemble bytecode to stderr (learning the VM, debugging perf) |
profile [--flame] [--json] FILE |
Run with profiling; --flame generates SVG, -o FILE writes to file |
bench [FILE|DIR] |
Discover and run benchmarks from bench/ or benches/ (bench_*.stk, b_*.stk) |
init [NAME] |
Scaffold a new project: main.stk, lib/, bench/, t/, .gitignore |
repl [--load FILE] |
Start interactive REPL explicitly, with optional pre-loaded file |
lsp |
Start Language Server Protocol over stdio (equivalent to --lsp) |
completions [SHELL] |
Emit shell completions to stdout (stryke completions zsh > _stryke) |
ast FILE |
Dump parsed AST as JSON to stdout |
prun FILE... |
Run multiple script files in parallel using all cores |
convert [-i] FILE... |
Convert Perl source to stryke syntax with |> pipes |
deconvert [-i] FILE... |
Convert stryke .stk files back to standard Perl syntax |

[0x08] SUPPORTED PERL FEATURES
Data
Scalars $x, arrays @a, hashes %h, refs \$x/\@a/\%h/\&sub, anon [...]/{...}, code refs / closures (capture enclosing lexicals), qr// regex objects, blessed references, native sets (set(LIST) / Set->new(...)), deque(), heap().
Control flow
if/elsif/else/unless, while/until, do { } while/until, C-style for, foreach, last/next/redo with labels, postfix if/unless/while/until/for, ternary, try { } catch ($err) { } finally { }, given/when/default, algebraic match (EXPR) { PATTERN [if EXPR] => EXPR, ... } (regex, array, hash, wildcard, literal patterns; bindings scoped per arm; exhaustive enum variant checking), eval_timeout SECS { ... }.
Operators
Arithmetic, string ./x, comparison (including Raku-style chained comparisons like 1 < $x < 10), eq/ne/lt/gt/cmp, logical &&/||////!/and/or/not, bitwise (|/& are set ops on native Set), assignment + compound (+=, .=, //=, …), regex =~/!~, range .. / ... (incl. flip-flop with eof), arrow ->, pipe-forward |> (stryke extension — threads the LHS as the first argument of the RHS call; see Extensions beyond stock Perl 5).
Regex engine
Three-tier compile (Rust regex → fancy-regex → PCRE2). Perl $ end anchor (no /m) is rewritten to (?:\n?\z). Match =~, dynamic $str =~ $pat, substitution s///, transliteration tr///, flags g/i/m/s/x/e/r, captures $1…$n, named groups → %+/$+{name}, \Q...\E, quotemeta, m///qr//. The /r flag (non-destructive) returns the modified string instead of the match count — auto-injected when s/// or tr/// appear as pipe-forward RHS. Bare /pat/ in statement/boolean context is $_ =~ /pat/.
Subroutines
fn name { } with optional prototype, typed parameters (fn add($a: Int, $b: Int)), default parameter values (fn greet($name = "world")), anon subs/closures, implicit return of last expression (VM), @_/shift/return, postfix return ... if COND, AUTOLOAD with $AUTOLOAD set to the FQN.
Built-ins (selected)
| Category | Functions |
|---|---|
| Array | push, pop, shift, unshift, splice, rev (scalar reverse), sort, map, grep, filter, reduce, fold, fore, e, preduce, scalar, partition, min_by, max_by, zip_with, interleave, frequencies, tally, count_by, pluck, grep_v |
| Hash | keys, values, each, delete, exists, select_keys, top, deep_clone/dclone, deep_merge/dmerge, deep_equal/deq |
| Functional | compose/comp, partial, curry, memoize/memo, once, constantly, complement, juxt, fnil |
| String | chomp, chop, length, substr, index, rindex, split, join, sprintf, printf, uc/lc/ucfirst/lcfirst, chr, ord, hex, oct, crypt, fc, pos, study, quotemeta, trim, lines, words, chars, digits, numbers, graphemes, columns, sentences, paragraphs, sections, snake_case, camel_case, kebab_case |
| Binary | pack, unpack (subset A a N n V v C Q q Z H x w i I l L s S f d + *), vec |
| Numeric | abs, int, sqrt, squared/sq, cubed/cb, expt(B,E), sin, cos, atan2, exp, log, rand, srand, avg, stddev, clamp, normalize, range(N, M) (lazy bidirectional) |
| I/O | print, p, printf, open (incl. open my $fh, files, -| / |- pipes), close, eof, readline, read, seek, tell, sysopen, sysread/syswrite/sysseek, handle methods ->print/->p/->printf/->getline/->close/->eof/->getc/->flush, slurp, input, backticks/qx{}, capture (structured: ->stdout/->stderr/->exit), pager/pg/less (pipes value into $PAGER; TTY-gated), binmode, fileno, flock, getc, select, truncate, formline, read_lines, append_file, to_file, read_json, write_json, tempfile, tempdir, xopen/xo (system open — open on macOS, xdg-open on Linux), clip/clipboard/pbcopy (copy to clipboard), paste/pbpaste (read clipboard) |
| Directory | opendir, readdir, closedir, rewinddir, telldir, seekdir, files, filesf/f, fr (recursive files, lazy iterator), dirs/d, dr (recursive dirs, lazy iterator), sym_links, sockets, pipes, block_devices, char_devices |
| File tests | -e, -f, -d, -l, -r, -w, -s, -z, -x, -t (defaults to $_) |
| System | system, exec, exit, chdir, mkdir, unlink, rename, chmod, chown, chroot, stat, lstat, link, symlink, readlink, glob, glob_par, glob_match, which_all, par_sed, par_find_files, par_line_count, ppool, barrier, fork, wait, waitpid, kill, alarm, sleep, times, dump, reset |
| System Stats | mem_total, mem_free, mem_used, swap_total, swap_free, swap_used, disk_total, disk_free, disk_avail, disk_used, load_avg, sys_uptime, page_size, os_version, os_family, endianness, pointer_width, proc_mem/rss |
| Sockets | socket, bind, listen, accept, connect, send, recv, shutdown, socketpair |
| Network | gethostbyname, gethostbyaddr, getpwnam, getpwuid, getpwent/setpwent/endpwent, getgrnam, getgrgid, getgrent/setgrent/endgrent, getprotobyname, getprotobynumber, getservbyname, getservbyport |
| SysV IPC | msgctl, msgget, msgsnd, msgrcv, semctl, semget, semop, shmctl, shmget, shmread, shmwrite (stubs — runtime error) |
| Type | defined, undef, ref, bless, tied, untie, type_of, byte_size |
| Serialization | to_json, to_csv, to_toml, to_yaml, to_xml, to_html, to_markdown, to_table/tbl, ddump, stringify/str, json_encode/json_decode |
| Visualization | sparkline/spark, bar_chart/bars, flame/flamechart, histo, gauge, spinner, spinner_start/spinner_stop |
| Control | die, warn, eval, do, require, caller, wantarray, goto LABEL, continue { } on loops, prototype |
| Number Theory | prime_factors, divisors, num_divisors, sum_divisors, is_perfect, is_abundant, is_deficient, collatz_length, collatz_sequence, lucas, tribonacci, nth_prime, primes_up_to/sieve, next_prime, prev_prime, triangular_number, pentagonal_number, is_pentagonal, perfect_numbers, twin_primes, goldbach, prime_pi, totient_sum, subfactorial, bell_number, partition_number, multinomial, is_smith, aliquot_sum, abundant_numbers, deficient_numbers |
| Statistics | skewness, kurtosis, linear_regression, moving_average, exponential_moving_average, coeff_of_variation, standard_error, normalize_array, cross_entropy, euclidean_distance, minkowski_distance, mean_absolute_error, mean_squared_error, median_absolute_deviation, winsorize, weighted_mean |
| Geometry | area_circle, area_triangle, area_rectangle, area_trapezoid, area_ellipse, circumference, perimeter_rectangle, perimeter_triangle, polygon_area, sphere_volume, sphere_surface, cylinder_volume, cone_volume, heron_area, point_distance, midpoint, slope, triangle_hypotenuse, degrees_to_compass |
| Financial | npv, depreciation_linear, depreciation_double, cagr, roi, break_even, markup, margin, discount, tax, tip |
| Encoding | morse_encode/morse, morse_decode, nato_phonetic, int_to_roman, roman_to_int, binary_to_gray, gray_to_binary, pig_latin, atbash, braille_encode, phonetic_digit, to_emoji_num |
| Color | hsl_to_rgb, rgb_to_hsl, hsv_to_rgb, rgb_to_hsv, color_blend, color_lighten, color_darken, color_complement, color_invert, color_grayscale, random_color, ansi_256, ansi_truecolor, color_distance |
| Constants | pi, tau, phi, epsilon, speed_of_light, gravitational_constant, planck_constant, avogadro_number, boltzmann_constant, elementary_charge, electron_mass, proton_mass, i64_max, i64_min, f64_max, f64_min |
| Matrix | matrix_transpose, matrix_inverse, matrix_hadamard, matrix_power, matrix_flatten, matrix_from_rows, matrix_map, matrix_sum, matrix_max, matrix_min |
| DSP / Signal | convolution, autocorrelation, fft_magnitude, zero_crossings, peak_detect |
| Algorithms | next_permutation, is_balanced_parens, eval_rpn, merge_sorted, binary_insert, reservoir_sample, run_length_encode_str, run_length_decode_str, range_expand, range_compress, group_consecutive_by, histogram, bucket, clamp_array, normalize_range |
| Validation | luhn_check, is_valid_hex_color, is_valid_cidr, is_valid_mime, is_valid_cron, is_valid_latitude, is_valid_longitude |
| Text | ngrams, bigrams, trigrams, char_frequencies, is_anagram, is_pangram, mask_string, chunk_string, camel_to_snake, snake_to_camel, collapse_whitespace, remove_vowels, remove_consonants, strip_html, metaphone, double_metaphone, initials, acronym, superscript, subscript, leetspeak, zalgo, sort_words, unique_words, word_frequencies, string_distance, string_multiply |
| Misc | fizzbuzz, roman_numeral_list, look_and_say, gray_code_sequence, sierpinski, mandelbrot_char, game_of_life_step, tower_of_hanoi, pascals_triangle, truth_table, base_convert, roman_add, haversine, bearing, bmi, bac_estimate |
Perl-compat highlights
-
OOP —
@ISA(incl.our @ISAoutsidemain), C3 MRO (live, not cached),$obj->SUPER::method.tiefor scalars/arrays/hashes withTIESCALAR/TIEARRAY/TIEHASH,FETCH/STORE, plusEXISTS/DELETEon tied hashes.tiedreturns the underlying object;untieremoves the tie. -
use overload—'op' => 'method'or\&handler; binary dispatch with(invocant, other),nomethod, unaryneg/bool/abs,""for stringification,fallback => 1. -
$?/$|— packed POSIX status fromsystem/backticks/pipe close; autoflush on print/printf. -
$.— undef until first successful read, then last-read line count. -
print/p/printfwith no args — uses$_(andprintf's format defaults to$_). -
Bareword statement —
name;calls a scwub with@_ = ($_). -
Typeglobs —
*foo = \&bar,*lhs = *rhscopies sub/scalar/array/hash/IO slots; package-qualified*Pkg::namesupported. -
%SIG(Unix) —SIGINT/SIGTERM/SIGALRM/SIGCHLDas code refs; handlers run between statements/opcodes viaperl_signal::poll.IGNOREandDEFAULThonored. -
format/write— partial:format NAME = ... .registers a template; pictures@<<<<,@>>>>,@||||,@####,@****, literal@@.formlinepopulates$^A.write(no args) uses$~to stdout. Not yet:write FILEHANDLE,$^. -
@INC/%INC/require/use—@INCis built from-I,vendor/perl, systemperl's@INC(setSTRYKE_NO_PERL_INCto skip), the script dir,STRYKE_INC, then..List::Utilis implemented natively in Rust (src/list_util.rs).use Module qw(a b);honors@EXPORT_OK/@EXPORT. Built-in pragmas (strict,warnings,utf8,feature,open,Env) do not load files. -
chunked/windowed/fold— Use pipe-forward:LIST |> chunked(N),LIST |> windowed(N),LIST |> fold { BLOCK }(same forreduce).List::Util::fold/qw(...) |> List::Util::fold { }aliasList::Util::reduce. List context → arrayrefs per chunk/window or the folded value; scalar context → chunk/window count where applicable.my @pairs = (1, 2, 3, 4) |> (2) my @slide = (1, 2, 3) |> (2) my @pipe = (10, 20, 30) |> (2) my $sum = (1, 2, 3, 4) |> fold { $a + $b } my $cat = |> fold { $a . $b } -
use strict— refs/subs/vars modes (per-modeuse strict 'refs'etc.).strict refsrejects symbolic derefs at runtime;strict varsrequires a visible binding. -
BEGIN/UNITCHECK/CHECK/INIT/END— Perl order;${^GLOBAL_PHASE}matches Perl in tree-walker and VM. -
String interpolation —
$var#{23 * 52},$h{k},$a[i],@a,@a[slice](joined with$"),$#ain slice indices,$0,$1..$n. Escapes:\n \r \t \a \b \f \e \0,\x{hex},\xHH,\u{hex},\o{oct},\NNN(octal),\cX(control),\N{U+hex},\N{UNICODE NAME},\U..\E,\L..\E,\u,\l,\Q..\E. -
__FILE__/__LINE__— compile-time literals. -
Heredocs
<<EOF, POD skipping, shebang handling,qw()/q()/qq()with paired delimiters. -
Special variables — large set of
${^NAME}scalars pre-seeded; seeSPECIAL_VARIABLES.md. Still missing vs Perl 5:English, full$^Vas a version object.
Extensions beyond stock Perl 5
-
Native CSV (
csv_read/csv_write), columnardataframe, embeddedsqlite. -
HTTP (
fetch/fetch_json/fetch_async/par_fetch), JSON (json_encode/json_decode). -
Crypto, compression, time, TOML, YAML helpers (see [0x05]).
-
All parallel primitives in [0x03] (
pmap,fan,pipeline,par_pipeline_stream,pchannel,pselect,barrier,ppool,glob_par,par_walk,par_lines,par_sed,par_find_files,par_line_count,pwatch,watch). -
Distributed compute ([0x10]):
cluster([...])builds an SSH worker pool;pmap_on $cluster { } @listandpflat_map_on $cluster { } @listfan a map across persistent remote workers with fault tolerance and per-job retries. -
Standalone binaries ([0x0D]):
stryke build SCRIPT -o OUTbakes a script into a self-contained executable. -
Inline Rust FFI ([0x0E]):
rust { pub extern "C" fn ... }blocks compile to a cdylib on first run, dlopen + register as Perl-callable subs. -
Bytecode cache ([0x0F]):
STRYKE_BC_CACHE=1skips parse + compile on warm starts via on-disk.pecbundles. -
Language server ([0x11]):
stryke lspruns an LSP server over stdio with diagnostics, hover, completion. -
mysyncshared state ([0x04]). -
frozen my(orconst my— same thing, more familiar spelling),typed my,struct,enum,class(full OOP withextends/impl),trait, algebraicmatch,try/catch/finally,eval_timeout,retry,rate_limit,every,gen { ... yield }. -
Raku-style chained comparisons —
1 < $x < 10desugars to(1 < $x) && ($x < 10)at parse time. Works with all comparison operators (<,<=,>,>=,lt,le,gt,ge) and chains of any length. -
Default parameter values —
fn greet($name = "world"),fn range(@vals = (1,2,3)),fn config(%opts = (debug => 0)). Defaults evaluated at call time when argument not provided. -
Functional composition —
compose,partial,curry,memoize,once,constantly,complement,juxt,fnil:my $f = (fn { $_ + 1 }, fn { $_ * 2 }) $f(5) my $add5 = (fn { $_ + $_ }, 5) $add5(3) my $cadd = (fn { $_ + $_ }, 2) $cadd(1)(2) my $fib = (fn { ... }) my $init = (fn { () }) -
Deep structure utilities —
deep_clone/dclone,deep_merge/dmerge,deep_equal/deq,tally:my $b = ($a) my $m = (\%a, \%b) ([1,2,{x=>3}], [1,2,{x=>3}]) my $t = (,,) -
Bare
_as topic shorthand — in any expression position, bare_is equivalent to$_. Inspired by Raku's WhateverCode and Scala's placeholder syntax. Enables ultra-concise blocks:map{_*2}instead ofmap{$_ * 2}. The sigil-free form compresses better — no spaces needed around_when adjacent to operators. -
Outer topic
$_<— access the enclosing scope's$_from nested blocks; up to 4 levels ($_<through$_<<<<). See [0x03]. -
fore(e) — side-effect-only list iterator (likemapbut void, returns item count). Works with{ BLOCK } LIST, blocklesse EXPR, LIST, and pipe-forward|> e p. Use for print/log/accumulator loops. -
Pipe-forward
|>— parse-time desugaring (zero runtime cost); threads the LHS as the first argument of the RHS call, left-associative.map,grep/filter,sort, andeaccept blockless expressions on the RHS of|>— no{ }required for simple transforms:my @titles = $url |> fetch_json |> json_decode |> files |> // |> e -f $_ && () .. |> |> .. |> // |> .. |> |> 1..10 |> $_ > 5 |> |> 1..5 |> $_ * $_ |> |> |> // |> |> |> |> 16 |> |> |> |> |> |> |> |> |> //g |> |> //g |> ddump |>Pipeline builtins — designed for
|>chains:($fh) |> lines |> ddump |> |> words |> ddump |> |> chars |> ddump |> |> trim |> |> snake_case |> |> camel_case |> |> kebab_case |> 1 .. 100 |> avg |> 1 .. 100 |> stddev |> |> chars |> frequencies |> ddump |> |> chars |> frequencies |> 3 |> ddump |> 1 .. 20 |> count_by { $_ % 2 == 0 ? : } |> ddump |> 1 .. 10 |> 3, 7 |> ddump |> 1 .. 5 |> normalize |> ddump |> 1 .. 10 |> |> ddump |> my $h = {a => 1, b => 2, c => 3} $h |> , |> ddump |> my @people = ({name=>,age=>30}, {name=>,age=>25}) @people |> |> ddump |> my $data = {a => 1, b => [2,3]} $data |> to_json |> @people |> to_csv |> my $cfg = {title => , => {name => , version => }} $cfg |> to_toml |> $data |> to_yaml |> $data |> to_xml |> fr |> +{name => $_, size => (size)} |> th |> () |> fr |> +{name => $_, size => (size)} |> tmd |> () |> ~> +{name => $_, size => (size)} ($_, ) ~> +{name => $_, size => (size)} ($_, ) fr |> +{name => $_, size => (size)} |> tbl |> fr |> +{name => $_, size => (size)} |> tmd |> (3,7,1,9,4,2,8,5) |> spark |> { ((100)) } 1..20 |> spark |> |> freq |> bars |> () |> words |> freq |> bars |> fr |> { ($_) } |> freq |> bars |> () |> chars |> freq |> histo |> { ((10)) } 1..100 |> freq |> histo |> |> freq |> tbl |> fr |> +{name => $_, size => (size)} |> tbl |> fr |> +{name => $_, ext => ($_)} |> tbl |> ({main => {parse => 30, eval => {compile => 15, run => 45}}, init => 10}) |> () |> chars |> freq |> flame |> (0.73) (45, 100) fr |> cnt |> ($_, 500) |> my $r = { 2; 42 } my $data = { ($url) } my $s = () (); (); () ($s) fr |> +{name => $_, size => (size)} |> tmd |> () |> words |> freq |> tbl |> my %f = %f |> bars |> %f |> histo |> %f |> tbl |> %f |> flame |> %f |> spark |> ~> ~> ~> ~> (3,7,1,9,4) $data |> str |> my $fn = fn { $_ * 2 } $fn |> str |> (1, 3) |> str |> my $f = ($x: Int) { $x + 1 } my $f2 = $f |> str |> $f2->(5) |> my ($yes, $no) = partition { $_ > 5 } 1..10 my $smallest = min_by { } @words my $largest = max_by { } @words my @sums = zip_with { $_0 + $_1 } [1,2,3], [10,20,30] my $nested = {key => [1, {nested => }]} $nested |> ddump |> my $text = $text |> my @lines = , my $tmp = () my $dir = () , {a => 1, b => 2} my $obj = my @merged = interleave [1,2,3], [10,20,30] , my @bins =Blockless
|>rules forgrep/filter: string literals test$_ eq EXPR, numbers test$_ == EXPR, regexes test$_ =~ EXPR, anything else (e.g.defined) uses standard Perl grep semantics (sets$_, evaluates expression).Precedence:
|>binds looser than||but tighter than?:/and/or/not— the slot sits betweenparse_ternaryandparse_or_wordin the parser stack. So$x + 1 |> fparses asf($x + 1), and0 || 1 |> yesparses asyes(0 || 1). The RHS must be a call, builtin, method invocation, bareword, or coderef expression; bare binary expressions / literals on the right are a parse error (42 |> 1 + 2is rejected). -
~>macro (thread,t,->>) — Clojure-inspired threading macro for clean multi-stage pipelines without repeating|>. Stages are bare function names, functions with blocks, parenthesized callsname(args)where$_(or bare_) is the threaded-value placeholder (must appear at least once in args, can sit in any position — first, last, middle, nested), or anonymous blocks (>{}/fn {}). Use|>after~>to continue piping. Blocks can use bare_for maximum conciseness —map{_*2}is equivalent tomap{$_ * 2}.~>1:10map{_*2}fi{_>5} @data = 1..20 ~> @data {_ % 2 == 0} {_ * _} {$_1 <=> $_0} |> |> @data |> {_ % 2 == 0} |> {_ * _} |> {$_1 <=> $_0} |> |> @nums = 1..100 ~> @nums {_ % 3 == 0} {_ * 2} {_ > 50} {$_1 <=> $_0} |> 5 |> |> ~> 100 >{_ / 2} >{_ + 10} >{_ * 3} @users = ({name=>,age=>30}, {name=>,age=>25}, {name=>,age=>35}) ~> @users {$_0-> <=> $_1->} {_->} |> |> ~> add2 { $_0 + $_1 } ~> 10 (_, 5) ~> 10 (5, _) ~> 10 (_, 5) (_, 100) add3 { $_0 + $_1 + $_2 } ~> 10 (5, _, 10) mul { $_0 * $_1 } ~> 10 (_ + 1, 2) ~> (1..10) reduce { $_0 + $_1 } @data = (3,1,4,1,5,9,2,6,5,3) ~> @data { $_0 <=> $_1 } uniq |> |>When to use
~>vs|>:~>: Best for chains of block-taking functions (map { },grep { },sort { },reduce { })|>: Best for blockless expressions (map $_ * 2,grep $_ > 5) and unary functions
1..20 |> $_ % 2 == 0 |> $_ * $_ |> $_ > 50 |> |> ~> @data { ($_) } { ($_) } { $_0 cmp $_1 } |>Stage types:
- Bare function:
~> "hello" uc trim— applies unary builtins in sequence - Function with block:
~> @data map{_ * 2} grep{_ > 5}— block-taking functions (bare_or$_) - Anonymous block:
~> 5 >{_ * 2}orfn { }— custom transforms
Termination:
|>ends the~>macro:~> @l f1 f2 f3 |> f4parses as(~> @l f1 f2 f3) |> f4.Numeric/statistical pipelines:
~> (1..10) {_ % 2 == 0} {_ * _} ~> (1..10) {_ * _} ~> (1..100) {_ % 7 == 0} {_ * 2} ~> (1..50) {_ % 2 == 1} {_ ** 2} ~> (1..10) ~> (1,1,2,2,3,3,4,5,5) ~> (1..20) ~> (1..10) ~> (1..10) ~> (1..10)String pipelines:
~> ~> (,,,) ~> (,,,)Sorting and aggregation:
~> (5,2,8,1,9,3) { $_0 <=> $_1 } ~> (5,2,8,1,9,3) { $_0 <=> $_1 } ~> (1,2,3,4,5,6) pairkeys |> |> ~> (1,2,3,4,5,6) pairvalues |> |>Compare with
|>syntax (same result, more typing):~> (1..10) {_ % 2 == 0} {_ * _} (1..10) |> {_ % 2 == 0} |> {_ * _} |> sum |>Language comparison — the same 10-stage pipeline:
~>) $s = ($s) $s = ($s) $s = ($s) $s =~ ; $s =~ $s = ($s) $s =~ ; $s =~ ($s),// JavaScript: no built-in case converters, needs helper functions const s...; const s.; const s...; const s.. + s.; const s...; let s = " hello world "; s = s.; s = s.; s = ; s = s.; s = ; s = ; s = ; s = ; console.log;# Python 3: no built-in case converters, needs helper functions return return return = = = = = = + # ucfirst = = =stryke: 1 line. Perl 5: 10+ lines + CPAN. JavaScript: 15+ lines. Python: 15+ lines.
Lisp hell — without
|>, the same pipeline becomes unreadable:|> trim |> |> rev |> |> |> rev |> snake_case |> camel_case |> kebab_case |> rev |> |> |> trim |> to_json |> ((((((((((((((()))))))))))))))The pipe-forward operator eliminates the cognitive overhead of matching parentheses and reading inside-out.
-
Short aliases — 1-3 character aliases for common functions, designed for
~>/|>pipelines:~> ~>Alias Function Alias Function Alias Function Thread/Pipe String Case ~>threadtmtrimscsnake_caseplenlengthcccamel_caseprprintufcucfirstkckebab_caselfclcfirstqmquotemetaList revgrgrepchcharsSerialize sosortlnlinestjto_jsonrdreducewdwordstyto_yamlhdhead/takettto_tomltltailUnique/Dedup tcto_csvdrpdrop/skipuquniqtxto_xmlflflattendupdedupthto_htmlcptcompactshufshuffletmdto_markdownddddumpxoxopencatslurpDeserialize ilinterleaveStats jdjson_decodeenenumeratesqsqrtydyaml_decodewiwith_indexmedmediantdtoml_decodechkchunkstdstddevxdxml_decodezpzipvarvariancejejson_encodefstfirstclpclampyeyaml_encodefrqfrequenciesnrmnormalizetetoml_encodewinwindowedxexml_encodeCrypto File/Path s1sha1Encoding slslurps256sha256b64ebase64_encodewfwrite_filem5md5b64dbase64_decoderlread_linesuiduuidhxehex_encoderbread_byteshxdhex_decodeafappend_fileHTTP ueurl_encoderjread_jsonftfetchudurl_decodewjwrite_jsonftjfetch_jsongzgzipbnbasenameftafetch_asyncugzgunzipdndirnamehrhttp_requestzstzstdrprealpathpftpar_fetchuzstzstd_decodewhwhichpwdgetcwdCSV/Data DateTime tftempfilecrcsv_readutcdatetime_utctdrtempdircwcsv_writenowdatetime_now_tzhngethostnamepcrpar_csv_readdtedatetime_from_epochelelapseddfdataframedtfdatetime_strftimedefdefinedsqlsqliterssproc_mem -
fnkeyword — alias forsub. Bothfn name { }andfn { }work identically tosub.($x) { $x * 2 } (21) my $f = fn { _ * 2 } $f->(21) -
Closure arguments
$_0,$_1, ...$_N— numeric closure arguments inspired by Swift. All arguments passed to any fn (named or anonymous) are available as$_0(first),$_1(second),$_2(third), up to$_Nfor any number of arguments. These work alongside or instead of Perl's@_,$_,$a,$b. Both$_, bare_, and$_0refer to the first argument —_ * 2,$_ * 2, and$_0 * 2are all equivalent. Use bare_for maximum conciseness in blocks.(1..5) |> { $_0 * 2 } |> |> (1..10) |> { $_0 % 2 == 0 } |> sum |> (5,2,8,1) |> { $_0 <=> $_1 } |> |> (1..5) |> reduce { $_0 + $_1 } |> (1..5) |> reduce { $_0 * $_1 } |> (,,) |> { ($_0) <=> ($_1) } |> |> ~> (1..5) { $_0 * 2 } ~> (1..5) reduce { $_0 + $_1 } ~> (1..5) reduce { $_0 * $_1 } ~> (5,2,8,1) { $_0 <=> $_1 } |> |> ~> (1..10) { $_0 % 2 == 0 } { $_0 * $_0 } my $add3 = fn { $_0 + $_1 + $_2 } $add3->(1, 2, 3) my $mul5 = fn { $_0 * $_1 * $_2 * $_3 * $_4 } $mul5->(1, 2, 3, 4, 5) my $concat = fn { } $concat->(, , , ) my $join_args = fn { (, @_) } $join_args->(, , ) my $double = fn { $_0 * 2 } my $triple = fn { $_0 * 3 } 5 |> $double |> $triple |> my $add = fn { $_0 + $_1 } (1..5) |> reduce { $add->($_0, $_1) } |> my $mul3 = fn { $_0 * $_1 * $_2 } $mul3->(2, 3, 4) my $cmp = fn { $_0 <=> $_1 } (5,2,8,1) |> { $cmp->($_0, $_1) } |> |> double { $_0 * 2 } triple { $_0 * 3 } add5 { $_0 + 5 } square { $_0 ** 2 } half { $_0 / 2 } ~> 2 inc { $_0 + 1 } dec { $_0 - 1 } dbl { $_0 * 2 } neg { -$_0 } abs_ { ($_0) } ~> 5 wrap { } upper { ($_0) } trim_ { ($_0) } rev_ { ($_0) } bang { } ~> is_even { $_0 % 2 == 0 } ~> (1..10) {(_)} ~> (1..5) {(_)} add { $_0 + $_1 } mul3 { $_0 * $_1 * $_2 } (3, 4) (2, 3, 4) ~> 5 >{_ * 2} >{_ + 10} ~> 100 >{_ / 2} >{_ + 10} >{_ * 3} -
Block params
{ |$var| body }— name the block's implicit arguments with Ruby-style|$params|at the start of a block. For single-param blocks (map,grep,each), the param aliases$_. For two-param blocks (sort,reduce), they alias$a/$b. For N≥3 params, they alias$_,$_1,$_2, etc.{ |$n| $n * $n }, 1..5 { |$x| $x > 3 }, 1..6 (1..3) |> { |$n| $n + 10 } |> { |$x, $y| $y <=> $x }, 3, 1, 4, 1, 5 reduce { |$acc, $val| $acc + $val }, 1..10
stryke is not a full perl replacement: many real .pm files (especially XS modules) will not run. See PARITY_ROADMAP.md.
[0x09] ARCHITECTURE
┌─────────────────────────────────────────────────────┐
│ Source ──▶ Lexer ──▶ Parser ──▶ AST │
│ │ │
│ ▼ │
│ Compiler (compiler.rs) │
│ │ │
│ ▼ │
│ Bytecode (bytecode.rs) │
│ │ │
│ ┌───────────────────────┼───────────┐ │
│ ▼ ▼ ▼ │
│ Tree-walker fallback VM (vm.rs) Cranelift │
│ (interpreter.rs) │ JIT │
│ ▼ │
│ Rayon work-stealing scheduler │
│ CORE 0 │ CORE 1 │ ... │ CORE N │
└─────────────────────────────────────────────────────┘
- Lexer (
src/lexer.rs) — context-sensitive tokenizer for Perl's ambiguous syntax (regex vs division, hash vs modulo, heredocs, interpolation). - Parser (
src/parser.rs) — recursive descent + Pratt precedence climbing. - Compiler / VM (
src/compiler.rs,src/vm.rs) — match-dispatch interpreter;try_vm_executeruns bytecode first then falls back to tree-walker onCompileError::Unsupportedor unsupported ops. Compiled subs use slot ops for frame-localmyscalars (O(1)). Lowering coversBEGIN/UNITCHECK/CHECK/INIT/ENDwithOp::SetGlobalPhase,mysync,tie, scalar compound assigns viaScope::atomic_mutate, regex values, named-sub coderefs, folds,pcache,pselect,par_lines,par_walk,par_sed,pwatch,each, four-argsubstr, dynamickeys/values/delete/exists, etc. - JIT (
src/jit.rs) — Cranelift two-tier JIT (linear-sub + block) with cachedOwnedTargetIsa, tiered afterSTRYKE_JIT_SUB_INVOKES(default 50) interpreter invocations. Block JIT validates a CFG, joins typedi64/f64slots at merges, and compiles straight-line numeric hot loops. Disable with--no-jit/STRYKE_NO_JIT=1. - Feature work policy — prefer new VM opcodes in
bytecode.rs, lowering incompiler.rs, implementation invm.rs. Do not add newExprKind/StmtKindvariants for new behavior. - Tree-walker (
src/interpreter.rs) — fallback execution withArc<RwLock>for thread-safe ref types;fib_like_tail.rsspecializes simple integer-base-case recursivef(n-1)+f(n-2)patterns to avoid nested scope frames. - Parallelism — each parallel block spawns an isolated interpreter with captured scope; rayon does work-stealing.
[0x0A] EXAMPLES
# sets: dedupe + union / intersection (`scalar` gives member count, like `scalar @array`)
[0x0B] BENCHMARKS
stryke vs perl5 vs python3 vs ruby vs julia vs raku vs luajit
bash bench/run_bench_all.sh — stryke vs perl 5.42.2 vs Python 3.14.4 vs Ruby 4.0.2 vs Julia 1.12.6 vs Raku vs LuaJIT on Apple M5 18-core. Mean of 10 hyperfine runs with 3 warmups; includes process startup (not steady-state). Values <1.0x mean stryke is faster.
stryke benchmark harness (multi-language)
──────────────────────────────────────────────
stryke: stryke v0.7.7
perl5: perl 5.42.2 (darwin-thread-multi-2level)
python: Python 3.14.4
ruby: ruby 4.0.2 +PRISM [arm64-darwin25]
julia: julia 1.12.6
raku: Rakudo Star v2026.03
luajit: LuaJIT 2.1.1774896198
cores: 18
warmup: 3 runs
measure: hyperfine (min 10 runs)
bench stryke ms perl5 ms python3 ms ruby ms julia ms raku ms luajit ms vs perl5 vs python vs ruby vs julia vs raku vs luajit
--------- --------- -------- ---------- ------- -------- ------- --------- -------- --------- ------- -------- ------- ---------
startup 3.3 2.3 14.3 23.8 68.3 71.4 1.5 1.43x 0.23x 0.14x 0.05x 0.05x 2.20x
fib 6.7 184.0 60.1 56.6 76.4 261.3 4.7 0.04x 0.11x 0.12x 0.09x 0.03x 1.43x
loop 3.2 91.2 191.4 77.8 78.1 159.4 4.3 0.04x 0.02x 0.04x 0.04x 0.02x 0.74x
string 4.0 10.2 26.8 44.7 83.2 124.2 3.3 0.39x 0.15x 0.09x 0.05x 0.03x 1.21x
hash 6.8 24.6 25.5 32.6 105.7 143.7 2.0 0.28x 0.27x 0.21x 0.06x 0.05x 3.40x
array 9.8 24.8 33.2 39.4 88.2 843.9 59.0 0.40x 0.30x 0.25x 0.11x 0.01x 0.17x
regex 12.6 89.7 264.0 234.3 94.4 25043.8 178.2 0.14x 0.05x 0.05x 0.13x 0.00x 0.07x
map_grep 13.9 48.8 35.9 48.8 90.5 492.4 3.3 0.28x 0.39x 0.28x 0.15x 0.03x 4.21x
stryke vs perl5 — faster on all 8 benches: fib 27x, loop 29x, regex 7.1x, hash 3.6x, map_grep 3.5x, array 2.5x, string 2.6x, startup 1.4x.
stryke vs python3 — faster on all 8 benches: loop 60x, regex 21x, string 6.7x, fib 9.0x, startup 4.3x, hash 3.8x, array 3.4x, map_grep 2.6x.
stryke vs ruby — faster on all 8 benches: regex 19x, loop 24x, string 11x, fib 8.4x, startup 7.2x, hash 4.8x, array 4.0x, map_grep 3.5x.
stryke vs julia — faster on all 8 benches: loop 24x, startup 21x, string 21x, hash 16x, fib 11x, array 9.0x, regex 7.5x, map_grep 6.5x. Julia timings include LLVM JIT compilation cost — in long-running sessions Julia compiles to native code and would match C on numeric work. These benchmarks measure scripting use cases where startup + single-shot execution matters.
stryke vs raku — faster on all 8 benches by 20-2000x. Raku's regex is 25044ms vs stryke's 12.6ms (1988x). Raku (Perl 6) runs on MoarVM with heavy startup (~70ms+). Raku's strengths are language features (grammars, gradual typing, junctions), not runtime speed.
stryke vs luajit — LuaJIT is the fastest dynamic language runtime ever built (tracing JIT by Mike Pall). stryke beats LuaJIT on 3 of 8 benchmarks: loop (0.74x), array (0.17x), regex (0.07x). Near-parity on string (1.21x) and fib (1.43x). LuaJIT wins on hash (3.4x) and map_grep (4.2x) where its tracing JIT eliminates all dispatch overhead. LuaJIT uses Lua patterns (not PCRE) for the regex bench. stryke offers what LuaJIT cannot: $_, -ne, regex literals, PCRE, parallel primitives (pmap, pmaps, pgrep), streaming iterators, and one-liner ergonomics.
stryke vs perl5 (detailed)
bash bench/run_bench.sh — includes noJIT and perturbation columns for honesty verification. Re-run to get current numbers on your hardware.
Parallel & streaming speedup (100k items, $_ * 2)
map (eager, sequential): 0.01s — inline execution, zero per-item overhead
maps (streaming, sequential): 0.11s — lazy iterator, single interpreter reused
pmap (eager, 18 cores): 0.14s — pre-built interpreter pool, rayon par_iter
pmaps (streaming, 18 cores): 0.49s — background worker threads, bounded channel
maps/pmaps are streaming — they return lazy iterators that never materialize the full result list. Use pmaps for pipelines over billions of items where holding all results in memory is impractical, or with take for early termination: range(0, 1e9) |> pmaps { expensive($_) } |> take 10 |> ep.
[0x0C] DEVELOPMENT & CI
Pull requests and pushes to main run .github/workflows/ci.yml (Check, Test, Format, Clippy, Doc, Parity, Release Build).
Cargo.lockis committed (CI uses--locked). If your global gitignore strips it, force-add updates:git add -f Cargo.lock.- Disable JIT:
STRYKE_NO_JIT=1orstryke --no-jit. - Parity work is tracked in
PARITY_ROADMAP.md.
[0x0D] STANDALONE BINARIES (stryke build)
Compile any Perl script to a single self-contained native executable. The output is a copy of the stryke binary with the script source embedded as a zstd-compressed trailer. scp it to any compatible machine and run it — no perl, no stryke, no @INC, no CPAN.
What's in the box:
- Parse / compile errors are surfaced at build time, not when users run the binary.
- The embedded script is detected at startup by a 32-byte trailer sniff (~50 µs), then decompressed and executed by the embedded VM. A script with no trailer runs normally as
stryke. - Builds are idempotent:
stryke build app.stk -o appfollowed bystryke --exe app build other.stk -o otherstrips the previous trailer first, so binaries never stack. - Unix: the output is marked
+xautomatically. macOS: unsigned —codesignbefore distribution if your environment requires it. - Current AOT runtime sets
@INC = ("."); modules outside the embedded script have to be inlined. (requireof a local.pmnext to the running binary still works.)
Under the hood (src/aot.rs): trailer layout is [zstd payload][u64 compressed_len][u64 uncompressed_len][u32 version][u32 reserved][8B magic b"STRYKEAOT"]. ELF / Mach-O loaders ignore bytes past the mapped segments so the embedded payload is invisible to the OS loader. The b"STRYKEAOT" magic plus version byte lets a future pre-compiled-bytecode payload ship alongside v1 without breaking already-shipped binaries.
# 13 MB binary, no external runtime required:
[0x0E] INLINE RUST FFI (rust { ... })
Drop a block of Rust directly into a Perl script. On first run, stryke compiles it to a cdylib (cached at ~/.cache/stryke/ffi/<hash>.{dylib,so}), dlopens it, and registers every exported function as a regular Perl-callable sub.
rust {
(a: i64, b: i64) -> { a + b }
(x: f64, i64) -> {
( a, b) = (0i64, 1i64)
for _ 0..n { t = a + ; a = ; b = ; }
}
}
21, 21 1.5, 2.0, 3.0 50
v1 signature table (parser rejects anything outside this — users write private Rust helpers freely, only exported fns matching the table become Perl-callable):
| rust signature | perl call |
|---|---|
fn() -> i64 / fn(i64, ...) -> i64 (1–4 args) |
integer → integer |
fn() -> f64 / fn(f64, ...) -> f64 (1–3 args) |
float → float |
fn(*const c_char) -> i64 |
string → integer |
fn(*const c_char) -> *const c_char |
string → string |
Requirements: rustc must be on PATH. First-run compile costs ~1 second; subsequent runs hit the cache and pay only dlopen (~10 ms). #[no_mangle] is auto-inserted by the wrapper — you don't need to write it. The body is #![crate_type = "cdylib"] with use std::os::raw::c_char; use std::ffi::{CStr, CString}; already in scope.
How it works (src/rust_sugar.rs, src/rust_ffi.rs): the source-level pre-pass desugars every top-level rust { ... } into a BEGIN { __stryke_rust_compile("<base64 body>", $line); } call. The __stryke_rust_compile builtin hashes the body, compiles via rustc --edition=2021 -O if the cache is cold, libc::dlopens the result, dlsyms each detected signature, and stores the raw symbol + arity/type tag in a process-global registry. Calls from Perl flow through a fallback arm in [crate::builtins::try_builtin] that dispatches on the signature tag via direct function-pointer transmute — no libffi dep, no per-call alloc, no marshalling overhead beyond the PerlValue::to_int / to_number / to_string calls you'd do for any builtin.
Combine with AOT for zero-friction deployment: stryke build script.stk -o prog bakes the Perl source — which includes the rust { ... } block — into a standalone binary. The FFI compile still happens on first run of ./prog, but the user only needs rustc once, then the ~/.cache/stryke/ffi/ entry is permanent.
Limitations (v1):
- Unix only (macOS + Linux). Windows support is a dlopen-equivalent swap away but isn't wired.
- Signatures beyond the table above are silently ignored (the function still exists in the cdylib, just not Perl-callable).
- Body must be self-contained Rust with
stdonly — noCargo.toml/ external crate deps. If you needregexor similar, vendor the minimal code into the block. - The cdylib runs with the calling process's privileges. Trust model is equivalent to
do FILE.
[0x0F] BYTECODE CACHE (.pec)
STRYKE_BC_CACHE=1 enables the on-disk bytecode cache. The first run of a script parses + compiles + persists a .pec bundle to ~/.cache/stryke/bc/<sha256>.pec. Every subsequent run skips both parse and compile and feeds the cached chunk straight into the VM.
STRYKE_BC_CACHE=1 STRYKE_BC_CACHE=1
Measured impact (Apple M5, 13 MB release stryke, hyperfine --warmup 5 -N, mean ± σ):
| script | cold (no cache) | warm (.pec) | speedup | .pec size |
|---|---|---|---|---|
| 6 002 lines, 3000 subs | 67.9 ms ± 5.1 | 19.9 ms ± 1.0 | 3.41× | 47 KB |
| 1 002 lines, 500 subs | 6.8 ms ± 0.5 | 6.5 ms ± 0.5 | 1.06× wall, 1.32× user CPU | 5 KB |
| 3 lines (toy) | 3.5 ms ± 0.3 | 4.8 ms ± 0.4 | cache loses | 1.9 KB |
The toy-script result is the honest one to call out: for tiny scripts the cache deserialize cost outweighs the parse cost it replaces. The cache wins decisively on anything substantial — startup time becomes O(deserialize) instead of O(parse + compile).
Tuning knobs:
STRYKE_BC_CACHE=1— opt-in. (V1 is opt-in to avoid surprising users with stray cache files; flip to opt-out once we have astryke cache prunesubcommand and confidence in invalidation.)STRYKE_BC_DIR=/path/to/dir— override the cache location. Useful for test isolation and CI.
Format (src/pec.rs): [4B magic b"PEC2"][zstd-compressed bincode of PecBundle]. The PecBundle carries format_version, pointer_width (so a cache built on a 64-bit host is rejected on 32-bit), strict_vars (a mismatch is treated as a clean miss → re-compile), source_fingerprint, the parsed Program, and the compiled Chunk. Format version 2 introduced zstd compression — files dropped ~10× in size and warm-load latency dropped with them.
Cache key (pec::source_fingerprint): SHA-256 of (crate version, source filename, full source including -M prelude). Editing the script, upgrading stryke, or changing the -M flags all force a recompile. The crate version is mixed in so a cargo install strykelang upgrade silently invalidates everyone's cache rather than risking a stale-bytecode mismatch.
Pairs with stryke build: AOT binaries pick up the cache for free. The first run of a shipped binary parses and compiles the embedded source; every subsequent run on the same machine reuses the cached chunk. The cache key includes the script name baked into the trailer, so two binaries with different embedded scripts never collide.
Limitations (v1):
- Bypassed for
-e/-Eone-liners. Measured: warm.pecis ~2-3× slower than cold for tiny scripts because the deserialize cost (~1-2 ms for fs read + zstd decode + bincode) dominates the parse+compile work it replaces (~500 µs). Each unique-einvocation would also pollute the cache directory with no GC. The break-even is around 1000 lines, so file-based scripts only. - Bypassed for
-n/-p/--lint/--check/--ast/--fmt/--profilemodes (those paths run a different driver loop). - No automatic eviction yet — old
.pecfiles for edited scripts accumulate.rm ~/.cache/stryke/bc/*.pecis a fine workaround untilstryke cache prunelands. - Cache hit path cannot fall back to the tree walker mid-run — but this is unreachable in practice because
compile_programonly emits ops the VM implements before persisting.
[0x10] DISTRIBUTED pmap_on OVER SSH (cluster)
Distribute a pmap-style fan-out across many machines via SSH. The dispatcher spawns one persistent stryke --remote-worker process per slot, performs a HELLO + SESSION_INIT handshake once per slot, then streams JOB frames over the same stdin/stdout. Pairs perfectly with stryke build: ship one binary to N hosts, fan the workload across them.
my $cluster = ([
, , , { host => , slots => 12, stryke => }, { timeout => 30, retries => 2, connect_timeout => 5 }, ])
my @hashes = @big_files |> $cluster { slurp_raw |> sha_256) }
my @lines = @log_paths |> $cluster { //, slurp }
Cluster syntax
Each list element to cluster([...]) is one of:
| Form | Meaning |
|---|---|
"host" |
One slot on host, remote stryke from $PATH |
"host:N" |
N slots on host |
"host:N:/path/to/stryke" |
N slots, custom remote stryke binary |
"user@host:N" |
ssh user override (kept verbatim, passed through to ssh) |
{ host => "...", slots => N, stryke => "..." } |
Hashref form with explicit fields |
trailing { timeout => SECS, retries => N, connect_timeout => SECS } |
Cluster-wide tunables (must be the last argument; consumed only when all keys are tunable names) |
Tunables (defaults shown):
| Key | Default | Meaning |
|---|---|---|
timeout (alias job_timeout) |
60 |
Per-job wall-clock budget in seconds. Slots that exceed this are killed and the job is re-enqueued. |
retries |
2 |
Retries per job on top of the initial attempt. retries=2 → up to 3 total tries. |
connect_timeout |
10 |
ssh -o ConnectTimeout=N for the initial handshake. |
Architecture
main thread ┌── slot 0 (ssh build1) ────┐
┌──────────────────┐ │ worker thread + ssh proc │
│ enqueue all jobs ├──► work_tx ─►│ HELLO + SESSION_INIT once │
│ collect results │ │ loop: take JOB from queue │
└──────────────────┘ │ send + read │
▲ │ push to results │
│ └────────────────────────────┘
│ ┌── slot 1 (ssh build1) ────┐
│ │ worker thread + ssh proc │
│ └────────────────────────────┘
│ ┌── slot 2 (ssh build2) ────┐
│ │ ... │
│ └────────────────────────────┘
│ │
└────────── result_rx ───────────────┘
Each slot runs in its own thread and pulls JOB messages from a shared crossbeam channel. Work-stealing emerges naturally — fast slots drain the queue faster, slow slots take fewer jobs. No round-robin assignment, which was the basic v1 implementation's biggest performance bug (fast hosts sat idle while slow hosts queued). The Interpreter on each remote worker is reused across jobs so package state, sub registrations, and module loads survive between items.
Wire protocol (v2)
Every message is [u64 LE length][u8 kind][bincode payload]. The single-byte kind discriminator lets future revisions extend the protocol without breaking older workers — an unknown kind is a hard error so version skew is loud. See src/remote_wire.rs.
dispatcher worker
│ │
│── HELLO ─────────────────►│ (proto version, build id)
│◄───────────── HELLO_ACK ──│ (worker stryke version, hostname)
│── SESSION_INIT ──────────►│ (subs prelude, block source, captured lexicals)
│◄────────── SESSION_ACK ───│ (or ERROR)
│── JOB(seq=0) ────────────►│ (item)
│◄────────── JOB_RESP(0) ───│
│── JOB(seq=1) ────────────►│
│◄────────── JOB_RESP(1) ───│
│ ... │
│── SHUTDOWN ──────────────►│
│ └─ exit 0
The basic v1 protocol shipped the entire subs prelude on every job and spawned a fresh ssh process per item. For a 10k-item map across 8 hosts that's 10 000 ssh handshakes (~50–200 ms each) + 10 000 copies of the subs prelude over the wire — minutes of overhead before any work runs. The v2 persistent session amortizes the handshake across the whole map and ships the prelude once.
Fault tolerance
When a slot's read or write fails (ssh died, network blip, remote crash, per-job timeout), the worker thread re-enqueues the in-flight job to the shared queue with attempts++ and exits. Other living slots pick the job up. A job is permanently failed when its attempt count reaches cluster.max_attempts. The whole map fails only when every slot is dead or every queued job has exhausted its retry budget.
stryke --remote-worker
The worker subprocess. Reads a HELLO frame from stdin, parses subs prelude + block source from SESSION_INIT exactly once, then handles JOB frames in a loop until SHUTDOWN or stdin EOF. Started by the dispatcher via ssh HOST FO_PATH --remote-worker. Also reachable directly for local testing:
|
Limitations (v1)
- Unix only — hardcoded
ssh, hardcoded POSIX dlopen path. Windows would need a similar shim. - JSON-marshalled values —
serde_jsonround-trip loses bigints, blessed refs, and other heap-onlyPerlValuepayloads. The supported types are: undef, bool, i64, f64, string, array, hash. Anything outside that returns an error frompmap_on. mysync/ atomic capture is rejected — shared state across remote workers can't honour the cross-process mutex semantics in v1. Use the result list and aggregate locally.- No streaming results — the dispatcher buffers the full result vector before returning. For huge fan-outs this is the next thing to fix (likely via
pchannelintegration). - No SSH connection pool across calls — each
pmap_oninvocation builds fresh sessions. Subsequentpmap_oncalls in the same script reconnect from scratch.
[0x11] LANGUAGE SERVER (stryke lsp)
stryke lsp (or stryke --lsp) runs an LSP server over stdio. Hooks into the existing parser, lexer, and symbol table — no separate analyzer to maintain. Surfaces:
- Diagnostics on save (parse + compile errors with line / column / message)
- Hover docs for builtins (
pmap,cluster,fetch_json,dataframe, …) — including the parallel and cluster primitives from sections [0x03] and [0x10] - Symbol lookup for subs and packages within the open file
- Completion for built-in function names and the keywords listed in [0x08]
Wire it into VS Code, JetBrains, or any LSP-aware editor by pointing the client at stryke lsp (or stryke --lsp) as the language-server command. There is no separate stryke-lsp binary — the same stryke you run scripts with also acts as its own language server.
// .vscode/settings.json
{
"stryke.serverPath": "/usr/local/bin/stryke",
"stryke.serverArgs": ["--lsp"]
}
[0x12] LANGUAGE REFLECTION
stryke exposes its own parser and dispatcher state as plain Perl hashes, so you can enumerate, look up, filter, and pipe over everything the interpreter knows about — no separate API surface to learn, just standard hash ops.
The data is derived at compile time by build.rs from the source of truth:
section-commented groups in is_perl5_core / stryke_extension_name (for
categories), try_builtin arm names (for aliases), and doc_for_label_text
in src/lsp.rs (for descriptions). No hand-maintained list, no stale counts.
Hashes
Eight hashes; every direct lookup ($h{name}) is O(1). Forward maps:
| Long name | Short | Key → Value |
|---|---|---|
%stryke::builtins |
%b |
primary callable name → category ("parallel", "string", …). Primaries-only — clean unique-op count. |
%stryke::all |
%all |
every spelling (primary + alias) → category. Aliases inherit their primary's tag. Use this for scalar keys %all. |
%stryke::perl_compats |
%pc |
subset of %b: Perl 5 core only, name → category |
%stryke::extensions |
%e |
subset of %b: stryke-only, name → category |
%stryke::aliases |
%a |
alias → canonical primary ($a{tj} → "to_json") |
%stryke::descriptions |
%d |
name → one-line LSP summary (sparse) |
Inverted indexes for constant-time reverse queries:
| Long name | Short | Key → Value |
|---|---|---|
%stryke::categories |
%c |
category → arrayref of names ($c{parallel} → [pmap, pgrep, …]) |
%stryke::primaries |
%p |
primary → arrayref of its aliases ($p{to_json} → [tj]) |
Examples
# O(1) direct lookups
# total callable spellings (primaries + aliases), one direct count
# see just Perl compats
# see just stryke extensions
# enumerate a whole category in O(1)
# browse any of them interactively via the pager
# frequency table: how many ops per category?
# find every documented op mentioning "parallel"
# catalog the full reflection surface
Notes
- Every direct
$h{name}lookup is O(1). Filter queries (grep { cond } keys %h) are O(n), but the two inverted indexes (%c,%p) give you O(1) reverse-lookups for the two most common "find names by property" queries. - Hash sigil namespace is separate from scalars and subs, so
%a/%b/%c/%d/%e/%p/%pcdon't collide with$a/$bsort specials or theeextension sub. - Short aliases are value copies of the long
%stryke::*names — currently read-only in practice, so the copy never diverges. %descriptionsis sparse:exists $d{$name}doubles as "is this documented in the LSP?". Undocumented ops still appear in%builtinswith a category — they just lack a hover summary.- A value of
"uncategorized"in%builtinsmeans the name is dispatched at runtime but doesn't match any// ── category ──section comment inparser.rsyet — a flag for "add a section header here", not an error.
[0x13] ZSHRS SHELL
zshrs is the most powerful shell ever created — a drop-in zsh replacement written in Rust that combines full bash/zsh/fish script compatibility with SQLite-indexed completions and native stryke parallel operations.
Install
Two build options:
# FAT build (24MB) - includes stryke for @ parallel mode
# LEAN build (5MB) - pure shell, no stryke dependency
| Build | Size | @ mode |
Use case |
|---|---|---|---|
Fat (strykelang) |
~24MB | Yes | Full power — parallel ops, JSON, HTTP in shell |
Lean (zsh crate) |
~4MB | No | Minimal — just the shell, fast startup |
Why zshrs?
| vs zsh | vs fish | vs bash |
|---|---|---|
| 10x faster startup | Full POSIX compat | Fish-quality UX |
| SQLite history (frequency, duration, exit status) | Runs .bashrc unchanged |
Modern completions |
| Native stryke parallel ops | No syntax translation needed | Syntax highlighting |
| ZWC precompiled function support | Global/suffix aliases | Autosuggestions |
Core features
| Feature | Description |
|---|---|
| Full zsh/bash/fish compatibility | Runs existing .zshrc, .bashrc, shell scripts unchanged |
| Fish-style syntax highlighting | Real-time colorization as you type — commands green, strings yellow, errors red, variables cyan |
| Fish-style autosuggestions | Ghost-text completions from history, accept with → |
| Fish-style abbreviations | g → git, gco → git checkout, expandable with space |
| SQLite-indexed completions | FTS5 prefix search indexes all PATH executables for instant fuzzy completion |
| SQLite history | Frequency-ranked, timestamped, tracks duration and exit status per command |
| Native stryke mode | Prefix any line with @ to execute as stryke code with full parallel primitives |
| ZWC support | Reads compiled .zwc files for fast function loading from fpath |
| Job control | Full &, fg, bg, jobs, wait, disown, suspend support |
| Traps & signals | trap 'cmd' EXIT INT TERM ERR DEBUG with proper cleanup |
| PS1/PROMPT escapes | %n, %m, %~, %?, %j, %T, %D, \u, \h, \w, \W, \$ |
| Named directories | hash -d proj=/path/to/project then cd ~proj |
| Hooks | precmd, preexec, chpwd, periodic, add-zsh-hook |
| zsh modules | zpty, zsocket, zprof, sched, zformat, zparseopts, zregexparse |
| PCRE regex | pcre_compile, pcre_match with capture groups |
| Shell emulation | emulate zsh, emulate bash, emulate ksh, emulate sh |
Usage
In interactive mode, prefix any line with @ to enter stryke mode:
|> sum |> p
) |> dd
|> pmap { )} | |> p # parallel file sizes
$ |> pgrep { ) } | |> p # parallel prime count
Default abbreviations (fish-style)
Abbreviations expand inline when you press space — type gs<space> and it becomes git status .
| Abbr | Expansion | Abbr | Expansion |
|---|---|---|---|
g |
git |
ga |
git add |
gaa |
git add --all |
gc |
git commit |
gcm |
git commit -m |
gco |
git checkout |
gd |
git diff |
gds |
git diff --staged |
gp |
git push |
gpl |
git pull |
gs |
git status |
gsw |
git switch |
gb |
git branch |
gl |
git log --oneline |
gst |
git stash |
grb |
git rebase |
gm |
git merge |
gf |
git fetch |
cb |
cargo build |
cr |
cargo run |
ct |
cargo test |
cc |
cargo check |
cf |
cargo fmt |
ccl |
cargo clippy |
dc |
docker compose |
dcu |
docker compose up |
dcd |
docker compose down |
dps |
docker ps |
k |
kubectl |
kgp |
kubectl get pods |
kgs |
kubectl get services |
kgd |
kubectl get deployments |
l |
ls -la |
ll |
ls -l |
la |
ls -la |
md |
mkdir -p |
... |
../.. |
.... |
../../.. |
..... |
../../../.. |
Builtins (complete list)
I/O: echo, printf, print, read, cat
Navigation: cd, pwd, pushd, popd, dirs
Variables: export, unset, readonly, typeset, declare, local, integer, float, set
Aliases: alias, unalias
Sourcing: source, ., eval, exec
Control flow: exit, return, break, continue, shift
Options: setopt, unsetopt, shopt
Functions: autoload, functions, unfunction
Hashing: hash, unhash, rehash
Lookup: whence, where, which, type, command, builtin
Enable/disable: enable, disable
Completion: compgen, complete, compopt, compadd, compset, compctl
Jobs: jobs, fg, bg, wait, disown, suspend, kill
Limits: ulimit, umask, limit, unlimit, times
History: history, fc
Traps: trap
Tests: test, [, [[, true, false, :
Arithmetic: let, (( ))
Emulation: emulate
Modules: zmodload
zsh-specific: zpty, zsocket, zprof, sched, zformat, zparseopts, zregexparse, pcre_compile, pcre_match, zstyle, zstat, zle, bindkey, vared, strftime, promptinit, add-zsh-hook, noglob, nocorrect, repeat, coproc
Bash compat: shopt, help, caller, mapfile, readarray
Parameter expansion (full zsh support)
Defaults and assignment:
Length and substring:
Pattern removal:
Replacement:
Case conversion (zsh flags):
Array operations (zsh flags):
Nested expansion:
Arrays
Indexed arrays:
arr=(a b c d e) # create array
arr[2]=X # assign element
arr+=(f g) # append elements
arr=() # from command substitution
Associative arrays:
# or: typeset -A map
map[key]=value # assign
Control flow
Conditionals:
if ; then ; fi
if ; then ; else ; fi
if ; then ; elif ; then ; fi
Loops:
for; do ; done
for; do ; done
for; do ; done
while ; do ; ; done
until ; do ; ; done
Case:
# Fallthrough with ;&
Select menu:
; do ; break; done
Always block (zsh):
{ ; }
Arithmetic
Basic operations:
Bitwise:
Increment/decrement:
# post-increment
# post-decrement
Compound assignment:
Ternary and comma:
Number bases:
let builtin:
Conditionals ([[ ]] and [ ])
Numeric:
# equal
# not equal
# less than
# greater than
# less or equal
# greater or equal
String:
# equal
# not equal
# lexically less
# lexically greater
# empty
# non-empty
# glob match
# regex match (BASH_REMATCH captures)
File tests:
# exists
# regular file
# directory
# symlink
# readable
# writable
# executable
# non-empty
# a newer than b
# a older than b
# same file (inode)
Logical:
# AND
# OR
# NOT
Redirection
Basic:
File descriptor manipulation:
Noclobber:
|
Here documents:
Here strings:
Process substitution:
Pipelines
| | |&
Globbing
Basic patterns:
Brace expansion:
Extended glob (setopt extendedglob):
|)
Glob qualifiers (zsh):
Glob options:
Functions
# Definition styles
# Arguments
# Local variables
# Return values
;
# Recursive
Autoloading (zsh):
Function inspection:
Aliases
Regular aliases:
Global aliases (zsh) — expand anywhere:
Suffix aliases (zsh) — run by extension:
Job control
& # run in background
# list jobs
# wait for all background jobs
# Job specs
Traps and signals
# list traps
History
Commands:
Expansion:
!! # last command
!-2 # 2 commands ago
!42 # history entry 42
! # last command starting with echo
!!!!
Options:
Prompt customization
Zsh escapes (PROMPT / PS1):
} }
Bash escapes (PS1):
\u # username
\h # hostname (short)
\H # hostname (full)
\w # working directory
\W # basename of working directory
\$ # $ or # for root
\n # newline
\[
Right prompt (zsh):
RPROMPT='%T' # show time on right
Hooks (zsh)
# Define hook functions
# Or use add-zsh-hook
zsh modules
zpty — pseudo-terminal:
zsocket — Unix domain sockets:
zprof — profiling:
# ... run code ...
sched — scheduled commands:
zformat — formatted output:
zparseopts — option parsing:
zregexparse — regex with captures:
pcre — Perl-compatible regex:
zstyle (completion configuration)
# Basic format: zstyle 'pattern' style value...
# Query styles
Shell options (setopt/unsetopt)
Common options:
List all options:
Shell emulation
Integration with stryke
The killer feature — prefix any line with @ to access stryke's parallel primitives:
# Parallel operations on shell data
| | |
# Fetch and process JSON
) |> -> |> p
# Parallel grep across files
) |> pgrep { ) = } |
# Pipeline with stryke operators
|> lines |> pgrep { } | |> p
# Mixed shell + stryke
; do ) |> keys |> cnt |> p; done
[0x14] DOCUMENTATION
All documentation is served via GitHub Pages at menketechnologies.github.io/strykelang/.
| Document | Description |
|---|---|
Docs Home |
Stryke reference — quickstart, builtins, parallel primitives, pipe-forward syntax, reflection hashes |
Full Reference |
Complete language reference — every builtin, operator, special variable, and regex feature |
Coverage Report |
Engineering report — zshrs C-to-Rust port (55,236 C lines → 47,954 Rust), compsys (19,822 lines), strykelang (166,139 lines), full function mapping for all 859 C functions across 14 source files, 145 shell builtins, 20 loadable modules, ZLE line editor, and workspace grand total (270,403 lines, 2,936 tests) |
[0xFF] LICENSE
MIT — see LICENSE.
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░
░░ >>> PARSE. EXECUTE. PARALLELIZE. OWN YOUR CORES. <<< ░░
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░