pub struct Arena<T, const N: usize>where
T: Copy,{ /* private fields */ }Expand description
Fixed-size arena allocator with O(1) allocation.
§Type Parameters
T: The type of values stored (must beCopyfor array initialization)N: Maximum number of cells (const generic)
§Memory Layout
slots: Array ofSlot<T>(either free with next pointer, or occupied with value)free_head: Head of the free listlen: Number of currently allocated slotsgc_enabled: Whether garbage collection is enabled
§O(1) Allocation
Uses a free-list for constant-time allocation and deallocation instead of scanning a bitmap.
§Garbage Collection
The arena supports mark-and-sweep garbage collection via the [Trace] trait.
GC can be enabled or disabled at runtime using Arena::set_gc_enabled.
When disabled, Arena::collect_garbage returns immediately without collecting.
Implementations§
Source§impl<T, const N: usize> Arena<T, N>where
T: Copy,
impl<T, const N: usize> Arena<T, N>where
T: Copy,
Sourcepub fn new(_default_value: T) -> Arena<T, N>
pub fn new(_default_value: T) -> Arena<T, N>
Create a new arena.
All slots start as free, linked together in a free list.
§Example
use pwn_arena::Arena;
let arena: Arena<isize, 100> = Arena::new(0);Sourcepub fn is_gc_enabled(&self) -> bool
pub fn is_gc_enabled(&self) -> bool
Check if garbage collection is enabled.
When disabled, Arena::collect_garbage returns immediately without
performing any collection.
Sourcepub fn set_gc_enabled(&self, enabled: bool)
pub fn set_gc_enabled(&self, enabled: bool)
Enable or disable garbage collection.
When disabled, Arena::collect_garbage returns immediately with
zero marked/collected stats. This can be useful for:
- Performance-critical sections where you want to defer GC
- Debugging to isolate GC-related issues
- Temporarily pausing GC during batch operations
§Example
use pwn_arena::Arena;
let arena: Arena<isize, 100> = Arena::new(0);
// Disable GC for a batch operation
arena.set_gc_enabled(false);
// ... perform many allocations ...
// Re-enable and collect
arena.set_gc_enabled(true);Sourcepub fn without_gc<F, R>(&self, f: F) -> Rwhere
F: FnOnce() -> R,
pub fn without_gc<F, R>(&self, f: F) -> Rwhere
F: FnOnce() -> R,
Temporarily disable GC, run a closure, then restore the previous state.
This is useful for critical sections where GC should not run.
§Example
use pwn_arena::Arena;
let arena: Arena<isize, 100> = Arena::new(0);
let result = arena.without_gc(|| {
// GC is disabled in here
arena.alloc(42).unwrap()
});
// GC is re-enabled hereSourcepub fn with_gc<F, R>(&self, f: F) -> Rwhere
F: FnOnce() -> R,
pub fn with_gc<F, R>(&self, f: F) -> Rwhere
F: FnOnce() -> R,
Temporarily enable GC, run a closure, then restore the previous state.
This is useful when GC is normally disabled but you want to force a collection in a specific section.
Sourcepub fn alloc(&self, value: T) -> Result<ArenaIndex, ArenaError>
pub fn alloc(&self, value: T) -> Result<ArenaIndex, ArenaError>
Sourcepub fn get(&self, index: ArenaIndex) -> Result<T, ArenaError>
pub fn get(&self, index: ArenaIndex) -> Result<T, ArenaError>
Get a copy of the value at the given index.
§Errors
Returns ArenaError::InvalidIndex if the index is out of bounds or not allocated.
Sourcepub fn set(&self, index: ArenaIndex, value: T) -> Result<(), ArenaError>
pub fn set(&self, index: ArenaIndex, value: T) -> Result<(), ArenaError>
Set the value at the given index.
§Errors
Returns ArenaError::InvalidIndex if the index is out of bounds or not allocated.
Sourcepub fn modify<F>(&self, index: ArenaIndex, f: F) -> Result<(), ArenaError>
pub fn modify<F>(&self, index: ArenaIndex, f: F) -> Result<(), ArenaError>
Modify a value in place using a closure.
This is more efficient than get + set as it avoids copying the value twice.
§Errors
Returns ArenaError::InvalidIndex if the index is out of bounds or not allocated.
§Example
use pwn_arena::Arena;
let arena: Arena<isize, 10> = Arena::new(0);
let idx = arena.alloc(42).unwrap();
arena.modify(idx, |v| *v += 10).unwrap();
assert_eq!(arena.get(idx).unwrap(), 52);Sourcepub fn try_get(&self, index: ArenaIndex) -> Option<T>
pub fn try_get(&self, index: ArenaIndex) -> Option<T>
Get a value, returning None instead of an error if invalid.
This is a convenience method for cases where you expect the index
might be invalid and want to handle it with Option instead of Result.
Sourcepub fn swap(&self, a: ArenaIndex, b: ArenaIndex) -> Result<(), ArenaError>
pub fn swap(&self, a: ArenaIndex, b: ArenaIndex) -> Result<(), ArenaError>
Sourcepub fn replace(&self, index: ArenaIndex, value: T) -> Result<T, ArenaError>
pub fn replace(&self, index: ArenaIndex, value: T) -> Result<T, ArenaError>
Replace the value at an index, returning the old value.
§Errors
Returns an error if the index is invalid.
Sourcepub fn free(&self, index: ArenaIndex) -> Result<(), ArenaError>
pub fn free(&self, index: ArenaIndex) -> Result<(), ArenaError>
Sourcepub fn is_allocated(&self, index: ArenaIndex) -> bool
pub fn is_allocated(&self, index: ArenaIndex) -> bool
Check if an index is currently valid (allocated).
Sourcepub fn clear(&self)
pub fn clear(&self)
Clear all allocations, making the entire arena available.
§Warning
This does not call any destructors. Use with caution.
Sourcepub fn iter(&self) -> ArenaIterator<'_, T, N>
pub fn iter(&self) -> ArenaIterator<'_, T, N>
Iterate over all allocated indices and values.
§Example
use pwn_arena::Arena;
let arena: Arena<isize, 10> = Arena::new(0);
arena.alloc(1).unwrap();
arena.alloc(2).unwrap();
arena.alloc(3).unwrap();
for (idx, value) in arena.iter() {
println!("Index {:?}: {}", idx, value);
}Sourcepub fn stats(&self) -> ArenaStats
pub fn stats(&self) -> ArenaStats
Get statistics about arena usage.
Sourcepub fn validate(&self) -> bool
pub fn validate(&self) -> bool
Validate internal consistency of the arena.
Returns true if the arena’s internal state is consistent.
This is useful for debugging and testing.
Checks:
- Free list integrity (no cycles, correct length)
- Slot count matches
len - All free slots are in the free list
Sourcepub fn is_slot_occupied(&self, slot_index: usize) -> bool
pub fn is_slot_occupied(&self, slot_index: usize) -> bool
Check if a slot index is currently occupied.
This is a low-level debugging method. For normal use, prefer [is_allocated].
Sourcepub fn allocated_indices(&self) -> [ArenaIndex; N]
pub fn allocated_indices(&self) -> [ArenaIndex; N]
Get indices of all allocated slots.
Returns an array with the first len() elements being valid indices.
The remaining elements are ArenaIndex::NULL.
Sourcepub fn for_each<F>(&self, f: F)
pub fn for_each<F>(&self, f: F)
Apply a function to all allocated values.
This is useful for bulk updates without the overhead of iteration.
Sourcepub fn for_each_mut<F>(&self, f: F)
pub fn for_each_mut<F>(&self, f: F)
Apply a mutating function to all allocated values.
Sourcepub fn count_where<F>(&self, predicate: F) -> usize
pub fn count_where<F>(&self, predicate: F) -> usize
Count values matching a predicate.
Sourcepub fn find<F>(&self, predicate: F) -> Option<(ArenaIndex, T)>
pub fn find<F>(&self, predicate: F) -> Option<(ArenaIndex, T)>
Find the first value matching a predicate.
Sourcepub fn all<F>(&self, predicate: F) -> bool
pub fn all<F>(&self, predicate: F) -> bool
Check if all allocated values match a predicate.
Returns true if the arena is empty.
Sourcepub fn alloc_contiguous(
&self,
count: usize,
default: T,
) -> Result<ArenaIndex, ArenaError>
pub fn alloc_contiguous( &self, count: usize, default: T, ) -> Result<ArenaIndex, ArenaError>
Allocate a contiguous block of count slots.
Returns the starting ArenaIndex if successful. The allocated slots
are consecutive in memory, starting from the returned index.
§Use Case
This is primarily used for string storage where characters need to be stored in contiguous memory for efficient access. The first slot typically stores length metadata, followed by character data.
§Errors
Returns ArenaError::InvalidIndex if count is 0.
Returns ArenaError::OutOfMemory if no contiguous block is available.
§Example
use pwn_arena::Arena;
let arena: Arena<isize, 100> = Arena::new(0);
// Allocate 5 contiguous slots
let start = arena.alloc_contiguous(5, 0).unwrap();
// All slots are consecutive
for i in 0..5 {
let idx = arena.index_at_offset(start, i).unwrap();
arena.set(idx, i as isize).unwrap();
}Sourcepub fn index_at_offset(
&self,
start: ArenaIndex,
offset: usize,
) -> Result<ArenaIndex, ArenaError>
pub fn index_at_offset( &self, start: ArenaIndex, offset: usize, ) -> Result<ArenaIndex, ArenaError>
Get an ArenaIndex at a given offset from a starting index.
This is used to access slots within a contiguous allocation.
§Errors
Returns ArenaError::InvalidIndex if the offset goes out of bounds
or the slot is not allocated.
§Example
use pwn_arena::Arena;
let arena: Arena<isize, 100> = Arena::new(0);
let start = arena.alloc_contiguous(3, 0).unwrap();
// Access slot at offset 1
let idx1 = arena.index_at_offset(start, 1).unwrap();
arena.set(idx1, 42).unwrap();
assert_eq!(arena.get(idx1).unwrap(), 42);Sourcepub fn free_contiguous(
&self,
start: ArenaIndex,
count: usize,
) -> Result<(), ArenaError>
pub fn free_contiguous( &self, start: ArenaIndex, count: usize, ) -> Result<(), ArenaError>
Free a contiguous block starting at start with count slots.
All slots must be allocated.
§Errors
Returns ArenaError::InvalidIndex if any slot is out of bounds or
not allocated.
§Example
use pwn_arena::Arena;
let arena: Arena<isize, 100> = Arena::new(0);
let start = arena.alloc_contiguous(5, 0).unwrap();
assert_eq!(arena.len(), 5);
arena.free_contiguous(start, 5).unwrap();
assert_eq!(arena.len(), 0);Source§impl<T, const N: usize> Arena<T, N>where
T: Copy,
impl<T, const N: usize> Arena<T, N>where
T: Copy,
Sourcepub fn delete_recursive(&self, index: ArenaIndex) -> Result<(), ArenaError>where
T: ArenaDelete<T, N>,
pub fn delete_recursive(&self, index: ArenaIndex) -> Result<(), ArenaError>where
T: ArenaDelete<T, N>,
Delete a value and recursively delete any children.
This requires T: ArenaDelete<T, N>.
Sourcepub fn copy_deep(&self, index: ArenaIndex) -> Result<ArenaIndex, ArenaError>where
T: ArenaCopy<T, N>,
pub fn copy_deep(&self, index: ArenaIndex) -> Result<ArenaIndex, ArenaError>where
T: ArenaCopy<T, N>,
Create a deep copy of a value and its children.
This requires T: ArenaCopy<T, N>.
Source§impl<T, const N: usize> Arena<T, N>where
T: Copy,
impl<T, const N: usize> Arena<T, N>where
T: Copy,
Sourcepub fn collect_garbage(&self, roots: &[ArenaIndex]) -> GcStatswhere
T: Trace<T, N>,
pub fn collect_garbage(&self, roots: &[ArenaIndex]) -> GcStatswhere
T: Trace<T, N>,
Perform mark-and-sweep garbage collection.
Starting from the given roots, marks all reachable objects by
following ArenaIndex references (via the Trace trait), then
frees all unmarked (unreachable) objects.
§Algorithm
- Mark phase: Starting from roots, recursively mark all reachable objects. Handles cycles correctly by checking if already marked.
- Sweep phase: Iterate through all slots and free any that are allocated but not marked.
§Returns
Returns GcStats with information about what was collected.
§Complexity
- Time: O(reachable + N) where N is arena capacity
- Space: O(N) for the mark bitmap
§Example
use pwn_arena::{Arena, ArenaIndex, Trace};
#[derive(Clone, Copy)]
struct Node {
value: isize,
next: Option<ArenaIndex>,
}
impl<const N: usize> Trace<Node, N> for Node {
fn trace<F: FnMut(ArenaIndex)>(&self, mut tracer: F) {
if let Some(next) = self.next {
tracer(next);
}
}
}
let arena: Arena<Node, 10> = Arena::new(Node { value: 0, next: None });
// Create a linked list: root -> n1 -> n2
let n2 = arena.alloc(Node { value: 3, next: None }).unwrap();
let n1 = arena.alloc(Node { value: 2, next: Some(n2) }).unwrap();
let root = arena.alloc(Node { value: 1, next: Some(n1) }).unwrap();
// Create some garbage
let _garbage = arena.alloc(Node { value: -1, next: None }).unwrap();
// Collect with root as the only GC root
let stats = arena.collect_garbage(&[root]);
assert_eq!(stats.collected, 1);
assert_eq!(arena.len(), 3);Sourcepub fn collect_garbage_unconditional(&self, roots: &[ArenaIndex]) -> GcStatswhere
T: Trace<T, N>,
pub fn collect_garbage_unconditional(&self, roots: &[ArenaIndex]) -> GcStatswhere
T: Trace<T, N>,
Perform garbage collection unconditionally, even if GC is disabled.
This ignores the gc_enabled flag and always performs collection.
Useful when you need to collect garbage regardless of the current
GC state.
§Example
use pwn_arena::{Arena, ArenaIndex, Trace};
#[derive(Clone, Copy)]
struct Leaf(isize);
impl<const N: usize> Trace<Leaf, N> for Leaf {
fn trace<F: FnMut(ArenaIndex)>(&self, _tracer: F) {}
}
let arena: Arena<Leaf, 10> = Arena::new(Leaf(0));
arena.set_gc_enabled(false);
arena.alloc(Leaf(1)).unwrap();
arena.alloc(Leaf(2)).unwrap(); // garbage
let root = arena.alloc(Leaf(3)).unwrap();
// This will NOT collect (GC disabled)
let stats = arena.collect_garbage(&[root]);
assert_eq!(stats.collected, 0);
// This WILL collect (unconditionally)
let stats = arena.collect_garbage_unconditional(&[root]);
assert_eq!(stats.collected, 2);Sourcepub fn collect_garbage_multi(&self, root_sets: &[&[ArenaIndex]]) -> GcStatswhere
T: Trace<T, N>,
pub fn collect_garbage_multi(&self, root_sets: &[&[ArenaIndex]]) -> GcStatswhere
T: Trace<T, N>,
Perform garbage collection with multiple root sets.
This iterates through all provided root sets and marks objects reachable from any of them.
Respects the gc_enabled flag - use Arena::collect_garbage_multi_unconditional
to ignore the flag.
§Note
For no-alloc compatibility, this method iterates through root sets sequentially rather than flattening them. This has the same effect but uses constant stack space.
Sourcepub fn collect_garbage_multi_unconditional(
&self,
root_sets: &[&[ArenaIndex]],
) -> GcStatswhere
T: Trace<T, N>,
pub fn collect_garbage_multi_unconditional(
&self,
root_sets: &[&[ArenaIndex]],
) -> GcStatswhere
T: Trace<T, N>,
Perform garbage collection unconditionally with multiple root sets, ignoring the gc_enabled flag.
Sourcepub fn alloc_or_gc(
&self,
value: T,
roots: &[ArenaIndex],
) -> Result<ArenaIndex, ArenaError>where
T: Trace<T, N>,
pub fn alloc_or_gc(
&self,
value: T,
roots: &[ArenaIndex],
) -> Result<ArenaIndex, ArenaError>where
T: Trace<T, N>,
Allocate a value, running GC first if the arena is full.
If allocation fails due to OutOfMemory, this method runs garbage
collection with the provided roots and retries the allocation.
§Errors
Returns OutOfMemory if allocation still fails after GC.
§Example
use pwn_arena::{Arena, ArenaIndex, Trace};
#[derive(Clone, Copy)]
struct Node(isize);
impl<const N: usize> Trace<Node, N> for Node {
fn trace<F: FnMut(ArenaIndex)>(&self, _: F) {}
}
let arena: Arena<Node, 3> = Arena::new(Node(0));
let root = arena.alloc(Node(1)).unwrap();
arena.alloc(Node(2)).unwrap(); // garbage
arena.alloc(Node(3)).unwrap(); // garbage
// Arena is full, but alloc_or_gc will collect garbage first
let new_idx = arena.alloc_or_gc(Node(4), &[root]).unwrap();
assert_eq!(arena.len(), 2); // root + new_idx