Skip to main content

Arena

Struct Arena 

Source
pub struct Arena<T, const N: usize>
where T: Copy,
{ /* private fields */ }
Expand description

Fixed-size arena allocator with O(1) allocation.

§Type Parameters

  • T: The type of values stored (must be Copy for array initialization)
  • N: Maximum number of cells (const generic)

§Memory Layout

  • slots: Array of Slot<T> (either free with next pointer, or occupied with value)
  • free_head: Head of the free list
  • len: Number of currently allocated slots
  • gc_enabled: Whether garbage collection is enabled

§O(1) Allocation

Uses a free-list for constant-time allocation and deallocation instead of scanning a bitmap.

§Garbage Collection

The arena supports mark-and-sweep garbage collection via the [Trace] trait. GC can be enabled or disabled at runtime using Arena::set_gc_enabled. When disabled, Arena::collect_garbage returns immediately without collecting.

Implementations§

Source§

impl<T, const N: usize> Arena<T, N>
where T: Copy,

Source

pub fn new(_default_value: T) -> Arena<T, N>

Create a new arena.

All slots start as free, linked together in a free list.

§Example
use pwn_arena::Arena;

let arena: Arena<isize, 100> = Arena::new(0);
Source

pub fn is_gc_enabled(&self) -> bool

Check if garbage collection is enabled.

When disabled, Arena::collect_garbage returns immediately without performing any collection.

Source

pub fn set_gc_enabled(&self, enabled: bool)

Enable or disable garbage collection.

When disabled, Arena::collect_garbage returns immediately with zero marked/collected stats. This can be useful for:

  • Performance-critical sections where you want to defer GC
  • Debugging to isolate GC-related issues
  • Temporarily pausing GC during batch operations
§Example
use pwn_arena::Arena;

let arena: Arena<isize, 100> = Arena::new(0);

// Disable GC for a batch operation
arena.set_gc_enabled(false);

// ... perform many allocations ...

// Re-enable and collect
arena.set_gc_enabled(true);
Source

pub fn without_gc<F, R>(&self, f: F) -> R
where F: FnOnce() -> R,

Temporarily disable GC, run a closure, then restore the previous state.

This is useful for critical sections where GC should not run.

§Example
use pwn_arena::Arena;

let arena: Arena<isize, 100> = Arena::new(0);

let result = arena.without_gc(|| {
    // GC is disabled in here
    arena.alloc(42).unwrap()
});
// GC is re-enabled here
Source

pub fn with_gc<F, R>(&self, f: F) -> R
where F: FnOnce() -> R,

Temporarily enable GC, run a closure, then restore the previous state.

This is useful when GC is normally disabled but you want to force a collection in a specific section.

Source

pub const fn capacity(&self) -> usize

Get the maximum capacity of this arena.

Source

pub fn len(&self) -> usize

Get the number of currently allocated cells.

Source

pub fn is_empty(&self) -> bool

Check if the arena is empty (no allocated cells).

Source

pub fn is_full(&self) -> bool

Check if the arena is full (all cells allocated).

Source

pub fn available(&self) -> usize

Get the number of free cells.

Source

pub fn alloc(&self, value: T) -> Result<ArenaIndex, ArenaError>

Allocate a new cell and return its index.

Uses O(1) free-list based allocation.

§Errors

Returns ArenaError::OutOfMemory if the arena is full.

§Example
use pwn_arena::Arena;

let arena: Arena<isize, 10> = Arena::new(0);
let idx = arena.alloc(42).unwrap();
assert_eq!(arena.get(idx).unwrap(), 42);
Source

pub fn get(&self, index: ArenaIndex) -> Result<T, ArenaError>

Get a copy of the value at the given index.

§Errors

Returns ArenaError::InvalidIndex if the index is out of bounds or not allocated.

Source

pub fn set(&self, index: ArenaIndex, value: T) -> Result<(), ArenaError>

Set the value at the given index.

§Errors

Returns ArenaError::InvalidIndex if the index is out of bounds or not allocated.

Source

pub fn modify<F>(&self, index: ArenaIndex, f: F) -> Result<(), ArenaError>
where F: FnOnce(&mut T),

Modify a value in place using a closure.

This is more efficient than get + set as it avoids copying the value twice.

§Errors

Returns ArenaError::InvalidIndex if the index is out of bounds or not allocated.

§Example
use pwn_arena::Arena;

let arena: Arena<isize, 10> = Arena::new(0);
let idx = arena.alloc(42).unwrap();

arena.modify(idx, |v| *v += 10).unwrap();
assert_eq!(arena.get(idx).unwrap(), 52);
Source

pub fn try_get(&self, index: ArenaIndex) -> Option<T>

Get a value, returning None instead of an error if invalid.

This is a convenience method for cases where you expect the index might be invalid and want to handle it with Option instead of Result.

Source

pub fn swap(&self, a: ArenaIndex, b: ArenaIndex) -> Result<(), ArenaError>

Swap the values at two indices.

§Errors

Returns an error if either index is invalid.

Source

pub fn replace(&self, index: ArenaIndex, value: T) -> Result<T, ArenaError>

Replace the value at an index, returning the old value.

§Errors

Returns an error if the index is invalid.

Source

pub fn free(&self, index: ArenaIndex) -> Result<(), ArenaError>

Free a cell, making it available for reuse.

§Errors

Returns ArenaError::InvalidIndex if the index is out of bounds or not allocated.

§Example
use pwn_arena::Arena;

let arena: Arena<isize, 10> = Arena::new(0);
let idx = arena.alloc(42).unwrap();
arena.free(idx).unwrap();
assert_eq!(arena.len(), 0);
Source

pub fn is_allocated(&self, index: ArenaIndex) -> bool

Check if an index is currently valid (allocated).

Source

pub fn clear(&self)

Clear all allocations, making the entire arena available.

§Warning

This does not call any destructors. Use with caution.

Source

pub fn iter(&self) -> ArenaIterator<'_, T, N>

Iterate over all allocated indices and values.

§Example
use pwn_arena::Arena;

let arena: Arena<isize, 10> = Arena::new(0);
arena.alloc(1).unwrap();
arena.alloc(2).unwrap();
arena.alloc(3).unwrap();

for (idx, value) in arena.iter() {
    println!("Index {:?}: {}", idx, value);
}
Source

pub fn stats(&self) -> ArenaStats

Get statistics about arena usage.

Source

pub fn validate(&self) -> bool

Validate internal consistency of the arena.

Returns true if the arena’s internal state is consistent. This is useful for debugging and testing.

Checks:

  • Free list integrity (no cycles, correct length)
  • Slot count matches len
  • All free slots are in the free list
Source

pub fn is_slot_occupied(&self, slot_index: usize) -> bool

Check if a slot index is currently occupied.

This is a low-level debugging method. For normal use, prefer [is_allocated].

Source

pub fn allocated_indices(&self) -> [ArenaIndex; N]

Get indices of all allocated slots.

Returns an array with the first len() elements being valid indices. The remaining elements are ArenaIndex::NULL.

Source

pub fn for_each<F>(&self, f: F)
where F: FnMut(ArenaIndex, &T),

Apply a function to all allocated values.

This is useful for bulk updates without the overhead of iteration.

Source

pub fn for_each_mut<F>(&self, f: F)
where F: FnMut(ArenaIndex, &mut T),

Apply a mutating function to all allocated values.

Source

pub fn count_where<F>(&self, predicate: F) -> usize
where F: Fn(&T) -> bool,

Count values matching a predicate.

Source

pub fn find<F>(&self, predicate: F) -> Option<(ArenaIndex, T)>
where F: Fn(&T) -> bool,

Find the first value matching a predicate.

Source

pub fn any<F>(&self, predicate: F) -> bool
where F: Fn(&T) -> bool,

Check if any allocated value matches a predicate.

Source

pub fn all<F>(&self, predicate: F) -> bool
where F: Fn(&T) -> bool,

Check if all allocated values match a predicate.

Returns true if the arena is empty.

Source

pub fn alloc_contiguous( &self, count: usize, default: T, ) -> Result<ArenaIndex, ArenaError>

Allocate a contiguous block of count slots.

Returns the starting ArenaIndex if successful. The allocated slots are consecutive in memory, starting from the returned index.

§Use Case

This is primarily used for string storage where characters need to be stored in contiguous memory for efficient access. The first slot typically stores length metadata, followed by character data.

§Errors

Returns ArenaError::InvalidIndex if count is 0. Returns ArenaError::OutOfMemory if no contiguous block is available.

§Example
use pwn_arena::Arena;
 
let arena: Arena<isize, 100> = Arena::new(0);
 
// Allocate 5 contiguous slots
let start = arena.alloc_contiguous(5, 0).unwrap();
 
// All slots are consecutive
for i in 0..5 {
    let idx = arena.index_at_offset(start, i).unwrap();
    arena.set(idx, i as isize).unwrap();
}
Source

pub fn index_at_offset( &self, start: ArenaIndex, offset: usize, ) -> Result<ArenaIndex, ArenaError>

Get an ArenaIndex at a given offset from a starting index.

This is used to access slots within a contiguous allocation.

§Errors

Returns ArenaError::InvalidIndex if the offset goes out of bounds or the slot is not allocated.

§Example
use pwn_arena::Arena;
 
let arena: Arena<isize, 100> = Arena::new(0);
let start = arena.alloc_contiguous(3, 0).unwrap();
 
// Access slot at offset 1
let idx1 = arena.index_at_offset(start, 1).unwrap();
arena.set(idx1, 42).unwrap();
assert_eq!(arena.get(idx1).unwrap(), 42);
Source

pub fn free_contiguous( &self, start: ArenaIndex, count: usize, ) -> Result<(), ArenaError>

Free a contiguous block starting at start with count slots.

All slots must be allocated.

§Errors

Returns ArenaError::InvalidIndex if any slot is out of bounds or not allocated.

§Example
use pwn_arena::Arena;
 
let arena: Arena<isize, 100> = Arena::new(0);
let start = arena.alloc_contiguous(5, 0).unwrap();
 
assert_eq!(arena.len(), 5);
 
arena.free_contiguous(start, 5).unwrap();
 
assert_eq!(arena.len(), 0);
Source§

impl<T, const N: usize> Arena<T, N>
where T: Copy,

Source

pub fn delete_recursive(&self, index: ArenaIndex) -> Result<(), ArenaError>
where T: ArenaDelete<T, N>,

Delete a value and recursively delete any children.

This requires T: ArenaDelete<T, N>.

Source

pub fn copy_deep(&self, index: ArenaIndex) -> Result<ArenaIndex, ArenaError>
where T: ArenaCopy<T, N>,

Create a deep copy of a value and its children.

This requires T: ArenaCopy<T, N>.

Source§

impl<T, const N: usize> Arena<T, N>
where T: Copy,

Source

pub fn collect_garbage(&self, roots: &[ArenaIndex]) -> GcStats
where T: Trace<T, N>,

Perform mark-and-sweep garbage collection.

Starting from the given roots, marks all reachable objects by following ArenaIndex references (via the Trace trait), then frees all unmarked (unreachable) objects.

§Algorithm
  1. Mark phase: Starting from roots, recursively mark all reachable objects. Handles cycles correctly by checking if already marked.
  2. Sweep phase: Iterate through all slots and free any that are allocated but not marked.
§Returns

Returns GcStats with information about what was collected.

§Complexity
  • Time: O(reachable + N) where N is arena capacity
  • Space: O(N) for the mark bitmap
§Example
use pwn_arena::{Arena, ArenaIndex, Trace};

#[derive(Clone, Copy)]
struct Node {
    value: isize,
    next: Option<ArenaIndex>,
}

impl<const N: usize> Trace<Node, N> for Node {
    fn trace<F: FnMut(ArenaIndex)>(&self, mut tracer: F) {
        if let Some(next) = self.next {
            tracer(next);
        }
    }
}

let arena: Arena<Node, 10> = Arena::new(Node { value: 0, next: None });

// Create a linked list: root -> n1 -> n2
let n2 = arena.alloc(Node { value: 3, next: None }).unwrap();
let n1 = arena.alloc(Node { value: 2, next: Some(n2) }).unwrap();
let root = arena.alloc(Node { value: 1, next: Some(n1) }).unwrap();

// Create some garbage
let _garbage = arena.alloc(Node { value: -1, next: None }).unwrap();

// Collect with root as the only GC root
let stats = arena.collect_garbage(&[root]);

assert_eq!(stats.collected, 1);
assert_eq!(arena.len(), 3);
Source

pub fn collect_garbage_unconditional(&self, roots: &[ArenaIndex]) -> GcStats
where T: Trace<T, N>,

Perform garbage collection unconditionally, even if GC is disabled.

This ignores the gc_enabled flag and always performs collection. Useful when you need to collect garbage regardless of the current GC state.

§Example
use pwn_arena::{Arena, ArenaIndex, Trace};

#[derive(Clone, Copy)]
struct Leaf(isize);

impl<const N: usize> Trace<Leaf, N> for Leaf {
    fn trace<F: FnMut(ArenaIndex)>(&self, _tracer: F) {}
}

let arena: Arena<Leaf, 10> = Arena::new(Leaf(0));
arena.set_gc_enabled(false);

arena.alloc(Leaf(1)).unwrap();
arena.alloc(Leaf(2)).unwrap(); // garbage

let root = arena.alloc(Leaf(3)).unwrap();

// This will NOT collect (GC disabled)
let stats = arena.collect_garbage(&[root]);
assert_eq!(stats.collected, 0);

// This WILL collect (unconditionally)
let stats = arena.collect_garbage_unconditional(&[root]);
assert_eq!(stats.collected, 2);
Source

pub fn collect_garbage_multi(&self, root_sets: &[&[ArenaIndex]]) -> GcStats
where T: Trace<T, N>,

Perform garbage collection with multiple root sets.

This iterates through all provided root sets and marks objects reachable from any of them.

Respects the gc_enabled flag - use Arena::collect_garbage_multi_unconditional to ignore the flag.

§Note

For no-alloc compatibility, this method iterates through root sets sequentially rather than flattening them. This has the same effect but uses constant stack space.

Source

pub fn collect_garbage_multi_unconditional( &self, root_sets: &[&[ArenaIndex]], ) -> GcStats
where T: Trace<T, N>,

Perform garbage collection unconditionally with multiple root sets, ignoring the gc_enabled flag.

Source

pub fn alloc_or_gc( &self, value: T, roots: &[ArenaIndex], ) -> Result<ArenaIndex, ArenaError>
where T: Trace<T, N>,

Allocate a value, running GC first if the arena is full.

If allocation fails due to OutOfMemory, this method runs garbage collection with the provided roots and retries the allocation.

§Errors

Returns OutOfMemory if allocation still fails after GC.

§Example
use pwn_arena::{Arena, ArenaIndex, Trace};

#[derive(Clone, Copy)]
struct Node(isize);

impl<const N: usize> Trace<Node, N> for Node {
    fn trace<F: FnMut(ArenaIndex)>(&self, _: F) {}
}

let arena: Arena<Node, 3> = Arena::new(Node(0));

let root = arena.alloc(Node(1)).unwrap();
arena.alloc(Node(2)).unwrap(); // garbage
arena.alloc(Node(3)).unwrap(); // garbage

// Arena is full, but alloc_or_gc will collect garbage first
let new_idx = arena.alloc_or_gc(Node(4), &[root]).unwrap();
assert_eq!(arena.len(), 2); // root + new_idx

Auto Trait Implementations§

§

impl<T, const N: usize> !Freeze for Arena<T, N>

§

impl<T, const N: usize> !RefUnwindSafe for Arena<T, N>

§

impl<T, const N: usize> Send for Arena<T, N>
where T: Send,

§

impl<T, const N: usize> !Sync for Arena<T, N>

§

impl<T, const N: usize> Unpin for Arena<T, N>
where T: Unpin,

§

impl<T, const N: usize> UnwindSafe for Arena<T, N>
where T: UnwindSafe,

Blanket Implementations§

Source§

impl<T> Any for T
where T: 'static + ?Sized,

Source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
Source§

impl<T> Borrow<T> for T
where T: ?Sized,

Source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
Source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

Source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
Source§

impl<T> From<T> for T

Source§

fn from(t: T) -> T

Returns the argument unchanged.

Source§

impl<T, U> Into<U> for T
where U: From<T>,

Source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

Source§

type Error = Infallible

The type returned in the event of a conversion error.
Source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
Source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

Source§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
Source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.