[][src]Struct moving_gc_arena::Region

pub struct Region<T> { /* fields omitted */ }

The type of a collectable region.

This object can be used to allocate, collect, traverse and update the objects within it.

Access to the region is exposed through methods on the corresponding reference types, and requires references to this region in order to safely reference the data within. This ensures that garbage collections do not interrupt accesses and vice versa, and allows for a conservative compile-time check for uniqueness, rather than requiring use of an internal Cell type.

Since garbage collection is a property of the region, it is not statically checked for indices. Weak and Root will always be in sync with their source region, but raw indices Ix may be invalidated. Some methods (which necessarily take &mut self) may invalidate raw indices by moving the objects, such as for a garbage collection. These will be documented.

Implementations

impl<T> Region<T>[src]

pub fn new() -> Self[src]

impl<T> Region<T>[src]

pub fn capacity(&self) -> usize[src]

Return the current capacity of this region. A collection won't be triggered by allocation unless the desired amount exceeds the capacity.

pub fn len(&self) -> usize[src]

Return the current number of entries in the region.

pub fn is_empty(&self) -> bool[src]

Returns true if there are currently no entries in this region.

impl<T: 'static + HasIx<T>> Region<T>[src]

pub fn ensure(&mut self, additional: usize)[src]

Ensure that the capacity supports new_elems more elements, collecting garbage if necessary.

pub fn alloc<F>(&mut self, make_t: F) -> Entry<'_, T> where
    F: FnOnce(&Self) -> T, 
[src]

Allocate a new object, returning a temporary handle, which can be used to mutate the object and to get roots, weak pointers, and internal pointers to the object.

This may trigger a garbage collection and invalidate raw indices. As such, a function is used to generate the new value, which can query the state of the world post-collection.

pub fn gc(&mut self)[src]

Immediately trigger a standard garbage collection.

This invalidates raw indices.

use moving_gc_arena as gc;
let mut r = gc::Region::new();

r.alloc(|_|{()});
r.gc();
assert!(r.is_empty());

pub fn gc_into(self, other: &mut Region<T>)[src]

Move the elements of this region onto the end of another Region. This can trigger a collection in the other region if it must be re-allocated.

pub fn traverse<'a, I, Strategy, Pre, Post>(
    &'a mut self,
    strategy: Strategy,
    start: I,
    pre: Pre,
    post: Post
) where
    T: HasIx<T>,
    Pre: FnMut(Entry<'_, T>),
    Post: FnMut(Entry<'_, T>),
    Strategy: Strategy<T, PreAndPost>,
    Strategy: Strategy<T, PreOnly>,
    I: IntoIterator<Item = &'a Ix<T>>,
    I::IntoIter: Clone
[src]

Traverse all reachable objects, applying a mutating function to each of the values passed over in pre-order.

The traverse::CallStack startegy can be used with any type whatsoever, and produces a depth-first search using the indices provided by HasIx::foreach_ix.

In order to ensure that this function performs well, the HasIx implementation for T must be deterministic: it should always return the same references. This is so that we can correctly maintain marking information.

There is no immutable implementation of this function, as it relies on unique access to object headers (but in exchange, requires no allocations).

pub fn traverse_with<'a, I, Strategy, State, Visit>(
    &'a mut self,
    strategy: Strategy,
    start: I,
    visitor: Visit
) where
    T: HasIx<T>,
    Strategy: Strategy<T, State>,
    Strategy: Strategy<T, PreOnly>,
    Visit: Visitor<T, State>,
    I: IntoIterator<Item = &'a Ix<T>>,
    I::IntoIter: Clone
[src]

Provides a more general state than traverse. This allows a specialized strategy to provide visitors beyond the simple pre- and post-actions provided by a blind DFS.

use moving_gc_arena::{traverse, Region, Entry};

let mut r = Region::new();
let mut count = 0;
let r1 = r.alloc(|_| ()).root();

// Annotations are usually needed when
// constructing generic visitors
let visitor = traverse::PrePostVisitor {
  pre: |_: Entry<_>| { count += 1 },
  post: ()
};
r.traverse_with(traverse::CallStack, &[r1.ix()], visitor);

assert_eq!(count, 1);

pub fn dfs_mut_cstack<'a, I, F>(&'a mut self, start: I, pre: F) where
    T: HasIx<T>,
    F: FnMut(Entry<'_, T>),
    I: IntoIterator<Item = &'a Ix<T>>,
    I::IntoIter: Clone
[src]

👎 Deprecated since 0.3.0:

Use Region::traverse with traverse::CallStack instead

pub fn deep_clone_with<F>(&mut self, start: Root<T>, do_clone: F) -> Root<T> where
    F: FnMut(Entry<'_, T>) -> T, 
[src]

Create a deep clone of the object subgraph starting at a particular root. Instead of using a standard Clone implementation, any clone function may be used.

This method assumes that the size of the graph reachable will not change. If the clone function does change it, then enough space for the result must be reserved to avoid panics or leaks.

pub fn deep_clone_into_with<F>(
    &mut self,
    other: &mut Region<T>,
    start: Root<T>,
    do_clone: F
) -> Root<T> where
    F: FnMut(Entry<'_, T>) -> T, 
[src]

As dep_clone_with, but cloning into a different Region

pub fn deep_clone_with_mut<F>(&mut self, start: &mut Root<T>, do_clone: F) where
    F: FnMut(Entry<'_, T>) -> T, 
[src]

👎 Deprecated since 0.3.0:

Use Region::deep_clone_with instead

pub fn deep_clone_into_with_mut<F>(
    &mut self,
    other: &mut Region<T>,
    start: &mut Root<T>,
    do_clone: F
) where
    F: FnMut(Entry<'_, T>) -> T, 
[src]

👎 Deprecated since 0.3.0:

Use Region::deep_clone_into_with instead

impl<T: 'static + HasIx<T> + Clone> Region<T>[src]

pub fn deep_clone(&mut self, start: Root<T>) -> Root<T>[src]

Create a deep clone of the object subgraph starting at a particular root. Each T instance which is reachable from this root object will be cloned exactly once, and all connections will be preserved. Each cloned object points only at cloned objects, and each original object points only at original objects.

Currently, the method of traversal is not yet configurable and always uses the CallStack, as special functionality is needed for this method beyond the normal traverse strategies.

pub fn deep_clone_into(
    &mut self,
    other: &mut Region<T>,
    start: Root<T>
) -> Root<T>
[src]

As deep_clone, but clones entries into a different Region.

Trait Implementations

impl<T> Debug for Region<T>[src]

impl<T> Default for Region<T>[src]

Auto Trait Implementations

impl<T> !Send for Region<T>

impl<T> !Sync for Region<T>

impl<T> Unpin for Region<T> where
    T: Unpin

Blanket Implementations

impl<T> Any for T where
    T: 'static + ?Sized
[src]

impl<T> Borrow<T> for T where
    T: ?Sized
[src]

impl<T> BorrowMut<T> for T where
    T: ?Sized
[src]

impl<T> From<T> for T[src]

impl<T, U> Into<U> for T where
    U: From<T>, 
[src]

impl<T, U> TryFrom<U> for T where
    U: Into<T>, 
[src]

type Error = Infallible

The type returned in the event of a conversion error.

impl<T, U> TryInto<U> for T where
    U: TryFrom<T>, 
[src]

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.