pub struct Talc<O: OomHandler> {
pub oom_handler: O,
/* private fields */
}Expand description
The Talc Allocator!
To get started:
- Construct with
neworwith_arenafunctions (useErrOnOomto ignore OOM handling). - Initialize with
initorextend. - Call
lockto get aTalckwhich supports theGlobalAllocandAllocatortraits.
Fields§
§oom_handler: OImplementations§
source§impl<O: OomHandler> Talc<O>
impl<O: OomHandler> Talc<O>
sourcepub unsafe fn malloc(&mut self, layout: Layout) -> Result<NonNull<u8>, ()>
pub unsafe fn malloc(&mut self, layout: Layout) -> Result<NonNull<u8>, ()>
Allocate a contiguous region of memory according to layout, if possible.
Safety
layout.size() must be nonzero.
sourcepub unsafe fn free(&mut self, ptr: NonNull<u8>, _: Layout)
pub unsafe fn free(&mut self, ptr: NonNull<u8>, _: Layout)
Free previously allocated/reallocated memory.
Safety
ptr must have been previously allocated given layout.
sourcepub unsafe fn grow(
&mut self,
ptr: NonNull<u8>,
layout: Layout,
new_size: usize
) -> Result<NonNull<u8>, ()>
pub unsafe fn grow( &mut self, ptr: NonNull<u8>, layout: Layout, new_size: usize ) -> Result<NonNull<u8>, ()>
Grow a previously allocated/reallocated region of memory to new_size.
Safety
ptr must have been previously allocated or reallocated given layout.
new_size must be larger or equal to layout.size().
sourcepub unsafe fn shrink(
&mut self,
ptr: NonNull<u8>,
layout: Layout,
new_size: usize
)
pub unsafe fn shrink( &mut self, ptr: NonNull<u8>, layout: Layout, new_size: usize )
Shrink a previously allocated/reallocated region of memory to new_size.
This function is infallibe given valid inputs, and the reallocation will always be done in-place, maintaining the validity of the pointer.
Safety
ptrmust have been previously allocated or reallocated givenlayout.new_sizemust be smaller or equal tolayout.size().new_sizeshould be nonzero.
sourcepub unsafe fn with_arena(oom_handler: O, arena: Span) -> Self
pub unsafe fn with_arena(oom_handler: O, arena: Span) -> Self
sourcepub const fn get_arena(&self) -> Span
pub const fn get_arena(&self) -> Span
Returns the Span which has been granted to this allocator as allocatable.
sourcepub fn get_allocatable_span(&self) -> Span
pub fn get_allocatable_span(&self) -> Span
Returns the Span in which allocations may be placed.
sourcepub fn get_allocated_span(&self) -> Span
pub fn get_allocated_span(&self) -> Span
Returns the minimum Span containing all allocated memory.
sourcepub unsafe fn init(&mut self, arena: Span)
pub unsafe fn init(&mut self, arena: Span)
Initialize the allocator heap.
Note that metadata will be placed into the bottom of the heap. It should be ~1KiB. Note that if the arena isn’t big enough, this function will not fail. However, no memory will be made available for allocation, and allocations will signal OOM.
Safety
- The memory within the
arenamust be valid for reads and writes, and memory therein not allocated to the user must not be mutated for the lifetime of all the allocations of this allocator.
Panics
Panics if arena contains the null address.
sourcepub unsafe fn extend(&mut self, new_arena: Span)
pub unsafe fn extend(&mut self, new_arena: Span)
Increase the extent of the arena.
Safety
The entire new_arena memory but be readable and writable and unmutated besides that which is allocated. So on and so forth.
Panics
This function panics if:
new_arenadoesn’t contain the old arena (NB: empty arenas are contained by any arena)new_arenacontains the null address
A recommended pattern for satisfying these criteria is:
// compute the new arena as an extention of the old arena
// for the sake of example we avoid the null page too
let new_arena = talc.get_arena().extend(1234, 5678).above(0x1000 as *mut u8);
// SAFETY: be sure not to extend into memory we can't use
unsafe { talc.extend(new_arena); }sourcepub fn truncate(&mut self, new_arena: Span)
pub fn truncate(&mut self, new_arena: Span)
Reduce the extent of the arena. The new extent must encompass all current allocations. See below.
Panics:
This function panics if:
- old arena doesn’t contain
new_arena new_arenadoesn’t contain all the allocated memory
The recommended pattern for satisfying these criteria is:
// note: lock the allocator otherwise a race condition may occur
// in between get_allocated_span and truncate
// compute the new arena as a reduction of the old arena
let new_arena = talc.get_arena().truncate(1234, 5678).fit_over(talc.get_allocated_span());
// alternatively...
let new_arena = Span::from((1234 as *mut u8)..(5678 as *mut u8))
.fit_within(talc.get_arena())
.fit_over(talc.get_allocated_span());
// truncate the arena
talc.truncate(new_arena);sourcepub const fn lock<R: RawMutex>(self) -> Talck<R, O>
pub const fn lock<R: RawMutex>(self) -> Talck<R, O>
Wrap in Talck, a mutex-locked wrapper struct using lock_api.
This implements the GlobalAlloc trait and provides
access to the Allocator API.
Examples
use spin::Mutex;
let talc = Talc::new(ErrOnOom);
let talck = talc.lock::<Mutex<()>>();
unsafe {
talck.alloc(Layout::from_size_align_unchecked(32, 4));
}