pub struct Talloc { /* private fields */ }Expand description
The TauOS Allocator!
Note, you’re probably looking for Tallock if you want
the spin-locked wrapper with the GlobalAlloc and Allocator
trait implementations.
TODO resolve
Implementations§
source§impl Talloc
impl Talloc
sourcepub unsafe fn malloc(
&mut self,
layout: Layout
) -> Result<NonNull<u8>, AllocError>
pub unsafe fn malloc( &mut self, layout: Layout ) -> Result<NonNull<u8>, AllocError>
Allocate a contiguous region of memory according to layout, if possible.
SAFETY:
layout.size() must be nonzero.
sourcepub unsafe fn free(&mut self, ptr: NonNull<u8>, _: Layout)
pub unsafe fn free(&mut self, ptr: NonNull<u8>, _: Layout)
Free previously allocated/reallocated memory.
SAFETY:
ptr must have been previously allocated given layout.
sourcepub unsafe fn grow(
&mut self,
ptr: NonNull<u8>,
layout: Layout,
new_size: usize
) -> Result<NonNull<u8>, AllocError>
pub unsafe fn grow( &mut self, ptr: NonNull<u8>, layout: Layout, new_size: usize ) -> Result<NonNull<u8>, AllocError>
Grow a previously allocated/reallocated region of memory to new_size.
SAFETY:
ptr must have been previously allocated or reallocated given old_layout.
new_size must be larger or equal to old_layout.size().
sourcepub unsafe fn shrink(
&mut self,
ptr: NonNull<u8>,
layout: Layout,
new_size: usize
)
pub unsafe fn shrink( &mut self, ptr: NonNull<u8>, layout: Layout, new_size: usize )
Shrink a previously allocated/reallocated region of memory to new_size.
This function is infallibe given valid inputs, and the reallocation will always be done in-place, maintaining the validity of the pointer.
SAFETY:
ptrmust have been previously allocated or reallocated givenold_layout.new_sizemust be smaller or equal toold_layout.size().new_sizeshould be nonzero.
pub const fn new() -> Self
pub const fn with_oom_handler( oom_handler: fn(_: &mut Talloc, _: Layout) -> Result<(), AllocError> ) -> Self
pub const fn get_arena(&self) -> Span
sourcepub fn get_allocatable_span(&self) -> Span
pub fn get_allocatable_span(&self) -> Span
Returns the span in which allocations may be placed.
sourcepub fn get_allocated_span(&self) -> Span
pub fn get_allocated_span(&self) -> Span
Returns the minimum span containing all allocated memory.
None indicated there is no allocated memory.
sourcepub unsafe fn init(&mut self, arena: Span)
pub unsafe fn init(&mut self, arena: Span)
Initialize the allocator heap.
SAFETY:
- After initialization, the allocator structure is invalidated if moved. This is because there are pointers on the heap to this struct.
- Initialization restores validity, but erases all knowledge of previous allocations.
Alternatively, use the mov method.
sourcepub unsafe fn extend(&mut self, new_arena: Span)
pub unsafe fn extend(&mut self, new_arena: Span)
Increase the extent of the arena.
SAFETY:
The entire new_arena memory but be readable and writable and unmutated besides that which is allocated. So on and so forth.
Panics:
This function panics if:
new_arenadoesn’t contain the old arenanew_arenacontains the null address
A recommended pattern for satisfying these criteria is:
let mut talloc = tallock.0.lock();
// compute the new arena as an extention of the old arena
let new_arena = talloc.get_arena().extend(1234, 5678).above(0x1000);
// extend the arena
// SAFETY: must be sure that we aren't extending into memory we can't use
unsafe { talloc.extend(new_arena); }sourcepub fn truncate(&mut self, new_arena: Span)
pub fn truncate(&mut self, new_arena: Span)
Reduce the extent of the arena. The new extent must encompass all current allocations. See below.
Panics:
This function panics if:
- old arena doesn’t contain
new_arena new_arenadoesn’t contain all the allocated memory
The recommended pattern for satisfying these criteria is:
// lock the allocator otherwise a race condition may occur
// in between get_allocated_span and truncate
let mut talloc = tallock.0.lock();
// compute the new arena as a reduction of the old arena
let new_arena = talloc.get_arena().truncate(1234, 5678).fit_over(talloc.get_allocated_span());
// alternatively...
let new_arena = Span::from(1234..5678).fit_within(talloc.get_arena()).fit_over(talloc.get_allocated_span());
// truncate the arena
talloc.truncate(new_arena);sourcepub fn mov(self, dest: &mut MaybeUninit<Self>) -> &mut Self
pub fn mov(self, dest: &mut MaybeUninit<Self>) -> &mut Self
Move the allocator structure to a new destination safely.