#[repr(transparent)]
pub struct VMExternRef(_);
Expand description

An external reference to some opaque data.

VMExternRefs dereference to their underlying opaque data as dyn Any.

Unlike the externref in the Wasm spec, VMExternRefs are non-nullable, and always point to a valid value. You may use Option<VMExternRef> to represent nullable references, and Option<VMExternRef> is guaranteed to have the same size and alignment as a raw pointer, with None represented with the null pointer.

VMExternRefs are reference counted, so cloning is a cheap, shallow operation. It also means they are inherently shared, so you may not get a mutable, exclusive reference to their inner contents, only a shared, immutable reference. You may use interior mutability with RefCell or Mutex to work around this restriction, if necessary.

VMExternRefs have pointer-equality semantics, not structural-equality semantics. Given two VMExternRefs a and b, a == b only if a and b point to the same allocation. a and b are considered not equal, even if a and b are two different identical copies of the same data, if they are in two different allocations. The hashing and ordering implementations also only operate on the pointer.

Example

use std::cell::RefCell;
use wasmtime_runtime::VMExternRef;

// Open a file. Wasm doesn't know about files, but we can let Wasm instances
// work with files via opaque `externref` handles.
let file = std::fs::File::create("some/file/path")?;

// Wrap the file up as an `VMExternRef` that can be passed to Wasm.
let extern_ref_to_file = VMExternRef::new(file);

// `VMExternRef`s dereference to `dyn Any`, so you can use `Any` methods to
// perform runtime type checks and downcasts.

assert!(extern_ref_to_file.is::<std::fs::File>());
assert!(!extern_ref_to_file.is::<String>());

if let Some(mut file) = extern_ref_to_file.downcast_ref::<std::fs::File>() {
    use std::io::Write;
    writeln!(&mut file, "Hello, `VMExternRef`!")?;
}

Implementations§

Wrap the given value inside an VMExternRef.

Construct a new VMExternRef in place by invoking make_value.

Examples found in repository?
src/externref.rs (line 318)
314
315
316
317
318
319
    pub fn new<T>(value: T) -> VMExternRef
    where
        T: 'static + Any + Send + Sync,
    {
        VMExternRef::new_with(|| value)
    }

Turn this VMExternRef into a raw, untyped pointer.

Unlike into_raw, this does not consume and forget self. It is not safe to use from_raw on pointers returned from this method; only use clone_from_raw!

Nor does this method increment the reference count. You must ensure that self (or some other clone of self) stays alive until clone_from_raw is called.

Examples found in repository?
src/libcalls.rs (line 404)
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
unsafe fn externref_global_get(vmctx: *mut VMContext, index: u32) -> *mut u8 {
    let index = GlobalIndex::from_u32(index);
    let instance = (*vmctx).instance();
    let global = instance.defined_or_imported_global_ptr(index);
    match (*global).as_externref().clone() {
        None => ptr::null_mut(),
        Some(externref) => {
            let raw = externref.as_raw();
            let (activations_table, module_info_lookup) =
                (*instance.store()).externref_activations_table();
            activations_table.insert_with_gc(externref, module_info_lookup);
            raw
        }
    }
}
More examples
Hide additional examples
src/externref.rs (line 892)
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
pub unsafe fn gc(
    module_info_lookup: &dyn ModuleInfoLookup,
    externref_activations_table: &mut VMExternRefActivationsTable,
) {
    log::debug!("start GC");

    #[cfg(debug_assertions)]
    assert!(externref_activations_table.gc_okay);

    debug_assert!({
        // This set is only non-empty within this function. It is built up when
        // walking the stack and interpreting stack maps, and then drained back
        // into the activations table's bump-allocated space at the
        // end. Therefore, it should always be empty upon entering this
        // function.
        externref_activations_table.precise_stack_roots.is_empty()
    });

    // This function proceeds by:
    //
    // * walking the stack,
    //
    // * finding the precise set of roots inside Wasm frames via our stack maps,
    //   and
    //
    // * resetting our bump-allocated table's over-approximation to the
    //   newly-discovered precise set.

    // The `activations_table_set` is used for `debug_assert!`s checking that
    // every reference we read out from the stack via stack maps is actually in
    // the table. If that weren't true, than either we forgot to insert a
    // reference in the table when passing it into Wasm (a bug) or we are
    // reading invalid references from the stack (another bug).
    let mut activations_table_set: DebugOnly<HashSet<_>> = Default::default();
    if cfg!(debug_assertions) {
        externref_activations_table.elements(|elem| {
            activations_table_set.insert(elem.as_raw() as *mut VMExternData);
        });
    }

    log::trace!("begin GC trace");
    Backtrace::trace(|frame| {
        let pc = frame.pc();
        debug_assert!(pc != 0, "we should always get a valid PC for Wasm frames");

        let fp = frame.fp();
        debug_assert!(
            fp != 0,
            "we should always get a valid frame pointer for Wasm frames"
        );

        let module_info = module_info_lookup
            .lookup(pc)
            .expect("should have module info for Wasm frame");

        let stack_map = match module_info.lookup_stack_map(pc) {
            Some(sm) => sm,
            None => {
                log::trace!("No stack map for this Wasm frame");
                return std::ops::ControlFlow::Continue(());
            }
        };
        log::trace!(
            "We have a stack map that maps {} words in this Wasm frame",
            stack_map.mapped_words()
        );

        let sp = fp - stack_map.mapped_words() as usize * mem::size_of::<usize>();

        for i in 0..(stack_map.mapped_words() as usize) {
            // Stack maps have one bit per word in the frame, and the
            // zero^th bit is the *lowest* addressed word in the frame,
            // i.e. the closest to the SP. So to get the `i`^th word in
            // this frame, we add `i * sizeof(word)` to the SP.
            let stack_slot = sp + i * mem::size_of::<usize>();

            if !stack_map.get_bit(i) {
                log::trace!(
                    "Stack slot @ {:p} does not contain externrefs",
                    stack_slot as *const (),
                );
                continue;
            }

            let stack_slot = stack_slot as *const *mut VMExternData;
            let r = std::ptr::read(stack_slot);
            log::trace!("Stack slot @ {:p} = {:p}", stack_slot, r);

            debug_assert!(
                r.is_null() || activations_table_set.contains(&r),
                "every on-stack externref inside a Wasm frame should \
                 have an entry in the VMExternRefActivationsTable; \
                 {:?} is not in the table",
                r
            );

            if let Some(r) = NonNull::new(r) {
                VMExternRefActivationsTable::insert_precise_stack_root(
                    &mut externref_activations_table.precise_stack_roots,
                    r,
                );
            }
        }

        std::ops::ControlFlow::Continue(())
    });
    log::trace!("end GC trace");

    externref_activations_table.sweep();

    log::debug!("end GC");
}

Consume this VMExternRef into a raw, untyped pointer.

Safety

This method forgets self, so it is possible to create a leak of the underlying reference counted data if not used carefully.

Use from_raw to recreate the VMExternRef.

Examples found in repository?
src/table.rs (line 89)
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
    unsafe fn into_table_value(self) -> usize {
        match self {
            Self::UninitFunc => 0,
            Self::FuncRef(e) => (e as usize) | FUNCREF_INIT_BIT,
            Self::ExternRef(e) => e.map_or(0, |e| e.into_raw() as usize),
        }
    }

    /// Consumes a table element into a pointer/reference, as it
    /// exists outside the table itself. This strips off any tag bits
    /// or other information that only lives inside the table.
    ///
    /// Can only be done to an initialized table element; lazy init
    /// must occur first. (In other words, lazy values do not survive
    /// beyond the table, as every table read path initializes them.)
    ///
    /// # Safety
    ///
    /// The same warnings as for `into_table_values()` apply.
    pub(crate) unsafe fn into_ref_asserting_initialized(self) -> usize {
        match self {
            Self::FuncRef(e) => e as usize,
            Self::ExternRef(e) => e.map_or(0, |e| e.into_raw() as usize),
            Self::UninitFunc => panic!("Uninitialized table element value outside of table slot"),
        }
    }

Recreate a VMExternRef from a pointer returned from a previous call to as_raw.

Safety

Unlike clone_from_raw, this does not increment the reference count of the underlying data. It is not safe to continue to use the pointer passed to this function.

Examples found in repository?
src/table.rs (line 54)
48
49
50
51
52
53
54
55
56
57
    unsafe fn from_table_value(ty: TableElementType, ptr: usize) -> Self {
        match (ty, ptr) {
            (TableElementType::Func, 0) => Self::UninitFunc,
            (TableElementType::Func, ptr) => Self::FuncRef((ptr & FUNCREF_MASK) as _),
            (TableElementType::Extern, 0) => Self::ExternRef(None),
            (TableElementType::Extern, ptr) => {
                Self::ExternRef(Some(VMExternRef::from_raw(ptr as *mut u8)))
            }
        }
    }

Recreate a VMExternRef from a pointer returned from a previous call to as_raw.

Safety

Wildly unsafe to use with anything other than the result of a previous as_raw call!

Additionally, it is your responsibility to ensure that this raw VMExternRef’s reference count has not dropped to zero. Failure to do so will result in use after free!

Examples found in repository?
src/externref.rs (line 718)
714
715
716
717
718
719
720
721
    fn insert_precise_stack_root(
        precise_stack_roots: &mut HashSet<VMExternRefWithTraits>,
        root: NonNull<VMExternData>,
    ) {
        let root = unsafe { VMExternRef::clone_from_raw(root.as_ptr().cast()) };
        log::trace!("Found externref on stack: {:p}", root);
        precise_stack_roots.insert(VMExternRefWithTraits(root));
    }
More examples
Hide additional examples
src/table.rs (line 70)
64
65
66
67
68
69
70
71
72
73
    unsafe fn clone_from_table_value(ty: TableElementType, ptr: usize) -> Self {
        match (ty, ptr) {
            (TableElementType::Func, 0) => Self::UninitFunc,
            (TableElementType::Func, ptr) => Self::FuncRef((ptr & FUNCREF_MASK) as _),
            (TableElementType::Extern, 0) => Self::ExternRef(None),
            (TableElementType::Extern, ptr) => {
                Self::ExternRef(Some(VMExternRef::clone_from_raw(ptr as *mut u8)))
            }
        }
    }
src/libcalls.rs (line 208)
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
unsafe fn table_grow(
    vmctx: *mut VMContext,
    table_index: u32,
    delta: u32,
    // NB: we don't know whether this is a pointer to a `VMCallerCheckedAnyfunc`
    // or is a `VMExternRef` until we look at the table type.
    init_value: *mut u8,
) -> Result<u32> {
    let instance = (*vmctx).instance_mut();
    let table_index = TableIndex::from_u32(table_index);
    let element = match instance.table_element_type(table_index) {
        TableElementType::Func => (init_value as *mut VMCallerCheckedAnyfunc).into(),
        TableElementType::Extern => {
            let init_value = if init_value.is_null() {
                None
            } else {
                Some(VMExternRef::clone_from_raw(init_value))
            };
            init_value.into()
        }
    };
    Ok(match instance.table_grow(table_index, delta, element)? {
        Some(r) => r,
        None => -1_i32 as u32,
    })
}

use table_grow as table_grow_funcref;
use table_grow as table_grow_externref;

// Implementation of `table.fill`.
unsafe fn table_fill(
    vmctx: *mut VMContext,
    table_index: u32,
    dst: u32,
    // NB: we don't know whether this is a `VMExternRef` or a pointer to a
    // `VMCallerCheckedAnyfunc` until we look at the table's element type.
    val: *mut u8,
    len: u32,
) -> Result<(), Trap> {
    let instance = (*vmctx).instance_mut();
    let table_index = TableIndex::from_u32(table_index);
    let table = &mut *instance.get_table(table_index);
    match table.element_type() {
        TableElementType::Func => {
            let val = val as *mut VMCallerCheckedAnyfunc;
            table.fill(dst, val.into(), len)
        }
        TableElementType::Extern => {
            let val = if val.is_null() {
                None
            } else {
                Some(VMExternRef::clone_from_raw(val))
            };
            table.fill(dst, val.into(), len)
        }
    }
}

use table_fill as table_fill_funcref;
use table_fill as table_fill_externref;

// Implementation of `table.copy`.
unsafe fn table_copy(
    vmctx: *mut VMContext,
    dst_table_index: u32,
    src_table_index: u32,
    dst: u32,
    src: u32,
    len: u32,
) -> Result<(), Trap> {
    let dst_table_index = TableIndex::from_u32(dst_table_index);
    let src_table_index = TableIndex::from_u32(src_table_index);
    let instance = (*vmctx).instance_mut();
    let dst_table = instance.get_table(dst_table_index);
    // Lazy-initialize the whole range in the source table first.
    let src_range = src..(src.checked_add(len).unwrap_or(u32::MAX));
    let src_table = instance.get_table_with_lazy_init(src_table_index, src_range);
    Table::copy(dst_table, src_table, dst, src, len)
}

// Implementation of `table.init`.
unsafe fn table_init(
    vmctx: *mut VMContext,
    table_index: u32,
    elem_index: u32,
    dst: u32,
    src: u32,
    len: u32,
) -> Result<(), Trap> {
    let table_index = TableIndex::from_u32(table_index);
    let elem_index = ElemIndex::from_u32(elem_index);
    let instance = (*vmctx).instance_mut();
    instance.table_init(table_index, elem_index, dst, src, len)
}

// Implementation of `elem.drop`.
unsafe fn elem_drop(vmctx: *mut VMContext, elem_index: u32) {
    let elem_index = ElemIndex::from_u32(elem_index);
    let instance = (*vmctx).instance_mut();
    instance.elem_drop(elem_index);
}

// Implementation of `memory.copy` for locally defined memories.
unsafe fn memory_copy(
    vmctx: *mut VMContext,
    dst_index: u32,
    dst: u64,
    src_index: u32,
    src: u64,
    len: u64,
) -> Result<(), Trap> {
    let src_index = MemoryIndex::from_u32(src_index);
    let dst_index = MemoryIndex::from_u32(dst_index);
    let instance = (*vmctx).instance_mut();
    instance.memory_copy(dst_index, dst, src_index, src, len)
}

// Implementation of `memory.fill` for locally defined memories.
unsafe fn memory_fill(
    vmctx: *mut VMContext,
    memory_index: u32,
    dst: u64,
    val: u32,
    len: u64,
) -> Result<(), Trap> {
    let memory_index = MemoryIndex::from_u32(memory_index);
    let instance = (*vmctx).instance_mut();
    instance.memory_fill(memory_index, dst, val as u8, len)
}

// Implementation of `memory.init`.
unsafe fn memory_init(
    vmctx: *mut VMContext,
    memory_index: u32,
    data_index: u32,
    dst: u64,
    src: u32,
    len: u32,
) -> Result<(), Trap> {
    let memory_index = MemoryIndex::from_u32(memory_index);
    let data_index = DataIndex::from_u32(data_index);
    let instance = (*vmctx).instance_mut();
    instance.memory_init(memory_index, data_index, dst, src, len)
}

// Implementation of `ref.func`.
unsafe fn ref_func(vmctx: *mut VMContext, func_index: u32) -> *mut u8 {
    let instance = (*vmctx).instance_mut();
    let anyfunc = instance
        .get_caller_checked_anyfunc(FuncIndex::from_u32(func_index))
        .expect("ref_func: caller_checked_anyfunc should always be available for given func index");
    anyfunc as *mut _
}

// Implementation of `data.drop`.
unsafe fn data_drop(vmctx: *mut VMContext, data_index: u32) {
    let data_index = DataIndex::from_u32(data_index);
    let instance = (*vmctx).instance_mut();
    instance.data_drop(data_index)
}

// Returns a table entry after lazily initializing it.
unsafe fn table_get_lazy_init_funcref(
    vmctx: *mut VMContext,
    table_index: u32,
    index: u32,
) -> *mut u8 {
    let instance = (*vmctx).instance_mut();
    let table_index = TableIndex::from_u32(table_index);
    let table = instance.get_table_with_lazy_init(table_index, std::iter::once(index));
    let elem = (*table)
        .get(index)
        .expect("table access already bounds-checked");

    elem.into_ref_asserting_initialized() as *mut _
}

// Drop a `VMExternRef`.
unsafe fn drop_externref(_vmctx: *mut VMContext, externref: *mut u8) {
    let externref = externref as *mut crate::externref::VMExternData;
    let externref = NonNull::new(externref).unwrap();
    crate::externref::VMExternData::drop_and_dealloc(externref);
}

// Do a GC and insert the given `externref` into the
// `VMExternRefActivationsTable`.
unsafe fn activations_table_insert_with_gc(vmctx: *mut VMContext, externref: *mut u8) {
    let externref = VMExternRef::clone_from_raw(externref);
    let instance = (*vmctx).instance();
    let (activations_table, module_info_lookup) = (*instance.store()).externref_activations_table();

    // Invariant: all `externref`s on the stack have an entry in the activations
    // table. So we need to ensure that this `externref` is in the table
    // *before* we GC, even though `insert_with_gc` will ensure that it is in
    // the table *after* the GC. This technically results in one more hash table
    // look up than is strictly necessary -- which we could avoid by having an
    // additional GC method that is aware of these GC-triggering references --
    // but it isn't really a concern because this is already a slow path.
    activations_table.insert_without_gc(externref.clone());

    activations_table.insert_with_gc(externref, module_info_lookup);
}

// Perform a Wasm `global.get` for `externref` globals.
unsafe fn externref_global_get(vmctx: *mut VMContext, index: u32) -> *mut u8 {
    let index = GlobalIndex::from_u32(index);
    let instance = (*vmctx).instance();
    let global = instance.defined_or_imported_global_ptr(index);
    match (*global).as_externref().clone() {
        None => ptr::null_mut(),
        Some(externref) => {
            let raw = externref.as_raw();
            let (activations_table, module_info_lookup) =
                (*instance.store()).externref_activations_table();
            activations_table.insert_with_gc(externref, module_info_lookup);
            raw
        }
    }
}

// Perform a Wasm `global.set` for `externref` globals.
unsafe fn externref_global_set(vmctx: *mut VMContext, index: u32, externref: *mut u8) {
    let externref = if externref.is_null() {
        None
    } else {
        Some(VMExternRef::clone_from_raw(externref))
    };

    let index = GlobalIndex::from_u32(index);
    let instance = (*vmctx).instance();
    let global = instance.defined_or_imported_global_ptr(index);

    // Swap the new `externref` value into the global before we drop the old
    // value. This protects against an `externref` with a `Drop` implementation
    // that calls back into Wasm and touches this global again (we want to avoid
    // it observing a halfway-deinitialized value).
    let old = mem::replace((*global).as_externref_mut(), externref);
    drop(old);
}

Get the strong reference count for this VMExternRef.

Note that this loads with a SeqCst ordering to synchronize with other threads.

Methods that would normally be trait implementations, but aren’t to avoid potential footguns around VMExternRef’s pointer-equality semantics.

Note that none of these methods are on &self, they all require a fully-qualified VMExternRef::foo(my_ref) invocation.

Check whether two VMExternRefs point to the same inner allocation.

Note that this uses pointer-equality semantics, not structural-equality semantics, and so only pointers are compared, and doesn’t use any Eq or PartialEq implementation of the pointed-to values.

Examples found in repository?
src/externref.rs (line 496)
495
496
497
    fn eq(&self, other: &Self) -> bool {
        VMExternRef::eq(&self.0, &other.0)
    }

Hash a given VMExternRef.

Note that this just hashes the pointer to the inner value, it does not use the inner value’s Hash implementation (if any).

Examples found in repository?
src/externref.rs (line 490)
486
487
488
489
490
491
    fn hash<H>(&self, hasher: &mut H)
    where
        H: Hasher,
    {
        VMExternRef::hash(&self.0, hasher)
    }

Compare two VMExternRefs.

Note that this uses pointer-equality semantics, not structural-equality semantics, and so only pointers are compared, and doesn’t use any Cmp or PartialCmp implementation of the pointed-to values.

Trait Implementations§

Returns a copy of the value. Read more
Performs copy-assignment from source. Read more
Formats the value using the given formatter. Read more
The resulting type after dereferencing.
Dereferences the value.
Executes the destructor for this type. Read more
Converts to this type from the input type.
Formats the value using the given formatter.

Auto Trait Implementations§

Blanket Implementations§

Gets the TypeId of self. Read more
Immutably borrows from an owned value. Read more
Mutably borrows from an owned value. Read more

Returns the argument unchanged.

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

The resulting type after obtaining ownership.
Creates owned data from borrowed data, usually by cloning. Read more
Uses borrowed data to replace owned data, usually by cloning. Read more
The type returned in the event of a conversion error.
Performs the conversion.
The type returned in the event of a conversion error.
Performs the conversion.