[−][src]Macro index_vec::define_index_type
Generate the boilerplate for a newtyped index struct, for use with
IndexVec
.
Usage
Standard
The rough usage pattern of this macro is:
index_vec::define_index_type! { // Note that isn't actually a type alias, `MyIndex` is // actually defined as a struct. XXX is this too confusing? pub struct MyIndex = u32; // optional extra configuration here of the form: // `OPTION_NAME = stuff;` // See below for details. }
Note that you can use other index types than u32
, and you can set it to be
MyIndex(pub u32)
as well. Currently, the wrapped item be a tuple struct,
however (patches welcome).
Customization
After the struct declaration, there are a number of configuration options the macro uses to customize how the type it generates behaves. For example:
index_vec::define_index_type! { #[repr(transparent)] pub struct Span = u32; // Don't allow any spans with values higher this. MAX_INDEX = 0x7fff_ff00; // But I also am not too worried about it, so only // perform the asserts in debug builds. DISABLE_MAX_INDEX_CHECK = !cfg!(debug_assertions); }
Configuration options
This macro has a few ways you can customize it's output behavior. There's not really any great syntax I can think of for them, but, well.
MAX_INDEX = <expr producing usize>
Assert if anything tries to construct an index above that value.
By default, this is $raw_type::max_value() as usize
, e.g. we check that
our cast from usize
to our wrapper is lossless, but we assume any all
instance of $raw_type
is valid in this index domain.
Note that these tests can be disabled entirely, or conditionally, with
DISABLE_MAX_INDEX_CHECK
. Additionally, the generated type has
from_usize_unchecked
and from_raw_unchecked
functions which can be used
to ignore these checks.
DISABLE_MAX_INDEX_CHECK = <expr>;
Set to true to disable the assertions mentioned above. False by default.
To be clear, if this is set to false, we blindly assume all casts between
usize
and $raw_type
succeed.
A common use is setting DISABLE_MAX_INDEX_CHECK = !cfg!(debug_assertions)
to
avoid the tests at compile time
For the sake of clarity, disabling this cannot lead to memory unsafety -- we still go through bounds checks when accessing slices, and no unsafe code (unless you write some, and don't! only use this for correctness!) should rely on on these checks.
DEFAULT = <expr>;
If provided, we'll implement Default
for the index type using this
expresson.
Example:
index_vec::define_index_type! { pub struct MyIdx = u16; MAX_INDEX = (u16::max_value() - 1) as usize; // Set the default index to be an invalid index, as // a hacky way of having this type behave somewhat // like it were an Option<MyIdx> without consuming // extra space. DEFAULT = (MyIdx::from_raw_unchecked(u16::max_value())); }
NO_DERIVES = true;
By default the generated type will derive
all traits needed to make itself
work. Specifically, Copy, Clone, Debug, PartialEq, Eq, Hash, PartialOrd, Ord
. If you'd like to provide your own implementation of one of these, this
is a problem.
It can be worked around by setting NO_DERIVES, and providing the
implementations yourself, usually with a combination of implementing it
manually and using Derives, for example, if I want to use a custom Debug
impl:
index_vec::define_index_type! { // Derive everything needs except `Debug`. #[derive(Copy, Clone, PartialEq, Eq, Hash, PartialOrd, Ord)] struct MyIdx = usize; NO_DERIVES = true; } // and then implement Debug manually. impl core::fmt::Debug for MyIdx { fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result { write!(f, "{}", self.raw()) } }