pub struct ChainCoder<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize>
where Word: BitArray + Into<State>, State: BitArray + AsPrimitive<Word>,
{ /* private fields */ }
Expand description

Experimental entropy coder for advanced variants of bitsback coding.

See module level documentation for motivation and explanation of the implemented entropy coding algorithm.

Intended Usage

A typical usage cycle goes along the following steps:

When compressing data using the bits-back trick

  1. Start with some stack of (typically already compressed) binary data, which you want to piggy-back into the choice of certain latent variables.
  2. Create a ChainCoder by calling ChainCoder::from_binary or ChainCoder::from_compressed (depending on whether you can guarantee that the stack of binary data has a nonzero word on top).
  3. Use the ChainCoder and a sequence of entropy models to decode some symbols.
  4. Export the remainders data on the ChainCoder by calling .into_remainders().

When decompressing the data

  1. Create a ChainCoder by calling ChainCoder::from_remainders.
  2. Encode the symbols you obtained in Step 2 above back onto the new chain coder (in reverse order) using the same entropy models.
  3. Recover the original binary data from Step 0 above by calling .into_binary() or .into_compressed() (using the `analogous choice as in Step 1 above).

Examples

The following two examples show two variants of the typical usage cycle described above.

use constriction::stream::{model::DefaultLeakyQuantizer, Decode, chain::DefaultChainCoder};
use probability::distribution::Gaussian;

// Step 0 of the compressor: Generate some sample binary data for demonstration purpose.
let original_data = (0..100u32).map(
    |i| i.wrapping_mul(0xad5f_b2ed).wrapping_add(0xed55_4892)
).collect::<Vec<_>>();

// Step 1 of the compressor: obtain a `ChainCoder` from the original binary data.
let mut coder = DefaultChainCoder::from_binary(original_data.clone()).unwrap();

// Step 2 of the compressor: decode data into symbols using some entropy models.
let quantizer = DefaultLeakyQuantizer::new(-100..=100);
let models = (0..50u32).map(|i| quantizer.quantize(Gaussian::new(i as f64, 10.0)));
let symbols = coder.decode_symbols(models.clone()).collect::<Result<Vec<_>, _>>().unwrap();

// Step 3 of the compressor: export the remainders data.
let (remainders_prefix, remainders_suffix) = coder.into_remainders().unwrap();
// (verify that we've indeed reduced the amount of data:)
assert!(remainders_prefix.len() + remainders_suffix.len() < original_data.len());

// ... do something with the `symbols`, then recover them later ...

// Step 1 of the decompressor: create a `ChainCoder` from the remainders data. We only really
// need the `remainders_suffix` here, but it would also be legal to use the concatenation of
// `remainders_prefix` with `remainders_suffix` (see other example below).
let mut coder = DefaultChainCoder::from_remainders(remainders_suffix).unwrap();

// Step 2 of the decompressor: re-encode the symbols in reverse order.
coder.encode_symbols_reverse(symbols.into_iter().zip(models));

// Step 3 of the decompressor: recover the original data.
let (recovered_prefix, recovered_suffix) = coder.into_binary().unwrap();
assert!(recovered_prefix.is_empty());  // Empty because we discarded `remainders_prefix` above.
let mut recovered = remainders_prefix;  // But we have to prepend it to the recovered data now.
recovered.extend_from_slice(&recovered_suffix);

assert_eq!(recovered, original_data);

In Step 3 of the compressor in the example above, calling .into_remainders() on a ChainCoder returns a tuple of a remainders_prefix and a remainders_suffix. The remainders_prefix contains superflous data that we didn’t need when decoding the symbols (remainders_prefix is an unaltered prefix of the original data). We therefore don’t need remainders_prefix for re-encoding the symbols, so we didn’t pass it to ChainCoder::from_remainders in Step 1 of the decompressor above.

If we were to write out remainders_prefix and remainders_suffix to a file then it would be tedious to keep track of where the prefix ends and where the suffix begins. Luckily, we don’t have to do this. We can just as well concatenate remainders_prefix and remainders_suffix right away. The only additional change this will cause is that the call to .into_binary() in Step 3 of the decompressor will then return a non-empty recovered_prefix because the second ChainCoder will then also have some superflous data. So we’ll have to again concatenate the two returned buffers. The following example shows how this works:

// ... compressor same as in the previous example above ...

// Alternative Step 1 of the decompressor: concatenate `remainders_prefix` with
// `remainders_suffix` before creating a `ChainCoder` from them.
let mut remainders = remainders_prefix;
remainders.extend_from_slice(&remainders_suffix);
let mut coder = DefaultChainCoder::from_remainders(remainders).unwrap();

// Step 2 of the decompressor: re-encode symbols in reverse order (same as in previous example).
coder.encode_symbols_reverse(symbols.into_iter().zip(models));

// Alternative Step 3 of the decompressor: recover the original data by another concatenation.
let (recovered_prefix, recovered_suffix) = coder.into_binary().unwrap();
assert!(!recovered_prefix.is_empty());  // No longer empty because there was superflous data.
let mut recovered = recovered_prefix;   // So we have to concatenate `recovered_{pre,suf}fix`.
recovered.extend_from_slice(&recovered_suffix);

assert_eq!(recovered, original_data);

Implementations§

source§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where Word: BitArray + Into<State>, State: BitArray + AsPrimitive<Word>,

source

pub fn from_binary( data: CompressedBackend ) -> Result<Self, CoderError<CompressedBackend, CompressedBackend::ReadError>>
where CompressedBackend: ReadWords<Word, Stack>, RemaindersBackend: Default,

Creates a new ChainCoder for decoding from the provided data.

The reader data must have enough words to initialize the chain heads but can otherwise be arbitrary. In particualar, data doesn’t necessary have to come from an AnsCoder. If you know that data comes from an AnsCoder then it’s slightly better to call from_compressed instead.

Retuns an error if data does not have enough words to initialize the chain heads or if reading from data lead to an error.

source

pub fn from_compressed( compressed: CompressedBackend ) -> Result<Self, CoderError<CompressedBackend, CompressedBackend::ReadError>>
where CompressedBackend: ReadWords<Word, Stack>, RemaindersBackend: Default,

Creates a new ChainCoder for decoding from the compressed data of an AnsCoder

The provided read backend compressed, must have enough words to initialize the chain heads and must not have a zero word at the current read position. The latter is always satisfied for (nonempty) data returned from AnsCoder::into_compressed.

Retuns an error if compressed does not have enough words, if reading from compressed lead to an error, or if the first word read from compressed is zero.

source

pub fn into_remainders( self ) -> Result<(CompressedBackend, RemaindersBackend), RemaindersBackend::WriteError>
where RemaindersBackend: WriteWords<Word>,

Terminates decoding and returns the remainders bit string as a tuple (prefix, suffix).

You can use the returned tuple (prefix, suffix) in either of the following two ways (see examples in the struct level documentation):

  • Either put prefix away and continue only with suffix as follows:
    1. obtain a new ChainCoder by calling ChainCoder::from_remainders(suffix);
    2. encode the same symbols that you decoded from the original ChainCoder back onto the new ChainCoder (in reverse order);
    3. call .into_binary() or .into_compressed() on the new ChainCoder to obatain another tuple (prefix2, suffix2).
    4. concatenate prefix, prefix2, and suffix2 to recover the data from which you created the original ChainCoder when you constructed it with ChainCoder::from_binary or ChainCoder::from_compressed, respectively.
  • Or you can concatenate prefix with suffix, create a new ChainCoder from the concatenation by calling ChainCoder::from_remainders(concatenation), continue with steps 2 and 3 above, and then just concatenate prefix2 with suffix2 to recover the original data.
source

pub fn from_remainders( remainders: RemaindersBackend ) -> Result<Self, CoderError<RemaindersBackend, RemaindersBackend::ReadError>>
where RemaindersBackend: ReadWords<Word, Stack>, CompressedBackend: Default,

Creates a new ChainCoder for encoding some symbols together with the data previously obtained from into_remainders.

See into_remainders for detailed explanation.

source

pub fn into_compressed( self ) -> Result<(RemaindersBackend, CompressedBackend), CoderError<Self, CompressedBackend::WriteError>>
where CompressedBackend: WriteWords<Word>,

Terminates encoding if possible and returns the compressed data as a tuple (prefix, suffix)

Call this method only if the original ChainCoder used for decoding was constructed with ChainCoder::from_compressed (typically if the original data came from an AnsCoder). If the original ChainCoder was instead constructed with ChainCoder::from_binary then call .into_binary() instead.

Returns an error unless there’s currently an integer amount of Words in the compressed data (which will be the case if you’ve used the ChainCoder correctly, see also is_whole).

See into_remainders for usage instructions.

source

pub fn into_binary( self ) -> Result<(RemaindersBackend, CompressedBackend), CoderError<Self, CompressedBackend::WriteError>>
where CompressedBackend: WriteWords<Word>,

Terminates encoding if possible and returns the compressed data as a tuple (prefix, suffix)

Call this method only if the original ChainCoder used for decoding was constructed with ChainCoder::from_binary. If the original ChainCoder was instead constructed with ChainCoder::from_compressed then call .into_compressed() instead.

Returns an error unless there’s currently an integer amount of Words in the both the compressed data and the remainders data (which will be the case if you’ve used the ChainCoder correctly and if the original chain coder was constructed with from_binary rather than from_compressed).

See into_remainders for usage instructions.

source

pub fn is_whole(&self) -> bool

Returns true iff there’s currently an integer amount of Words in the compressed data

source

pub fn encode_symbols_reverse<S, M, I>( &mut self, symbols_and_models: I ) -> Result<(), EncoderError<Word, CompressedBackend, RemaindersBackend>>
where S: Borrow<M::Symbol>, M: EncoderModel<PRECISION>, M::Probability: Into<Word>, Word: AsPrimitive<M::Probability>, I: IntoIterator<Item = (S, M)>, I::IntoIter: DoubleEndedIterator, CompressedBackend: WriteWords<Word>, RemaindersBackend: ReadWords<Word, Stack>,

source

pub fn try_encode_symbols_reverse<S, M, E, I>( &mut self, symbols_and_models: I ) -> Result<(), TryCodingError<EncoderError<Word, CompressedBackend, RemaindersBackend>, E>>
where S: Borrow<M::Symbol>, M: EncoderModel<PRECISION>, M::Probability: Into<Word>, Word: AsPrimitive<M::Probability>, I: IntoIterator<Item = Result<(S, M), E>>, I::IntoIter: DoubleEndedIterator, CompressedBackend: WriteWords<Word>, RemaindersBackend: ReadWords<Word, Stack>,

source

pub fn encode_iid_symbols_reverse<S, M, I>( &mut self, symbols: I, model: M ) -> Result<(), EncoderError<Word, CompressedBackend, RemaindersBackend>>
where S: Borrow<M::Symbol>, M: EncoderModel<PRECISION> + Copy, M::Probability: Into<Word>, Word: AsPrimitive<M::Probability>, I: IntoIterator<Item = S>, I::IntoIter: DoubleEndedIterator, CompressedBackend: WriteWords<Word>, RemaindersBackend: ReadWords<Word, Stack>,

source

pub fn increase_precision<const NEW_PRECISION: usize>( self ) -> Result<ChainCoder<Word, State, CompressedBackend, RemaindersBackend, NEW_PRECISION>, CoderError<Infallible, BackendError<Infallible, RemaindersBackend::WriteError>>>
where RemaindersBackend: WriteWords<Word>,

source

pub fn decrease_precision<const NEW_PRECISION: usize>( self ) -> Result<ChainCoder<Word, State, CompressedBackend, RemaindersBackend, NEW_PRECISION>, CoderError<EncoderFrontendError, BackendError<Infallible, RemaindersBackend::ReadError>>>
where RemaindersBackend: ReadWords<Word, Stack>,

source

pub fn change_precision<const NEW_PRECISION: usize>( self ) -> Result<ChainCoder<Word, State, CompressedBackend, RemaindersBackend, NEW_PRECISION>, ChangePrecisionError<Word, RemaindersBackend>>
where RemaindersBackend: WriteWords<Word> + ReadWords<Word, Stack>,

Converts the stable::Decoder into a new stable::Decoder that accepts entropy models with a different fixed-point precision.

Here, “precision” refers to the number of bits with which probabilities are represented in entropy models passed to the decode_XXX methods.

The generic argument NEW_PRECISION can usually be omitted because the compiler can infer its value from the first time the new stable::Decoder is used for decoding. The recommended usage pattern is to store the returned stable::Decoder in a variable that shadows the old stable::Decoder (since the old one gets consumed anyway), i.e., let mut stable_decoder = stable_decoder.change_precision(). See example below.

Failure Case

The conversion can only fail if all of the following conditions are true

  • NEW_PRECISION < PRECISION; and
  • the ChainCoder has already been used incorrectly: it must have encoded too many symbols or used the wrong sequence of entropy models, causing it to use up just a few more bits of remainders than available (but also not exceeding the capacity enough for this to be detected during encoding).

In the event of this failure, change_precision returns Err(self).

Example
use constriction::stream::{model::LeakyQuantizer, Decode, chain::DefaultChainCoder};

// Construct two entropy models with 24 bits and 20 bits of precision, respectively.
let continuous_distribution = probability::distribution::Gaussian::new(0.0, 10.0);
let quantizer24 = LeakyQuantizer::<_, _, u32, 24>::new(-100..=100);
let quantizer20 = LeakyQuantizer::<_, _, u32, 20>::new(-100..=100);
let distribution24 = quantizer24.quantize(continuous_distribution);
let distribution20 = quantizer20.quantize(continuous_distribution);

// Construct a `ChainCoder` and decode some data with the 24 bit precision entropy model.
let data = vec![0x0123_4567u32, 0x89ab_cdef];
let mut coder = DefaultChainCoder::from_binary(data).unwrap();
let _symbol_a = coder.decode_symbol(distribution24);

// Change `coder`'s precision and decode data with the 20 bit precision entropy model.
// The compiler can infer the new precision based on how `coder` will be used.
let mut coder = coder.change_precision().unwrap();
let _symbol_b = coder.decode_symbol(distribution20);

Trait Implementations§

source§

impl<Word, State, CompressedBackend: Clone, RemaindersBackend: Clone, const PRECISION: usize> Clone for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where Word: BitArray + Into<State> + Clone, State: BitArray + AsPrimitive<Word> + Clone,

source§

fn clone( &self ) -> ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>

Returns a copy of the value. Read more
1.0.0 · source§

fn clone_from(&mut self, source: &Self)

Performs copy-assignment from source. Read more
source§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> Code for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where Word: BitArray + Into<State>, State: BitArray + AsPrimitive<Word>,

§

type Word = Word

The smallest unit of compressed data that this coder can emit or read at once. Most coders guarantee that encoding emits at most one Word per symbol (plus a constant overhead).
§

type State = ChainCoderHeads<Word, State, PRECISION>

The internal coder state, as returned by the method state. Read more
source§

fn state(&self) -> Self::State

Returns the current internal state of the coder. Read more
source§

fn encoder_maybe_full<const PRECISION: usize>(&self) -> bool
where Self: Encode<PRECISION>,

Checks if there might not be any room to encode more data. Read more
source§

fn decoder_maybe_exhausted<const PRECISION: usize>(&self) -> bool
where Self: Decode<PRECISION>,

Checks if there might be no compressed data left for decoding. Read more
source§

impl<Word, State, CompressedBackend: Debug, RemaindersBackend: Debug, const PRECISION: usize> Debug for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where Word: BitArray + Into<State> + Debug, State: BitArray + AsPrimitive<Word> + Debug,

source§

fn fmt(&self, f: &mut Formatter<'_>) -> Result

Formats the value using the given formatter. Read more
source§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> Decode<PRECISION> for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where Word: BitArray + Into<State>, State: BitArray + AsPrimitive<Word>, CompressedBackend: ReadWords<Word, Stack>, RemaindersBackend: WriteWords<Word>,

§

type FrontendError = DecoderFrontendError

The error type for logical decoding errors. Read more
§

type BackendError = BackendError<<CompressedBackend as ReadWords<Word, Stack>>::ReadError, <RemaindersBackend as WriteWords<Word>>::WriteError>

The error type for reading in encoded data. Read more
source§

fn decode_symbol<M>( &mut self, model: M ) -> Result<M::Symbol, DecoderError<Word, CompressedBackend, RemaindersBackend>>
where M: DecoderModel<PRECISION>, M::Probability: Into<Self::Word>, Self::Word: AsPrimitive<M::Probability>,

Decodes a single symbol using the given entropy model. Read more
source§

fn maybe_exhausted(&self) -> bool

Checks if there might be no compressed data left for decoding. Read more
source§

fn decode_symbols<'s, I, M>( &'s mut self, models: I ) -> DecodeSymbols<'s, Self, I::IntoIter, PRECISION>
where I: IntoIterator<Item = M> + 's, M: DecoderModel<PRECISION>, M::Probability: Into<Self::Word>, Self::Word: AsPrimitive<M::Probability>,

Decodes a sequence of symbols, using an individual entropy model for each symbol. Read more
source§

fn try_decode_symbols<'s, I, M, E>( &'s mut self, models: I ) -> TryDecodeSymbols<'s, Self, I::IntoIter, PRECISION>
where I: IntoIterator<Item = Result<M, E>> + 's, M: DecoderModel<PRECISION>, M::Probability: Into<Self::Word>, Self::Word: AsPrimitive<M::Probability>,

Decodes a sequence of symbols from a fallible iterator over entropy models. Read more
source§

fn decode_iid_symbols<M>( &mut self, amt: usize, model: M ) -> DecodeIidSymbols<'_, Self, M, PRECISION>
where M: DecoderModel<PRECISION> + Copy, M::Probability: Into<Self::Word>, Self::Word: AsPrimitive<M::Probability>,

Decodes amt symbols using the same entropy model for all symbols. Read more
source§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> Encode<PRECISION> for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where Word: BitArray + Into<State>, State: BitArray + AsPrimitive<Word>, CompressedBackend: WriteWords<Word>, RemaindersBackend: ReadWords<Word, Stack>,

§

type FrontendError = EncoderFrontendError

The error type for logical encoding errors. Read more
§

type BackendError = BackendError<<CompressedBackend as WriteWords<Word>>::WriteError, <RemaindersBackend as ReadWords<Word, Stack>>::ReadError>

The error type for writing out encoded data. Read more
source§

fn encode_symbol<M>( &mut self, symbol: impl Borrow<M::Symbol>, model: M ) -> Result<(), EncoderError<Word, CompressedBackend, RemaindersBackend>>
where M: EncoderModel<PRECISION>, M::Probability: Into<Self::Word>, Self::Word: AsPrimitive<M::Probability>,

Encodes a single symbol with the given entropy model. Read more
source§

fn maybe_full(&self) -> bool

Checks if there might not be any room to encode more data. Read more
source§

fn encode_symbols<S, M>( &mut self, symbols_and_models: impl IntoIterator<Item = (S, M)> ) -> Result<(), CoderError<Self::FrontendError, Self::BackendError>>
where S: Borrow<M::Symbol>, M: EncoderModel<PRECISION>, M::Probability: Into<Self::Word>, Self::Word: AsPrimitive<M::Probability>,

Encodes a sequence of symbols, each with its individual entropy model. Read more
source§

fn try_encode_symbols<S, M, E>( &mut self, symbols_and_models: impl IntoIterator<Item = Result<(S, M), E>> ) -> Result<(), TryCodingError<CoderError<Self::FrontendError, Self::BackendError>, E>>
where S: Borrow<M::Symbol>, M: EncoderModel<PRECISION>, M::Probability: Into<Self::Word>, Self::Word: AsPrimitive<M::Probability>,

Encodes a sequence of symbols from a fallible iterator. Read more
source§

fn encode_iid_symbols<S, M>( &mut self, symbols: impl IntoIterator<Item = S>, model: M ) -> Result<(), CoderError<Self::FrontendError, Self::BackendError>>
where S: Borrow<M::Symbol>, M: EncoderModel<PRECISION> + Copy, M::Probability: Into<Self::Word>, Self::Word: AsPrimitive<M::Probability>,

Encodes a sequence of symbols, all with the same entropy model. Read more
source§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> Pos for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where Word: BitArray + Into<State>, State: BitArray + AsPrimitive<Word>, CompressedBackend: Pos, RemaindersBackend: Pos,

source§

fn pos(&self) -> Self::Position

Returns the position in the compressed data, in units of Words. Read more
source§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> PosSeek for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where Word: BitArray + Into<State>, State: BitArray + AsPrimitive<Word>, CompressedBackend: PosSeek, RemaindersBackend: PosSeek,

§

type Position = (BackendPosition<<CompressedBackend as PosSeek>::Position, <RemaindersBackend as PosSeek>::Position>, ChainCoderHeads<Word, State, PRECISION>)

source§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> Seek for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where Word: BitArray + Into<State>, State: BitArray + AsPrimitive<Word>, CompressedBackend: Seek, RemaindersBackend: Seek,

source§

fn seek(&mut self, (pos, state): Self::Position) -> Result<(), ()>

Jumps to a given position in the compressed data. Read more

Auto Trait Implementations§

§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> RefUnwindSafe for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where CompressedBackend: RefUnwindSafe, RemaindersBackend: RefUnwindSafe, State: RefUnwindSafe, <Word as BitArray>::NonZero: RefUnwindSafe,

§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> Send for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where CompressedBackend: Send, RemaindersBackend: Send, State: Send, <Word as BitArray>::NonZero: Send,

§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> Sync for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where CompressedBackend: Sync, RemaindersBackend: Sync, State: Sync, <Word as BitArray>::NonZero: Sync,

§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> Unpin for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where CompressedBackend: Unpin, RemaindersBackend: Unpin, State: Unpin, <Word as BitArray>::NonZero: Unpin,

§

impl<Word, State, CompressedBackend, RemaindersBackend, const PRECISION: usize> UnwindSafe for ChainCoder<Word, State, CompressedBackend, RemaindersBackend, PRECISION>
where CompressedBackend: UnwindSafe, RemaindersBackend: UnwindSafe, State: UnwindSafe, <Word as BitArray>::NonZero: UnwindSafe,

Blanket Implementations§

source§

impl<T> Any for T
where T: 'static + ?Sized,

source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
source§

impl<T> Borrow<T> for T
where T: ?Sized,

source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
source§

impl<T> From<T> for T

source§

fn from(t: T) -> T

Returns the argument unchanged.

source§

impl<T, U> Into<U> for T
where U: From<T>,

source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

source§

impl<T> ToOwned for T
where T: Clone,

§

type Owned = T

The resulting type after obtaining ownership.
source§

fn to_owned(&self) -> T

Creates owned data from borrowed data, usually by cloning. Read more
source§

fn clone_into(&self, target: &mut T)

Uses borrowed data to replace owned data, usually by cloning. Read more
source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

§

type Error = Infallible

The type returned in the event of a conversion error.
source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.