Struct rust_bert::mbart::MBartConfig[][src]

pub struct MBartConfig {
Show 34 fields pub vocab_size: i64, pub max_position_embeddings: i64, pub encoder_layers: i64, pub encoder_attention_heads: i64, pub encoder_ffn_dim: i64, pub encoder_layerdrop: f64, pub decoder_layers: i64, pub decoder_ffn_dim: i64, pub decoder_attention_heads: i64, pub decoder_layerdrop: f64, pub is_encoder_decoder: Option<bool>, pub activation_function: Option<Activation>, pub d_model: i64, pub dropout: f64, pub activation_dropout: f64, pub attention_dropout: f64, pub classifier_dropout: Option<f64>, pub scale_embedding: Option<bool>, pub bos_token_id: Option<i64>, pub eos_token_id: Option<i64>, pub pad_token_id: Option<i64>, pub forced_eos_token_id: Option<i64>, pub decoder_start_token_id: Option<i64>, pub id2label: Option<HashMap<i64, String>>, pub label2id: Option<HashMap<String, i64>>, pub init_std: f64, pub min_length: Option<i64>, pub no_repeat_ngram_size: Option<i64>, pub normalize_embedding: Option<bool>, pub num_hidden_layers: i64, pub output_attentions: Option<bool>, pub output_hidden_states: Option<bool>, pub output_past: Option<bool>, pub static_position_embeddings: Option<bool>,
}
Expand description

MBART model configuration

Defines the MBART model architecture (e.g. number of layers, hidden layer size, label mapping…)

Fields

vocab_size: i64max_position_embeddings: i64encoder_layers: i64encoder_attention_heads: i64encoder_ffn_dim: i64encoder_layerdrop: f64decoder_layers: i64decoder_ffn_dim: i64decoder_attention_heads: i64decoder_layerdrop: f64is_encoder_decoder: Option<bool>activation_function: Option<Activation>d_model: i64dropout: f64activation_dropout: f64attention_dropout: f64classifier_dropout: Option<f64>scale_embedding: Option<bool>bos_token_id: Option<i64>eos_token_id: Option<i64>pad_token_id: Option<i64>forced_eos_token_id: Option<i64>decoder_start_token_id: Option<i64>id2label: Option<HashMap<i64, String>>label2id: Option<HashMap<String, i64>>init_std: f64min_length: Option<i64>no_repeat_ngram_size: Option<i64>normalize_embedding: Option<bool>num_hidden_layers: i64output_attentions: Option<bool>output_hidden_states: Option<bool>output_past: Option<bool>static_position_embeddings: Option<bool>

Trait Implementations

Returns a copy of the value. Read more

Performs copy-assignment from source. Read more

Loads a Config object from a JSON file. The format is expected to be aligned with the Transformers library configuration files for each model. The parsing will fail if non-optional keys expected by the model are missing. Read more

Formats the value using the given formatter. Read more

Deserialize this value from the given Serde deserializer. Read more

Serialize this value into the given Serde serializer. Read more

Auto Trait Implementations

Blanket Implementations

Gets the TypeId of self. Read more

Immutably borrows from an owned value. Read more

Mutably borrows from an owned value. Read more

Performs the conversion.

Instruments this type with the provided Span, returning an Instrumented wrapper. Read more

Instruments this type with the current Span, returning an Instrumented wrapper. Read more

Performs the conversion.

The alignment of pointer.

The type for initializers.

Initializes a with the given initializer. Read more

Dereferences the given pointer. Read more

Mutably dereferences the given pointer. Read more

Drops the object pointed to by the given pointer. Read more

Should always be Self

The resulting type after obtaining ownership.

Creates owned data from borrowed data, usually by cloning. Read more

🔬 This is a nightly-only experimental API. (toowned_clone_into)

Uses borrowed data to replace owned data, usually by cloning. Read more

The type returned in the event of a conversion error.

Performs the conversion.

The type returned in the event of a conversion error.

Performs the conversion.