Struct rust_bert::bert::BertLayer

source ·
pub struct BertLayer { /* private fields */ }
Expand description

BERT Layer

Layer used in BERT encoders. It is made of the following blocks:

  • attention: self-attention BertAttention layer
  • cross_attention: (optional) cross-attention BertAttention layer (if the model is used as a decoder)
  • is_decoder: flag indicating if the model is used as a decoder
  • intermediate: BertIntermediate intermediate layer
  • output: BertOutput output layer

Implementations§

Build a new BertLayer

Arguments
  • p - Variable store path for the root of the BERT model
  • config - BertConfig object defining the model architecture
Example
use rust_bert::bert::{BertConfig, BertLayer};
use rust_bert::Config;
use std::path::Path;
use tch::{nn, Device};

let config_path = Path::new("path/to/config.json");
let device = Device::Cpu;
let p = nn::VarStore::new(device);
let config = BertConfig::from_file(config_path);
let layer: BertLayer = BertLayer::new(&p.root(), &config);

Forward pass through the layer

Arguments
  • hidden_states - input tensor of shape (batch size, sequence_length, hidden_size).
  • mask - Optional mask of shape (batch size, sequence_length). Masked position have value 0, non-masked value 1. If None set to 1
  • encoder_hidden_states - Optional encoder hidden state of shape (batch size, encoder_sequence_length, hidden_size). If the model is defined as a decoder and the encoder_hidden_states is not None, used in the cross-attention layer as keys and values (query from the decoder).
  • encoder_mask - Optional encoder attention mask of shape (batch size, encoder_sequence_length). If the model is defined as a decoder and the encoder_hidden_states is not None, used to mask encoder values. Positions with value 0 will be masked.
  • train - boolean flag to turn on/off the dropout layers in the model. Should be set to false for inference.
Returns
  • BertLayerOutput containing:
    • hidden_state - Tensor of shape (batch size, sequence_length, hidden_size)
    • attention_scores - Option<Tensor> of shape (batch size, sequence_length, hidden_size)
    • cross_attention_scores - Option<Tensor> of shape (batch size, sequence_length, hidden_size)
Example
let layer: BertLayer = BertLayer::new(&vs.root(), &config);
let (batch_size, sequence_length, hidden_size) = (64, 128, 512);
let input_tensor = Tensor::rand(
    &[batch_size, sequence_length, hidden_size],
    (Kind::Float, device),
);
let mask = Tensor::zeros(&[batch_size, sequence_length], (Kind::Int64, device));

let layer_output = no_grad(|| layer.forward_t(&input_tensor, Some(&mask), None, None, false));

Auto Trait Implementations§

Blanket Implementations§

Gets the TypeId of self. Read more
Immutably borrows from an owned value. Read more
Mutably borrows from an owned value. Read more

Returns the argument unchanged.

Instruments this type with the provided Span, returning an Instrumented wrapper. Read more
Instruments this type with the current Span, returning an Instrumented wrapper. Read more

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

The alignment of pointer.
The type for initializers.
Initializes a with the given initializer. Read more
Dereferences the given pointer. Read more
Mutably dereferences the given pointer. Read more
Drops the object pointed to by the given pointer. Read more
Should always be Self
The type returned in the event of a conversion error.
Performs the conversion.
The type returned in the event of a conversion error.
Performs the conversion.
Attaches the provided Subscriber to this type, returning a WithDispatch wrapper. Read more
Attaches the current default Subscriber to this type, returning a WithDispatch wrapper. Read more