Module bert

Module bert 

Source
Expand description

BERT (Bidirectional Encoder Representations from Transformers)

Implements BERT as described in “BERT: Pre-training of Deep Bidirectional Transformers”

  • Token embeddings
  • Segment embeddings
  • Position embeddings
  • Transformer encoder layers
  • Masked language modeling head
  • Next sentence prediction head

Structs§

BertConfig
BERT configuration
BertEmbeddings
BERT embeddings layer
BertForMaskedLM
BERT for Masked Language Modeling
BertForSequenceClassification
BERT for Sequence Classification
BertForTokenClassification
BERT for Token Classification (NER, POS tagging)
BertModel
BERT model
BertOutput
BERT output
BertPooler
BERT pooler (for classification tasks)