[−][src]Module rust_bert::distilbert
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter (Sanh et al.)
Implementation of the DistilBERT language model (https://arxiv.org/abs/1910.01108 Sanh, Debut, Chaumond, Wolf, 2019).
The base model is implemented in the distilbert::DistilBertModel struct. Several language model heads have also been implemented, including:
- Masked language model:
distilbert::DistilBertForMaskedLM - Question answering:
distilbert::DistilBertForQuestionAnswering - Sequence classification:
distilbert::DistilBertForSequenceClassification - Token classification (e.g. NER, POS tagging):
distilbert::DistilBertForTokenClassification
Model set-up and pre-trained weights loading
A full working example is provided in examples/distilbert_masked_lm.rs, run with cargo run --example distilbert_masked_lm.
The example below illustrate a DistilBERT Masked language model example, the structure is similar for other models.
All models expect the following resources:
- Configuration file expected to have a structure following the Transformers library
- Model weights are expected to have a structure and parameter names following the Transformers library. A conversion using the Python utility scripts is required to convert the
.binweights to the.otformat. BertTokenizerusing avocab.txtvocabulary Pretrained models are available and can be downloaded using RemoteResources.
use tch::{nn, Device}; use rust_bert::distilbert::{ DistilBertConfig, DistilBertConfigResources, DistilBertModelMaskedLM, DistilBertModelResources, DistilBertVocabResources, }; use rust_bert::resources::{LocalResource, RemoteResource, Resource}; use rust_bert::Config; use rust_tokenizers::tokenizer::BertTokenizer; let config_resource = Resource::Local(LocalResource { local_path: PathBuf::from("path/to/config.json"), }); let vocab_resource = Resource::Local(LocalResource { local_path: PathBuf::from("path/to/vocab.txt"), }); let weights_resource = Resource::Local(LocalResource { local_path: PathBuf::from("path/to/model.ot"), }); let config_path = config_resource.get_local_path()?; let vocab_path = vocab_resource.get_local_path()?; let weights_path = weights_resource.get_local_path()?; let device = Device::cuda_if_available(); let mut vs = nn::VarStore::new(device); let tokenizer: BertTokenizer = BertTokenizer::from_file(vocab_path.to_str().unwrap(), true, true)?; let config = DistilBertConfig::from_file(config_path); let bert_model = DistilBertModelMaskedLM::new(&vs.root(), &config); vs.load(weights_path)?;
Structs
| DistilBertConfig | DistilBERT model configuration |
| DistilBertConfigResources | DistilBERT Pretrained model config files |
| DistilBertForQuestionAnswering | DistilBERT for question answering |
| DistilBertForTokenClassification | DistilBERT for token classification (e.g. NER, POS) |
| DistilBertMaskedLMOutput | Container for the DistilBERT masked LM model output. |
| DistilBertModel | DistilBERT Base model |
| DistilBertModelClassifier | DistilBERT for sequence classification |
| DistilBertModelMaskedLM | DistilBERT for masked language model |
| DistilBertModelResources | DistilBERT Pretrained model weight files |
| DistilBertQuestionAnsweringOutput | Container for the DistilBERT question answering model output |
| DistilBertSequenceClassificationOutput | Container for the DistilBERT sequence classification model output |
| DistilBertTokenClassificationOutput | Container for the DistilBERT token classification model output |
| DistilBertVocabResources | DistilBERT Pretrained model vocab files |