Details: https://spacy.io/models/en#en_trf_distilbertbaseuncased_lg
File checksum:
a244180b0d66f0fd9f781547cd284156e1789621f4ef5f96c4320d8a18964ac8
Provides weights and configuration for the pretrained transformer model distilbert-base-uncased
, published by Hugging Face. The package uses HuggingFace's transformers
implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.
Feature | Description |
---|---|
Name | en_trf_distilbertbaseuncased_lg
|
Version | 2.3.0
|
spaCy | >=2.3.0,<2.4.0
|
Model size | 233 MB |
Pipeline | sentencizer , trf_wordpiecer , trf_tok2vec
|
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | distilbert-base-uncased (Hugging Face) |
License | MIT
|
Author | Hugging Face (repackaged by Explosion) |
Requires the spacy-transformers
package to be installed. A CUDA-compatible GPU is advised for reasonable performance.
Installation
pip install spacy
python -m spacy download en_trf_distilbertbaseuncased_lg