Details: https://spacy.io/models/de#de_trf_bertbasecased_lg
File checksum:
4480207301f7eb52a41bd6cf2078486da1f74792ccbc1b9becb5227bd5b2b7c8
Provides weights and configuration for the pretrained transformer model bert-base-german-cased
, published by deepset. The package uses HuggingFace's transformers
implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.
Feature | Description |
---|---|
Name | de_trf_bertbasecased_lg
|
Version | 2.3.0
|
spaCy | >=2.3.0,<2.4.0
|
Model size | 386 MB |
Pipeline | sentencizer , trf_wordpiecer , trf_tok2vec
|
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | bert-base-german-cased (deepset) |
License | MIT
|
Author | deepset (repackaged by Explosion) |
Requires the spacy-transformers
package to be installed. A CUDA-compatible GPU is advised for reasonable performance.
Installation
pip install spacy
python -m spacy download de_trf_bertbasecased_lg