github explosion/spacy-models en_trf_robertabase_lg-2.3.0

Downloads

Details: https://spacy.io/models/en#en_trf_robertabase_lg

File checksum: 202e68591296546ceb1c31393e336e7337a44dcbfc4fc2ed97312079ce4d50b5

Provides weights and configuration for the pretrained transformer model roberta-base, published by Facebook. The package uses HuggingFace's transformers implementation of the model. Pretrained transformer models assign detailed contextual word representations, using knowledge drawn from a large corpus of unlabelled text. You can use the contextual word representations as features in a variety of pipeline components that can be trained on your own data.

Feature Description
Name en_trf_robertabase_lg
Version 2.3.0
spaCy >=2.3.0,<2.4.0
Model size 278 MB
Pipeline  sentencizer, trf_wordpiecer, trf_tok2vec
Vectors 0 keys, 0 unique vectors (0 dimensions)
Sources roberta-base (Facebook)
License MIT
Author Facebook (repackaged by Explosion)

Requires the spacy-transformers package to be installed. A CUDA-compatible GPU is advised for reasonable performance.

Installation

pip install spacy
python -m spacy download en_trf_robertabase_lg

Don't miss a new spacy-models release

NewReleases is sending notifications on new releases.