Checksum .tar.gz:
40bab7d231a67a17cdac169bdf3a18070d34e941302bc037841b5361d95b89f3
Checksum .whl:ae37a39df924099ea9e9d9d4d2912bbed9c534089c9f5f3ac0a3564ca7815521
Details: https://spacy.io/models/en#en_core_web_lg
English pipeline optimized for CPU. Components: tok2vec, tagger, parser, senter, ner, attribute_ruler, lemmatizer.
Feature | Description |
---|---|
Name | en_core_web_lg
|
Version | 3.2.0
|
spaCy | >=3.2.0,<3.3.0
|
Default Pipeline | tok2vec , tagger , parser , attribute_ruler , lemmatizer , ner
|
Components | tok2vec , tagger , parser , senter , attribute_ruler , lemmatizer , ner
|
Vectors | 684830 keys, 684830 unique vectors (300 dimensions) |
Sources | OntoNotes 5 (Ralph Weischedel, Martha Palmer, Mitchell Marcus, Eduard Hovy, Sameer Pradhan, Lance Ramshaw, Nianwen Xue, Ann Taylor, Jeff Kaufman, Michelle Franchini, Mohammed El-Bachouti, Robert Belvin, Ann Houston) ClearNLP Constituent-to-Dependency Conversion (Emory University) WordNet 3.0 (Princeton University) GloVe Common Crawl (Jeffrey Pennington, Richard Socher, and Christopher D. Manning) |
License | MIT
|
Author | Explosion |
Model size | 741 MB |
Label Scheme
View label scheme (114 labels for 4 components)
Component | Labels |
---|---|
tagger
| $ , '' , , , -LRB- , -RRB- , . , : , ADD , AFX , CC , CD , DT , EX , FW , HYPH , IN , JJ , JJR , JJS , LS , MD , NFP , NN , NNP , NNPS , NNS , PDT , POS , PRP , PRP$ , RB , RBR , RBS , RP , SYM , TO , UH , VB , VBD , VBG , VBN , VBP , VBZ , WDT , WP , WP$ , WRB , XX , ````
|
parser
| ROOT , acl , acomp , advcl , advmod , agent , amod , appos , attr , aux , auxpass , case , cc , ccomp , compound , conj , csubj , csubjpass , dative , dep , det , dobj , expl , intj , mark , meta , neg , nmod , npadvmod , nsubj , nsubjpass , nummod , oprd , parataxis , pcomp , pobj , poss , preconj , predet , prep , prt , punct , quantmod , relcl , xcomp
|
senter
| I , S
|
ner
| CARDINAL , DATE , EVENT , FAC , GPE , LANGUAGE , LAW , LOC , MONEY , NORP , ORDINAL , ORG , PERCENT , PERSON , PRODUCT , QUANTITY , TIME , WORK_OF_ART
|
Accuracy
Type | Score |
---|---|
TOKEN_ACC
| 99.93 |
TOKEN_P
| 99.57 |
TOKEN_R
| 99.58 |
TOKEN_F
| 99.57 |
TAG_ACC
| 97.42 |
SENTS_P
| 91.79 |
SENTS_R
| 89.06 |
SENTS_F
| 90.41 |
DEP_UAS
| 92.01 |
DEP_LAS
| 90.22 |
ENTS_P
| 85.74 |
ENTS_R
| 84.90 |
ENTS_F
| 85.32 |
Installation
pip install spacy
python -m spacy download en_core_web_lg