Universe

spacy-transformers

spaCy pipelines for pretrained BERT, XLNet and GPT-2

This package provides spaCy model pipelines that wrap Hugging Face's transformers package, so you can use them in spaCy. The result is convenient access to state-of-the-art transformer architectures, such as BERT, GPT-2, XLNet, etc.

Example

import spacy nlp = spacy.load("en_trf_bertbaseuncased_lg") doc = nlp("Apple shares rose on the news. Apple pie is delicious.") print(doc[0].similarity(doc[7])) print(doc._.trf_last_hidden_state.shape)
View more
Author info

Explosion

GitHubexplosion/spacy-transformers

Categories pipeline models research

Submit your project

If you have a project that you want the spaCy community to make use of, you can suggest it by submitting a pull request to the spaCy website repository. The Universe database is open-source and collected in a simple JSON file. For more details on the formats and available fields, see the documentation. Looking for inspiration your own spaCy plugin or extension? Check out the project idea label on the issue tracker.

Read the docsJSON source