Skip to content

spacy-transformers with GPT-2 #9491

Oct 17, 2021 · 2 comments · 1 reply
Discussion options

You must be logged in to vote

You can use other models by passing the name (from HuggingFace hub) to the Transformer model, as described in the docs here. In code that looks something like:

from spacy_transformers import Transformer, TransformerModel
from spacy_transformers.annotation_setters import null_annotation_setter
from spacy_transformers.span_getters import get_doc_spans

trf = Transformer(
    nlp.vocab,
    TransformerModel(
        "bert-base-cased",
        get_spans=get_doc_spans,
        tokenizer_config={"use_fast": True},
    ),
    set_extra_annotations=null_annotation_setter,
    max_batch_items=4096,
)

You can also change this in the config.

Replies: 2 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Answer selected by Btibert3
Comment options

You must be logged in to vote
1 reply
@adrianeboyd
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / transformer Feature: Transformer
3 participants