DisplaCy UserWarning: [W006] when evaluating NER model #7672
-
I put together a simple NER model following the Project Templates. Everything seems to be running fine except for the This is the command I'm executing from my .yml file:
As suggested, I checked
That being the case, I decided to have a look at my data which was annotated with Label Studio and exported in the CoNLL 2003 NER format:
I used the |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
What do the evaluation results look like for What do the evaluation steps look like while training? Here's what it looks like for a very simple toy example where in step 3 you can see the score going up as the model improves: https://github.com/explosion/spaCy/tree/de4f4c9b8a46395b5f2be5fa80afd746c4a7f3cb/examples#-get-started-with-a-demo-project Your check with displacy is only loading the reference docs (with your manual annotation), not the predicted annotation using the model. Instead, try something like: docs = list(nlp.pipe([doc.text for doc in doc_bin.get_docs(nlp.vocab)])) There is currently a bug in that conversion related to |
Beta Was this translation helpful? Give feedback.
-
For people using the rel_component tutorial that have this issue, see those discussions: |
Beta Was this translation helpful? Give feedback.
What do the evaluation results look like for
NER
andNER per type
before you run into this warning? It sounds like your model isn't predicting any entities for these texts.What do the evaluation steps look like while training? Here's what it looks like for a very simple toy example where in step 3 you can see the score going up as the model improves: https://github.com/explosion/spaCy/tree/de4f4c9b8a46395b5f2be5fa80afd746c4a7f3cb/examples#-get-started-with-a-demo-project
Your check with displacy is only loading the reference docs (with your manual annotation), not the predicted annotation using the model. Instead, try something like: