You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Brilliant work here, Morgan - really looking forward to using this with my students on a project. Deepest apologies if I'm not doing this right - I'm very new to Github and also not a particularly good programmer.
It looks like perhaps the FastAI v2 team made a change in Tokenizer that is making it choke on the sep argument when instantiating your custom tokenizer in the fasthugs_language_model notebook.
Taking the sep argument out seemed to fix the issue at first, but then the fastai_tokenizer kept the datasets from being created. I checked the vaious other components and isolated the issue to the tokenizer, but wasn't able to parse the error message that resulted.
Brilliant work here, Morgan - really looking forward to using this with my students on a project. Deepest apologies if I'm not doing this right - I'm very new to Github and also not a particularly good programmer.
It looks like perhaps the FastAI v2 team made a change in Tokenizer that is making it choke on the sep argument when instantiating your custom tokenizer in the fasthugs_language_model notebook.
Taking the sep argument out seemed to fix the issue at first, but then the fastai_tokenizer kept the datasets from being created. I checked the vaious other components and isolated the issue to the tokenizer, but wasn't able to parse the error message that resulted.
Here are head and tail of resulting ten or so pages of error message (again, apologies if I'm not following protocol here):
Anyway, I hope this is helpful. Please keep up the amazing work!
The text was updated successfully, but these errors were encountered: