-
-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HuggingFace warnings / errors on experiment runs -e.g., "... model 'OptimizedModule' is not supported ..." #670
Comments
Another potential compatibility issue with the recent HuggingFace updates - this warning is reported at the start of training:
|
An additional warning being reported by HF for recent experiments. This warning occurs at the end of preprocessing / start of training.
|
ClearML warning at the start of the training step:
|
Torch warning at the start of training:
|
Warning at the end of training when the model is being saved:
|
I am seeing this warning during mid-training evals:
|
Warning at the start of training, just before the
|
HuggingFace is reporting an error at the start of the test step during an experiment run:
[ERROR|base.py:1149] 2025-02-28 12:05:52,437 >> The model 'OptimizedModule' is not supported for . Supported models are ['BartForConditionalGeneration', 'BigBirdPegasusForConditionalGeneration', 'BlenderbotForConditionalGeneration', 'BlenderbotSmallForConditionalGeneration', 'EncoderDecoderModel', 'FSMTForConditionalGeneration', 'GPTSanJapaneseForConditionalGeneration', 'LEDForConditionalGeneration', 'LongT5ForConditionalGeneration', 'M2M100ForConditionalGeneration', 'MarianMTModel', 'MBartForConditionalGeneration', 'MT5ForConditionalGeneration', 'MvpForConditionalGeneration', 'NllbMoeForConditionalGeneration', 'PegasusForConditionalGeneration', 'PegasusXForConditionalGeneration', 'PLBartForConditionalGeneration', 'ProphetNetForConditionalGeneration', 'Qwen2AudioForConditionalGeneration', 'SeamlessM4TForTextToText', 'SeamlessM4Tv2ForTextToText', 'SwitchTransformersForConditionalGeneration', 'T5ForConditionalGeneration', 'UMT5ForConditionalGeneration', 'XLMProphetNetForConditionalGeneration'].
However, the test step appears to work successfully for this experiment despite the error. The model is set to 'facebook/nllb-200-distilled-1.3B'.
The text was updated successfully, but these errors were encountered: