You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi , I really like and appreciate what you guys are doing.
Is it possible to implement the method on other languages ? etc :Chinese
General-usage of language model was being used in the method , I was curious about the difference between domain-specific finetuned language model and models with MASKER mentioned in the paper and how they affect OOD and performance on downstream tasks. Have you given a try ,yet?
Really looking forward to your response. Thanks.
The text was updated successfully, but these errors were encountered:
Hi, I'm not an expert of other languages, but I think MASKER would be applied for other languages if pre-trained language models are available (e.g., Chinese version of BERT), as the implementation of masked keyword regularization would be straightforward upon those models.
Hi , I really like and appreciate what you guys are doing.
Is it possible to implement the method on other languages ? etc :Chinese
General-usage of language model was being used in the method , I was curious about the difference between domain-specific finetuned language model and models with MASKER mentioned in the paper and how they affect OOD and performance on downstream tasks. Have you given a try ,yet?
Really looking forward to your response. Thanks.
The text was updated successfully, but these errors were encountered: