Skip to content

Beyond of fine-tuning which requires an entire new model for every task, how to perform downstream tasks in a more cost-efficient way has continuously gained importance throughout the NLP field.

Notifications You must be signed in to change notification settings

JingyanChen22/Cross-lingual-lightweight-tuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

Parameter-efficient-transfer-learning-in-cross-lingual-scenario

Beyond of fine-tuning which requires an entire new model for every task, how to perform downstream tasks in a more cost-efficient way has continuously gained importance throughout the NLP field.

About

Beyond of fine-tuning which requires an entire new model for every task, how to perform downstream tasks in a more cost-efficient way has continuously gained importance throughout the NLP field.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published