Skip to content

Latest commit

 

History

History
8 lines (5 loc) · 369 Bytes

201015 Fine-Tuning Pre-trained Language Model with Weak Supervision.md

File metadata and controls

8 lines (5 loc) · 369 Bytes

https://arxiv.org/abs/2010.07835

Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self-Training Approach (Yue Yu, Simiao Zuo, Haoming Jiang, Wendi Ren, Tuo Zhao, Chao Zhang)

weak supervision transfer learning. 텍스트에 대한 규칙을 가지고 생성한 레이블로 학습.

#transfer #weak_supervision #language_model