文獻閱讀筆記—Improving Language Understanding by Generative Pre-Training

遷移學習在nlp領域的應用之pretrain language representation,四連載,建議按順序看,看完對該方向一定會非常清楚的! (一)ELMO:Deep contextualized word representations (二)Universal Language Model Fine-tuning for Text Classification (三)openAI GPT
相關文章
相關標籤/搜索