BERT

Word Embedding、RNN/LSTM/GRU+Seq2Seq+Attention+Self-Attention机制和Contextual Word Embedding(Universal Sentence Embedding)

Fine-Tuning

原文地址:https://www.cnblogs.com/jev-0987/p/14980499.html