##关于bert的资源汇总##

基于中文的词的全mask 的bert做法:

https://github.com/ymcui/Chinese-BERT-wwm

Robert原理讲解:

https://zhuanlan.zhihu.com/p/103205929

pytorch-transformer的资料库:

https://github.com/huggingface/transformers

#          Model          | Tokenizer          | Pretrained weights shortcut
MODELS = [(BertModel,       BertTokenizer,       'bert-base-uncased'),
          (OpenAIGPTModel,  OpenAIGPTTokenizer,  'openai-gpt'),
          (GPT2Model,       GPT2Tokenizer,       'gpt2'),
          (CTRLModel,       CTRLTokenizer,       'ctrl'),
          (TransfoXLModel,  TransfoXLTokenizer,  'transfo-xl-wt103'),
          (XLNetModel,      XLNetTokenizer,      'xlnet-base-cased'),
          (XLMModel,        XLMTokenizer,        'xlm-mlm-enfr-1024'),
          (DistilBertModel, DistilBertTokenizer, 'distilbert-base-cased'),
          (RobertaModel,    RobertaTokenizer,    'roberta-base'),
          (XLMRobertaModel, XLMRobertaTokenizer, 'xlm-roberta-base'),
         ]

 bert可视化讲解:

http://jalammar.github.io/a-visual-guide-to-using-bert-for-the-first-time/

 https://github.com/jalammar/jalammar.github.io/tree/master/notebooks/bert

原文地址:https://www.cnblogs.com/tfknight/p/13354848.html