Transformer

参考资料:

【ERT大火却不懂Transformer?读这一篇就够了】

https://zhuanlan.zhihu.com/p/54356280 (中文版)

http://jalammar.github.io/illustrated-transformer/  (谷歌AI博客 英文版)

 https://github.com/hanxiao/bert-as-service#q-the-cosine-similarity-of-two-sentence-vectors-is-unreasonably-high-eg-always--08-whats-wrong BERT使用问题集

【NLP】Transformer详解

https://zhuanlan.zhihu.com/p/44121378

【关于Multi-Head和Positional Encoding】

http://blog.leanote.com/post/lincent/Attention-is-all-you-need%EF%BC%882%EF%BC%89%E5%85%B3%E4%BA%8EMulti-Head%E5%92%8CPositional-Encoding-2

【Universal Transformers详解】:https://zhuanlan.zhihu.com/p/44655133、https://www.leiphone.com/news/201808/1nhPCi9jWWNGv6aw.html

【图灵完备】:https://www.zhihu.com/question/20115374/answer/288346717

【Transformer应用】

Transformer 在美团搜索排序中的实践 : https://tech.meituan.com/2020/04/16/transformer-in-meituan.html

原文地址:https://www.cnblogs.com/ying-chease/p/10508944.html