Attention machenism

from attention mechanism

Attention is one component of a network’s architecture, and is in charge of managing and quantifying the interdependence.

  1. Between the input and output elements (General Attention)
  2. Within the input elements (Self-Attention)

While Attention does have its application in other fields of deep learning such as Computer Vision, its main breakthrough and success come from its application in Natural Language Processing (NLP) tasks. This is due to the fact that Attention was introduced to address the problem of long sequences in Machine Translation, which is also a problem for most other NLP tasks as well.

原文地址:https://www.cnblogs.com/dulun/p/12240867.html