tensorflow2.0、keras实现Attention

1     h1_c=h1[:,-1:,:]
2     tmp=tf.keras.backend.batch_dot(h1_c,tf.keras.layers.Permute((2,1))(h1))
3     scores=tf.keras.layers.Softmax()(tmp)
4     at=tf.keras.backend.batch_dot(scores,h1)
原文地址:https://www.cnblogs.com/oldBook/p/13917728.html