1.3 tensorflow2.0 常用函数

with tf.GradientTape() as tape:
  w = tf.Variable(tf.constant(3.0))
  loss = tf.pow(w,2)
grad = tape.gradient(loss,w)
print(grad)
#tf.Tensor(6.0, shape=(), dtype=float32)
 tf.one_hot(待转换数据,depth=几分类)
 tf.nn.softmax( )
 assign_sub 
 tf.argmax 
原文地址:https://www.cnblogs.com/gao-chao/p/13356436.html