Searched refs:softmax_cross_entropy_with_logits_v2 (Results 1 – 10 of 10) sorted by relevance
72 out = tf.nn.softmax_cross_entropy_with_logits_v2(
227 x = nn_ops.softmax_cross_entropy_with_logits_v2(
56 logits = tf.nn.softmax_cross_entropy_with_logits_v2(...)
782 losses = nn.softmax_cross_entropy_with_logits_v2(
1954 sampled_losses = nn_ops.softmax_cross_entropy_with_logits_v2(
2997 def softmax_cross_entropy_with_logits_v2(labels, logits, axis=-1, name=None): function3239 return softmax_cross_entropy_with_logits_v2(
360 name: "softmax_cross_entropy_with_logits_v2"
3932 return nn.softmax_cross_entropy_with_logits_v2(labels=target, logits=output)
1210 "loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(\n",
805 * Add `tf.nn.softmax_cross_entropy_with_logits_v2` which enables backprop