sigmoid_cross_entropy_with_logits and weighted_cross_entropy_with_logits
2018-01-03 12:00
387 查看
当样本的 labels 是多个独立的二分类问题时,loss 函数之前的激活函数应该是 sigmoid/tanh,而不能使用softmax了。如果激活函数是sigmoid,这种情况下我们的loss函数就是sigmoid_cross_entropy_with_logits。
Tensorflow中,logits都是指未经过激活函数的输出,即所谓的 unscaled。
Tensorflow中,logits都是指未经过激活函数的输出,即所谓的 unscaled。
sigmoid_cross_entropy_with_logits
具体实现参考官方文档。直接给出我用python实现的版本吧。def my_binary_crossentropy(labels, output): """ binary crossentropy between an output tensor and a target tensor. """ # transform back to logits epsilon = 1e-08 np.clip(output, epsilon, 1.0 - epsilon, out=output) output = np.log(output / (1.0 - output)) # x = logits, z = labels # max(x, 0) - x * z + log(1 + exp(-abs(x))) loss2 = np.maximum(output, 0) loss1 = - np.multiply(output, labels) loss3 = np.log(1 + np.exp(- abs(output))) return (loss1 + loss2 + loss3)
weighted_cross_entropy_with_logits
当训练样本中的正负样本数量不均衡时,我们可以增加一个权重,使模型学习得更好。实现同样参考官方文档def my_weighted_binary_crossentropy(labels, output, weight=pos_weight): # https://stackoverflow.com/questions/42158866/neural-network-for-multi-label-classification-with-large-number-of-classes-outpu/47313183#47313183 """ Weighted binary crossentropy between an output tensor and a target tensor. """ # transform back to logits epsilon = 1e-08 np.clip(output, epsilon, 1.0 - epsilon, out=output) output = np.log(output / (1.0 - output)) # l = 1 + (q - 1) * z # (1 - z) * x + l * (log(1 + exp(-abs(x))) + max(-x, 0)) l = 1.0 + (weight - 1.0) * labels loss1 = np.multiply(1.0 - labels, output) tmp = np.log(1.0 + np.exp(-abs(output))) + np.maximum(-output, 0) loss2 = np.multiply(l, tmp) return (loss1 + loss2)
相关文章推荐
- tensorflow函数--sigmoid_cross_entropy_with_logits
- [tensorflow损失函数系列]weighted_cross_entropy_with_logits
- tensorflow函数--weighted_cross_entropy_with_logits
- tf.nn.sigmoid_cross_entropy_with_logits
- tensorflow学习 softmax_cross_entropy_with_logits & sparse_softmax_cross_entropy_with_logits
- ValueError: Only call `sigmoid_cross_entropy_with_logits` with named arguments (labels=..., logits=.
- [tensorflow损失函数系列]sigmoid_cross_entropy_with_logits
- Tensorflow 交叉熵计算 sparse_softmax_cross_entropy_with_logits softmax_cross_entropy_with_logits
- 【TensorFlow】tf.nn.softmax_cross_entropy_with_logits的用法
- TensorFlow学习笔记之tf.nn.softmax()与tf.nn.softmax_cross_entropy_with_logits的用法
- ValueError: Only call `sparse_softmax_cross_entropy_with_logits` with named arguments (labels=..., l
- 【TensorFlow】tf.nn.softmax_cross_entropy_with_logits的用法
- tf.nn.softmax_cross_entropy_with_logits()笔记及交叉熵
- tf.nn.sparse_softmax_cross_entropy_with_logits()函数的用法
- 【TensorFlow】tf.nn.softmax_cross_entropy_with_logits的用法
- 解决 tensorflow softmax_cross_entropy_with_logits() 报错 Only call `softmax_cross_entropy_with_logits`
- tensorflow:Only call `sparse_softmax_cross_entropy_with_logits` with named arguments (labe
- Only call 'softmax_cross_entropy_with_logits' with named grguments(labels=...,logits=...)
- tf.nn.softmax_cross_entropy_with_logits
- Tensorflow函数:tf.nn.softmax_cross_entropy_with_logits 讲解