
误差计算
发布日期:2021-05-09 05:33:40
浏览次数:11
分类:博客文章
本文共 2490 字,大约阅读时间需要 8 分钟。
目录
TensorFlow2教程完整教程目录(更有python、go、pytorch、tensorflow、爬虫、人工智能教学等着你):
Outline
- MSE
- Cross Entropy Loss
- Hinge Loss
MSE
- \(loss = \frac{1}{N}\sum(y-out)^2\)
- \(L_{2-norm} = \sqrt{\sum(y-out)}\)
import tensorflow as tf
y = tf.constant([1, 2, 3, 0, 2])y = tf.one_hot(y, depth=4) # max_label=3种y = tf.cast(y, dtype=tf.float32)out = tf.random.normal([5, 4])out
loss1 = tf.reduce_mean(tf.square(y - out))loss1
loss2 = tf.square(tf.norm(y - out)) / (5 * 4)loss2
loss3 = tf.reduce_mean(tf.losses.MSE(y, out))loss3
Entropy
- Uncertainty
- measure of surprise
- lower entropy --> more info.
\[\text{Entropy} = -\sum_{i}P(i)log\,P(i)\]
a = tf.fill([4], 0.25)a * tf.math.log(a) / tf.math.log(2.)
-tf.reduce_sum(a * tf.math.log(a) / tf.math.log(2.))
a = tf.constant([0.1, 0.1, 0.1, 0.7])-tf.reduce_sum(a * tf.math.log(a) / tf.math.log(2.))
a = tf.constant([0.01, 0.01, 0.01, 0.97])-tf.reduce_sum(a * tf.math.log(a) / tf.math.log(2.))
Cross Entropy
\[H(p,q) = -\sum{p(x)log\,q(x)} \\H(p,q) = H(p) + D_{KL}(p|q)\]
- for p = q
Minima: H(p,q) = H(p)
- for P: one-hot encodint
\(h(p:[0,1,0]) = -1log\,1=0\)
\(H([0,1,0],[p_0,p_1,p_2]) = 0 + D_{KL}(p|q) = -1log\,q_1\) # p,q即真实值和预测值相等的话交叉熵为0Binary Classification
- Two cases(第二种格式只需要输出一种情况,节省计算,无意义)
Single output
\[H(P,Q) = -P(cat)log\,Q(cat) - (1-P(cat))log\,(1-Q(cat)) \\P(dog) = (1-P(cat)) \\\]
\[\begin{aligned}H(P,Q) & = -\sum_{i=(cat,dog)}P(i)log\,Q(i)\\& = -P(cat)log\,Q(cat) - P(dog)log\,Q(dog)-(ylog(p)+(1-y)log\,(1-p))\end{aligned}\]
Classification
- \(H([0,1,0],[p_0,p_1,p_2])=0+D_{KL}(p|q) = -1log\,q_1\)
\[\begin{aligned}& P_1 = [1,0,0,0,0]\\& Q_1=[0.4,0.3,0.05,0.05,0.2]\end{aligned}\]
\[\begin{aligned}H(P_1,Q_1) & = -\sum{P_1(i)}log\,Q_1(i) \\& = -(1log\,0.4+0log\,0.3+0log\,0.05+0log\,0.05+0log\,0.2) \\& =-log\,0.4 \\& \approx{0.916}\end{aligned}\]
\[\begin{aligned}& P_1 = [1,0,0,0,0]\\& Q_1=[0.98,0.01,0,0,0.01]\end{aligned}\]
\[\begin{aligned}H(P_1,Q_1) & = -\sum{P_1(i)}log\,Q_1(i) \\& =-log\,0.98 \\& \approx{0.02}\end{aligned}\]
tf.losses.categorical_crossentropy([0, 1, 0, 0], [0.25, 0.25, 0.25, 0.25])
tf.losses.categorical_crossentropy([0, 1, 0, 0], [0.1, 0.1, 0.8, 0.1])
tf.losses.categorical_crossentropy([0, 1, 0, 0], [0.1, 0.7, 0.1, 0.1])
tf.losses.categorical_crossentropy([0, 1, 0, 0], [0.01, 0.97, 0.01, 0.01])
tf.losses.BinaryCrossentropy()([1],[0.1])
tf.losses.binary_crossentropy([1],[0.1])
Why not MSE?
- sigmoid + MSE
gradient vanish
- converge slower
- However
e.g. meta-learning
logits-->CrossEntropy
发表评论
最新留言
网站不错 人气很旺了 加油
[***.192.178.218]2025年04月07日 22时59分20秒
关于作者

喝酒易醉,品茶养心,人生如梦,品茶悟道,何以解忧?唯有杜康!
-- 愿君每日到此一游!
推荐文章
堆与堆排序
2021-05-09
关于计数排序
2021-05-09
使用 C# 9 的records作为强类型ID - 初次使用
2021-05-09
CentOS Too Many Open Files 解决
2021-05-09
移除 DevExpress 的 XtraForm 标题文字阴影
2021-05-09
【进阶之路】并发编程(三)-非阻塞同步机制
2021-05-09
分布式图数据库 Nebula Graph 中的集群快照实践
2021-05-09
Python3获取新浪微博内容乱码问题
2021-05-09
移动开发中单页异步加载所有列表项
2021-05-09
Newbe.Claptrap 框架入门,第一步 —— 开发环境准备
2021-05-09
最大子序列和问题
2021-05-09
【neo4j】文件管理路径、数据备份、创建新数据库、导入数据等操作记录
2021-05-09
Python中字符串前添加r ,b, u, f前缀的含义
2021-05-09
zookeeper安装部署步骤
2021-05-09
Linux I/O 重定向
2021-05-09
Hadoop学习笔记—Yarn
2021-05-09
误差计算
2021-05-09
对象的绑定方法
2021-05-09