batchnorm与dropout的区别
发布日期:2021-05-06 21:45:31 浏览次数:19 分类:精选文章

本文共 724 字,大约阅读时间需要 2 分钟。

Dropout is mostly a technique for
regularization. It introduces
noise into a neural network to force the neural network to learn to generalize well enough to deal with noise. (This is a big oversimplification, and dropout is really about a lot more than just robustness to noise)
Batch normalization is mostly a technique for improving
optimization.
As a side effect, batch normalization happens to introduce some noise into the network, so it can regularize the model a
little bit.
When you have a large dataset, it’s important to optimize well, and not as important to regularize well, so batch normalization is more important for large datasets. You can of course use both batch normalization and dropout at the same time
上一篇:如何用df的两列作复杂的运算
下一篇:svd的理解

发表评论

最新留言

逛到本站,mark一下
[***.202.152.39]2025年04月12日 06时13分18秒