本文主要是介绍Caffe Prototxt 激活层系列:ReLU Layer,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
ReLU Layer 是DL中非线性激活的一种,常常在卷积、归一化层后面(当然这也不是一定的)
首先我们先看一下 ReLUParameter
// Message that stores parameters used by ReLULayer
message ReLUParameter {// Allow non-zero slope for negative inputs to speed up optimization// Described in:// Maas, A. L., Hannun, A. Y., & Ng, A. Y. (2013). Rectifier nonlinearities// improve neural network acoustic models. In ICML Workshop on Deep Learning// for Audio, Speech, and Language Processing.optional float negative_slope = 1 [default = 0]; //x负方向的斜率,relu为0,若不为0,则就是relu的变种enum Engine {DEFAULT = 0;CAFFE = 1;CUDNN = 2;}optional Engine engine = 2 [default = DEFAULT];
}
ReLU Layer 在prototxt里面的书写:
layer {name: "relu"type: "ReLU"bottom: "conv/bn"top: "conv/bn"
}
例如在Mobilenet中:
layer {name: "relu6_4"type: "ReLU"bottom: "conv6_4/bn"top: "conv6_4/bn"
}
这篇关于Caffe Prototxt 激活层系列:ReLU Layer的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!