本文主要是介绍keras代码阅读-relu函数,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
概述
relu是激活函数的一种。很多地方说relu函数的公式就是:
f(x)=max(0,x)
阅读了theano的代码
def relu(x, alpha=0):"""Compute the element-wise rectified linear activation function... versionadded:: 0.7.1Parameters----------x : symbolic tensorTensor to compute the activation function for.alpha : scalar or tensor, optionalSlope for negative input, usually between 0 and 1. The default valueof 0 will lead to the standard rectifier, 1 will lead toa linear activation function, and any value in between will give aleaky rectifier. A shared variable (broadcastable against `x`) willresult in a parameterized rectifier with learnable slope(s).Returns-------symbolic tensorElement-wise rectifier applied to `x`.Notes-----This is numerically equivalent to ``T.switch(x > 0, x, alpha * x)``(or ``T.maximum(x, alpha * x)`` for ``alpha < 1``), but uses a fasterformulation or an optimized Op, so we encourage to use this function."""# This is probably the fastest implementation for GPUs. Both the forward# pass and the gradient get compiled into a single GpuElemwise call.# TODO: Check if it's optimal for CPU as well; add an "if" clause if not.# TODO: Check if there's a faster way for the gradient; create an Op if so.if alpha == 0:return 0.5 * (x + abs(x))else:f1 = 0.5 * (1 + alpha)f2 = 0.5 * (1 - alpha)return f1 * x + f2 * abs(x)
发现这个代码中用了以下方法,当alpha为0的时候
f(x)=x+|x|2
当alpha不等于0时。是这样的:
f(x)=1+α2x+1−α2|x|
这篇关于keras代码阅读-relu函数的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!