softsign激活函数

x33g5p2x  于2021-11-12 转载在 其他  
字(0.7k)|赞(0)|评价(0)|浏览(365)

softsign激活函数

人脸识别中,尝试替代prelu和bn层,效果不好。

pytorch函数:

a_tensor=torch.Tensor([[-1,2,30],[4,5,60],[7,8,9]])
    aaaaa=F.softsign(a_tensor)

    print(a_tensor)
class SoftSign(Activation):
    def __init__(self, x):
        super(SoftSign, self).__init__(x)

    def forward(self):
        self.p = self.x / (1 + np.abs(self.x))
        return self.p

    def backward(self):
        self.derivative = np.full_like(self.p, 0)
        self.derivative[self.x >= 0] = np.power(1 - self.p[self.x >= 0], 2)
        self.derivative[self.x < 0] = np.power(1 + self.p[self.x < 0], 2)
        return self.derivative
if __name__ == "__main__":
    x = np.linspace(-10, 10, 500)
    plt.plot(x, SoftSign(x)()[0], label='softSign_forward')
    plt.plot(x, SoftSign(x)()[1], label='softSign_backward')
    plt.legend(loc='best')
    plt.show()

效果图:

相关文章

微信公众号

最新文章

更多