softsign激活函数
人脸识别中,尝试替代prelu和bn层,效果不好。
pytorch函数:
a_tensor=torch.Tensor([[-1,2,30],[4,5,60],[7,8,9]])
aaaaa=F.softsign(a_tensor)
print(a_tensor)
class SoftSign(Activation):
def __init__(self, x):
super(SoftSign, self).__init__(x)
def forward(self):
self.p = self.x / (1 + np.abs(self.x))
return self.p
def backward(self):
self.derivative = np.full_like(self.p, 0)
self.derivative[self.x >= 0] = np.power(1 - self.p[self.x >= 0], 2)
self.derivative[self.x < 0] = np.power(1 + self.p[self.x < 0], 2)
return self.derivative
if __name__ == "__main__":
x = np.linspace(-10, 10, 500)
plt.plot(x, SoftSign(x)()[0], label='softSign_forward')
plt.plot(x, SoftSign(x)()[1], label='softSign_backward')
plt.legend(loc='best')
plt.show()
效果图:
版权说明 : 本文为转载文章, 版权归原作者所有 版权申明
原文链接 : https://blog.csdn.net/jacke121/article/details/121284755
内容来源于网络,如有侵权,请联系作者删除!