ReLU激活函数(Rectified linrear unit)
负数时导数为0,正数时导数为1
import torch
import torchvision
from torch.utils import data
from torchvision import transforms
import matplotlib.pyplot as plt
from torch import nn
x = torch.arange(-8.0, 8.0, 0.1, requires_grad=True)
y = torch.relu(x)
plt.plot(x.detach(), y.detach(), 'x', 'relu(x)')
plt.show()
y.backward(torch.ones_like(x), retain_graph=True)
plt.plot(x.detach(), x.grad, 'x', 'grad of relu')
sigmoid(squashing function)
sigmoid函数,将输入变换为区间(0, 1)上的输出,通常称为挤压函数(squashing function),低于0就是0,高于1就是1
当输入接近0时,sigmoid函数接近线性变换
import torch
import torchvision
from torch.utils import data
from torchvision import transforms
import matplotlib.pyplot as plt
from torch import nn
if torch.cuda.is_available():
device = torch.device("cuda")
x = torch.arange(-8.0, 8.0, 0.1, requires_grad=True, device=device)
y = torch.sigmoid(x)
y=y.to(device)
plt.plot(x.to("cpu").detach(), y.to("cpu").detach(), '.', 'sigmoid(x)')
plt.show()
tanh(双曲正切)
-1,1的范围