2020-10-19 21:48:57

This commit is contained in:
wizardforcel
2020-10-19 21:48:57 +08:00
parent 74f7d35aeb
commit 045dee5888
20 changed files with 73 additions and 73 deletions

View File

@@ -10,7 +10,7 @@
![](img/761c210ceb0fdd69c7e0f8bd85e39698.png)
```
```py
import torch
from torch.autograd import Variable
import matplotlib.pyplot as plt
@@ -42,7 +42,7 @@ plt.show()
我们在这里搭建两个神经网络, 一个没有 dropout, 一个有 dropout. 没有 dropout 的容易出现 过拟合, 那我们就命名为 net_overfitting, 另一个就是 net_dropped.  torch.nn.Dropout(0.5)  这里的 0.5 指的是随机有 50% 的神经元会被关闭/丢弃.
```
```py
net_overfitting = torch.nn.Sequential(
torch.nn.Linear(1, N_HIDDEN),
torch.nn.ReLU(),
@@ -66,7 +66,7 @@ net_dropped = torch.nn.Sequential(
训练的时候, 这两个神经网络分开训练. 训练的环境都一样.
```
```py
optimizer_ofit = torch.optim.Adam(net_overfitting.parameters(), lr=0.01)
optimizer_drop = torch.optim.Adam(net_dropped.parameters(), lr=0.01)
loss_func = torch.nn.MSELoss()
@@ -92,7 +92,7 @@ for t in range(500):
![](img/a545e4a49909bd7a80e042fd6d8267cb.png)
```
```py
...
optimizer_ofit.step()