|
2 | 2 | An :class:`Activation` is just a function
|
3 | 3 | that takes some parameters and returns an element-wise activation function.
|
4 | 4 | For the most part we just use
|
5 |
| -`PyTorch activations <http://pytorch.org/docs/master/nn.html#non-linear-activations>`_. |
| 5 | +`PyTorch activations <https://pytorch.org/docs/master/nn.html#non-linear-activations>`_. |
6 | 6 | Here we provide a thin wrapper to allow registering them and instantiating them ``from_params``.
|
7 | 7 |
|
8 | 8 | The available activation functions are
|
9 | 9 |
|
10 | 10 | * "linear"
|
11 |
| -* `"relu" <http://pytorch.org/docs/master/nn.html#torch.nn.ReLU>`_ |
12 |
| -* `"relu6" <http://pytorch.org/docs/master/nn.html#torch.nn.ReLU6>`_ |
13 |
| -* `"elu" <http://pytorch.org/docs/master/nn.html#torch.nn.ELU>`_ |
14 |
| -* `"prelu" <http://pytorch.org/docs/master/nn.html#torch.nn.PReLU>`_ |
15 |
| -* `"leaky_relu" <http://pytorch.org/docs/master/nn.html#torch.nn.LeakyReLU>`_ |
16 |
| -* `"threshold" <http://pytorch.org/docs/master/nn.html#torch.nn.Threshold>`_ |
17 |
| -* `"hardtanh" <http://pytorch.org/docs/master/nn.html#torch.nn.Hardtanh>`_ |
18 |
| -* `"sigmoid" <http://pytorch.org/docs/master/nn.html#torch.nn.Sigmoid>`_ |
19 |
| -* `"tanh" <http://pytorch.org/docs/master/nn.html#torch.nn.Tanh>`_ |
20 |
| -* `"log_sigmoid" <http://pytorch.org/docs/master/nn.html#torch.nn.LogSigmoid>`_ |
21 |
| -* `"softplus" <http://pytorch.org/docs/master/nn.html#torch.nn.Softplus>`_ |
22 |
| -* `"softshrink" <http://pytorch.org/docs/master/nn.html#torch.nn.Softshrink>`_ |
23 |
| -* `"softsign" <http://pytorch.org/docs/master/nn.html#torch.nn.Softsign>`_ |
24 |
| -* `"tanhshrink" <http://pytorch.org/docs/master/nn.html#torch.nn.Tanhshrink>`_ |
| 11 | +* `"relu" <https://pytorch.org/docs/master/nn.html#torch.nn.ReLU>`_ |
| 12 | +* `"relu6" <https://pytorch.org/docs/master/nn.html#torch.nn.ReLU6>`_ |
| 13 | +* `"elu" <https://pytorch.org/docs/master/nn.html#torch.nn.ELU>`_ |
| 14 | +* `"prelu" <https://pytorch.org/docs/master/nn.html#torch.nn.PReLU>`_ |
| 15 | +* `"leaky_relu" <https://pytorch.org/docs/master/nn.html#torch.nn.LeakyReLU>`_ |
| 16 | +* `"threshold" <https://pytorch.org/docs/master/nn.html#torch.nn.Threshold>`_ |
| 17 | +* `"hardtanh" <https://pytorch.org/docs/master/nn.html#torch.nn.Hardtanh>`_ |
| 18 | +* `"sigmoid" <https://pytorch.org/docs/master/nn.html#torch.nn.Sigmoid>`_ |
| 19 | +* `"tanh" <https://pytorch.org/docs/master/nn.html#torch.nn.Tanh>`_ |
| 20 | +* `"log_sigmoid" <https://pytorch.org/docs/master/nn.html#torch.nn.LogSigmoid>`_ |
| 21 | +* `"softplus" <https://pytorch.org/docs/master/nn.html#torch.nn.Softplus>`_ |
| 22 | +* `"softshrink" <https://pytorch.org/docs/master/nn.html#torch.nn.Softshrink>`_ |
| 23 | +* `"softsign" <https://pytorch.org/docs/master/nn.html#torch.nn.Softsign>`_ |
| 24 | +* `"tanhshrink" <https://pytorch.org/docs/master/nn.html#torch.nn.Tanhshrink>`_ |
25 | 25 | """
|
26 | 26 |
|
27 | 27 | import torch
|
|
0 commit comments