You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/containers.md
+37-26
Original file line number
Diff line number
Diff line change
@@ -8,14 +8,14 @@ Complex neural networks are easily built using container classes:
8
8
*[Concat](#nn.Concat) : concatenates in one layer several modules along dimension `dim` ;
9
9
*[DepthConcat](#nn.DepthConcat) : like Concat, but adds zero-padding when non-`dim` sizes don't match;
10
10
*[Bottle](#nn.Bottle) : allows any dimensionality input be forwarded through a module ;
11
-
11
+
12
12
See also the [Table Containers](#nn.TableContainers) for manipulating tables of [Tensors](https://github.com/torch/torch7/blob/master/doc/tensor.md).
13
13
14
14
<aname="nn.Container"></a>
15
15
## Container ##
16
16
17
17
This is an abstract [Module](module.md#nn.Module) class which declares methods defined in all containers.
18
-
It reimplements many of the Module methods such that calls are propagated to the
18
+
It reimplements many of the Module methods such that calls are propagated to the
19
19
contained modules. For example, a call to [zeroGradParameters](module.md#nn.Module.zeroGradParameters)
20
20
will be propagated to all contained modules.
21
21
@@ -37,7 +37,7 @@ Returns the number of contained modules.
37
37
Sequential provides a means to plug layers together
38
38
in a feed-forward fully connected manner.
39
39
40
-
E.g.
40
+
E.g.
41
41
creating a one hidden-layer multi-layer perceptron is thus just as easy as:
Creates a container module that applies its `ith` child module to the `ith` slice of the input Tensor by using [select](https://github.com/torch/torch7/blob/master/doc/tensor.md#tensor-selectdim-index)
107
+
Creates a container module that applies its `ith` child module to the `ith` slice of the input Tensor by using [select](https://github.com/torch/torch7/blob/master/doc/tensor.md#tensor-selectdim-index)
108
108
on dimension `inputDimension`. It concatenates the results of its contained modules together along dimension `outputDimension`.
109
109
110
110
Example:
111
111
```lua
112
112
mlp=nn.Parallel(2,1); -- Parallel container will associate a module to each slice of dimension 2
113
113
-- (column space), and concatenate the outputs over the 1st dimension.
114
-
114
+
115
115
mlp:add(nn.Linear(10,3)); -- Linear module (input 10, output 3), applied on 1st slice of dimension 2
116
116
mlp:add(nn.Linear(10,2)) -- Linear module (input 10, output 2), applied on 2nd slice of dimension 2
117
-
117
+
118
118
-- After going through the Linear module the outputs are
119
119
-- concatenated along the unique dimension, to form 1D Tensor
120
120
>mlp:forward(torch.randn(10,2)) -- of size 5.
@@ -131,8 +131,8 @@ A more complicated example:
131
131
132
132
mlp=nn.Sequential();
133
133
c=nn.Parallel(1,2) -- Parallel container will associate a module to each slice of dimension 1
134
-
-- (row space), and concatenate the outputs over the 2nd dimension.
135
-
134
+
-- (row space), and concatenate the outputs over the 2nd dimension.
WeightNorm implements the reparametrization presented in [Weight Normalization](https://arxiv.org/pdf/1602.07868v3.pdf), which decouples the length of neural network weight vectors from their direction. The weight vectors `w` is determined instead by parameters `g` and `v` such that `w = g * v / ||v||`, where `||v||` is the euclidean norm of vector v. This container can wrap nn layers with weights.
334
+
335
+
It accepts a parameter ``outputDim`` that represents the output dimension of the module weight it wraps, which defaults to 1. If the outputDim is not 1, the container will transpose the weight appropriately. If the module weight is not 2D, the container will view the weight into an appropriate 2D shape based on the outputDim specified by the user.
336
+
326
337
<aname="nn.TableContainers"></a>
327
338
## Table Containers ##
328
339
While the above containers are used for manipulating input [Tensors](https://github.com/torch/torch7/blob/master/doc/tensor.md), table containers are used for manipulating tables :
0 commit comments