Skip to content
This repository was archived by the owner on Dec 16, 2022. It is now read-only.

Updated the docs for PytorchSeq2VecWrapper to specify that mask is required #5386

Merged
merged 8 commits into from
Mar 11, 2022
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- LZMA compression is now supported.
- Added a way to give JSON blobs as input to dataset readers in the `evaluate` command.
- Added the argument `sub_module` in `PretrainedTransformerMismatchedEmbedder`
- Updated the docs for `PytorchSeq2VecWrapper` to specify that `mask` is required rather than sequence lengths for clarity.

### Changed

Expand Down
7 changes: 4 additions & 3 deletions allennlp/modules/seq2vec_encoders/pytorch_seq2vec_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,9 +35,10 @@ class PytorchSeq2VecWrapper(Seq2VecEncoder):
This is what pytorch's RNN's look like - just make sure your class looks like those, and it
should work.

Note that we *require* you to pass sequence lengths when you call this module, to avoid subtle
bugs around masking. If you already have a `PackedSequence` you can pass `None` as the
second parameter.
Note that we *require* you to pass a binary `mask` of shape
(batch_size, sequence_length) when you call this module, to avoid subtle
bugs around masking. If you already have a `PackedSequence` you can pass
`None` as the second parameter.
"""

def __init__(self, module: torch.nn.modules.RNNBase) -> None:
Expand Down