Skip to content
This repository was archived by the owner on Dec 16, 2022. It is now read-only.

Commit a038c01

Browse files
Update transformers requirement from <2.11,>=2.9 to >=2.9,<2.12 (#4315)
* Update transformers requirement from <2.11,>=2.9 to >=2.9,<2.12 Updates the requirements on [transformers](https://github.com/huggingface/transformers) to permit the latest version. - [Release notes](https://github.com/huggingface/transformers/releases) - [Commits](huggingface/transformers@v2.9.0...v2.11.0) Signed-off-by: dependabot-preview[bot] <[email protected]> * try fix Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com> Co-authored-by: epwalsh <[email protected]>
1 parent 345459e commit a038c01

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

setup.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@
6464
"scikit-learn",
6565
"scipy",
6666
"pytest",
67-
"transformers>=2.9,<2.11",
67+
"transformers>=2.9,<2.12",
6868
"jsonpickle",
6969
"dataclasses;python_version<'3.7'",
7070
"filelock>=3.0,<3.1",

tests/modules/token_embedders/pretrained_transformer_embedder_test.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -214,7 +214,7 @@ def test_unfold_long_sequences(self):
214214
assert (unfolded_embeddings_out == unfolded_embeddings).all()
215215

216216
def test_encoder_decoder_model(self):
217-
token_embedder = PretrainedTransformerEmbedder("bart-large", sub_module="encoder")
217+
token_embedder = PretrainedTransformerEmbedder("facebook/bart-large", sub_module="encoder")
218218
token_ids = torch.LongTensor([[1, 2, 3], [2, 3, 4]])
219219
mask = torch.ones_like(token_ids).bool()
220220
token_embedder(token_ids, mask)

0 commit comments

Comments
 (0)