Skip to content

Commit c472b59

Browse files
stas00amyeroberts
authored andcommitted
1 parent 1e7062a commit c472b59

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

src/transformers/generation_utils.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1200,7 +1200,7 @@ def generate(
12001200
input_ids_seq_length = input_ids.shape[-1]
12011201
if max_length is None and max_new_tokens is None:
12021202
warnings.warn(
1203-
"Neither `max_length` nor `max_new_tokens` have been set, `max_length` will default to "
1203+
"Neither `max_length` nor `max_new_tokens` has been set, `max_length` will default to "
12041204
f"{self.config.max_length} (`self.config.max_length`). Controlling `max_length` via the config is "
12051205
"deprecated and `max_length` will be removed from the config in v5 of Transformers -- we recommend "
12061206
"using `max_new_tokens` to control the maximum length of the generation.",

src/transformers/models/fsmt/modeling_fsmt.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -220,7 +220,7 @@
220220
input_ids (`torch.LongTensor` of shape `(batch_size, sequence_length)`):
221221
Indices of input sequence tokens in the vocabulary.
222222
223-
IIndices can be obtained using [`FSTMTokenizer`]. See [`PreTrainedTokenizer.encode`] and
223+
Indices can be obtained using [`FSTMTokenizer`]. See [`PreTrainedTokenizer.encode`] and
224224
[`PreTrainedTokenizer.__call__`] for details.
225225
226226
[What are input IDs?](../glossary#input-ids)

0 commit comments

Comments
 (0)