Open
Description
The input text length is greater than the maximum length (8376 > 8192) and has been truncated!
The input text length is greater than the maximum length (8607 > 8192) and has been truncated!
The input text length is greater than the maximum length (8295 > 8192) and has been truncated!
The input text length is greater than the maximum length (8733 > 8192) and has been truncated!
The input text length is greater than the maximum length (8592 > 8192) and has been truncated!
The input text length is greater than the maximum length (9489 > 8192) and has been truncated!
The input text length is greater than the maximum length (8283 > 8192) and has been truncated!
The input text length is greater than the maximum length (8330 > 8192) and has been truncated!
The input text length is greater than the maximum length (9055 > 8192) and has been truncated!
Traceback (most recent call last):
File "/root/siton-data-0553377b2d664236bad5b5d0ba8aa419/workspace/FlashRAG/examples/methods/run_exp.py", line 650, in <module>
func(args)
File "/root/siton-data-0553377b2d664236bad5b5d0ba8aa419/workspace/FlashRAG/examples/methods/run_exp.py", line 456, in ircot
result = pipeline.run(test_data)
^^^^^^^^^^^^^^^^^^^^^^^
File "/root/siton-data-0553377b2d664236bad5b5d0ba8aa419/workspace/FlashRAG/flashrag/pipeline/active_pipeline.py", line 1040, in run
self.run_batch(dataset)
File "/root/siton-data-0553377b2d664236bad5b5d0ba8aa419/workspace/FlashRAG/flashrag/pipeline/active_pipeline.py", line 986, in run_batch
new_thoughts_batch = self.generator.generate(input_prompts, stop=['.', '\n'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/siton-data-0553377b2d664236bad5b5d0ba8aa419/workspace/FlashRAG/flashrag/generator/generator.py", line 258, in generate
outputs = self.model.generate(input_list, sampling_params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/flashrag/lib/python3.11/site-packages/vllm/
ValueError: The decoder prompt (length 8192) is longer than the maximum model length of 8192. Make sure that `max_model_len` is no smaller than the number of text tokens.
max_len改成1024也不行。
Metadata
Metadata
Assignees
Labels
No labels