Skip to content
This repository was archived by the owner on Mar 9, 2025. It is now read-only.

FileNotFoundError: Couldn't find a module script at /content/wer/wer.py. Module 'wer' doesn't exist on the Hugging Face Hub either. #230

Open
Gyimah3 opened this issue Feb 26, 2025 · 0 comments · May be fixed by #231
Assignees

Comments

@Gyimah3
Copy link

Gyimah3 commented Feb 26, 2025

I got this error while fintuning the model as specified in the readme but got this error

trainer.train(
output_dir="Gyimah3/whisper-small-finetuned",
warmup_steps=10,
max_steps=500,
learning_rate=0.0001,
lr_scheduler_type="constant_with_warmup",
per_device_train_batch_size=16, # Adjust based on available RAM; increase if more RAM is available
per_device_eval_batch_size=16, # Adjust based on available RAM; increase if more RAM is available
optim="adamw_bnb_8bit",
save_steps=100,
logging_steps=100,
eval_steps=100,
gradient_checkpointing=True,
)

Reading metadata...: 1926it [00:00, 3807.73it/s]
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Reading metadata...: 660it [00:00, 1316.56it/s]
wandb: WARNING The `run_name` is currently set to the same value as `TrainingArguments.output_dir`. If this was not intended, please specify a different run name by setting the `TrainingArguments.run_name` parameter.
wandb: Currently logged in as: evelyngyim1111 (evelyngyim1111-inlaks). Use `wandb login --relogin` to force relogin
wandb version 0.19.7 is available! To upgrade, please run: $ pip install wandb --upgrade
Tracking run with wandb version 0.17.4
Run data is saved locally in /content/wandb/run-20250226_110645-bb86w4fx
Reading metadata...: 1926it [00:00, 2775.14it/s]
 [101/500 03:57 < 15:56, 0.42 it/s, Epoch 0.20/9223372036854775807]
Step | Training Loss | Validation Loss -- | -- | --

Reading metadata...: 660it [00:00, 934.74it/s]
You have passed task=transcribe, but also have set `forced_decoder_ids` to [[1, None], [2, 50359]] which creates a conflict. `forced_decoder_ids` will be ignored in favor of task=transcribe.
The attention mask is not set and cannot be inferred from input because pad token is same as eos token.As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results.
---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
<ipython-input-7-ab54e5d8e014> in <cell line: 0>()
----> 1 trainer.train(
      2     output_dir="Gyimah3/whisper-small-finetuned",
      3     warmup_steps=10,
      4     max_steps=500,
      5     learning_rate=0.0001,


13 frames
/usr/local/lib/python3.11/dist-packages/evaluate/loading.py in evaluation_module_factory(path, module_type, revision, download_config, download_mode, force_local_path, dynamic_modules_path, **download_kwargs)
    679             if not isinstance(e1, (ConnectionError, FileNotFoundError)):
    680                 raise e1 from None
--> 681             raise FileNotFoundError(
    682                 f"Couldn't find a module script at {relative_to_absolute_path(combined_path)}. "
    683                 f"Module '{path}' doesn't exist on the Hugging Face Hub either."

FileNotFoundError: Couldn't find a module script at /content/wer/wer.py. Module 'wer' doesn't exist on the Hugging Face Hub either.

Reading metadata...: 1926it [00:00, 3807.73it/s] Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. Reading metadata...: 660it [00:00, 1316.56it/s] wandb: WARNING The `run_name` is currently set to the same value as `TrainingArguments.output_dir`. If this was not intended, please specify a different run name by setting the `TrainingArguments.run_name` parameter. wandb: Currently logged in as: evelyngyim1111 (evelyngyim1111-inlaks). Use `wandb login --relogin` to force relogin wandb version 0.19.7 is available! To upgrade, please run: $ pip install wandb --upgrade Tracking run with wandb version 0.17.4 Run data is saved locally in /content/wandb/run-20250226_110645-bb86w4fx Syncing run [../openai/whisper-small-finetuned](https://wandb.ai/evelyngyim1111-inlaks/huggingface/runs/bb86w4fx) to [Weights & Biases](https://wandb.ai/evelyngyim1111-inlaks/huggingface) ([docs](https://wandb.me/run)) View project at https://wandb.ai/evelyngyim1111-inlaks/huggingface View run at https://wandb.ai/evelyngyim1111-inlaks/huggingface/runs/bb86w4fx Reading metadata...: 1926it [00:00, 2775.14it/s] [101/500 03:57 < 15:56, 0.42 it/s, Epoch 0.20/9223372036854775807] Step Training Loss Validation Loss Reading metadata...: 660it [00:00, 934.74it/s] You have passed task=transcribe, but also have set `forced_decoder_ids` to [[1, None], [2, 50359]] which creates a conflict. `forced_decoder_ids` will be ignored in favor of task=transcribe. The attention mask is not set and cannot be inferred from input because pad token is same as eos token.As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results. --------------------------------------------------------------------------- FileNotFoundError Traceback (most recent call last) [](https://localhost:8080/#) in () ----> 1 trainer.train( 2 output_dir="Gyimah3/whisper-small-finetuned", 3 warmup_steps=10, 4 max_steps=500, 5 learning_rate=0.0001,

13 frames
/usr/local/lib/python3.11/dist-packages/evaluate/loading.py in evaluation_module_factory(path, module_type, revision, download_config, download_mode, force_local_path, dynamic_modules_path, **download_kwargs)
679 if not isinstance(e1, (ConnectionError, FileNotFoundError)):
680 raise e1 from None
--> 681 raise FileNotFoundError(
682 f"Couldn't find a module script at {relative_to_absolute_path(combined_path)}. "
683 f"Module '{path}' doesn't exist on the Hugging Face Hub either."

FileNotFoundError: Couldn't find a module script at /content/wer/wer.py. Module 'wer' doesn't exist on the Hugging Face Hub either.

@KevKibe KevKibe self-assigned this Feb 27, 2025
@KevKibe KevKibe linked a pull request Feb 27, 2025 that will close this issue
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants