Skip to content

Update open-clip-torch requirement from <2.26.1,>=2.23.0 to >=2.23.0,<2.29.1 #2405

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Nov 4, 2024

Updates the requirements on open-clip-torch to permit the latest version.

Changelog

Sourced from open-clip-torch's changelog.

2.24.0

  • Fix missing space in error message
  • use model flag for normalizing embeddings
  • init logit_bias for non siglip pretrained models
  • Fix logit_bias load_checkpoint addition
  • Make CoCa model match CLIP models for logit scale/bias init
  • Fix missing return of "logit_bias" in CoCa.forward
  • Add NLLB-CLIP with SigLIP models
  • Add get_logits method and NLLB tokenizer
  • Remove the empty file src/open_clip/generation_utils.py
  • Update params.py: "BatchNorm" -> "LayerNorm" in the description string for "--lock-text-freeze-layer-norm"

2.23.0

  • Add CLIPA-v2 models
  • Add SigLIP models
  • Add MetaCLIP models
  • Add NLLB-CLIP models
  • CLIPA train code
  • Minor changes/fixes
    • Remove protobuf version limit
    • Stop checking model name when loading CoCa models
    • Log native wandb step
    • Use bool instead of long masks

2.21.0

  • Add SigLIP loss + training support
  • Add more DataComp models (B/16, B/32 and B/32@256)
  • Update default num workers
  • Update CoCa generation for transformers>=4.31
  • PyTorch 2.0 state_dict() compatibility fix for compiled models
  • Fix padding in ResizeMaxSize
  • Convert JIT model on state dict load for pretrained='filename…'
  • Other minor changes and fixes (typos, README, dependencies, CI)

2.20.0

  • Add EVA models
  • Support serial worker training
  • Fix Python 3.7 compatibility

2.19.0

  • Add DataComp models

2.18.0

  • Enable int8 inference without .weight attribute

... (truncated)

Commits
  • 82d7496 Release 2.29.0
  • aedd550 Missed hf_hub entries for two of the metaclip weights (ViT-B-32)
  • d11e54a Improve cache_dir behaviour
  • 1b01224 Remove safeglobals add, not worth having with all pretrained weights on hub a...
  • 84f7d2f All default pretrained weights pushed to HF hub, stragglers uploaded to timm ...
  • 427c434 Release 2.28.0
  • 13ee1d7 Tweak comments
  • cd15a7b Add safeglobals to allow metaclip models to load with weights_only=True, add ...
  • c82349a Move device check ahead of dist check
  • a49469a A few more distributed devicec handling tweaks
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [open-clip-torch](https://github.com/mlfoundations/open_clip) to permit the latest version.
- [Release notes](https://github.com/mlfoundations/open_clip/releases)
- [Changelog](https://github.com/mlfoundations/open_clip/blob/main/HISTORY.md)
- [Commits](mlfoundations/open_clip@v2.23.0...v2.29.0)

---
updated-dependencies:
- dependency-name: open-clip-torch
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot requested a review from samet-akcay as a code owner November 4, 2024 11:21
@dependabot dependabot bot added Dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Nov 4, 2024
@samet-akcay samet-akcay closed this Nov 5, 2024
Copy link
Contributor Author

dependabot bot commented on behalf of github Nov 5, 2024

OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version or @dependabot ignore this minor version. You can also ignore all major, minor, or patch releases for a dependency by adding an ignore condition with the desired update_types to your config file.

If you change your mind, just re-open this PR and I'll resolve any conflicts on it.

@dependabot dependabot bot deleted the dependabot/pip/open-clip-torch-gte-2.23.0-and-lt-2.29.1 branch November 5, 2024 07:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant