Skip to content

Source code for CVPR'25 paper "PromptHash: Affinity-Prompted Collaborative Cross-Modal Learning for Adaptive Hashing Retrieval"

Notifications You must be signed in to change notification settings

CCRG-XJU/PromptHash

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🌷 PromptHash: Affinity-Prompted Collaborative Cross-Modal Learning for Adaptive Hashing Retrieval [CVPR 2025]

Overview 🌟

PromptHash (Affinity-Prompted Collaborative Cross-Modal Learning for Adaptive Hashing Retrieval) is a method for adaptive hashing retrieval through collaborative cross-modal learning. This repository contains the code associated with the paper:

"PromptHash: Affinity-Prompted Collaborative Cross-Modal Learning for Adaptive Hashing Retrieval" (accepted to CVPR 2025)
Qiang Zou, Shuli Cheng and Jiayi Chen

Usage 🛠️

  1. To begin, clone this repository and navigate to the PromptHash folder:
git clone https://github.com/CCRG-XJU/PromptHash.git
cd PromptHash
  1. Please install the following packages:
conda create -n prompthash python=3.13 -y
conda activate prompthash
  1. Install PyTorch 2.7.0, mamba_ssm, and causal-conv1d:
# Install PyTorch 2.7.0
pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu128

# Install mamba_ssm
# Please refer to https://github.com/state-spaces/mamba for detailed installation instructions
# If you are using a CUDA 12.8 environment, you can download the pre-built whl file from the release page

# Install causal-conv1d
# Please refer to https://github.com/Dao-AILab/causal-conv1d for detailed installation instructions
# If you are using a CUDA 12.8 environment, you can download the pre-built whl file from the release page

Data 🗂️

You should generate the following *.mat file for each dataset. The structure of directory ./dataset should be:

    dataset
    ├── coco
    │   ├── caption.mat 
    │   ├── index.mat
    │   └── label.mat 
    ├── flickr25k
    │   ├── caption.mat
    │   ├── index.mat
    │   └── label.mat
    └── nuswide
        ├── caption.mat
        ├── index.mat 
        └── label.mat

Please preprocess the dataset to the appropriate input format.

The dataset mat files have already been uploaded in the ./dataset directory. Please use the unzip command to extract them.

Additionally, cleaned datasets (MIRFLICKR25K & MSCOCO & NUSWIDE) used in our experiments are available at pan.baidu.com:

link: https://pan.baidu.com/s/1ZyDTR2IEHlY4xIdLgxtaVA password: kdq7 (Source: CMCL)

Training 🏋️‍♂️

After preparing the Python environment, pretrained metaclip model, and dataset, we can train the PromptHash model.

See PromptHash_$DATASET$.sh.

Please refer to the instructions in the paper for hyperparameter settings.

Evaluation 📊

See PromptHash_$DATASET$_TEST.sh.

We have uploaded the hash code files generated by our algorithm in the repository. If needed, you can directly use the hash codes in the test.zip archive for verification.

Citation 📜

If you find the PromptHash paper and code useful for your research and applications, please cite using this BibTeX:

@inproceedings{
  PromptHash,
  title={{PromptHash}: Affinity-Prompted Collaborative Cross-Modal Learning for Adaptive Hashing Retrieval},
  author={Qiang Zou, Shuli Cheng and Jiayi Chen},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2025},
}

Any question ✉️

If you have any questions, please feel free to contact Shuli Cheng ([email protected]) or Qiang Zou ([email protected]).

Acknowledgements 🌸

This project is based on open_clip and CMCL - special thanks to all the contributors.

About

Source code for CVPR'25 paper "PromptHash: Affinity-Prompted Collaborative Cross-Modal Learning for Adaptive Hashing Retrieval"

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •  

Languages