Skip to content

XavierJiezou/KTDA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[ICME 2025 Oral] KTDA

Knowledge Transfer and Domain Adaptation for Fine-Grained Remote Sensing Image Segmentation

arXiv Paper Project Page HugginngFace Models HugginngFace Datasets

framework


Installation

  1. Clone the Repository
git clone https://github.com/XavierJiezou/KTDA.git
cd KTDA
  1. Install Dependencies

You can either set up the environment manually or use our pre-configured environment for convenience:

  • Option 1: Manual Installation

Ensure you are using Python 3.8 or higher, then install the required dependencies:

pip install -r requirements.txt  
  • Option 2: Use Pre-configured Environment

We provide a pre-configured environment (env.tar.gz) hosted on Hugging Face. You can download it directly from Hugging Face. Follow the instructions on the page to set up and activate the environment.

Once download env.tar.gz, you can extract it using the following command:

tar -xzf env.tar.gz -C envs
source envs/bin/activate
conda-unpack

Prepare Data

We have open-sourced all datasets used in the paper, which are hosted on Hugging Face Datasets. Please follow the instructions on the dataset page to download the data.

After downloading, organize the dataset as follows:

KTDA
├── ...
├── data
│   ├── grass
│   │   ├── ann_dir
│   │   │   ├── train
│   │   │   ├── val
│   │   ├── img_dir
│   │   │   ├── train
│   │   │   ├── val
│   ├── l8_biome (cloud)
│   │   ├── ann_dir
│   │   │   ├── train
│   │   │   ├── val
│   │   │   ├── test
│   │   ├── img_dir
│   │   │   ├── train
│   │   │   ├── val
│   │   │   ├── test

Training

Step 1: Modify the Configuration File

After downloading the vision transformer models from Hugging Face, make sure to correctly specify the path to the configuration file within your config settings.

For example:

# configs/_base_/models/ktda.py
model = dict(
    backbone=dict(
        init_cfg=dict(
            type="Pretrained",
            checkpoint="checkpoints/dinov2-base.pth", # you can set vision transformer models path here
        ),
    ),
   
)

Update the configs directory with your training configuration, or use one of the provided example configurations. You can customize the backbone, dataset paths, and hyperparameters in the configuration file (e.g., configs/ktda/ktda_cloud.py).

Step 2: Start Training

Use the following command to begin training on grass dataset:

python tools/train.py configs/ktda/ktda_grass.py

and you can also train on cloud dataset:

python tools/train.py configs/ktda/ktda_cloud.py

Step 3: Resume or Fine-tune

To resume training from a checkpoint or fine-tune using pretrained weights, run:

python tools/train.py configs/ktda/ktda_grass.py --resume-from path/to/checkpoint.pth  

Evaluation

All model weights used in the paper have been open-sourced and are available on Hugging Face Models.

Use the following command to evaluate the trained model:

python tools/test.py configs/ktda/ktda_grass.py path/to/checkpoint.pth  

Alternatively, you can find the evaluation results in the eval_result folder within this repository.

Visualization

We have uploaded the visualization results of various models on the Grass and Cloud datasets to Hugging Face. You can view them at the following link:

Hugging Face Visualization Results

Citation

If you use our code or models in your research, please cite with:

@misc{ktda,
      title={Knowledge Transfer and Domain Adaptation for Fine-Grained Remote Sensing Image Segmentation}, 
      author={Shun Zhang and Xuechao Zou and Kai Li and Congyan Lang and Shiying Wang and Pin Tao and Tengfei Cao},
      year={2024},
      eprint={2412.06664},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2412.06664}, 
}

Acknowledgments

Readme Card

About

[ICME 2025 Oral] Knowledge Transfer and Domain Adaptation for Fine-Grained Remote Sensing Image Segmentation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •