Skip to content

AndyCao1125/SpikeSlicer

Repository files navigation

Spiking Neural Network as Adaptive Event Stream Slicer (NeurIPS'24)

Paper PDF Proceeding Supp Hugging Face Model Hub

Logo

Installation

The environment is largely adopted from pytracking, if you have any issues, please carefully follow their official guidelines.

Install dependencies

  • Create and activate a conda environment
    conda create -n spikeslicer python=3.9 -y
    conda activate spikeslicer
  • Install packages
    pip install -r requirements.txt
    pip install tb-nightly
    conda install cpython  

Note: if error occurs when installing tb-nightly, please use pip install tb-nightly -i https://mirrors.aliyun.com/pypi/simple after other packages are installed.

Dataset

Please prepare FE108 on your device, and enter its path into fe108_dir of ltr/admin/local.py and pytracking/evaluation/local.py.

Replace the txt files in the dataset directory with our txt files splited by train_all (ANN pretraining), train (ANN training), val (SNN training), and test (Evaluation) sets:

cp dataset_txts/* your_fe108_directory/

Then please convert the raw event data of aedat4 format into npy format that is convenient to load:

python event2frame.py

Note that this file has a default setting with group number of 5, which divides the event stream into 5 frames with same time intervals in voxel grid event representation.

Train

Enter your directory of code to replace "SpikeSlicer" in workspace_dir of ltr/admin/local.py.

To pretrain a model, use command like this:

cd ltr
python run_training.py --train_module transt --train_name transt

To train an SNN along with ANN, first enter the path of the checkpoint saved in the pretraining stage to the corresponding file within ltr/train_settings. Then run the spikeslicer version, for example:

cd ltr
python run_training.py --train_module transt --train_name spikeslicer_transt

Test

Step 1: Generate the tracking results.

Activate the conda environment and run the script pytracking/run_tracker_fe108.py to run test code with different SNN types.

cd pytracking
python run_tracker_fe108.py --tracker_name transt --tracker_param transt50 --trained_model MODEL/WEIGHT/PATH --eval_save_dir SAVE/BBOX/PATH --event_representation frame --snn_type small

Step 2: Calculate the quantitative results.

Based on the tracking files from step 1, we can now compute the quantitative results, e.g., RSR, OP and RPR.

cd notebooks
python eval_results.py --tracker_name transt --tracker_param transt50 --eval_save_dir SAVE/BBOX/PATH --mode "all"

We also provide an example experimental results reported in the article (test logs and weights of TransT can be found in link), and evaluate through:

cd notebooks
python eval_results.py --tracker_name transt --tracker_param transt50 --eval_save_dir "TransT_SpikeSlicer_Small_test_log.zip" --mode "all"

Example Test Logs and Weights:

Acknowledgement

Our code is generally built upon: pytracking, Transformer Tracking, and spikingjelly. We thank all these authors for their nicely open sourced code and their great contributions to the community.

For any help or issues of this project, please contact Jiahang Cao or Mingyuan Sun.

Citation

If you find our work useful, please consider citing:

@inproceedings{cao2024spiking,
  title={Spiking Neural Network as Adaptive Event Stream Slicer},
  author={Cao, Jiahang and Sun, Mingyuan and Wang, Ziqing and Cheng, Hao and Zhang, Qiang and Xu, Renjing and others},
  booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems},
  year={2024}
}

About

[NeurIPS 2024] Spiking Neural Network as Adaptive Event Stream Slicer

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages