Skip to content

arclab-hku/DEIO

Repository files navigation

DEIO: Deep Event Inertial Odometry

Abstract

Event cameras show great potential for visual odometry (VO) in handling challenging situations, such as fast motion and high dynamic range. Despite this promise, the sparse and motion-dependent characteristics of event data continue to limit the performance of feature-based or direct-based data association methods in practical applications. To address these limitations, we propose Deep Event Inertial Odometry (DEIO), the first monocular learning-based event-inertial framework, which combines a learning-based method with traditional nonlinear graph-based optimization. Specifically, an event-based recurrent network is adopted to provide accurate and sparse associations of event patches over time. DEIO further integrates it with the IMU to recover up-to-scale pose and provide robust state estimation. The Hessian information derived from the learned differentiable bundle adjustment (DBA) is utilized to optimize the co-visibility factor graph, which tightly incorporates event patch correspondences and IMU pre-integration within a keyframe-based sliding window. Comprehensive validations demonstrate that DEIO achieves superior performance on 10 challenging public benchmarks compared with more than 20 state-of-the-art methods.

Update log

  • README Upload (2024/10/28)
  • Paper Upload (2024/11/06)
  • Estimated Trajectories Upload (2024/11/07)
  • Code Upload (2025/07/19)
  • More Raw Results of VECtor Dataset (2025/07/20)

Setup and Installation

# for cuda 11.7
conda env create -f environment.yml   
conda activate DEIO
# conda remove --name DEIO --all

pip install .
pip install numpy-quaternion==2022.4.3

# install GTSAM
cd thirdparty/gtsam
mkdir build
cd build
cmake .. -DGTSAM_BUILD_PYTHON=1 -DGTSAM_PYTHON_VERSION=3.10.11
make python-install

Run an Example

conda activate DEIO

# example for the davia240c
CUDA_VISIBLE_DEVICES=0 PYTHONPATH={YOUR_PATH}/DEIO python script/eval_deio/davis240c.py \
    --inputdir=/media/lfl-data2/davis240c \
    --config=config/davis240c.yaml \
    --val_split=script/splits/davis240c/davis240c_val.txt \
    --enable_event \
    --network={YOUR_PATH}/DEVO/DEVO.pth  \
    --plot \
    --save_trajectory \
    --trials=5

The Results will save in path: results.

Using Our Results as Comparison

For the convenience of the comparison, we release the estimated trajectories of DEIO in tum format in the dir of estimated_trajectories. What's more, we also give the sample code for the quantitative and qualitative evaluation using evo package

Acknowledgement

  • This work is based on DPVO, DEVO, DROID-SLAM, DBA-Fusion, and GTSAM
  • If you find this work is helpful in your research, a simple star or citation of our works should be the best affirmation for us. 😊
@inproceedings{GWPHKU:DEIO,
    title={DEIO: Deep Event Inertial Odometry},
    author={Guan, Weipeng and Lin, Fuling and Chen, Peiyu and Lu, Peng},
    booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops},
    year={2025}
  }

About

(ICCVW2025) Learning-based Event-Inertial Odometry

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published