Spaces:
Running
A newer version of the Gradio SDK is available:
5.32.1
LiftFeat: 3D Geometry-Aware Local Feature Matching


Real-time SuperPoint demonstration (left) compared to LiftFeat (right) on a textureless scene.
- π New! Training code is now available π
- π New! The test code and pretrained model have been released. π
Table of Contents
Introduction
This repository contains the official implementation of the paper:
LiftFeat: 3D Geometry-Aware Local Feature Matching, to be presented at ICRA 2025.
Overview of LiftFeat's achitecture

LiftFeat is a lightweight and robust local feature matching network designed to handle challenging scenarios such as drastic lighting changes, low-texture regions, and repetitive patterns. By incorporating 3D geometric cues through surface normals predicted from monocular depth, LiftFeat enhances the discriminative power of 2D descriptors. Our proposed 3D geometry-aware feature lifting module effectively fuses these cues, leading to significant improvements in tasks like relative pose estimation, homography estimation, and visual localization.
Installation
If you use conda as virtual environment,you can create a new env with:
git clone https://github.com/lyp-deeplearning/LiftFeat.git
cd LiftFeat
conda create -n LiftFeat python=3.8
conda activate LiftFeat
pip install -r requirements.txt
Usage
Inference
To run LiftFeat on an image,you can simply run with:
python demo.py --img1=<reference image> --img2=<query image>
Training
To train LiftFeat as described in the paper, you will need MegaDepth & COCO_20k subset of COCO2017 dataset as described in the paper XFeat: Accelerated Features for Lightweight Image Matching You can obtain the full COCO2017 train data at https://cocodataset.org/. However, we make available a subset of COCO for convenience. We simply selected a subset of 20k images according to image resolution. Please check COCO terms of use before using the data.
To reproduce the training setup from the paper, please follow the steps:
- Download COCO_20k containing a subset of COCO2017;
- Download MegaDepth dataset. You can follow LoFTR instructions, we use the same standard as LoFTR. Then put the megadepth indices inside the MegaDepth root folder following the standard below:
{megadepth_root_path}/train_data/megadepth_indices #indices
{megadepth_root_path}/MegaDepth_v1 #images & depth maps & poses
- Finally you can call training
python train.py --megadepth_root_path <path_to>/MegaDepth --synthetic_root_path <path_to>/coco_20k --ckpt_save_path /path/to/ckpts
Evaluation
All evaluation code are in evaluation, you can download HPatch dataset following D2-Net and download MegaDepth test dataset following LoFTR.
Download and process HPatch
cd /data
# Download the dataset
wget https://huggingface.co/datasets/vbalnt/hpatches/resolve/main/hpatches-sequences-release.zip
# Extract the dataset
unzip hpatches-sequences-release.zip
# Remove the high-resolution sequences
cd hpatches-sequences-release
rm -rf i_contruction i_crownnight i_dc i_pencils i_whitebuilding v_artisans v_astronautis v_talent
cd <LiftFeat>/data
ln -s /data/hpatches-sequences-release ./HPatch
Download and process MegaDepth1500
We provide download link to megadepth_test_1500
tar xvf <path to megadepth_test_1500.tar>
cd <LiftFeat>/data
ln -s <path to megadepth_test_1500> ./megadepth_test_1500
Homography Estimation
python evaluation/HPatch_evaluation.py
Relative Pose Estimation
For Megadepth1500 dataset:
python evaluation/MegaDepth1500_evaluation.py
Citation
If you find this code useful for your research, please cite the paper:
@misc{liu2025liftfeat3dgeometryawarelocal,
title={LiftFeat: 3D Geometry-Aware Local Feature Matching},
author={Yepeng Liu and Wenpeng Lai and Zhou Zhao and Yuxuan Xiong and Jinchi Zhu and Jun Cheng and Yongchao Xu},
year={2025},
eprint={2505.03422},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2505.03422},
}
License
Acknowledgements
We would like to thank the authors of the following open-source repositories for their valuable contributions, which have inspired or supported this work:
We deeply appreciate the efforts of the research community in releasing high-quality codebases.