Skip to content

earth-insights/samroadplus

Repository files navigation

Towards Satellite Image Road Graph Extraction: A Global-Scale Dataset and A Novel Method

1Xi'an Jiaotong University  2Chinese Academy of Sciences  3Xidian University 

[Project][arXiv]

Overview of our proposed SAM-Road++. The red line indicates training only and the blue line indicates inference only.

Abstract

Recently, road graph extraction has garnered increasing attention due to its crucial role in autonomous driving, navigation, etc. However, accurately and efficiently extracting road graphs remains a persistent challenge, primarily due to the severe scarcity of labeled data. To address this limitation, we collect a global-scale satellite road graph extraction dataset, i.e. Global-Scale dataset. Specifically, the Global-Scale dataset is $\sim20 \times$ larger than the largest existing public road extraction dataset and spans over 13,800 $km^2$ globally. Additionally, we develop a novel road graph extraction model, i.e. SAM-Road++, which adopts a node-guided resampling method to alleviate the mismatch issue between training and inference in SAM-Road, a pioneering state-of-the-art road graph extraction model. Furthermore, we propose a simple yet effective ``extended-line'' strategy in SAM-Road++ to mitigate the occlusion issue on the road. Extensive experiments demonstrate the validity of the collected Global-Scale dataset and the proposed SAM-Road++ method, particularly highlighting its superior predictive power in unseen regions.

Installation

You need the following:

  • an Nvidia GPU with latest CUDA and driver.
  • the latest pytorch.
  • pytorch lightning.
  • wandb.
  • Go, just for the APLS metric.
  • and pip install whatever is missing.

Getting Started

SAM Preparation

Download the ViT-B checkpoint from the official SAM directory. Put it under:

-sam_road++  
--sam_ckpts  
---sam_vit_b_01ec64.pth  

Data Preparation

Refer to the instructions in the sam_road repo to download City-scale and SpaceNet datasets. Put them in the main directory, structure like:

-sam_road++  
--cityscale  
---20cities  
--spacenet  
---RGB_1.0_meter  

and run python generate_labes.py under both dirs.

for Global-scale, refer to cityscale for data preparation. And our Global-scale datasets will be publicly available soon.

Training

City-scale dataset:

python train.py --config=config/toponet_vitb_512_cityscale.yaml  

Glbale-scale dataset:

python train.py --config=config/toponet_vitb_512_globalscale.yaml

or

python train.py --config=config/toponet_vitb_256_globalscale.yaml

SpaceNet dataset:

python train.py --config=config/toponet_vitb_256_spacenet.yaml 

You can find the checkpoints under lightning_logs dir.

Inference

python inferencer.py 
--config=path_to_the_same_config_for_training--checkpoint=path_to_ckpt  

Test

For APLS and TOPO metrics, please move to Sat2Graph. It is worth mentioning that the metrics used to test our Global-scale datasets are the same as those used for the Cityscale datasets.

Demos

Visual road network graph prediction based on SAM-Road++ and two currently advanced methods.

Citation

@article{yin2024satelliteimageroadgraph,
  title={Towards Satellite Image Road Graph Extraction: A Global-Scale Dataset and A Novel Method},
  author={Yin, Pan and Li, Kaiyu and Cao, Xiangyong and Yao, Jing and Liu, Lei and Bai, Xueru and Zhou, Feng and Meng, Deyu},
  journal={arXiv preprint arXiv:2411.16733},
  year={2024}
}

Acknowledgement

We sincerely appreciate the authors of the following codebases which made this project possible:

TODO List

  • [√] Basic instructions
  • [×] Organize configs
  • [√] Add dependency list
  • [×] Add demos
  • [×] Add trained checkpoints

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published