Skip to content

Optimization-Inspired Cross-Attention Transformer for Compressive Sensing (CVPR 2023)

Notifications You must be signed in to change notification settings

songjiechong/OCTUF

Repository files navigation

Optimization-Inspired Cross-Attention Transformer for Compressive Sensing (CVPR 2023)

This repository is for OCTUF introduced in the following paper:

Jiechong Song, Chong Mou, Shiqi Wang, Siwei Ma, Jian Zhang, "Optimization-Inspired Cross-Attention Transformer for Compressive Sensing", in the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2023. PDF

🎨 Abstract

By integrating certain optimization solvers with deep neural networks, deep unfolding network (DUN) with good interpretability and high performance has attracted growing attention in compressive sensing (CS). However, existing DUNs often improve the visual quality at the price of a large number of parameters and have the problem of feature information loss during iteration. In this paper, we propose an Optimization-inspired Cross-attention Transformer (OCT) module as an iterative process, leading to a lightweight OCT-based Unfolding Framework (OCTUF) for image CS. Specifically, we design a novel Dual Cross Attention (Dual-CA) sub-module, which consists of an Inertia-Supplied Cross Attention (ISCA) block and a Projection-Guided Cross Attention (PGCA) block. ISCA block introduces multi-channel inertia forces and increases the memory effect by a cross attention mechanism between adjacent iterations. And, PGCA block achieves an enhanced information interaction, which introduces the inertia force into the gradient descent step through a cross attention block. Extensive CS experiments manifest that our OCTUF achieves superior performance compared to state-of-the-art methods while training lower complexity.

🔥 Network Architecture

Network

Network

🔧 Requirements

  • Python == 3.8.5
  • Pytorch == 1.8.0

🚩 Results

Network

👀 Datasets

💻 Command

Train

python train_OCTUF.py --sensing_rate 0.1/0.25/0.3/0.4/0.5

Test

python test_octuf.py --sensing_rate 0.1/0.25/0.3/0.4/0.5 --test_name Set11/Urban100

📧 Contact

If you have any question, please email [email protected].

🤗 Acknowledgements

This code is built on FSOINet. We thank the authors for sharing their codes.

About

Optimization-Inspired Cross-Attention Transformer for Compressive Sensing (CVPR 2023)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages