Skip to content
forked from slr1997/ALA

Adversarial Lightness Attack

Notifications You must be signed in to change notification settings

Huang-yihao/ALA

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ALA: Naturalness-aware Adversarial Lightness Attack

This is the official repository of ALA: Naturalness-aware Adversarial Lightness Attack. The paper is accepted by ACM MM 2023.

arXiv

ALA Naturalness-aware Adversarial Lightness Attack
Yihao Huang, Liangru Sun, Qing Guo, Felix Juefei-Xu, Jiayi Zhu, Jincao Feng, Yang Liu, Geguang Pu

Abstract:
Most researchers have tried to enhance the robustness of DNNs by revealing and repairing the vulnerability of DNNs with specialized adversarial examples. Parts of the attack examples have imperceptible perturbations restricted by Lp norm. However, due to their high-frequency property, the adversarial examples can be defended by denoising methods and are hard to realize in the physical world. To avoid the defects, some works have proposed unrestricted attacks to gain better robustness and practicality. It is disappointing that these examples usually look unnatural and can alert the guards. In this paper, we propose Adversarial Lightness Attack (ALA), a white-box unrestricted adversarial attack that focuses on modifying the lightness of the images. The shape and color of the samples, which are crucial to human perception, are barely influenced. To obtain adversarial examples with a high attack success rate, we propose unconstrained enhancement in terms of the light and shade relationship in images. To enhance the naturalness of images, we craft the naturalness-aware regularization according to the range and distribution of light. The effectiveness of ALA is verified on two popular datasets for different tasks (i.e., ImageNet for image classification and Places-365 for scene recognition).

Requirements

python3
torch == 1.8.0
torchvision == 0.9.0

References

@inproceedings{huang2023ala,
  title={ALA: Naturalness-aware Adversarial Lightness Attack},
  author={Huang, Yihao and Sun, Liangru and Guo, Qing and Juefei-Xu, Felix and Zhu, Jiayi and Feng, Jincao and Liu, Yang and Pu, Geguang},
  booktitle={Proceedings of the 31st ACM International Conference on Multimedia},
  pages={2418--2426},
  year={2023}
}

About

Adversarial Lightness Attack

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%