Adversarial robustness through Lipschitz-Guided Stochastic Depth in Neural Networks

Laith Nayal, Mahmoud Mousatat, Bader Rasheed

Published: 2025/9/12

Abstract

Deep neural networks and Vision Transformers achieve state-of-the-art performance in computer vision but are highly vulnerable to adversarial perturbations. Standard defenses often incur high computational cost or lack formal guarantees. We propose a Lipschitz-guided stochastic depth (DropPath) method, where drop probabilities increase with depth to control the effective Lipschitz constant of the network. This approach regularizes deeper layers, improving robustness while preserving clean accuracy and reducing computation. Experiments on CIFAR-10 with ViT-Tiny show that our custom depth-dependent schedule maintains near-baseline clean accuracy, enhances robustness under FGSM, PGD-20, and AutoAttack, and significantly reduces FLOPs compared to baseline and linear DropPath schedules.

Adversarial robustness through Lipschitz-Guided Stochastic Depth in Neural Networks | SummarXiv | SummarXiv