Lipschitz constant estimation for 1D convolutional neural networks (2211.15253v2)
Abstract: In this work, we propose a dissipativity-based method for Lipschitz constant estimation of 1D convolutional neural networks (CNNs). In particular, we analyze the dissipativity properties of convolutional, pooling, and fully connected layers making use of incremental quadratic constraints for nonlinear activation functions and pooling operations. The Lipschitz constant of the concatenation of these mappings is then estimated by solving a semidefinite program which we derive from dissipativity theory. To make our method as efficient as possible, we exploit the structure of convolutional layers by realizing these finite impulse response filters as causal dynamical systems in state space and carrying out the dissipativity analysis for the state space realizations. The examples we provide show that our Lipschitz bounds are advantageous in terms of accuracy and scalability.
- Robustness against adversarial attacks in neural networks using incremental dissipativity. IEEE Control Systems Letters, 6:2341–2346, 2022.
- Networks of dissipative systems: compositional certification of stability, performance, and safety. Springer, 2016.
- Lipschitz certificates for layered network structures driven by averaged activation operators. SIAM Journal on Mathematics of Data Science, 2(2):529–557, 2020.
- Efficient and accurate estimation of Lipschitz constants for deep neural networks. Advances in Neural Information Processing Systems, 32, 2019.
- Adaptive filtering prediction and control. Courier Corporation, 2014.
- Convolutional neural networks as 2-D systems. arXiv preprint arXiv:2303.03042, 2023.
- The stability of nonlinear dissipative systems. IEEE Transactions on Automatic Control, 21(5):708–711, 1976. 10.1109/TAC.1976.1101352.
- Decentralized verification for dissipativity of cascade interconnected systems. In 58th Conference on Decision and Control (CDC), pages 3629–3634. IEEE, 2019.
- 1d convolutional neural networks and applications: A survey. Mechanical systems and signal processing, 151:107398, 2021.
- Incremental positivity nonpreservation by stability multipliers. IEEE Transactions on Automatic Control, 47(1):173–177, 2002. 10.1109/9.981740.
- Lipschitz constant estimation of neural networks via sparse polynomial optimization. arXiv preprint arXiv:2004.08688, 2020.
- Exploiting sparsity for neural network verification. In Learning for Dynamics and Control, pages 715–727. PMLR, 2021.
- Wavenet: A generative model for raw audio. arXiv preprint arXiv:1609.03499, 2016.
- Neural network training under semidefinite constraints. arXiv preprint arXiv:2201.00632, 2022a.
- Training robust neural networks using Lipschitz bounds. IEEE Control Systems Letters, 6:121–126, 2022b.
- Linear matrix inequalities in control. Lecture Notes, Dutch Institute for Systems and Control, Delft, The Netherlands, 3(2), 2000.
- Intriguing properties of neural networks. arXiv preprint arXiv:1312.6199, 2013.
- Chordal sparsity for Lipschitz constant estimation of deep neural networks. arXiv preprint arXiv:2204.00846, 2022.