Artykuł w czasopiśmie
Brak miniatury
Licencja

ClosedAccessDostęp zamknięty

On the Verification of Neural ODEs with Stochastic Guarantees

Autor
Gruenbacher, Sophie
Hasani, Ramin
Lechner, Mathias
Cyranka, Jacek
Smolka, Scott A.
Grosu, Radu
Data publikacji
2021
Abstrakt (EN)

We show that Neural ODEs, an emerging class of time-continuous neural networks, can be verified by solving a set of global-optimization problems. For this purpose, we introduce Stochastic Lagrangian Reachability (SLR), an abstraction-based technique for constructing a tight Reachtube (an over-approximation of the set of reachable states over a given time-horizon), and provide stochastic guarantees in the form of confidence intervals for the Reachtube bounds. SLR inherently avoids the infamous wrapping effect (accumulation of over-approximation errors) by performing local optimization steps to expand safe regions instead of repeatedly forward-propagating them as is done by deterministic reachability methods. To enable fast local optimizations, we introduce a novel forward-mode adjoint sensitivity method to compute gradients without the need for backpropagation. Finally, we establish asymptotic and non-asymptotic convergence rates for SLR.

Słowa kluczowe EN
Safety
Robustness & Trustworthiness
(Deep) Neural Network Learning Theory
Sampling/Simulation-based Search
Dyscyplina PBN
informatyka
Strony od-do
11525-11535
Licencja otwartego dostępu
Dostęp zamknięty