3.9 Article

A domain-theoretic framework for robustness analysis of neural networks

Related references

Note: Only part of the references are listed.
Article Computer Science, Software Engineering

A Dual Number Abstraction for Static Analysis of Clarke Jacobians

Jacob Laurel et al.

Summary: We propose a novel abstraction method to bound the Clarke Jacobian of a Lipschitz continuous function over a local input region. By leveraging a new abstract domain based on dual numbers, we can accurately over-approximate all first derivatives needed for computing the Clarke Jacobian. We demonstrate the generality and scalability of our approach by evaluating it on various deep neural networks and non-differentiable input perturbations.

PROCEEDINGS OF THE ACM ON PROGRAMMING LANGUAGES-PACMPL (2022)

Article Automation & Control Systems

Training Robust Neural Networks Using Lipschitz Bounds

Patricia Pauli et al.

Summary: In this study, a framework for training multi-layer neural networks with increased robustness is proposed. By minimizing the Lipschitz constant and using a semidefinite programming based training procedure, the framework successfully enhances the robustness of neural networks. Two examples are provided to demonstrate the effectiveness of the proposed framework.

IEEE CONTROL SYSTEMS LETTERS (2022)

Proceedings Paper Computer Science, Software Engineering

Exploiting Verified Neural Networks via Floating Point Numerical Error

Kai Jia et al.

Summary: The researchers developed neural network verification algorithms to assess the robustness of deep neural networks. It is crucial to accurately model floating point computations in order to achieve reliable verification of neural networks.

STATIC ANALYSIS, SAS 2021 (2021)

Article Mathematics

Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation

Mikhail Belkin

Summary: This paper explores the gap between the mathematical theory of machine learning and practical challenges, focusing on the key themes of interpolation and over-parametrization in understanding the foundations of deep learning.

ACTA NUMERICA (2021)

Article Multidisciplinary Sciences

Benign overfitting in linear regression

Peter L. Bartlett et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2020)

Article Computer Science, Theory & Methods

Adversarial Examples on Object Recognition: A Comprehensive Survey

Alex Serban et al.

ACM COMPUTING SURVEYS (2020)

Proceedings Paper Computer Science, Software Engineering

Abstract Neural Networks

Matthew Sotoudeh et al.

STATIC ANALYSIS (SAS 2020) (2020)

Proceedings Paper Computer Science, Theory & Methods

Domain Theoretic Second-Order Euler's Method for Solving Initial Value Problems

Abbas Edalat et al.

ELECTRONIC NOTES IN THEORETICAL COMPUTER SCIENCE (2020)

Article Engineering, Electrical & Electronic

Artificial Intelligence Aided Automated Design for Reliability of Power Electronic Systems

Tomislav Dragicevic et al.

IEEE TRANSACTIONS ON POWER ELECTRONICS (2019)

News Item Multidisciplinary Sciences

DEEP TROUBLE FOR DEEP LEARNING

Douglas Heaven

NATURE (2019)

Proceedings Paper Computer Science, Information Systems

Harmless interpolation of noisy data in regression

Vidya Muthukumar et al.

2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT) (2019)

Article Computer Science, Software Engineering

An Abstract Domain for Certifying Neural Networks

Gagandeep Singh et al.

PROCEEDINGS OF THE ACM ON PROGRAMMING LANGUAGES-PACMPL (2019)

Article Computer Science, Theory & Methods

Safe & robust reachability analysis of hybrid systems

Eugenio Moggi et al.

THEORETICAL COMPUTER SCIENCE (2018)

Article Computer Science, Theory & Methods

Domain Theory its Ramifications and Interactions

Klaus Keimel

ELECTRONIC NOTES IN THEORETICAL COMPUTER SCIENCE (2017)

Article Computer Science, Artificial Intelligence

Steps Toward Robust Artificial Intelligence

Thomas G. Dietterich

AI MAGAZINE (2017)

Proceedings Paper Computer Science, Information Systems

Towards Evaluating the Robustness of Neural Networks

Nicholas Carlini et al.

2017 IEEE SYMPOSIUM ON SECURITY AND PRIVACY (SP) (2017)

Article Computer Science, Theory & Methods

A computational model for multi-variable differential calculus

Abbas Edalat et al.

INFORMATION AND COMPUTATION (2013)

Article Computer Science, Hardware & Architecture

Continuity and Robustness of Programs

Swarat Chaudhuri et al.

COMMUNICATIONS OF THE ACM (2012)

Article

Denotational semantics of hybrid automata

Abbas Edalat et al.

JOURNAL OF LOGIC AND ALGEBRAIC PROGRAMMING (2007)

Article Computer Science, Theory & Methods

SHRAD: A language for sequential real number computation

Amin Farjudian

THEORY OF COMPUTING SYSTEMS (2007)

Article Computer Science, Theory & Methods

Domain theory and differential calculus (functions of one variable)

A Edalat et al.

MATHEMATICAL STRUCTURES IN COMPUTER SCIENCE (2004)

Article Computer Science, Theory & Methods

Integration in real PCF

A Edalat et al.

INFORMATION AND COMPUTATION (2000)