Document Type

Thesis - Open Access

Award Date

2024

Degree Name

Master of Science (MS)

Department / School

Mathematics and Statistics

First Advisor

Michael Puthawala

Abstract

In this thesis, we study the Inverse Lipschitz Constant (ILC) of injective ReLU layers. We study the tightness of the ILC lower bound established in Puthawala et al. Our approach has three components. First, we find that the conditions for injectivity on lines yield a weaker condition than the general condition given in Puthawala et al. Second, we perform numerical experiments to judge the tightness of the existing ILC lower bound and find that bound is overly conservative. Third, we identify the source of the potential slack in the proof of the existing ILC bound, and perform further numerical experiments to support this hypothesis. An accurate ILC is crucial for better understanding and potentially reducing the network’s sensitivity to input variations, which is essential for improving the performance and stability of neural networks in real-world applications.

Library of Congress Subject Headings

Mathematical constants.
Lipschitz spaces.
Neural networks (Computer science)
Deep learning (Machine learning)

Publisher

South Dakota State University

Share

COinS
 

Rights Statement

In Copyright