Session 9: The Embedding Gap, when are manifold close?

Presenter Information/ Coauthors Information

Michael Puthawala, South Dakota State UniversityFollow

Presentation Type

Invited

Student

No

Track

Methodology

Abstract

: The manifold hypothesis, that high dimensional data of interest usually clusters around low-dimensional manifolds, is perhaps the most common assumption made when analyzing high-dimensional data. Making this eminently intuitive maxim rigorous is a challenge but yields insight into practical questions. Motivated by the desire build tools for the pure analysis of manifolds that yield practical problems, we introduce the Embedding Gap, a novel non-symmetric divergence for measuring the degree to which one manifold fails to embed in another. We demonstrate how the embedding gap initiatively measures an important geometric property, is amenable to computation, and gives affirmative answers to practical questions about approximation power and rates of neural network learning manifold-supported distribution. No background on geometry is needed, as ample geometric intuition will be provided in the form of illustrative figures.

Start Date

2-6-2024 2:30 PM

End Date

2-6-2024 3:30 PM

This document is currently not available here.

Share

COinS
 
Feb 6th, 2:30 PM Feb 6th, 3:30 PM

Session 9: The Embedding Gap, when are manifold close?

Pasque 255

: The manifold hypothesis, that high dimensional data of interest usually clusters around low-dimensional manifolds, is perhaps the most common assumption made when analyzing high-dimensional data. Making this eminently intuitive maxim rigorous is a challenge but yields insight into practical questions. Motivated by the desire build tools for the pure analysis of manifolds that yield practical problems, we introduce the Embedding Gap, a novel non-symmetric divergence for measuring the degree to which one manifold fails to embed in another. We demonstrate how the embedding gap initiatively measures an important geometric property, is amenable to computation, and gives affirmative answers to practical questions about approximation power and rates of neural network learning manifold-supported distribution. No background on geometry is needed, as ample geometric intuition will be provided in the form of illustrative figures.