Feb. 2025
Intervenant : | Jisu Kim |
Institution : | Seoul National University |
Heure : | 11h00 - 12h00 |
Lieu : | 2L8 |
Topological Data Analysis (TDA) generally refers to utilizing topological features from data. One main focus in TDA is persistent homology, which observes data at various resolutions and summarizes topological features that persistently appear. TDA has been proven valuable in enhancing machine learning applications. This presentation focuses on the application of TDA in machine learning, specifically in two aspects: featurization and evaluation.
The intricate structure of persistent homology poses challenges when directly applied to statistical or machine learning frameworks. To overcome this, the persistent homology is often featurized in Euclidean space or functional space. Three papers will be discussed as examples. First, I will present ”PLLay: Efficient Topological Layer based on Persistence Landscapes”, where I will explain how persistence landscapes are used to create a topological layer in a deep learning framework. Then, I will present “ECLayr: Fast and Robust Topological Layer based on Differentiable Euler Characteristic Curve”, which uses Euler Characteristic Curve to boost up the computation compared to PLLay. I will also present ”Generalized Penalty for Circular Coordinate Representation”, discussing how circular coordinates are utilized for visualization and dimension reduction.
Recently, efforts have emerged in using TDA to evaluate data or models and integrate them into machine learning models. I will present “TopP&R: Robust Support Estimation Approach for Evaluating Fidelity and Diversity in Generative Models”, where the confidence of TDA is employed for robust and reliable evaluation metrics for generative models.