• search hit 18 of 35395
Back to Result List

Spectral convergence of diffusion maps

  • Diffusion maps is a manifold learning algorithm widely used for dimensionality reduction. Using a sample from a distribution, it approximates the eigenvalues and eigenfunctions of associated Laplace-Beltrami operators. Theoretical bounds on the approximation error are, however, generally much weaker than the rates that are seen in practice. This paper uses new approaches to improve the error bounds in the model case where the distribution is supported on a hypertorus. For the data sampling (variance) component of the error we make spatially localized compact embedding estimates on certain Hardy spaces; we study the deterministic (bias) component as a perturbation of the Laplace-Beltrami operator's associated PDE and apply relevant spectral stability results. Using these approaches, we match long-standing pointwise error bounds for both the spectral data and the norm convergence of the operator discretization. We also introduce an alternative normalization for diffusion maps based on Sinkhorn weights. This normalization approximates aDiffusion maps is a manifold learning algorithm widely used for dimensionality reduction. Using a sample from a distribution, it approximates the eigenvalues and eigenfunctions of associated Laplace-Beltrami operators. Theoretical bounds on the approximation error are, however, generally much weaker than the rates that are seen in practice. This paper uses new approaches to improve the error bounds in the model case where the distribution is supported on a hypertorus. For the data sampling (variance) component of the error we make spatially localized compact embedding estimates on certain Hardy spaces; we study the deterministic (bias) component as a perturbation of the Laplace-Beltrami operator's associated PDE and apply relevant spectral stability results. Using these approaches, we match long-standing pointwise error bounds for both the spectral data and the norm convergence of the operator discretization. We also introduce an alternative normalization for diffusion maps based on Sinkhorn weights. This normalization approximates a Langevin diffusion on the sample and yields a symmetric operator approximation. We prove that it has better convergence compared with the standard normalization on flat domains, and we present a highly efficient rigorous algorithm to compute the Sinkhorn weights.show moreshow less

Export metadata

Additional Services

Search Google Scholar Statistics
Metadaten
Author details:Caroline L. WormellORCiD, Sebastian ReichORCiDGND
DOI:https://doi.org/10.1137/20M1344093
ISSN:0036-1429
ISSN:1095-7170
Title of parent work (English):SIAM journal on numerical analysis / Society for Industrial and Applied Mathematics
Subtitle (English):Improved error bounds and an alternative normalization
Publisher:Society for Industrial and Applied Mathematics
Place of publishing:Philadelphia
Publication type:Article
Language:English
Year of first publication:2021
Publication year:2021
Release date:2024/05/24
Tag:Sinkhorn problem; diffusion maps; graph Laplacian; kernel methods
Volume:59
Issue:3
Number of pages:48
First page:1687
Last Page:1734
Funding institution:Deutsche Forschungsgemeinschaft (DFG, German Science Foundation)German Research Foundation (DFG) [SFB 1294/1-318763901]; European Research Council (ERC) under European Union's Horizon 2020 Research and Innovation ProgrammeEuropean Research Council (ERC) [787304]
Organizational units:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Mathematik
DDC classification:5 Naturwissenschaften und Mathematik / 51 Mathematik / 510 Mathematik
Peer review:Referiert
Accept ✔
This website uses technically necessary session cookies. By continuing to use the website, you agree to this. You can find our privacy policy here.