publications

(*) denotes equal contribution

2024

2024

  1. NeurIPSSpotlight
    Mirror and Preconditioned Gradient Descent in Wasserstein Space
    Clément Bonet, Théo Uscidda,  Adam David and 2 more authors
    In the 38th Annual Conference on Neural Information Processing Systems
    Spotlight Presentation
    Wasserstein Gradient Flows Optimization
  2. NeurIPS
    GENOT: Entropic (Gromov) Wasserstein Flow Matching with Applications to Single-Cell Genomics
    Dominik Klein*, Théo Uscidda*,  Fabian Theis and 1 more author
    In the 38th Annual Conference on Neural Information Processing Systems
    Generative Modeling Multimodality Neural Optimal Transport Flow Matching
  3. SPIGM @ ICML
    Disentangled Representation Learning with the Gromov-Monge Gap
    Théo Uscidda*, Luca Eyring*,  Karsten Roth and 3 more authors
    In the Structured Probabilistic Inference & Generative Modeling Workshop at the 41st International Conference on Machine Learning
    Representational Learning Geometric Learning Neural Optimal Transport VAEs
  4. ICLR
    Unbalancedness in Neural Monge Maps Improves Unpaired Domain Translation
    Luca Eyring*, Dominik Klein*,  Théo Uscidda* and 4 more authors
    In the 12th International Conference on Learning Representations
    Generative Modeling Neural Optimal Transport Flow Matching
  5. CCAI @ ICLR
    On the potential of Optimal Transport in Geospatial Data Science
    Nina Wiedemann, Théo Uscidda, and Martin Raubal
    In the Tackling Climate Change with Machine Learning Workshop at the 12th International Conference on Learning Representations
    Geographic Information Transportation Planning

2023

2023

  1. ICML
    The Monge Gap: A Regularizer to Learn All Transport Maps
    Théo Uscidda, and Marco Cuturi
    In the 40th International Conference on Machine Learning
    Generative Modeling Neural Optimal Transport