publications

(*) denotes equal contribution

2024

2024

  1. arXiv
    Mirror and Preconditioned Gradient Descent in Wasserstein Space
    Clément Bonet, Théo Uscidda, Adam David, and 2 more authors
    arXiv:2310.09254, 2024
  2. SPIGM
    Disentangled Representation Learning through Geometry Preservation with the Gromov-Monge Gap
    Théo Uscidda*, Luca Eyring*, Karsten Roth, and 3 more authors
    In the Structured Probabilistic Inference & Generative Modeling Workshop at the 41st International Conference on Machine Learning, 2024
  3. ICLR
    Unbalancedness in Neural Monge Maps Improves Unpaired Domain Translation
    Luca Eyring*, Dominik Klein*, Théo Uscidda*, and 4 more authors
    In the 12th International Conference on Learning Representations, 2024

2023

2023

  1. arXiv
    Entropic (Gromov) Wasserstein Flow Matching with GENOT
    Dominik Klein*, Théo Uscidda*, Fabian Theis, and 1 more author
    arXiv:2310.09254, 2023
  2. ICML
    The Monge Gap: A Regularizer to Learn All Transport Maps
    Théo Uscidda, and Marco Cuturi
    In the 40th International Conference on Machine Learning, 2023