Théo Uscidda

Ph.D. with Marco Cuturi @ CREST - ENSAE, Institut Polytechnique de Paris
Visiting Ph.D. hosted by Fabian Theis @ Helmholtz Munich, Technical University of Munich
Internships: Amazon AWS AI Lab, Fundamental Research Team – Flatiron Institute, Simons Foundation.

prof_pic.jpg

Education: I am a Ph.D. student in machine learning working with Marco Cuturi (Apple MLR). Prior to that, I completed a master’s degree in Mathematics, Vision and Learning (MVA) at ENS Paris-Saclay. During my master’s thesis, I worked with Claire Boyer (Sorbonne Université), Julie Josse (INRIA) and Boris Muzellec (Owkin) at LPSM.

Research: I work on the interplay between optimal transport (OT), generative modeling, and representational learning. I intend to demonstrate how OT can introduce optimality structures within flow-, diffusion-, and VAE-based models, enhancing performance. I have applied these methods across diverse areas, including image generation/unpaired image translation, disentangled representation learning, optimization over probability spaces, geographic information, and various single-cell genomics challenges, such as reconstructing cellular trajectories, predicting cellular responses to perturbations, and translating between different single-cell data modalities. Recently, my focus has been on using OT to enhance/analyze the properties of vision/language foundation models.

Experiences: Since February 2024, I have been a visting Ph.D. student in Fabian Theis’ lab at Helmholtz Munich, where I have been working on generative models for predicting cellular responses to perturbations. From June to August 2024, I interned at the Flatiron Institute in Michael Shelley’s team developing multi-marginal generative models for population dynamics. Since December 2024, I have been interning in the fundamental research team of Amazon led by Stefano Soatto, where I have been working on large language models for reasoning.

news

Jan 31, 2025 Our paper on Disentangled Representational Learning using the Gromov Wasserstein paradigm has been accepted to ICLR 2025. See you in Singapore!
Sep 26, 2024 Thrilled to announce two accepted papers at NeurIPS 2024: GENOT: Entropic (Gromov) Wasserstein Flow Matching and Mirror and Preconditioned Gradient Descent in Wasserstein Space, with the latter receiving a spotlight! Huge thanks to my coauthors for these amazing collaborations. See you in Vancouver!
Sep 2, 2024 Back in Munich! I have rejoined Fabian Theis’ lab at Helmholtz Munich, where I will keep working on single-cell perturbations.
Apr 25, 2024 I presented the unbalanced Monge maps at Google DeepMind’s reading group on generative models, transport and sampling, organized by Valentin de Bortoli and Arnaud doucet. Looking forward to presenting the paper at ICLR 2024!
Jan 17, 2024 Our paper on unbalanced Monge maps has been accepted to ICLR 2024. See you in Vienna!

selected publications

  1. ICLR
    Disentangled Representation Learning with the Gromov-Monge Gap
    Théo Uscidda*, Luca Eyring*,  Karsten Roth and 3 more authors
    In the 13th International Conference on Learning Representations
    Representational Learning Geometric Learning Neural Optimal Transport VAEs
  2. NeurIPSSpotlight
    Mirror and Preconditioned Gradient Descent in Wasserstein Space
    Clément Bonet, Théo Uscidda,  Adam David and 2 more authors
    In the 38th Annual Conference on Neural Information Processing Systems
    Spotlight Presentation
    Wasserstein Gradient Flows Optimization
  3. NeurIPS
    GENOT: Entropic (Gromov) Wasserstein Flow Matching with Applications to Single-Cell Genomics
    Dominik Klein*, Théo Uscidda*,  Fabian Theis and 1 more author
    In the 38th Annual Conference on Neural Information Processing Systems
    Generative Modeling Multimodality Neural Optimal Transport Flow Matching
  4. ICLR
    Unbalancedness in Neural Monge Maps Improves Unpaired Domain Translation
    Luca Eyring*, Dominik Klein*,  Théo Uscidda* and 4 more authors
    In the 12th International Conference on Learning Representations
    Generative Modeling Neural Optimal Transport Flow Matching
  5. CCAI @ ICLR
    On the potential of Optimal Transport in Geospatial Data Science
    Nina Wiedemann, Théo Uscidda, and Martin Raubal
    In the Tackling Climate Change with Machine Learning Workshop at the 12th International Conference on Learning Representations
    Geographic Information Transportation Planning
  6. ICML
    The Monge Gap: A Regularizer to Learn All Transport Maps
    Théo Uscidda, and Marco Cuturi
    In the 40th International Conference on Machine Learning
    Generative Modeling Neural Optimal Transport