Théo Uscidda
Ph.D. with Marco Cuturi @ CREST - ENSAE, Institut Polytechnique de Paris
Visiting Ph.D. hosted by Fabian Theis @ Helmholtz Munich, Technical University of Munich
Internships: Amazon AWS AI Lab, Fundamental Research Team – Flatiron Institute, Simons Foundation.

Education: I am a Ph.D. student in machine learning working with Marco Cuturi (Apple MLR). Prior to that, I completed a master’s degree in Mathematics, Vision and Learning (MVA) at ENS Paris-Saclay. During my master’s thesis, I worked with Claire Boyer (Sorbonne Université), Julie Josse (INRIA) and Boris Muzellec (Owkin) at LPSM.
Research: I work on the interplay between optimal transport (OT), generative modeling, and representational learning. I intend to demonstrate how OT can introduce optimality structures within flow-, diffusion-, and VAE-based models, enhancing performance. I have applied these methods across diverse areas, including image generation/unpaired image translation, disentangled representation learning, optimization over probability spaces, geographic information, and various single-cell genomics challenges, such as reconstructing cellular trajectories, predicting cellular responses to perturbations, and translating between different single-cell data modalities. Recently, my focus has been on using OT to enhance/analyze the properties of vision/language foundation models.
Experiences: Since February 2024, I have been a visting Ph.D. student in Fabian Theis’ lab at Helmholtz Munich, where I have been working on generative models for predicting cellular responses to perturbations. From June to August 2024, I interned at the Flatiron Institute in Michael Shelley’s team developing multi-marginal generative models for population dynamics. Since December 2024, I have been interning in the fundamental research team of Amazon led by Stefano Soatto, where I have been working on large language models for reasoning.
news
Jan 31, 2025 | Our paper on Disentangled Representational Learning using the Gromov Wasserstein paradigm has been accepted to ICLR 2025. See you in Singapore! |
---|---|
Sep 26, 2024 | Thrilled to announce two accepted papers at NeurIPS 2024: GENOT: Entropic (Gromov) Wasserstein Flow Matching and Mirror and Preconditioned Gradient Descent in Wasserstein Space, with the latter receiving a spotlight! Huge thanks to my coauthors for these amazing collaborations. See you in Vancouver! |
Sep 2, 2024 | Back in Munich! I have rejoined Fabian Theis’ lab at Helmholtz Munich, where I will keep working on single-cell perturbations. |
Apr 25, 2024 | I presented the unbalanced Monge maps at Google DeepMind’s reading group on generative models, transport and sampling, organized by Valentin de Bortoli and Arnaud doucet. Looking forward to presenting the paper at ICLR 2024! |
Jan 17, 2024 | Our paper on unbalanced Monge maps has been accepted to ICLR 2024. See you in Vienna! |