Maths and algorithms

Soon!

If you’re interested in computational Optimal Transport, I would strongly recommend you to read my tutorial on gradient flows and the notebooks presenting the multiscale Sinkhorn algorithm, in the gallery.

Going further

A comprehensive reference on the algorithms implemented here will be provided in my PhD thesis (~ November 2019) - from CUDA tricks to theoretical results on Sinkhorn divergences and applications to medical data. Until then, you may be interested by:

  • My slides and poster on Hausdorff distances, Kernel MMDs and Sinkhorn divergences.

  • The tutorial on gradient flows that I wrote for students at the math department of the ENS.

  • Our AiStats2019 paper, which proves the positive-definiteness and convexity of Sinkhorn divergences.

  • Our ShapeMI2018 paper, which puts the three geometric loss functions in a common framework.

  • Aude Genevay’s PhD thesis, which provides an in-depth study of the statistical properties of Sinkhorn divergences.

  • Bernhard Schmitzer’s sparse scaling paper, which introduced the first multiscale Sinkhorn loop.