Diffeomorphic Learning

Several papers have explored the possibility of using homeomorphic or diffeomorphic transformations within feed-forward machine learning models. Discrete invertible versions of the ResNet architecture (He et al, CVPR 2016) were proposed as ``normalizing flows'' by Rezende and Mohamed (ICML 2015), and extended to a time-continuous form by Chen et al (Neurips 2018). Continuous-time optimal control as a learning principle was proposed by Weinan (Comm. Math. Stats, 2017).
In a similar vein, the LDDMM framework can be adapted to develop powerful predictor in typical data science contexts. The following two papers explore this paradigm for classification [1] and regression [2].


[1] Diffeomorphic Learning, L. Younes, Journal of Machine Learning Research, 21(220):1-28, 2020
[2] FineMorphs: Affine-diffeomorphic sequences for regression, M. Lohr and L. Younes
, arXiv:2305.17255, 2023