Radu-Alexandru Dragomir

- Researcher in numerical optimization and data science -

About me


Since 2024, I am an assistant professor in Applied Mathematics in the S2A team at Télécom Paris. My research revolves around machine learning and signal processing methods.

Prior to that, I held two post-doctoral positions, at UCLouvain with Yurii Nesterov and then at EPFL in the OPTIM group of Nicolas Boumal.

From 2018 to 2021, I did my PhD jointly with Jérôme Bolte and Alexandre d'Aspremont within the SIERRA team in Paris.

Email: dragomir [at] telecom-paris.fr


Research

I study optimization methods for solving large-scale problems arising in signal processing and data science. My research revolves around understanding problems with non-quadratic and non-convex objectives. Among others, I am interested in:

  • Mirror descent
  • Implicit bias of neural network training
  • Nonlinear inverse problems
  • Computer-aided performance estimation

I am also interested on the ethical aspects of machine learning and artificial intelligence. I explore topics such as:

  • Reliability and robustness of machine learning algorithms
  • Political impact of social networks and recommender systems
  • Environnmental cost of AI

Teaching

  • Optimization for Machine Learning (Télécom Paris, M1)
  • Signal processing tools for audio and image (Télécom Paris, L3)
  • Numerical analysis (Télécom Paris, L3)
  • Ecological and social transition (Télécom Paris, L3)

Preprints

  • R-A. Dragomir, Y. Nesterov. Convex Quartic Problems: Homogenized Gradient Method and Preconditioning. [arxiv] [slides] [slides]

Publications

  • S. Pesme, R-A. Dragomir, N. Flammarion. Implicit Bias of Mirror Flow on Separable Data.
    NeurIPS, 2024. [arxiv]
  • R-A. Dragomir, M. Even, H. Hendrikx. Fast Stochastic Bregman Gradient Methods: Sharp Analysis and Variance Reduction.
    International Conference on Machine Learning, 2021. [PMLR] [arxiv] [slides]
  • R-A. Dragomir, A. B. Taylor, A. d'Aspremont, J. Bolte. Optimal Complexity and Certification of Bregman First-Order Methods.
    Mathematical Programming, 2021. [springer] [arxiv] [GeoGebra demo] [code]
  • R-A. Dragomir, A. d'Aspremont, J. Bolte. Quartic First-Order Methods for Low-Rank Minimization.
    Journal of Optimization Theory and Applications, 2021. [springer] [arxiv] [code]

Thesis

  • R-A. Dragomir, Bregman Gradient Methods for Relatively-Smooth Optimization.
    PhD thesis, 2021. Advised by Jérôme Bolte and Alexandre d'Aspremont. [pdf] [slides] [video]