Research
My research is in machine learning, particularly on efficient algorithms for estimating and sampling from probabilistic models, as well as on learning useful representations of brain activity.
|
Sampling from Energy-Based Probabilistic Models
|
|
Provable Convergence and Limitations of Geometric Tempering for Langevin Dynamics
Omar Chehab, Anna Korba, Austin Stromme, Adrien Vacher
arXiv, 2024
arxiv /
poster /
We presented this work at the Yale workshop on sampling.
|
|
A Practical Diffusion Path for Sampling
Omar Chehab, Anna Korba
SPIGM Workshop, International Conference on Machine Learning (ICML), 2024
arxiv /
poster /
|
Estimating Energy-Based Probabilistic Models
|
|
Provable benefits of annealing for estimating normalizing constants: Importance Sampling, Noise-Contrastive Estimation, and beyond
Omar Chehab, Aapo Hyvärinen, Andrej Risteski
Spotlight, Advances in Neural Information Processing Systems (NeurIPS), 2023
arxiv /
code /
poster /
|
|
The Optimal Noise in Noise-Contrastive Learning Is Not What You Think
Omar Chehab, Alexandre Gramfort, Aapo Hyvärinen
Conference on Uncertainty in Artificial Intelligence (UAI), 2022
arxiv /
code /
poster /
|
Learning Representations of Brain Activity
|
|
Deep Recurrent Encoder: an end-to-end network to model magnetoencephalography at scale
Omar Chehab*, Alexandre Defossez*, Jean-Christophe Loiseau, Alexandre Gramfort, Jean-Remi King
Journal of Neurons, Behavior, Data analysis, and Theory, 2022
arxiv /
code /
|
|
Learning with self-supervision on EEG data
Alexandre Gramfort, Hubert Banville, Omar Chehab, Aapo Hyvärinen, Denis Engemann
IEEE workshop on Brain-Computer Interface, 2021
arxiv /
|
|
Uncovering the structure of clinical EEG signals with self-supervised learning
Hubert Banville, Omar Chehab, Aapo Hyvärinen, Denis Engemann, Alexandre Gramfort
Journal of Neural Engineering, 2021
arxiv /
|
|
A mean-field approach to the dynamics of networks of complex neurons, from nonlinear Integrate-and-Fire to Hodgkin–Huxley models
Mallory Carlu, Omar Chehab, Leonardo Dalla Porta, Damien Depannemaecker, Charlotte Héricé, Maciej Jedynak, Elif Köksal Ersöz, Paulo Muratore, Selma Souihe, Cristiano Capone, Yann Zerlaut, Alain Destexhe, Matteo di Volo
Journal of Neurophysiology, 2020
arxiv /
|
Talks
* 10/2024, Gregory Wornell's team seminar, MIT, USA
* 10/2024, Youssef Marzouk's team seminar, MIT, USA
* 03/2024, Takeru Matsuda's team seminar, RIKEN, Japan
* 03/2023, Self-Supervised Learning Reading Group, Vector Institute, Canada
* 02/2020 First International Workshop on Nonlinear ICA, Inria, France
|
Teaching
I was Teacher's Assistant for the following Masters courses.
Optimization for Data Science
- Institut Polytechnique de Paris
(2021-2023)
Professors: Alexandre Gramfort, Pierre Ablin
Optimization
- CentraleSupelec, Universite Paris-Saclay
(2020-2021)
Professors: Jean-Christophe Pesquet, Sorin Olaru, Stephane Font
Advanced Machine Learning
- CentraleSupelec, Universite Paris-Saclay
(2020-2022)
Professors: Emilie Chouzenoux, Frederic Pascal
|
Service
I review machine learning papers for the NeurIPS, ICML, ICLR and AISTATS conferences.
I am grateful to have been recognized as a "top reviewer" for AISTATS 2022 and for NeurIPS 2022, 2023 and 2024.
|
|