cv
Here is an abridged version of my CV.
Basics
Name | Hugo Latourelle-Vigeant |
Education
-
2024 - Present Connecticut, USA
-
2022 - 2024 Montreal, Canada
M.Sc. Mathematics and Statistics
McGill University
- Master's Thesis: The matrix Dyson equation for machine learning: Correlated linearizations and the test error in random features regression
-
2018 - 2022 Montreal, Canada
B.Sc. Joint Honours Mathematics and Computer Science
McGill University
- Graduated with First Class Joint Honours
Work
-
2024 - Present Montreal, Canada
Data Science Intern
CDPQ
Internship in data science at CDPQ during the summer of 2024. Part of the NLP team.
Publications
-
2024 The matrix Dyson equation for machine learning: Correlated linearizations and the test error in random features regression
Hugo Latourelle-Vigeant
McGill University
Summary: Extended the matrix Dyson equation framework to derive an anisotropic global law for pseudo-resolvents with general correlation structures. Applied this framework to provide an exact deterministic expression for the empirical test error in random features ridge regression, addressing aspects such as existence-uniqueness, spectral support bounds, and stability properties.
-
2024 Dyson Equation for Correlated Linearizations and Test Error of Random Features Regression
Hugo Latourelle-Vigeant and Elliot Paquette
Preprint: arXiv:2312.09194v2
Summary: Developed theory for the matrix Dyson equation for correlated linearizations, including existence-uniqueness, spectral support bounds, and stability properties, and applied this to derive a deterministic equivalent for the test error in random features ridge regression.
Presentations
-
2023.12.03 Matrix Dyson Equation for Correlated Linearizations
The many facets of random matrix theory Workshop at Canadian Mathematical Society Winter Meeting
Summary: Extended the matrix Dyson equation framework for linearizations to derive an anisotropic global law for pseudo-resolvents with general correlation structures, and applied this to derive an exact asymptotic expression for the validation error of random features ridge regression.
-
2023.09.06 Matrix Dyson Equation for Linearizations
Seminar in random matrix theory, machine learning and optimization at McGill University
Summary: Extended the matrix Dyson equation framework to analyze rational expressions in random matrices using a linearization trick, and applied this to study the test error of a random feature model.
-
2021.08.23 GD and Large Linear Regression: Concentration and Asymptotics for a Spiked Model
4th Undergraduate Student Research Conference at McGill University
Summary: Demonstrated that the halting time in large-scale spiked random least squares problems trained with gradient descent exhibits a universality property, independent of input probability distribution, and provided explicit asymptotic results.
Teaching
-
Winter 2024 Calculus 2 - MATH 141
Teaching Assistant
-
Winter 2024 Convex Optimization - Math 463/563
Graduate Course Assistant
-
Winter 2023 Calculus 2 - MATH 141
Teaching Assistant
-
Winter 2023 Convex Optimization - Math 463/563
Graduate Course Assistant
-
Fall 2022 Numerical Optimization - Math 560
Graduate Course Assistant
-
Fall 2022 Calculus 2 - MATH 141
Teaching Assistant
-
Winter 2022 Numerical Optimization - Math 560
Undergraduate Course Assistant
Awards
-
2022
First-class honours in Mathematics and Computer Science
McGill University
-
2021
Undergraduate student research award
NSERC
The NSERC Undergraduate Student Research Award is a competitive award granted by the Natural Sciences and Engineering Research Council of Canada (NSERC) on the basis of academic excellence and research potential to support a full-time undergraduate summer research project.
-
2018
Major entrance scholarship in science
Hydro-Québec
Organizer
-
Fall 2023 Montreal RMT-ML-OPT seminar at McGill University
Reviewer
-
NeurIPS 2024 OPT Workshop on Optimization for Machine Learning
-
ICML 2024 2nd Workshop on High-dimensional Learning Dynamics (HiLD): The Emergence of Structure and Reasoning
-
ICML 2023 High-dimensional Learning Dynamics Workshop
-
NeurIPS 2023 OPT Workshop on Optimization for Machine Learning
-
NeurIPS 2022 OPT Workshop on Optimization for Machine Learning