I am a MCSA postdoctoral researcher at INRIA Paris
in the Sierra team
with Francis Bach.
Prior to INRIA, I received my PhD from UBC in 2024, where I worked with Mark Schmidt,
and my BSc and MSc from EPFL where I worked with Martin Jaggi.
I’ve had the chance to work at the MPI with Philipp Hennig
and at RIKEN with Emtiyaz Khan.
My research interests are at the intersection
of optimization theory and machine learning.
Papers (all)
-
Scaling Laws for Gradient Descent and Sign Descent for Linear Bigram Models under Zipf's Law
-
FK,
Francis Bach
2025 NeurIPS
-
Heavy-Tailed Class Imbalance and Why Adam Outperforms Gradient Descent on Language Models
-
FK,
Robin Yadav,
Alan Milligan,
Mark Schmidt,
Alberto Bietti
2024 NeurIPS
-
Searching for Optimal Per-Coordinate Step-sizes with Multidimensional Backtracking
-
FK,
Victor Sanches Portella,
Mark Schmidt,
Nick Harvey
2023 NeurIPS
-
Homeomorphic-Invariance of EM: Non-Asymptotic Convergence in KL Divergence for Exponential Families via Mirror Descent
-
FK,
Raunak Kumar,
Mark Schmidt
2021 AISTATS
-
BackPACK: Packing more into backprop
-
Felix Dangel,
FK,
Philipp Hennig
2020 ICLR
Software utilities
Citation explorer for literature reviews (through Greg d’Eon)
Bibcleaner to clean BibTeX entries
Tex2UTF8 for places that do not support Latex but happily render UTF8 (finally!)
DatasetDownloader for libsvm datasets