Distributed Multi-Task Learning

Abstract

We consider the problem of distributed multi-task learning, where each machine learns a separate, but related, task. Specifically, each machine learns a linear predictor in high-dimensional space, where all tasks share the same small support. We present a communication-efficient estimator based on the debiased lasso and show that it is comparable with the optimal centralized method.

Publication
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics
Jialei Wang
Jialei Wang
PhD (2013-2018)

Jialei received his PhD in Computer Science at University of Chicago in June 2019. His advisors were Nathan Srebro and Mladen Kolar.

Mladen Kolar
Mladen Kolar
Associate Professor of Econometrics and Statistics

Mladen Kolar is an Associate Professor of Econometrics and Statistics at the University of Chicago Booth School of Business. His research is focused on high-dimensional statistical methods, graphical models, varying-coefficient models and data mining, driven by the need to uncover interesting and scientifically meaningful structures from observational data.