Feb 25, 2020 · We propose a \emph{preconditioned} accelerated gradient method where the preconditioning is done by solving a local optimization problem over a subsampled ...
We have introduced SPAG, an accelerated algorithm that per- forms statistical preconditioning for large-scale distributed optimization. Although our ...
We propose a precon-ditioned accelerated gradient method where the preconditioning is done by solving a local optimization problem over a subsampled dataset at ...
In order to reduce the number of communications required to reach a given accuracy, we propose a preconditioned accelerated gradient method where the ...
Feb 25, 2020 · In order to reduce the number of communications required to reach a given accuracy, we propose a preconditioned accelerated gradient method ...
Feb 25, 2020 · In order to reduce the number of communications required to reach a given accuracy, we propose a preconditioned accelerated gradient method ...
Feb 25, 2020 · This work considers the setting of distributed empirical risk minimization where multiple machines compute the gradients in parallel and a ...
People also ask
In order to reduce the number of communications required to reach a given accuracy, we propose a \emph{preconditioned} accelerated gradient method where the ...
Papertalk is an open-source platform where scientists share video presentations about their newest scientific results - and watch, like + discuss them.
Jul 12, 2020 · In order to reduce the number of communications required to reach a given accuracy, we propose a preconditioned accelerated gradient method ...