Rechercher

[BPT19] Distributed Optimization for Deep Learning with Gossip Exchange

Revue Internationale avec comité de lecture : Journal Neurcomputing, 2019, (doi:10.1016/j.neucom.2018.11.002)
motcle:
Résumé: We address the issue of speeding up the training of convolutional neural networks by studying a distributed method adapted to stochastic gradient descent. Our parallel optimization setup uses several threads, each applying individual gradient descents on a local variable. We propose a new way of sharing information between different threads based on gossip algorithms that show good consensus convergence properties. Our method called GoSGD has the advantage to be fully asynchronous and decentralized.

BibTeX

@article {
BPT19,
title="{Distributed Optimization for Deep Learning with Gossip Exchange}",
author="M. Blot and D. Picard and N. Thome and M. Cord",
journal="Neurcomputing",
year=2019,
doi="10.1016/j.neucom.2018.11.002",
}