Articles with "training distributed" as a keyword



Photo from wikipedia

Deep Neural Network Training with Distributed K-FAC

Sign Up to like & get
recommendations!
Published in 2022 at "IEEE Transactions on Parallel and Distributed Systems"

DOI: 10.1109/tpds.2022.3161187

Abstract: Scaling deep neural network training to more processors and larger batch sizes is key to reducing end-to-end training time; yet, maintaining comparable convergence and hardware utilization at larger scales is a challenge. Increases in training… read more here.

Keywords: neural network; training; deep neural; training distributed ... See more keywords