Sign Up to like & get
recommendations!
0
Published in 2021 at "IEEE Transactions on Cloud Computing"
DOI: 10.1109/tcc.2021.3062398
Abstract: Distributed deep learning has been widely used in training deep neural networks, especially for big models on massive datasets. Parameter Server (PS) architecture is the most popular distributed training framework, which can flexibly design the…
read more here.
Keywords:
distributed deep;
deep learning;
group;
eliminating stragglers ... See more keywords