Sign Up to like & get
recommendations!
2
Published in 2022 at "IEEE Transactions on Parallel and Distributed Systems"
DOI: 10.1109/tpds.2022.3200518
Abstract: In distributed training, workers collaboratively refine the global model parameters by pushing their updates to the Parameter Server and pulling fresher parameters for the next iteration. This introduces high communication costs for training at scale,…
read more here.
Keywords:
parameter server;
yet effective;
framework;
effective framework ... See more keywords