Sign Up to like & get
recommendations!
1
Published in 2022 at "IEEE transactions on neural networks and learning systems"
DOI: 10.1109/tnnls.2022.3151736
Abstract: Distributed second-order optimization, as an effective strategy for training large-scale machine learning systems, has been widely investigated due to its low communication complexity. However, the existing distributed second-order optimization algorithms, including distributed approximate Newton (DANE),…
read more here.
Keywords:
distributed second;
accelerated distributed;
approximate newton;
distributed approximate ... See more keywords