Machine-learned regression models represent a promising tool to implement accurate and computationally affordable energy-density functionals to solve quantum many-body problems via density functional theory. However, while they can easily be… Click to show full abstract
Machine-learned regression models represent a promising tool to implement accurate and computationally affordable energy-density functionals to solve quantum many-body problems via density functional theory. However, while they can easily be trained to accurately map ground-state density profiles to the corresponding energies, their functional derivatives often turn out to be too noisy, leading to instabilities in self-consistent iterations and in gradient-based searches of the ground-state density profile. We investigate how these instabilities occur when standard deep neural networks are adopted as regression models, and we show how to avoid them by using an ad hoc convolutional architecture featuring an interchannel averaging layer. The main testbed we consider is a realistic model for noninteracting atoms in optical speckle disorder. With the interchannel average, accurate and systematically improvable ground-state energies and density profiles are obtained via gradient-descent optimization, without instabilities nor violations of the variational principle.
               
Click one of the above tabs to view related content.