Sign Up to like & get
recommendations!
0
Published in 2022 at "Natural Language Engineering"
DOI: 10.1017/s1351324922000237
Abstract: Abstract This paper describes gft (general fine-tuning), a little language for deep nets, introduced at an ACL-2022 tutorial. gft makes deep nets accessible to a broad audience including non-programmers. It is standard practice in many…
read more here.
Keywords:
general fine;
deep nets;
gft;
emerging trends ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2022 at "Natural Language Engineering"
DOI: 10.1017/s1351324922000365
Abstract: Abstract Deep nets are becoming larger and larger in practice, with no respect for (non)-factors that ought to limit growth including the so-called curse of dimensionality (CoD). Donoho suggested that dimensionality can be a blessing…
read more here.
Keywords:
nets thrive;
practice;
trends deep;
deep nets ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2019 at "IEEE Transactions on Neural Networks and Learning Systems"
DOI: 10.1109/tnnls.2018.2868980
Abstract: Along with the rapid development of deep learning in practice, theoretical explanations for its success become urgent. Generalization and expressivity are two widely used measurements to quantify theoretical behaviors of deep nets. The expressivity focuses…
read more here.
Keywords:
expressivity deep;
expressivity;
deep nets;
capacity ... See more keywords
Sign Up to like & get
recommendations!
1
Published in 2021 at "Communications of the ACM"
DOI: 10.1145/3446773
Abstract: that a traditional measure, Rademacher complexity, is high for the deep net architecture. Subsequent work has explored the authors’ suggestion that the training algorithm (a variant of gradient descent) plays a powerful role in how…
read more here.
Keywords:
today deep;
overfit training;
deep nets;
nets overfit ... See more keywords