Articles with "virtual knowledge" as a keyword



Photo by lmahammad from unsplash

A Virtual Knowledge Distillation via Conditional GAN

Sign Up to like & get
recommendations!
Published in 2022 at "IEEE Access"

DOI: 10.1109/access.2022.3163398

Abstract: Knowledge distillation aims at transferring the knowledge from a pre-trained complex model, called teacher, to a relatively smaller and faster one, called student. Unlike previous works that transfer the teacher’s softened distributions or feature spaces,… read more here.

Keywords: softened distributions; distillation; virtual knowledge; knowledge ... See more keywords