Sign Up to like & get
recommendations!
0
Published in 2024 at "Scientific Reports"
DOI: 10.1038/s41598-024-63195-5
Abstract: Knowledge distillation is an effective approach for training robust multi-modal machine learning models when synchronous multimodal data are unavailable. However, traditional knowledge distillation techniques have limitations in comprehensively transferring knowledge across modalities and models. This…
read more here.
Keywords:
human activity;
knowledge distillation;
knowledge;
multiscale knowledge ... See more keywords