Articles with "multiscale knowledge" as a keyword



Multiscale knowledge distillation with attention based fusion for robust human activity recognition

Sign Up to like & get
recommendations!
Published in 2024 at "Scientific Reports"

DOI: 10.1038/s41598-024-63195-5

Abstract: Knowledge distillation is an effective approach for training robust multi-modal machine learning models when synchronous multimodal data are unavailable. However, traditional knowledge distillation techniques have limitations in comprehensively transferring knowledge across modalities and models. This… read more here.

Keywords: human activity; knowledge distillation; knowledge; multiscale knowledge ... See more keywords