Sign Up to like & get
recommendations!
1
Published in 2022 at "International Journal of Intelligent Systems"
DOI: 10.1002/int.22819
Abstract: Graph convolutional network (GCN)‐based recommendation has recently attracted significant attention in the recommender system community. Although current studies propose various GCNs to improve recommendation performance, existing methods suffer from two main limitations. First, user–item interaction…
read more here.
Keywords:
distillation;
recommendation;
knowledge distillation;
model ... See more keywords
Photo from wikipedia
Sign Up to like & get
recommendations!
1
Published in 2022 at "Applied Intelligence"
DOI: 10.1007/s10489-022-03355-0
Abstract: Data augmentation has been proved effective in training deep models. Existing data augmentation methods tackle finegrained problem by blending image pairs and fusing corresponding labels according to the statistics of mixed pixels, which produces additional…
read more here.
Keywords:
knowledge distillation;
distillation;
ensemble knowledge;
fine grained ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2024 at "Neural Processing Letters"
DOI: 10.1007/s11063-024-11646-5
Abstract: Deep neural networks perform better than shallow neural networks, but the former tends to be deeper or wider, introducing large numbers of parameters and computations. We know that networks that are too wide have a…
read more here.
Keywords:
narrow deep;
deep networks;
knowledge distillation;
distillation based ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2025 at "Nonlinear Dynamics"
DOI: 10.1007/s11071-025-10916-8
Abstract: With the advancement of machine learning and deep learning, Physics-Informed Neural Networks (PINNs) have emerged as a prominent approach for solving partial differential equation (PDE) problems. In this article, we introduce a novel distillation framework…
read more here.
Keywords:
physics informed;
knowledge distillation;
framework;
informed neural ... See more keywords
Photo from wikipedia
Sign Up to like & get
recommendations!
0
Published in 2021 at "Neurocomputing"
DOI: 10.1016/j.neucom.2021.01.086
Abstract: Abstract Deep model-based semantic segmentation has received ever increasing research focus in recent years. However, due to the complex model architectures, existing works are still unable to achieve high accuracy in real-time applications. In this…
read more here.
Keywords:
segmentation;
semantic segmentation;
knowledge distillation;
real time ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2021 at "Neurocomputing"
DOI: 10.1016/j.neucom.2021.04.026
Abstract: Abstract Knowledge distillation (KD) is a standard teacher-student learning framework to train a light-weight student network under the guidance of a well-trained, large teacher network. As an effective teaching strategy, interactive teaching has been widely employed…
read more here.
Keywords:
student;
interactive knowledge;
knowledge;
knowledge distillation ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2024 at "Scientific Reports"
DOI: 10.1038/s41598-024-58409-9
Abstract: The current state-of-the-art anomaly detection methods based on knowledge distillation (KD) typically depend on smaller student networks or reverse distillation to address vanishing representations discrepancy on anomalies. These methods often struggle to achieve precise detection…
read more here.
Keywords:
detection;
knowledge distillation;
anomaly detection;
similarity ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2024 at "Scientific Reports"
DOI: 10.1038/s41598-024-59553-y
Abstract: Multimodal spectral imaging offers a unique approach to the enhancement of the analytical capabilities of standalone spectroscopy techniques by combining information gathered from distinct sources. In this manuscript, we explore such opportunities by focusing on…
read more here.
Keywords:
spectroscopy;
knowledge distillation;
sensor fusion;
mineral identification ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2024 at "Scientific Reports"
DOI: 10.1038/s41598-024-63195-5
Abstract: Knowledge distillation is an effective approach for training robust multi-modal machine learning models when synchronous multimodal data are unavailable. However, traditional knowledge distillation techniques have limitations in comprehensively transferring knowledge across modalities and models. This…
read more here.
Keywords:
human activity;
knowledge distillation;
knowledge;
multiscale knowledge ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2024 at "Scientific Reports"
DOI: 10.1038/s41598-024-69813-6
Abstract: This paper presents a Cosine Similarity-Based Knowledge Distillation (CSKD) for robust, lightweight object detectors. Knowledge Distillation (KD) has been effective in enhancing the performance of compact models in image classification by leveraging deep CNN models.…
read more here.
Keywords:
distillation;
model;
cosine similarity;
knowledge distillation ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2025 at "Scientific Reports"
DOI: 10.1038/s41598-025-16001-9
Abstract: Accurate emotion recognition in social media text is critical for applications such as sentiment analysis, mental health monitoring, and human-computer interaction. However, existing approaches face challenges like computational complexity and class imbalance, limiting their deployment…
read more here.
Keywords:
emotion recognition;
knowledge distillation;
emotion;
recognition ... See more keywords