Articles with "knowledge distillation" as a keyword



Photo from wikipedia

A two‐phase knowledge distillation model for graph convolutional network‐based recommendation

Sign Up to like & get
recommendations!
Published in 2022 at "International Journal of Intelligent Systems"

DOI: 10.1002/int.22819

Abstract: Graph convolutional network (GCN)‐based recommendation has recently attracted significant attention in the recommender system community. Although current studies propose various GCNs to improve recommendation performance, existing methods suffer from two main limitations. First, user–item interaction… read more here.

Keywords: distillation; recommendation; knowledge distillation; model ... See more keywords
Photo from wikipedia

CEKD:Cross ensemble knowledge distillation for augmented fine-grained data

Sign Up to like & get
recommendations!
Published in 2022 at "Applied Intelligence"

DOI: 10.1007/s10489-022-03355-0

Abstract: Data augmentation has been proved effective in training deep models. Existing data augmentation methods tackle finegrained problem by blending image pairs and fusing corresponding labels according to the statistics of mixed pixels, which produces additional… read more here.

Keywords: knowledge distillation; distillation; ensemble knowledge; fine grained ... See more keywords
Photo from wikipedia

Real-time semantic segmentation via sequential knowledge distillation

Sign Up to like & get
recommendations!
Published in 2021 at "Neurocomputing"

DOI: 10.1016/j.neucom.2021.01.086

Abstract: Abstract Deep model-based semantic segmentation has received ever increasing research focus in recent years. However, due to the complex model architectures, existing works are still unable to achieve high accuracy in real-time applications. In this… read more here.

Keywords: segmentation; semantic segmentation; knowledge distillation; real time ... See more keywords
Photo from wikipedia

Interactive Knowledge Distillation for image classification

Sign Up to like & get
recommendations!
Published in 2021 at "Neurocomputing"

DOI: 10.1016/j.neucom.2021.04.026

Abstract: Abstract Knowledge distillation (KD) is a standard teacher-student learning framework to train a light-weight student network under the guidance of a well-trained, large teacher network. As an effective teaching strategy, interactive teaching has been widely employed… read more here.

Keywords: student; interactive knowledge; knowledge; knowledge distillation ... See more keywords
Photo from wikipedia

Variational Bayesian Group-Level Sparsification for Knowledge Distillation

Sign Up to like & get
recommendations!
Published in 2020 at "IEEE Access"

DOI: 10.1109/access.2020.3008854

Abstract: Deep neural networks are capable of learning powerful representation, but often limited by heavy network architectures and high computational cost. Knowledge distillation (KD) is one of the effective ways to perform model compression and inference… read more here.

Keywords: variational bayesian; knowledge distillation; group level; bayesian group ... See more keywords
Photo from wikipedia

Knowledge Distillation in Acoustic Scene Classification

Sign Up to like & get
recommendations!
Published in 2020 at "IEEE Access"

DOI: 10.1109/access.2020.3021711

Abstract: Common acoustic properties that different classes share degrades the performance of acoustic scene classification systems. This results in a phenomenon where a few confusing pairs of acoustic scenes dominate a significant proportion of all misclassified… read more here.

Keywords: acoustic scene; scene classification; knowledge; classification ... See more keywords
Photo by lmahammad from unsplash

A Virtual Knowledge Distillation via Conditional GAN

Sign Up to like & get
recommendations!
Published in 2022 at "IEEE Access"

DOI: 10.1109/access.2022.3163398

Abstract: Knowledge distillation aims at transferring the knowledge from a pre-trained complex model, called teacher, to a relatively smaller and faster one, called student. Unlike previous works that transfer the teacher’s softened distributions or feature spaces,… read more here.

Keywords: softened distributions; distillation; virtual knowledge; knowledge ... See more keywords
Photo by demoya from unsplash

Analysis of Model Compression Using Knowledge Distillation

Sign Up to like & get
recommendations!
Published in 2022 at "IEEE Access"

DOI: 10.1109/access.2022.3197608

Abstract: In the development of deep learning, several convolution neural network (CNN) models are designed to solve various tasks. However, these CNN models are complex and cumbersome to achieve state-of-the-art performance. The current CNN models remain… read more here.

Keywords: compression; using knowledge; knowledge distillation; model ... See more keywords
Photo from wikipedia

Progressive Network Grafting With Local Features Embedding for Few-Shot Knowledge Distillation

Sign Up to like & get
recommendations!
Published in 2022 at "IEEE Access"

DOI: 10.1109/access.2022.3218890

Abstract: Compared with traditional knowledge distillation, which relies on a large amount of data, few-shot knowledge distillation can distill student networks with good performance using only a small number of samples. Some recent studies treat the… read more here.

Keywords: network; local features; knowledge distillation; block ... See more keywords
Photo from wikipedia

Defect Detection Method Based on Knowledge Distillation

Sign Up to like & get
recommendations!
Published in 2023 at "IEEE Access"

DOI: 10.1109/access.2023.3252910

Abstract: Aiming at the problem that traditional surface detection is easily affected by complex industrial environments and cannot extract effective features, a deep learning-based knowledge distillation anomaly detection model is proposed. Firstly, a pre-trained teacher network… read more here.

Keywords: network; knowledge; detection; knowledge distillation ... See more keywords
Photo from wikipedia

Enhancing Recommendation Capabilities Using Multi-Head Attention-Based Federated Knowledge Distillation

Sign Up to like & get
recommendations!
Published in 2023 at "IEEE Access"

DOI: 10.1109/access.2023.3271678

Abstract: As the internet and mobile computing have advanced, recommendation algorithms are used to manage large amounts of data. However, traditional recommendation systems usually require collecting user data on a central server, which may expose user… read more here.

Keywords: multi head; federated knowledge; knowledge distillation; recommendation ... See more keywords