Most existing methods of garbage classification utilize transfer learning to acquire acceptable performance. They focus on some smaller categories. For example, the number of the dataset is small or the… Click to show full abstract
Most existing methods of garbage classification utilize transfer learning to acquire acceptable performance. They focus on some smaller categories. For example, the number of the dataset is small or the number of categories is few. However, they are hardly implemented on small devices, such as a smart phone or a Raspberry Pi, because of the huge number of parameters. Moreover, those approaches have insufficient generalization capability. Based on the aforementioned reasons, a promising cascade approach is proposed. It has better performance than transfer learning in classifying garbage in a large scale. In addition, it requires less parameters and training time. So it is more suitable to a potential application, such as deployment on a small device. Several commonly used backbones of convolutional neural networks are investigated in this study. Two different tasks, that is, the target domain being the same as the source domain and the former being different from the latter, are conducted besides. Results indicate with ResNet101 as the backbone, our algorithm outperforms other existing approaches. The innovation is that, as far as we know, this study is the first work combining a pre-trained convolutional neural network as a feature extractor with extreme learning machine to classify garbage. Furthermore, the training time and the number of trainable parameters is significantly shorter and less, respectively.
               
Click one of the above tabs to view related content.