Loading...
				
	
				
	
								
					
				
				
	
				
															
Class attention map distillation for efficient semantic segmentation
Karimi Bavandpour, N ; Sharif University of Technology | 2020
				
											369
									
				Viewed
			
		- Type of Document: Article
- DOI: 10.1109/MVIP49855.2020.9116875
- Publisher: IEEE Computer Society , 2020
- Abstract:
- In this paper, a novel method for capturing the information of a powerful and trained deep convolutional neural network and distilling it into a training smaller network is proposed. This is the first time that a saliency map method is employed to extract useful knowledge from a convolutional neural network for distillation. This method, despite of many others which work on final layers, can successfully extract suitable information for distillation from intermediate layers of a network by making class specific attention maps and then forcing the student network to mimic producing those attentions. This novel knowledge distillation training is implemented using state-of-the-art DeepLab and PSPNet segmentation networks and its effectiveness is shown by experiments on the standard Pascal Voc 2012 dataset. © 2020 IEEE
- Keywords:
- Knowledge Distillation ; Saliency Maps ; Computer vision ; Convolution ; Deep neural networks ; Distillation ; Distilleries ; Image segmentation ; Network layers ; Semantics ; Intermediate layers ; Saliency map ; Semantic segmentation ; State of the art ; Student network ; Convolutional neural networks
- Source: 1st International Conference on Machine Vision and Image Processing, MVIP 2020, 19 February 2020 through 20 February 2020 ; Volume 2020-February , 2020
- URL: https://ieeexplore.ieee.org/document/9116875
 
		