The STUDIA UNIVERSITATIS BABEŞ-BOLYAI issue article summary

The summary of the selected article appears at the bottom of the page. In order to get back to the contents of the issue this article belongs to you have to access the link from the title. In order to see all the articles of the archive which have as author/co-author one of the authors mentioned below, you have to access the link from the author's name.

 
       
         
    STUDIA INFORMATICA - Issue no. 2 / 2020  
         
  Article:   EXPERIMENTAL STUDY OF SOME PROPERTIES OF KNOWLEDGE DISTILLATION.

Authors:  ÁDÁM SZIJÁRTÓ, PÉTER LEHOTAY-KÉRY, ATTILA KISS.
 
       
         
  Abstract:  
DOI: 10.24193/subbi.2020.2.01

Published Online: 2020-10-27
Published Print: 2020-12-30
pp. 5-16

FULL PDF

VIEW PDF


Abstract. For more complex classification problems it is inevitable that we use increasingly complex and cumbersome classifying models. However, often we do not have the space or processing power to deploy these models.
Knowledge distillation is an effective way to improve the accuracy of an otherwise smaller, simpler model using a more complex teacher network or ensemble of networks. This way we can have a classifier with an accuracy that is comparable to the accuracy of the teacher while small enough to deploy.
In this paper we evaluate certain features of this distilling method, while trying to improve its results. These experiments and examinations and the discovered properties may also help to further develop this operation.


Received by the editors: 7 August 2020.
2010 Mathematics Subject Classiffication. 68T05, 68T30.
1998 CR Categories and Descriptors. I.2.6 [Artificial Intelligence]: Learning - Connectionism and neural nets;
Key words and phrases. Artificial Intelligence, Convolutional Neural Networks, Deep learning, Knowledge distillation, Neural networks.
 
         
     
         
         
      Back to previous page