Rezumat articol ediţie STUDIA UNIVERSITATIS BABEŞ-BOLYAI

În partea de jos este prezentat rezumatul articolului selectat. Pentru revenire la cuprinsul ediţiei din care face parte acest articol, se accesează linkul din titlu. Pentru vizualizarea tuturor articolelor din arhivă la care este autor/coautor unul din autorii de mai jos, se accesează linkul din numele autorului.

 
       
         
    STUDIA INFORMATICA - Ediţia nr.2 din 2020  
         
  Articol:   EXPERIMENTAL STUDY OF SOME PROPERTIES OF KNOWLEDGE DISTILLATION.

Autori:  ÁDÁM SZIJÁRTÓ, PÉTER LEHOTAY-KÉRY, ATTILA KISS.
 
       
         
  Rezumat:  
DOI: 10.24193/subbi.2020.2.01

Published Online: 2020-10-27
Published Print: 2020-12-30
pp. 5-16

FULL PDF

VIEW PDF


Abstract. For more complex classification problems it is inevitable that we use increasingly complex and cumbersome classifying models. However, often we do not have the space or processing power to deploy these models.
Knowledge distillation is an effective way to improve the accuracy of an otherwise smaller, simpler model using a more complex teacher network or ensemble of networks. This way we can have a classifier with an accuracy that is comparable to the accuracy of the teacher while small enough to deploy.
In this paper we evaluate certain features of this distilling method, while trying to improve its results. These experiments and examinations and the discovered properties may also help to further develop this operation.


Received by the editors: 7 August 2020.
2010 Mathematics Subject Classiffication. 68T05, 68T30.
1998 CR Categories and Descriptors. I.2.6 [Artificial Intelligence]: Learning - Connectionism and neural nets;
Key words and phrases. Artificial Intelligence, Convolutional Neural Networks, Deep learning, Knowledge distillation, Neural networks.
 
         
     
         
         
      Revenire la pagina precedentă