Rezumat articol ediţie STUDIA UNIVERSITATIS BABEŞ-BOLYAI

În partea de jos este prezentat rezumatul articolului selectat. Pentru revenire la cuprinsul ediţiei din care face parte acest articol, se accesează linkul din titlu. Pentru vizualizarea tuturor articolelor din arhivă la care este autor/coautor unul din autorii de mai jos, se accesează linkul din numele autorului.

 
       
         
    STUDIA INFORMATICA - Ediţia nr.1 din 2021  
         
  Articol:   AN ANALYSIS ON VERY DEEP CONVOLUTIONAL NEURAL NETWORKS: PROBLEMS AND SOLUTIONS.

Autori:  TIDOR-VLAD PRICOPE.
 
       
         
  Rezumat:  
DOI: 10.24193/subbi.2021.1.01

Published Online: 2021-06-30
Published Print: 2021-06-30
pp. 5-22

VIEW PDF


FULL PDF

Neural Networks have become a powerful tool in computer vision because of the recent breakthroughs in computation time and model architecture. Very deep models allow for better deciphering of the hidden patterns in the data; however, training them successfully is not a trivial problem, because of the notorious vanishing/exploding gradient problem. We illustrate this problem on VGG models, with 8 and 38 hidden layers, on the CIFAR100 image dataset, where we visualize how the gradients evolve during training. We explore known solutions to this problem like Batch Normalization (BatchNorm) or Residual Networks (ResNets), explaining the theory behind them. Our experiments show that the deeper model suffers from the vanishing gradient problem, but BatchNorm and ResNets do solve it. The employed solutions slighly improve the performance of shallower models as well, yet, the fixed deeper models outperform them.

Keywords and phrases: Deep Learning, Neural Network, Image Classification, Deep Convolutional Neural Network, Vanishing Gradient Problem, VGG.

2010 Mathematics Subject Classification: 68T45.

1998 CR Categories and Descriptors. I.2.1 [Artificial Intelligence]: Learning – Connectionism and neural nets.
 
         
     
         
         
      Revenire la pagina precedentă