Skip navigation
Please use this identifier to cite or link to this item: https://libeldoc.bsuir.by/handle/123456789/41147
Title: Hyperparameters of Multilayer Perceptron with Normal Distributed Weights
Authors: Karaki, Y.
Ivanov, N.
Keywords: публикации ученых;neural networks;hyperparameters;Gaussian distribution;Bayesian optimization
Issue Date: 2020
Publisher: Springer
Citation: Karaki, Y. Hyperparameters of Multilayer Perceptron with Normal Distributed Weights / Karaki Y., Ivanov N. // Pattern Recognition and Image Analysis. – 2020. – V. 30, № 3. – P. 170-173. – DOI: 10.1134/S1054661820020054.
Abstract: Multilayer Perceptrons, Recurrent neural networks, Convolutional networks, and others types of neural networks are widespread nowadays. Neural Networks have hyperparameters like number of hidden layers, number of units for each hidden layer, learning rate, and activation function. Bayesian Optimization is one of the methods used for tuning hyperparameters. Usually this technique treats values of neurons in net- work as stochastic Gaussian processes. This article reports experimental results on multivariate normality test and proves that the neuron vectors are considerably far from Gaussian distribution.
URI: https://libeldoc.bsuir.by/handle/123456789/41147
Appears in Collections:Публикации в зарубежных изданиях

Files in This Item:
File Description SizeFormat 
Karaki_Hyperparameters.pdf162.88 kBAdobe PDFView/Open
Show full item record Google Scholar

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.