Skip navigation
Please use this identifier to cite or link to this item: https://libeldoc.bsuir.by/handle/123456789/41147
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKaraki, Y.-
dc.contributor.authorIvanov, N.-
dc.date.accessioned2020-11-19T12:12:31Z-
dc.date.available2020-11-19T12:12:31Z-
dc.date.issued2020-
dc.identifier.citationKaraki, Y. Hyperparameters of Multilayer Perceptron with Normal Distributed Weights / Karaki Y., Ivanov N. // Pattern Recognition and Image Analysis. – 2020. – V. 30, № 3. – P. 170-173. – DOI: 10.1134/S1054661820020054.ru_RU
dc.identifier.urihttps://libeldoc.bsuir.by/handle/123456789/41147-
dc.description.abstractMultilayer Perceptrons, Recurrent neural networks, Convolutional networks, and others types of neural networks are widespread nowadays. Neural Networks have hyperparameters like number of hidden layers, number of units for each hidden layer, learning rate, and activation function. Bayesian Optimization is one of the methods used for tuning hyperparameters. Usually this technique treats values of neurons in net- work as stochastic Gaussian processes. This article reports experimental results on multivariate normality test and proves that the neuron vectors are considerably far from Gaussian distribution.ru_RU
dc.language.isoenru_RU
dc.publisherSpringerru_RU
dc.subjectпубликации ученыхru_RU
dc.subjectneural networksru_RU
dc.subjecthyperparametersru_RU
dc.subjectGaussian distributionru_RU
dc.subjectBayesian optimizationru_RU
dc.titleHyperparameters of Multilayer Perceptron with Normal Distributed Weightsru_RU
dc.typeСтатьяru_RU
Appears in Collections:Публикации в зарубежных изданиях

Files in This Item:
File Description SizeFormat 
Karaki_Hyperparameters.pdf162.88 kBAdobe PDFView/Open
Show simple item record Google Scholar

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.