DC Field | Value | Language |
dc.contributor.author | Karaki, Y. | - |
dc.contributor.author | Ivanov, N. | - |
dc.date.accessioned | 2020-11-19T12:12:31Z | - |
dc.date.available | 2020-11-19T12:12:31Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | Karaki, Y. Hyperparameters of Multilayer Perceptron with Normal Distributed Weights / Karaki Y., Ivanov N. // Pattern Recognition and Image Analysis. – 2020. – V. 30, № 3. – P. 170-173. – DOI: 10.1134/S1054661820020054. | ru_RU |
dc.identifier.uri | https://libeldoc.bsuir.by/handle/123456789/41147 | - |
dc.description.abstract | Multilayer Perceptrons, Recurrent neural networks, Convolutional networks, and others types of
neural networks are widespread nowadays. Neural Networks have hyperparameters like number of hidden
layers, number of units for each hidden layer, learning rate, and activation function. Bayesian Optimization
is one of the methods used for tuning hyperparameters. Usually this technique treats values of neurons in net-
work as stochastic Gaussian processes. This article reports experimental results on multivariate normality test
and proves that the neuron vectors are considerably far from Gaussian distribution. | ru_RU |
dc.language.iso | en | ru_RU |
dc.publisher | Springer | ru_RU |
dc.subject | публикации ученых | ru_RU |
dc.subject | neural networks | ru_RU |
dc.subject | hyperparameters | ru_RU |
dc.subject | Gaussian distribution | ru_RU |
dc.subject | Bayesian optimization | ru_RU |
dc.title | Hyperparameters of Multilayer Perceptron with Normal Distributed Weights | ru_RU |
dc.type | Статья | ru_RU |
Appears in Collections: | Публикации в зарубежных изданиях
|