https://libeldoc.bsuir.by/handle/123456789/41147
Title: | Hyperparameters of Multilayer Perceptron with Normal Distributed Weights |
Authors: | Karaki, Y. Ivanov, N. |
Keywords: | публикации ученых;neural networks;hyperparameters;Gaussian distribution;Bayesian optimization |
Issue Date: | 2020 |
Publisher: | Springer |
Citation: | Karaki, Y. Hyperparameters of Multilayer Perceptron with Normal Distributed Weights / Karaki Y., Ivanov N. // Pattern Recognition and Image Analysis. – 2020. – V. 30, № 3. – P. 170-173. – DOI: 10.1134/S1054661820020054. |
Abstract: | Multilayer Perceptrons, Recurrent neural networks, Convolutional networks, and others types of neural networks are widespread nowadays. Neural Networks have hyperparameters like number of hidden layers, number of units for each hidden layer, learning rate, and activation function. Bayesian Optimization is one of the methods used for tuning hyperparameters. Usually this technique treats values of neurons in net- work as stochastic Gaussian processes. This article reports experimental results on multivariate normality test and proves that the neuron vectors are considerably far from Gaussian distribution. |
URI: | https://libeldoc.bsuir.by/handle/123456789/41147 |
Appears in Collections: | Публикации в зарубежных изданиях |
File | Description | Size | Format | |
---|---|---|---|---|
Karaki_Hyperparameters.pdf | 162.88 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.