DC Field | Value | Language |
dc.contributor.author | Matskevich, V. | - |
dc.coverage.spatial | Минск | en_US |
dc.date.accessioned | 2024-02-21T06:05:58Z | - |
dc.date.available | 2024-02-21T06:05:58Z | - |
dc.date.issued | 2023 | - |
dc.identifier.citation | Matskevich, V. Fast Random Search Algorithm in Neural networks Training / V. Matskevich // Pattern Recognition and Information Processing (PRIP'2023) = Распознавание образов и обработка информации (2023) : Proceedings of the 16th International Conference, October 17–19, 2023, Minsk, Belarus / United Institute of Informatics Problems of the National Academy of Sciences of Belarus. – Minsk, 2023. – P. 22–24. | en_US |
dc.identifier.uri | https://libeldoc.bsuir.by/handle/123456789/54296 | - |
dc.description.abstract | The paper deals with a state-of-art applied problem related to the neural networks training. Currently,
gradient descent algorithms are widely used for training. Despite their high convergence rate, they have a number of disadvantages, which, with the expansion of the neural networks' scope, can turn out to be critical.
The paper proposes a fast algorithm for neural networks training based on random search. It has been experimentally shown that in terms of the proposed algorithm's convergence rate, it is almost comparable to the best of the gradient algorithms, and in terms of quality it is significantly ahead of it. | en_US |
dc.language.iso | en | en_US |
dc.publisher | BSU | en_US |
dc.subject | материалы конференций | en_US |
dc.subject | neural networks | en_US |
dc.subject | random search | en_US |
dc.subject | gradient decent | en_US |
dc.subject | training | en_US |
dc.title | Fast Random Search Algorithm in Neural networks Training | en_US |
dc.type | Article | en_US |
Appears in Collections: | Pattern Recognition and Information Processing (PRIP'2023) = Распознавание образов и обработка информации (2023)
|