Skip navigation
Please use this identifier to cite or link to this item: https://libeldoc.bsuir.by/handle/123456789/54296
Title: Fast Random Search Algorithm in Neural networks Training
Authors: Matskevich, V.
Keywords: материалы конференций;neural networks;random search;gradient decent;training
Issue Date: 2023
Publisher: BSU
Citation: Matskevich, V. Fast Random Search Algorithm in Neural networks Training / V. Matskevich // Pattern Recognition and Information Processing (PRIP'2023) = Распознавание образов и обработка информации (2023) : Proceedings of the 16th International Conference, October 17–19, 2023, Minsk, Belarus / United Institute of Informatics Problems of the National Academy of Sciences of Belarus. – Minsk, 2023. – P. 22–24.
Abstract: The paper deals with a state-of-art applied problem related to the neural networks training. Currently, gradient descent algorithms are widely used for training. Despite their high convergence rate, they have a number of disadvantages, which, with the expansion of the neural networks' scope, can turn out to be critical. The paper proposes a fast algorithm for neural networks training based on random search. It has been experimentally shown that in terms of the proposed algorithm's convergence rate, it is almost comparable to the best of the gradient algorithms, and in terms of quality it is significantly ahead of it.
URI: https://libeldoc.bsuir.by/handle/123456789/54296
Appears in Collections:Pattern Recognition and Information Processing (PRIP'2023) = Распознавание образов и обработка информации (2023)

Files in This Item:
File Description SizeFormat 
Matskevich_Fast.pdf229.14 kBAdobe PDFView/Open
Show full item record Google Scholar

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.