Skip navigation
Please use this identifier to cite or link to this item: https://libeldoc.bsuir.by/handle/123456789/54364
Title: Human Pose Estimation using SimCC and Swin Transformer
Authors: Tongrui Li
Ablameyko, S.
Keywords: материалы конференций;human pose estimation;swin transformer;SimCC
Issue Date: 2023
Publisher: BSU
Citation: Tongrui Li. Human Pose Estimation using SimCC and Swin Transformer / Tongrui Li, S. Ablameyko // Pattern Recognition and Information Processing (PRIP'2023) = Распознавание образов и обработка информации (2023) : Proceedings of the 16th International Conference, October 17–19, 2023, Minsk, Belarus / United Institute of Informatics Problems of the National Academy of Sciences of Belarus. – Minsk, 2023. – P. 197–201.
Abstract: 2D Human Pose Estimation is an important task in computer vision. In recent years, methods using deep learning for human pose estimation have been proposed one after another and achieved good results. Among existing models, the built-in attention layer in Transformer enables the model to effectively capture long-range relationships and also reveal the dependencies on which predicted key points depend. SimCC formulates keypoint localization as a classification problem, dividing the horizontal and vertical axes into equal-width numbered bins, and discretizing continuous coordinates into integer bin labels. We propose a new model that combines the Swin Transformer training model to predict the bin where the key points are located, so as to achieve the purpose of predicting key points. This method can achieve better results than other models and can achieve supixel positioning accuracy and low quantization error.
URI: https://libeldoc.bsuir.by/handle/123456789/54364
Appears in Collections:Pattern Recognition and Information Processing (PRIP'2023) = Распознавание образов и обработка информации (2023)

Files in This Item:
File Description SizeFormat 
Tongrui_Li_Human.pdf6.39 MBAdobe PDFView/Open
Show full item record Google Scholar

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.