https://libeldoc.bsuir.by/handle/123456789/63185| Title: | Generalized synergistic edge-guided graph reasoning network for biomedical image segmentation |
| Authors: | Di Zhao Yi Tang Pertsau, D. Gourinovitch, A. |
| Keywords: | публикации ученых;medical image segmentation;graph reasoning;graph convolutional network;MRI;CT |
| Issue Date: | 2026 |
| Publisher: | Institute of Mathematics |
| Citation: | Generalized synergistic edge-guided graph reasoning network for biomedical image segmentation / Di Zhao, Yi Tang, Dmitry Pertsau, Alevtina Gourinovitch // Informatics and Control Problems. – 2026. – Volume 46, Issue 1. – P. 39–49. |
| Abstract: | Biomedical image segmentation plays a vital role in computer-aided diagnosis and treatment planning. However, existing methods often struggle with modeling complex anatomical structures and capturing long-range dependencies. To address these limitations, we propose a generalized Synergistic Edge-Guided Graph Reasoning Network (SEGRNet) that integrates convolutional feature extraction with graph-based global reasoning. The model projects pixel-level region and edge features into a graph domain, enabling adaptive interaction between local and global features via a graph convolutional network. After reasoning, enhanced features are mapped back for refined segmentation. Experiments conducted on three public datasets including BUSI, LGG and CHAOS outperforms state-of-the-art models in terms of dice coefficient, mean intersection over union and structural similarity. These results confirm the effectiveness and generalization ability of the proposed method across various medical imaging scenarios, making it suitable for future clinical applications. |
| URI: | https://libeldoc.bsuir.by/handle/123456789/63185 |
| DOI: | https://doi.org/10.54381/icp.2026.1.05 |
| Appears in Collections: | Публикации в зарубежных изданиях |
| File | Description | Size | Format | |
|---|---|---|---|---|
| Di_Zhao_Generalized.pdf | 824.84 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.