Chen, S., Ding, C., Liu, M., Cheng, J., & Tao, D. (2023). CPP-Net: Context-Aware Polygon Proposal Network for Nucleus Segmentation. IEEE Transactions on Image Processing, 32, 980–994. https://doi.org/10.1109/tip.2023.3237013
Nucleus segmentation is a challenging task due to the crowded distribution and blurry boundaries of nuclei.
Recent approaches represent nuclei by means of polygons to differentiate between touching and overlapping nuclei and have accordingly achieved promising performance. Each polygon is
represented by a set of centroid-to-boundary distances, which are in turn predicted by features of the centroid pixel for a single nucleus. However, using the centroid pixel alone does not provide
sufficient contextual information for robust prediction and thus degrades the segmentation accuracy. To handle this problem, we propose a Context-aware Polygon Proposal Network (CPP-Net) for nucleus segmentation. First, we sample a point set rather than one single pixel within each cell for distance prediction. This strategy substantially enhances contextual information and thereby improves the robustness of the prediction. Second, we
propose a Confidence-based Weighting Module, which adaptively fuses the predictions from the sampled point set. Third, we introduce a novel Shape-Aware Perceptual (SAP) loss that constrains the shape of the predicted polygons. Here, the SAP loss is based on an additional network that is pre-trained by means of mapping the centroid probability map and the pixel-to-boundary distance maps to a different nucleus representation. Extensive experiments justify the effectiveness of each component in the
proposed CPP-Net. Finally, CPP-Net is found to achieve state-of-the-artperformance on three publicly available databases,
namely DSB2018, BBBC06, and PanNuke. Code of this paper
is available at https://github.com/csccsccsccsc/cpp-net.
This work was supported by the National Natural Science Foundation of China under Grant 62076101 and 61702193, the Program for Guangdong Introducing Innovative and Entrepreneurial Teams under Grant 2017ZT07X183, the Natural Science Fund of Guangdong Province under Grant 2021A1515011651, and Guangdong Provincial Key Laboratory of Human Digital Twin under Grant 2022B1212010004