Abstract:
At present, the diagnosis of prostate cancer mainly relies on the level of prostate-specific antigen (PSA) followed by a prostate biopsy. The technology, transrectal ultrasound (TRUS), has been the most popular method for diagnosing prostate cancer because of its advantages, such as real-time, low cost, easy operation. However, the low imaging quality of ultrasound equipment makes it difficult to distinguish regions of malignant tumors from those of healthy tissues from low-quality images, which results in missing diagnoses or overtreating conditions. In contrast, magnetic resonance (MR) images of the prostate can quickly locate the position of malignant tumors. It is crucial to register the annotated MR images and the corresponding TRUS image to perform a targeted biopsy of the prostate tumor. The registration fusion of prostate magnetic resonance and transrectal ultrasound images helps to improve the accuracy of the prostate lesions targeted biopsy. Traditional registration methods that are usually manually selected, specific anatomical landmarks in segmented areas used as a reference, and performed rigid or nonrigid registration, which is inefficient because of the low quality of prostate TRUS images and the substantial differences in pixel intensity of the prostate between MR and TRUS images. This paper proposed a novel prostate MR/TRUS image segmentation and the automatic registration method was based on a supervised learning framework. First, the prostate active appearance model was trained to be applied in the prostate TRUS images segmentation task, and the random forest classifier was used for building a boundary-driven mathematical model to realize automatic segmentation of TRUS images. Then, some sets of MR/TRUS images contour landmarks were computed by matching the corresponding shape descriptors used for registration. The method was validated by comparing the automatic contour segmentation results with standard results, and the registration results with a traditional registration method. Results showed that our method could accurately realize the automatic segmentation and registration of prostate TRUS and MR images. The DSC (Dice similarity coefficient, DSC) accuracy of nine sets of registration results is higher than 0.98, whereas the average location accuracy of the urethral opening is 1.64 mm, which displays a better registration performance.