Abstract:
In radiotherapy, although the image-guidance technique based on cone-beam computed tomography (CBCT) can effectively correct patient setup errors and monitor lesion volume changes, its inherent scattering noise and reconstruction artifacts result in distorted image grayscale values, which limits its clinical application. To achieve fast calibration of CBCT HU(Hounsfield unit) values in intra-fraction adaptive radiotherapy, we propose an adversarial generative network model called registration-enhanced generative adversarial network (Reg-GAN) based on the deformation registration mechanism, which realizes efficient calibration of CBCT images to the radiotherapy dosage by efficiently mapping unpaired medical image data to the radiotherapy dosage. Mapping achieves fast grayscale calibration of CBCT images to pseudo-CT, synthetic CT (sCT). The study included paired simulated CT, also known as planning CT (pCT), and CBCT image data from 46 patients with head and neck tumors (acquisition interval <24 h). Stratified random sampling was used to divide the dataset into training (38 cases) and validation (eight cases) groups. In the preprocessing stage, a rigid registration algorithm was applied to spatially align the pCT with the CBCT coordinate system, and voxel resampling was used to achieve spatial pixel standardization. The Reg-GAN network architecture is based on a cycle-consistent adversarial network (Cycle-GAN) and innovatively integrates a deep-learning-based multimodal alignment module to optimize the image quality through joint optimization. The Reg-GAN architecture significantly improves the robustness of the model to noise and artifacts by jointly optimizing the image generation loss and spatial deformation field constraints. Quantitative evaluation showed that by comparing the HU values of the corresponding voxels in the spatial coordinate system, the difference in HU values between CBCT and pCT was between 0 and 250 HU within anatomical structures, the difference in HU values between sCT and pCT was between −50 and 50 HU, and the difference in HU values for soft and brain tissues was 0 HU. In contrast, the sCT generated by Reg-GAN showed significant improvement in image quality metrics over the original CBCT: Mean absolute error (MAE) decreased from (52.5±26.6) HU to (36.6±11.6) HU (
P=0.041<0.05), peak signal-to-noise ratio (PSNR) increased from (25.1±3.1) dB to (27.1±2.4) dB (
P=0.006<0.05), and structural similarity index (SSIM) was optimized from 0.82±0.03 to 0.84±0.02 (
P=0.022<0.05). Dosimetric validation was performed using a multimodal image fusion strategy, in which pCT was used as a baseline image and sCT was rigidly aligned to map the target volume and organs at risk through deformation contouring. The dose calculation results of the treatment planning system (TPS) showed that the dose distributions and dose–volume histogram (DVH) generated by sCT and pCT maintained high consistency, and the
P-values of the pivotal dosimetric parameters were all >0.05, with no statistically significant difference between them, validating the dosimetric accuracy of sCT in adaptive radiotherapy. In this study, the limitation of CBCT image grayscale distortion on dose calculation was effectively solved by the synergistic optimization of deep alignment and the generative adversarial network. The proposed Reg-GAN model not only enhances the workflow efficiency of image-guided radiotherapy but also exhibits excellent performance of the generated sCT in terms of image quality and dosimetric properties, providing reliable technical support for the clinical implementation of online adaptive radiotherapy.