Large-Scale Validation of Vq-Vae for Predicting Anatomical Changes In Rectal Cancer Adaptive Radiotherapy
Abstract
Purpose
To investigate the feasibility and robustness of using a VQ-VAE based deep learning model for predicting daily anatomical changes in rectal cancer radiotherapy using a large-scale dataset.
Methods
A total of 2461 daily CT images from 443 rectal cancer patients were utilized in this study. A generative model combining Vector Quantized Variational Autoencoder (VQ-VAE) and Adaptive Instance Normalization (AdaIN) was established to predict daily anatomy. The model extracted anatomical features from planning CTs and learned inter-fractional non-rigid deformations. The dataset was divided into training and validation sets. To evaluate the model performance, quantitative indices included Structural Similarity Index (SSIM) and Peak Signal-to-Noise Ratio (PSNR). Visual assessment was performed to check the anatomical plausibility of the bladder and rectum boundaries.
Results
The model achieved stable convergence on the extensive cohort. Quantitative analysis demonstrated that the predicted CTs achieved an average SSIM of 0.825 and PSNR of 24.98 dB. Compared to the baseline planning CTs (SSIM: 0.816), the proposed model improved the structural fidelity to the daily ground truth. Visually, the model successfully reconstructed soft tissue deformations and preserved bony landmarks. The generated images showed clear textures without significant artifacts, indicating the model can capture key information for anatomical variation.
Conclusion
The VQ-VAE based deep learning model can be used to predict anatomical changes in rectal cancer. It shows potential for guiding adaptive radiotherapy strategies on large-scale datasets.