Synthetic Computed Tomography Generation via Cycle Bidirectional-Transformer
-
Graphical Abstract
-
Abstract
Magnetic resonance imaging (MRI)-guided radiotherapy allows real-time adjustments to treatment plans based on the threat posed by tumors and organs. It relies primarily on using MRI-generated pseudo-computed tomography (CT) scans for radiotherapy. Currently, pseudo-CT imaging techniques are mainly based on generative adversarial networks. This method, using pixel-level loss during training to update network parameters, is prone to mode collapse, resulting in unstable pseudo-CT scans. To accurately generate pseudo-CT scans based on magnetic resonance (MR) images, this paper proposes a novel medical image synthesis method — the Cycle Bi-directional Transformer. It leverages the context sensitivity of visual Transformers and the inductive bias of convolutional operators. During the encoding and prediction phase, the Cycle Bi-directional Transformer utilizes a codebook obtained from U-Net encoding to represent images. It shortens the length of the generated codebook using non-autoregressive encoding and vector quantization, thereby producing locally realistic and globally consistent images. Furthermore, this paper employs normalized mutual information as a loss function and introduces cycle consistency loss to address data mismatch issues. A series of experiments on a collected dataset of cranial MRI images validates the effectiveness of the proposed method. The Cycle Bi-directional Transformer achieves Mean Absolute Error (MAE), Peak Signal-to-Noise Ratio (PSNR), and Structural Similarity Index (SSIM) values of 86.3 HU, 25.96 dB, and 0.897, respectively. In comparison with other methods, this approach demonstrates superior performance.
-
-