Professional Documents
Culture Documents
Deform-Gan:An Unsupervised Learning Model For Deformable Registration Xiaoyue Zhang, Weijian Jian, Yu Chen, Shihting Yang
Deform-Gan:An Unsupervised Learning Model For Deformable Registration Xiaoyue Zhang, Weijian Jian, Yu Chen, Shihting Yang
REGISTRATION
3.1. Dataset
Fig. 2. Registration results for different methods.
We evaluated our method with Brain Tumor Segmentation
(BraTS) 2018 dataset[12], which provides a large number of The registration results are illustrated in Fig.2. MI is in-
multi-sequence MRI scans, including T1, T2, FLAIR, and trinsically a global measure and so its local estimation is dif-
T1Gd. Different sequence in the dataset have been aligned ficult. VoxelMorph with local CC is based on gray value and
very well. We evaluated the registration on T1 and T2 data cannot function well for cross-sequence. The results of Voxel-
pairs. We randomly chose 235 data for training and the rest Morph with gradient loss become much better and can handle
50 for testing. We cropped and downsized the images to the large deformation between R and F . This demonstrates the
input size of 112 × 128 × 96. We added random shift, rota- effectiveness of the local gradient loss. Our methods, both
tion, scaling and elastic deformation to the scans and gener- Deform-GAN-1 and Deform-GAN-2, prove higher accuracy
ated data pairs for registration, while the synthetic deforma- of registration. Even for blur and noisy image (please see the
tion fields can be regarded as registration ground-truth. The second row), Deform-GAN can get satisfying results, and ob-
range of deformations can be large enough to -40 +40 voxels viously, Deform-GAN-2 is even better.
and it is a challenge of registration. For further evaluation on the two setting of Deform-
GAN, the warped floating images F (φ) and synthesized
3.2. Baseline Methods images F 0 (φ) from two GANs at different stages of train-
ing are shown in Fig.3. We can see that gradient constraint
We compare two well-established image registration methods brings the faster convergence during training. Even at first
with ours: A conventional MI-based approach[?] and Vox- epoch, white matter can be seen clearly in F 0 (φ). Whats
elMorph method[4]. In the former one, MI is implemented more, Deform-GAN-2 is more stable in the training process
as driving forces within a fluid registration framework. The (As the yellow arrows point out, there is less noise in F 0 (φ)
latter one introduces novel diffeomorphic integration layers of Deform-GAN-2 than that of Deform-GAN-1). Note that
combined with a transform layer to enable unsupervised end- F 0 (φ) is important for calculating CC(R, F 0 (φ)), it should
to-end learning. But the original VoxelMorph set similarity be aligned to F (φ) strictly. The red arrows point out that
as local cross-correlation which only function well in single- the alignment between F (φ) and F 0 (φ) of Deform-GAN-2 is
model registration.As described in chapter 2.2, we also tried much better.
several similarity metric with Voxel-morph framework, such In order to quantify the registration results of our methods
as MI, NGF and LG. But only LG is capable of the regis- and the compared methods, we proposed additional eval-
tration task. So we just use Voxelmorph with LG for compar- uation experiments. For the BraTS dataset, we can warp
ison. the floating image by synthetic deformation fields as ground
truth. Hence, the root-mean-square error (RMSE) of pixel-
3.3. Evaluation wise intensity can be calculated for the evaluation. Also,
because the mask of tumor is given by BraTS challenge, we
We set the loss weight as: α = 1in Eq.4, β = 2 in Eq.5 can calculate Dice score to evaluate registration result around
and µ = 5, λ = 100 in Eq.8. For CC and local gradient, tumor area. Table 1 shows the quantitative results. It can be
window size is set as 7×7×7. We use ADAM optimizer with seen that our method outperforms the others in terms of tumor
learning rate 1e−4. NVIDIA Geforce 1080ti GPU with 11GB Dice and RMSE. In terms of registration speed, deep-learning
We declare that we do not have any commercial or asso-
ciative interest that represents a conflict of interest in connec-
tion with the work submitted.
5. REFERENCES