Deep Learning Based Reconstruction For Tailored Magnetic Resonance Fingerprinting
Deep Learning Based Reconstruction For Tailored Magnetic Resonance Fingerprinting
Amaresha Shridhar Konar1, Vineet Vinay Bhombore1, Imam Ahmed Shaik1, Seema Bhat1, Rajagopalan Sundaresan2, Gul Moonis3, Prachi Desai3, Sachin Jambawalikar3, Ramesh Venkatesan2, Thomas Vaughan3 and Sairam Geethanath 1,4*
1Medical Imaging Research Centre (MIRC), Dayananda Sagar Institutions, Bangalore, India; 2Wipro GE Healthcare, Bangalore, India;
3Dept. of Radiology, Columbia University Medical Center, New York, USA; 4Magnetic Resonance Research Program Columbia University Medical Center, New York, USA
Introduction
Magnetic Resonance Fingerprinting (MRF) is an accelerated acquisition and reconstruction technique employed to generate multiple parametric maps1. Tailored MRF (TMRF) has been recently proposed with a block based (T1, T2 and PD) acquisition2 approach to overcome the limitation of under estimation of long T2 components1. Dictionary based matching in conventional MRF is computationally intensive and limited by dictionary resolution. To overcome this limitation, a deep learning based approach has been proposed and demonstrated for the reconstruction of TMRF data. Accelerated reconstruction with minimum acquisition to provide multiple parametric maps would add value to the MR imaging. A two-layer Deep Neural Network (DNN) created on TensorFlow (Google Inc., USA) has been employed for training.
Methods
The reconstruction pipeline for tailored MRF using the block-based acquisition is illustrated in Figure 1.
Acquisition: The signal intensity of a gradient echo based sequence is more dependent on the Flip Angle (FA) than the Repetition Time (TR) due to the minimal TRs typically employed in such cases3. Thus, the required contrast can be achieved by an optimal choice of the FA. This has been utilized to form 3 blocks that optimize contrast for Proton Density (PD), T1 and T2 weighting. This allows each tissue type to have hyper-intensities in at least one of the three blocks. TRs and FAs were independently designed for each block and then combined into a single sequence. Each block comprised of 240 acquisitions (total of 720 TR/FA combinations). Four human in vivo brain data were acquired on a 1.5 T GE Signa with a spiral readout time of 5ms with a fixed Echo Time (TE) of 2.7ms, as part of an institution approved study. The spiral trajectory consisted of 48 arms and 720 images acquired using the TR/FA schedule was sliding window reconstructed to get 673 images.
Figure 1: Block diagram depicting the reconstruction of TMRF Data using Deep Learning methods
Methods
Approach 1: 10000 natural images of size 32x32 were downloaded from CIFAR-1004 for training the NN. One half of the data was corrupted by introducing three artifacts: random rotation, Rician noise and circular shift as these discrepancies were commonly observed in TMRF data. As input to the Neural Network, 10000 voxels were randomly selected from synthesized data of size 32x32x673x10000 and trained against concurrent voxels selected from GT maps (T1 and T2).
Training: We considered two approaches for general/brain specific image reconstruction.
Approach 2: Synthetic brain images of size 128x128 were synthesized using the above method to create a training dataset of size 128x128x673. Similar to the previous approach, 12000 voxels were selected from the synthesized maps and GT maps (synthetic brain images) for NN training. The voxels selected were non-zero values within the brain.
DNN: T1 and T2 for approaches 1 and 2 were separately trained. A two layered DNN was used for both parameters. The DNN for T2 had weights of size 64, 32 and 1 for the layers, whereas the DNN for T1 had weights of size 128, 64 and 1.
Testing: Voxel based testing was performed for both approaches on the data acquired using TMRF method. Four such datasets were used to test the DNN. The data was forward propagated through the trained NN to obtain the reconstructed voxels. These voxels were reshaped to generate reconstructed maps on MATLAB (The Mathworks Inc, MA). The reconstructed maps were compared with the scanner generated GT maps to validate the proposed approaches. The code for this implementation is available online5.
Results
Figure 2 show the results of T1 and T2 from both approaches, compared with scanner generated GT maps respectively.
Figure 2: Comparison of Ground Truth T1 and T2 maps with Natural Images (NI) and Brain-specific Images (BI) DNN reconstructed maps
Results
Quantitative analysis for both the approaches is reported in Figure 3. A comparison of the learning curves (Figure 3(a)) shows the differences between the two approaches. Error between the reconstructed and scanner generated GT maps using the Normalized Root Mean Square Error (NRMSE) metric is shown in Figure 3(b). The graph shows that there is no significant difference between approach-1 and approach-2. However, the superior performance of the natural images based training over the other method through the retention of parametric values can be observed in figure 2 over the four data sets. Figure 4 shows the Histogram plots comparing Ground Truth and DNN reconstruction.
Figure 3: (a) Learning rates for the DNN training of T1 and T2 for both the approaches, (b) NRMSE depicts the error between Ground Truth and reconstructed maps for the two cases
Figure 4: Depicts Histogram plots for both approaches
Discussion and Conclusion
It can be ascertained that the results obtained from the natural images were better than the brain images based training. We attribute this to the over-fitting nature of the second neural network. This is also shown by the L-curves for training. A salient feature of the natural images approach is that it is not restricted to a single organ and hence provide accelerated imaging of multi organ to provide high value MRI.
References
[1] Dan Ma, et. al., Nature 2013, [2] Shaik Imam, et. al., ISMRM MRF workshop 2017, [3] Brian Hargreaves, et. Al., JMRI 2012, [4] https://www.cs.toronto.edu/~kriz/cifar.html, [5] https://github.com/mirc-dsi/IMRI MIRC/tree/master/MR%20RECON/CODE/TMRF_DNN_Recon.