Electronic Theses and Dissertations for Graduate School
Add My Work
Author Last Name
APPLYING DEEP LEARNING TECHNIQUES TO GENERATE MEDICAL IMAGES
Restricted (Penn State Only)
Computer Science and Engineering
Master of Science
Date of Defense:
July 15, 2019
Mohamed Khaled Almekkawy, Thesis Advisor
Mehrdad Mahdavi, Committee Member
Chitaranjan Das, Committee Member
Proper diagnosis of a disease often requires more than one modality of medical imaging. Perhaps a higher resolution of magnetic resonance imaging is required for finding the finer details in brain tissue but the additional expenditure prevents the patient from getting it, or a computed tomography image is required from a pregnant woman who cannot undergo any exposure to radiation and can only undergo ultrasound or magnetic resonance imaging. As these situations keep occurring more frequently, an increasing need for a tool that can convert medical images from one modality to another arises. Our study aims to solve this problem by using deep generative adversarial networks to learn from different pairs of medical image modalities and then generate one modality from another. Specifically we explore two imaging modalities, namely, magnetic resonance imaging T1 and magnetic resonance imaging T2. We train our network on patches from each corresponding image pair and compare generated images against ground truth. To develop the model we needed perfectly aligned data, however, the T1 and T2 scans in our dataset varied in size, scale and cuttoff areas. We apply medical image registration and use a step gradient descent optimizer to align both images. For training, we consider four different configuration and show the results in order of improving image clarity: i) a reduced scan size to increase receptive area of training patches, ii) no gradient difference loss, iii) gradient difference loss and iv) gradient difference loss with a threshold. These configurations improve the model performance gradually until the generated images are almost identical to the ground truth.
Login using your Penn State access account to view the paper.