Master of Science (MS)
Hyperspectral imaging presents detailed information about the electromagneticspectrum of an object in three dimensions. The significant point about the hyperspectral images is that it contains tens or hundreds of spectral layers, which provide precise data about the composition of the studied material. Therefore, hyperspectral images have been popular in many fields of study, such as medical diagnostic imaging. Speed and precision are key points to save human life in disease diagnosis, and applying machine learning techniques to medical hyperspectral images helps answer this need. Convolutional neural networka are one of the most popular machine learning methods for classifying medical images. However, training neural networks, in general, requires a large dataset, and the small size of medical imaging datasets results in a problem. In this thesis, we propose sparse coding algorithms to regenerate the hyperspectral data and feed it to the CNN model for training. This issue can be solved with the help of sparse coding algorithms. We focus on a colon cancer hyperspectral image dataset and different sparse coding methods utilizing K-SVD and A+ (with and without patching) as dictionary learning methods. The new reconstructed images have been added to the original image set and provided three new training sets with doubled number of images (246) for training the CNN. Using the augmented datasets, the test accuracy has risen to 86.53%, which is 30.13% higher than the original dataset (56.4%). We have also generated another dataset which is a mixture of the three reconstruction methods, and increased the number of training images to 266. Using the mixed dataset, the accuracy has reached 94.23%, and the difference between the test and training accuracy has dropped by 15.42%. Also, the precision has increased to 100%, which means there is no non-malignant image classified as a lesional image.
Zandi, Rojin, "Sparse Coding for Data Augmentation of Hyperspectral Medical Images" (2021). Master's Theses. 5250.