• Login
    View Item 
    •   MUT Research Archive
    • Theses & Dissertations
    • Masters Theses and Dissertations
    • School of Computing and IT (MT)
    • View Item
    •   MUT Research Archive
    • Theses & Dissertations
    • Masters Theses and Dissertations
    • School of Computing and IT (MT)
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    A Transfer Learning and Two-Level Hyperparameter Optimization Based Model for Improved Classification of Diabetic Retinopathy

    Thumbnail
    View/Open
    Full Text Thesis (2.054Mb)
    Date
    2022-10
    Author
    Wambugu, Jackson K.
    Metadata
    Show full item record
    Abstract
    Automated diagnosis of disease from medical images using machine learning has been in rise in the recent past. One such case is the classification of diabetic retinopathy from fundus images. Diabetic Retinopathy is an eye disease that is a result of diabetes mellitus and it is major cause of blindness among people of the working age. Diabetic retinopathy has five main classes namely: No DR, Mild DR, Moderate DR, Severe DR, and Proliferative DR. Deep learning has been used previously in this field and it has proved to be better than conventional machine learning approaches. However, deep learning involves training a model from scratch thus making it to be data hungry, require high training cost, have poor generalizability, and they don’t deliver high performance. Meta-learning also known as learning-to-learn is a field of machine learning which aims at improving deep learning by enabling models to improve their performance capabilities and reduce training cost. Meta-learning techniques include multi-task learning, transfer learning, self-optimization, and few-shot learning. Several transfer learning architectures pre-trained on the ImageNet dataset have been used by different researchers and they have demonstrated superior performance over deep learning. However, domain-shift generalizability and optimal performance of pre-trained architectures are major challenges facing transfer learning. This so because these models are not properly tuned for cross-domain optimality. The aim of this study was to develop an improved model for classification of diabetic retinopathy into its five classes. To achieve this, the researcher used the following approach: A VGG16 network pre-trained in ImageNet was modified such that the top-layer was rebuilt and an attention model was added. Two-level optimization was used during training in which the model was allowed to self-tune its learning rate based on the training parameters. The EyePACS dataset obtained from Kaggle repository was used in training, validating, and testing the model. The model was developed in Google Collaboratory platform using python programming language, TensorFlow, and Keras. The study achieved the following results: Accuracy 89.06%, Precision 88.9%, Recall 89.2%, F1-Score 75%, Quadratic Cohen Kappa Metric 0.84, Area Under the Curve (AUC) 93.3%. The results of the study demonstrated improved performance compared to other existing models in literature such as Qummar et al (2019), Jinfeng et al (2020), Chilukoti et al (2022), that classify diabetic retinopathy into five classes. The study concluded that leveraging on previously acquired knowledge and efficient optimization of neural networks using data driven self-optimization delivers better performance than conventional machine learning and deep learning. In future researchers can consider using reinforcement learning and transfer learning in classification of diabetic retinopathy.
    URI
    http://hdl.handle.net/123456789/6379
    Collections
    • School of Computing and IT (MT) [5]

    MUT Library copyright © 2017-2024  MUT Library Website
    Contact Us | Send Feedback
     

     

    Browse

    All of Research ArchiveCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    MUT Library copyright © 2017-2024  MUT Library Website
    Contact Us | Send Feedback