Current Journal of Applied Science and Technology
https://journalcjast.com/index.php/CJAST
<p style="text-align: justify;"><strong>Current Journal of Applied Science and Technology (ISSN: 2457-1024)</strong> is dedicated to publish research papers, reviews, case studies and short communications from all disciplines of science and technology. By not excluding papers on the basis of subject area, CJAST facilitates the research and wishes to publish papers as long as they are technically correct and scientifically motivated. Subject areas cover, but not limited to, medicine, physics, chemistry, biology, environmental sciences, geology, engineering, agriculture, biotechnology, nanotechnology, arts, education, sociology and psychology, business and economics, finance, mathematics and statistics, computer science, social sciences, linguistics, architecture, industrial and all other science and engineering disciplines. By not excluding papers based on novelty, this journal facilitates the research and wishes to publish papers as long as they are technically correct and scientifically motivated. The journal also encourages the submission of useful reports of negative results. This is a quality controlled, OPEN peer-reviewed, open-access INTERNATIONAL journal.</p>SCIENCEDOMAIN internationalen-USCurrent Journal of Applied Science and Technology2457-1024Automated Diabetic Retinopathy Severity Classification Using Transfer Learning with DenseNet201
https://journalcjast.com/index.php/CJAST/article/view/4690
<p>Diabetic Retinopathy (DR) remains one of the most common causes of preventable blindness worldwide, particularly among working-age adults between 20 and 64 years of age. Despite its prevalence, early diagnosis continues to be a significant clinical challenge because the disease frequently progresses without noticeable symptoms until substantial and often irreversible damage has already occurred to the retinal tissue. Methods: An automated classification system was developed to identify the severity of Diabetic Retinopathy from fundus images using a convolutional neural network built on the DenseNet201 architecture with transfer learning from ImageNet pretrained weights. The system classifies retinal images into five distinct stages of severity: No DR, Mild Non-Proliferative DR, Moderate Non-Proliferative DR, Severe Non-Proliferative DR, and Proliferative DR. To address the substantial class imbalance present in the training dataset, image augmentation techniques including random horizontal flipping and random rotation were employed. The model was trained on Gaussian-filtered retinal images sourced from a publicly available Kaggle dataset containing 3,662 original samples, with a 75-25 stratified train-test split to preserve class proportions. The model was evaluated using multiple metrics including accuracy, weighted precision, weighted F1 score, and a detailed confusion matrix analysis across all five classes. Results: The experimental results demonstrate that the proposed approach achieves 94% overall accuracy, a weighted precision of 0.949, and a weighted F1 score of 0.937 on the validation set containing 500 randomly sampled images. A thorough analysis of per-class performance is also presented, along with a comparison of results against other popular architectures reported in the literature. The limitations of the current approach are discussed, and concrete directions for future improvement are proposed, including multi-model ensembling, attention mechanisms, and explainability through gradient-based visualization techniques. The complete implementation including data preprocessing, model training, and validation scripts is provided in a public GitHub repository to support full reproducibility of the reported results.</p>Balachandar Jeganathan
Copyright (c) 2026 Author(s). The licensee is the journal publisher. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
2026-04-222026-04-2245511810.9734/cjast/2026/v45i54690