Advanced Deep Learning Models for Cloud-Based Bug Tracking and Software Defect Prediction: Integrating Transformer
Pages : 619-627, DOI: https://doi.org/10.14741/ijmcr/v.12.6.4Download PDF
Software defect prediction is crucial for maintaining software quality by detecting defective modules at an early stage of the development process. Conventional models suffer from problems like high-dimensionality, noisy features, and poor hyperparameters, resulting in lower accuracy. Even with the growth of machine learning and deep learning, accurate and efficient defect prediction is still a problem. This research suggests a more advanced deep learning model that combines (SSO) with (DNN) to solve feature selection and hyperparameter optimization problems. The suggested model is trained on the Kaggle Software Defect Prediction Dataset, which contains different software metrics like lines of code, cyclomatic complexity, and past bug reports. SSO is utilized for hyperparameter tuning and feature selection, tuning parameters such as number of layers, learning rate, and dropout percentages. The DNN is thereafter utilized for classifying defects. The model produced a classification rate of 94.4% compared to customary models. Quantitative measures in the form of Precision (98.6%), Recall (98%), and F1-score (98.4%) also assert its efficiency. Deployment testing at the cloud stage revealed latency as between 180ms and 250ms, throughput as from 38 Mbps to 45 Mbps, and availability as in excess of 99.7% for seven days. Scalability tests showed a linear increase in response time from 1.2s to 3.0s as the user count increased from 100 to 700. Optimization of resources was also witnessed between training iterations with CPU utilization decreasing from 65% to 50% and memory usage decreasing from 70% to 59%. This proves that integrating SSO and DNN leads to a correct, efficient, and scalable defect prediction model, well suited for real-time cloud-based applications
Keywords: Software Defect Prediction, Deep Neural Network, Social Spider Optimization, Feature Selection, Hyperparameter Optimization