Independent researcher.
World Journal of Advanced Engineering Technology and Sciences, 2025, 15(02), 877-884
Article DOI: 10.30574/wjaets.2025.15.2.0608
Received on 26 March 2025; revised on 03 May 2025; accepted on 05 May 2025
This research paper explores the application of artificial intelligence (AI) and machine learning (ML) techniques in optimizing cloud resource allocation. The study investigates how AI-driven approaches can enhance the efficiency and effectiveness of cloud computing systems through dynamic resource allocation. We present a comprehensive review of existing methodologies, propose novel algorithms, and conduct extensive experiments to validate the effectiveness of our approach. The results demonstrate significant improvements in resource utilization, cost reduction, and overall system performance compared to traditional static allocation methods.
Cloud Computing; Resource Allocation; Artificial Intelligence; Machine Learning; Deep Q-Network; LSTM; Genetic Algorithms; Dynamic Optimization; SLA Violations
Preview Article PDF
Manoj Bhoyar. AI-driven cloud optimization: Leveraging machine learning for dynamic resource allocation. World Journal of Advanced Engineering Technology and Sciences, 2025, 15(02), 877-884. Article DOI: https://doi.org/10.30574/wjaets.2025.15.2.0608.