Open Access
Issue
E3S Web Conf.
Volume 616, 2025
2nd International Conference on Renewable Energy, Green Computing and Sustainable Development (ICREGCSD 2025)
Article Number 03038
Number of page(s) 17
Section Sustainable Development
DOI https://doi.org/10.1051/e3sconf/202561603038
Published online 24 February 2025
  1. Q. Yan, B. Zhang, and M. Kezunovic. “Optimization of electric • Q. Yan, B. Zhang, and M. Kezunovic, “Optimization of electric vehicle movement for efficient energy consumption,” in Proc. North American Power Symposium (NAPS), 2014, pp. 1–6, doi: 10.1109/NAPS.2014.6965467. [Google Scholar]
  2. J. Jiang et al., “Research on Global Optimization Algorithm of Integrated Energy and Thermal Management for Plug-In Hybrid Electric Vehicles,” Sensors, vol. 23, no. 16, pp. 7149, Aug. 2023, doi: 10.3390/s23167149. [CrossRef] [PubMed] [Google Scholar]
  3. M. Nour, J. P. C. Ávila, G. Magdy, and Á. S. Miralles, “Review of Positive and Negative Impacts of Electric Vehicles Charging on Electric Power Systems,” Energies, vol. 13, no. 18, pp. 4675, Sep. 2020, doi: 10.3390/en13184675. [CrossRef] [Google Scholar]
  4. B. Moulik and D. Soffker, “Battery Management System for Future Electric Vehicles,” Applied Sciences, vol. 10, no. 15, pp. 5095, Jul. 2020, doi: 10.3390/app10155095. [CrossRef] [Google Scholar]
  5. H. S. Hamut, Í. Dinner, and G. Naterer, “Analysis and optimization of hybrid electric vehicle thermal management systems,” J. Power Sources, vol. 250, pp. 167–178, Mar. 2014, doi: 10.1016/j.jpowsour.2013.08.131. [Google Scholar]
  6. S. Aslam, A. Altaweel, and A. B. Nassif, “Optimization Algorithms in Smart Grids: A Systematic Literature Review,” arXiv preprint, arXiv: 2301.07512, 2023, doi: 10.48550/arXiv.2301.07512. [Google Scholar]
  7. M. Rocha, R. Mendes, O. Rocha, I. Rocha, and E. C. Ferreira, “Optimization of fedbatch fermentation processes with bio-inspired algorithms,” Expert Syst. Appl., vol. 41, no. 5, pp. 2186–2195, Apr. 2014, doi: 10.1016/j.eswa.2013.09.017. [CrossRef] [Google Scholar]
  8. A. F. Mirza, Q. Ling, M. Y. Javed, and M. Mansoor, “Novel MPPT techniques for photovoltaic systems under uniform irradiance and partial shading,” Solar Energy, vol. 184, pp. 628–648, May 2019, doi: 10.1016/j.solener.2019.04.034. [CrossRef] [Google Scholar]
  9. A. M. Eltamaly, “An Improved Cuckoo Search Algorithm for Maximum Power Point Tracking of Photovoltaic Systems under Partial Shading Conditions,” Energies, vol. 14, no. 4, pp. 953, Feb. 2021, doi: 10.3390/en14040953. [CrossRef] [Google Scholar]
  10. N. A. Kamarzaman and C. W. Tan, “A comprehensive review of maximum power point tracking algorithms for photovoltaic systems,” Renewable and Sustainable Energy Reviews, vol. 37, pp. 585–598, Sep. 2014, doi: 10.1016/j.rser.2014.05.045. [CrossRef] [Google Scholar]
  11. I. Pervez, A. Pervez, M. Tariq, A. Sarwar, R. K. Chakrabortty, and M. J. Ryan, “Rapid and robust adaptive Jaya (Ajaya) based maximum power point tracking of a PV-based generation system,” IEEE Access, vol. 9, pp. 48679–48703, Oct. 2020, doi: 10.1109/ACCESS.2020.3028609. [Google Scholar]
  12. X. Guo, D. Meng, and J. Wang, “A Coupled Simulated Annealing and Particle Swarm Optimization Reliability-Based Design Optimization Strategy under Hybrid Uncertainties,” Mathematics, vol. 11, no. 23, pp. 4790, 2023, doi: 10.3390/math11234790. [CrossRef] [Google Scholar]
  13. J. Xiang, Y. Zhang, X. Cao, and Z. Zhou, “An Improved Multi-Objective Hybrid Genetic-Simulated Annealing Algorithm for AGV Scheduling under Composite Operation Mode,” Computers, Materials & Continua, vol. 77, no. 3, pp. 3443–3466, 2023, doi: 10.32604/cmc.2023.045120. [CrossRef] [Google Scholar]
  14. M. Joshi, M. Gyanchandani, and R. Wadhvani, “Analysis of Genetic Algorithm, Particle Swarm Optimization and Simulated Annealing on Benchmark Functions,” in 2021 Int. Conf, on Computer and Multimedia Communications (ICCMC), 2021, pp. 19–23, doi: 10.1109/ICCMC51019.2021.9418458. [Google Scholar]
  15. P. Trojovsky and M. Dehghani, “Subtraction-Average-Based Optimizer: A New Swarm-Inspired Metaheuristic Algorithm for Solving Optimization Problems,” Biomimetics, vol. 8, no. 2, p. 149, 2023, doi: 10.3390/biomimetics8020149. [CrossRef] [PubMed] [Google Scholar]
  16. B. Wang, Q. Jin, R. Zhao, and Y. Zhang, “A New Optimization Idea: Parallel SearchBased Golden Jackal Algorithm,” IEEE Access, vol. 11, pp. 13427–13438, 2023, doi: 10.1109/ACCESS.2023.3312684. [Google Scholar]
  17. X. Liu, Y. Bai, C. Yu, H. Yang, H. Gao, J. Wang, Q. Chang, and X. Wen, “Multi-Strategy Improved Sparrow Search Algorithm and Application,” Mathematics and Computers in Simulation, 2023, doi: 10.3390/mca27060096. [Google Scholar]
  18. X. Deng, X. Yu, C. Wang, and Y. Zhang, “LMGT: Optimizing Exploration-Exploitation Balance in Reinforcement Learning through Language Model Guided Trade-offs,” arXiv Preprint, 7 Sep 2024. [Online]. Available: https://arxiv.org/abs/2409.04744. [Google Scholar]
  19. D. Zha, X. Liu, and H. Yang, “Awesome Exploration Methods in Reinforcement Learning,” GitHub Repository, 8 Oct 2024. [Online]. Available: https://github.com/opendilab/awesome-exploration-rl. [Google Scholar]
  20. H. Zhang, J. Yang, T. Qin, Y. Fan, Z. Li, and W. Wei, “A Multi-Strategy Improved Sparrow Search Algorithm for Solving the Node Localization Problem in Heterogeneous Wireless Sensor Networks,” Applied Sciences, vol. 12, no. 10, p. 5080, 2022, doi: 10.3390/app12105080. [CrossRef] [Google Scholar]
  21. T. Liu, H. Yang, Y. Bai, and Z. Wang, “Toward Understanding Why Adam Converges Faster Than SGD for Transformers,” arXiv Preprint, 7 Sep 2024. [Online]. Available: https://arxiv.labs.arxiv.org/html/2306.00204. [Google Scholar]
  22. B. S. Obbu and Z. Jabeen, “Study of convergence speed of chaotic particle swarm optimization algorithm,” Pollack Periodica, vol. 19, no. 1, pp. 157–163, 2023, doi: 10.1556/606.2023.00933. [Online]. Available: https://doi.org/10.1556/606.2023.00933. [Google Scholar]
  23. T. Shaqarin and B. R. Noack, “A Fast-Converging Particle Swarm Optimization through Targeted, Position-Mutated, Elitism (PSO-TPME),” Int. J. Comput. Intell. Syst., vol. 16, no. 6, 2023, doi: 10.1007/s44196-023-00183-z. [CrossRef] [Google Scholar]
  24. D. Bertsimas and G. Margaritis, “Global Optimization: A Machine Learning Approach,” arXiv Preprint, 2023. [Online]. Available: https://paperswithcode.com/paper/global-optimization-a-machine-learning. [Google Scholar]
  25. D. Bertsimas and B. Öztürk, “Global Optimization via Optimal Decision Trees,” J. Global Optimization, vol. 87, no. 2, pp. 253–279, 2023, doi: 10.1007/s10898-023-01311-x. [Google Scholar]
  26. X. Liu, Y. Wang, H. Yang, and Z. Zhang, “IOOA: A Multi-Strategy Fusion Improved Osprey Optimization Algorithm,” Mathematics, vol. 11, no. 20, p. 4793, 2023, doi: 10.3934/era.2024093. [CrossRef] [Google Scholar]
  27. J. Liang, X. Ban, K. Yu, B. Qu, K. Qiao, and C. Yue, “A Survey on Evolutionary Constrained Multiobjective Optimization,” IEEE Transactions on Evolutionary Computation, vol. 27, no. 2, pp. 201–221, Apr. 2023, doi: 10.1109/TEVC.2022.3155533. [CrossRef] [Google Scholar]
  28. F. Ming, W. Gong, L. Wang, and L. Gao, “A Constraint-Handling Technique for Decomposition-Based Constrained Many-Objective Evolutionary Algorithms,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 53, no. 12, pp. 7783–7793, Dec. 2023, doi: 10.1109/TSMC.2023.3299570. [CrossRef] [Google Scholar]
  29. C. Fan, J. Wang, L. T. Yang, L. Xiao, and Z. Ai, “Efficient Constrained Large-Scale Multi-Objective Optimization Based on Reference Vector-Guided Evolutionary Algorithm,” Applied Intelligence, vol. 53, no. 18, pp. 21027–21049, May 2023, doi: 10.1007/s10489-023-04663-9. [CrossRef] [Google Scholar]
  30. H. Zhai, Q. Hu, and J. Chen, “Hybrid Reinforcement Learning Framework for MixedVariable Problems,” arXiv Preprint, 30 May 2024, doi: 10.48550/arXiv.2405.20500. [Google Scholar]
  31. L. Brevault and M. Balesdent, “Bayesian Quality-Diversity Approaches for Constrained Optimization Problems with Mixed Continuous, Discrete, and Categorical Variables,” Engineering Applications of Artificial Intelligence, vol. 113, p. 108118, 2024, doi: 10.1016/j.engappai.2024.108118. [CrossRef] [Google Scholar]
  32. H. M. Sheikh and P. S. Marcus, “Bayesian Optimization for Multi-Objective MixedVariable Problems,” Structural and Multidisciplinary Optimization, vol. 65, no. 4, pp. 1229–1241, 2022, doi: 10.1007/s00158-022-03382-y. [CrossRef] [Google Scholar]
  33. J. Wu, Z. Zhang, and Y. Wang, “Efficient Constrained Multi-objective Evolutionary Algorithm for Large-scale Problems,” IEEE Transactions on Cybernetics, vol. 54, no. 4, pp. 2200–2212, 2023, doi: 10.1109/TCYB.2023.3247890. [Google Scholar]
  34. J. Liang, X. Ban, K. Yu, B. Qu, K. Qiao, and C. Yue, “A Survey on Evolutionary Constrained Multiobjective Optimization,” IEEE Transactions on Evolutionary Computation, vol. 27, no. 2, pp. 201–221, Apr. 2023, doi: 10.1109/TEVC.2022.3155533. [CrossRef] [Google Scholar]
  35. F. Ming, W. Gong, L. Wang, and L. Gao, “A Constraint-Handling Technique for Decomposition-Based Constrained Many-Objective Evolutionary Algorithms,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 53, no. 12, pp. 7783–7793, Dec. 2023, doi: 10.1109/TSMC.2023.3299570. [CrossRef] [Google Scholar]
  36. A. Greco, S. D. Riccio, J. Timmis, and G. Nicosia, “Assessing Algorithm Parameter Importance Using Global Sensitivity Analysis,” in Analysis of Experimental Algorithms, vol. 11544, Springer, 2019, pp. 392–407, doi: 10.1007/978-3-030-34029-2_26. [CrossRef] [Google Scholar]
  37. F. Pinel, G. Danoy, and P. Bouvry, “Evolutionary Algorithm Parameter Tuning with Sensitivity Analysis,” in Proceedings of the 2011 IEEE Congress on Evolutionary Computation, New Orleans, LA, USA, 2011, pp. 1–8, doi: 10.1109/CEC.2011.5949700. [Google Scholar]
  38. W. Zhou and G. Luo, “Parameter Sensitivity Analysis for the Progressive SamplingBased Bayesian Optimization Method for Automated Machine Learning Model Selection,” in Heterogeneous Data Management, Polystores, and Analytics for Healthcare, V. Gadepally et al., Eds. Cham: Springer, 2021, vol. 12633, pp. 203–217, doi: 10.1007/978-3-030-71055-2_17. [Google Scholar]
  39. M. Blanchard, J. Zhang, and P. Jaillet, “Memory-Constrained Algorithms for Convex Optimization via Recursive Cutting-Planes,” Advances in Neural Information Processing Systems, vol. 36, 2023, doi: 10.48550/arXiv.2306.10096. [Google Scholar]
  40. K. Lv, H. Yan, Q. Guo, H. Lv, and X. Qiu, “AdaLomo: Low-memory Optimization with Adaptive Learning Rate,” arXiv Preprint, 16 Oct 2023, doi: 10.48550/arXiv.2310.10195. [Google Scholar]
  41. B. Lewandowski and A. Kosson, “Memory Efficient Mixed-Precision Optimizers,” arXiv Preprint, 21 Sep 2023, doi: 10.48550/arXiv.2309.12381. [Google Scholar]
  42. C. Tadonki, “OpenMP Parallelization of Dynamic Programming and Greedy Algorithms,” arXiv Preprint, 20 Jan 2020, doi: 10.48550/arXiv.2001.07103. [Google Scholar]
  43. B. A. de Melo Menezes, H. Kuchen, and F. B. de Lima Neto, “Parallelization of Swarm Intelligence Algorithms: Literature Review,” International Journal of Parallel Programming, vol. 50, pp. 486–514, 2022, doi: 10.1007/s10766-022-00736-3. [CrossRef] [Google Scholar]
  44. S. Tavares, C. P. Bras, A. L. Custodio, V. Duarte, and P. Medeiros, “Parallel Strategies for Direct Multisearch,” arXiv Preprint, 6 May 2021, doi: 10.48550/arXiv.2105.03000. [Google Scholar]
  45. M. Shakeri, E. Miahi, A. Gupta, and Y.-S. Ong, “Scalable Transfer Evolutionary Optimization: Coping with Big Task Instances,” IEEE Transactions on Cybernetics, vol. 53, no. 8, pp. 3200–3211, 2023, doi: 10.1109/TCYB.2022.3164399. [Google Scholar]
  46. D. M. Chitty, E. Wanner, R. Parmar, and P. R. Lewis, “Can Bio-Inspired Swarm Algorithms Scale to Modern Societal Problems,” arXiv Preprint, 20 May 2019, doi: 10.48550/arXiv.1905.08126. [Google Scholar]
  47. N. A. Ezhova and L. B. Sokolinsky, “Scalability Evaluation of Iterative Algorithms Used for Supercomputer Simulation of Physical Processes,” in Proceedings of the Global Simulation and Intelligent Computing Conference (GloSIC), 2018, pp. 1–5, doi: 10.1109/GloSIC.2018.8570107. [Google Scholar]
  48. F. Pinel, G. Danoy, and P. Bouvry, “Evolutionary Algorithm Parameter Tuning with Sensitivity Analysis,” in Security and Intelligent Information Systems, P. Bouvry et al., Eds. Berlin, Heidelberg: Springer, 2012, vol. 7053, pp. 241–254, doi: 10.1007/978-3-642-25261-7_16. [Google Scholar]
  49. A. Greco, S. D. Riccio, J. Timmis, and G. Nicosia, “Assessing Algorithm Parameter Importance Using Global Sensitivity Analysis,” in Analysis of Experimental Algorithms, vol. 11544, Springer, 2019, pp. 392–407, doi: 10.1007/978-3-030-34029-2_26. [CrossRef] [Google Scholar]
  50. M. F. Briones-Baez, L. Aguilera-Vazquez, N. Rangel-Valdez, C. Zuniga, A. L. Martinez-Salazar, and C. Gomez-Santillan, “Pitfalls in Metaheuristics Solving Stoichiometric-Based Optimization Models for Metabolic Networks,” Algorithms, vol. 17, no. 8, p. 336, 2024, doi: 10.3390/a17080336. [CrossRef] [Google Scholar]
  51. R. Egele, T. Chang, Y. Sun, V. Vishwanath, and P. Balaprakash, “Parallel MultiObjective Hyperparameter Optimization with Uniform Normalization and Bounded Objectives,” arXiv Preprint, 26 Sep 2023, doi: 10.48550/arXiv.2309.14936. [Google Scholar]
  52. T. Huang and K. Li, “Direct Preference-Based Evolutionary Multi-Objective Optimization with Dueling Bandit,” arXiv Preprint, 23 Nov 2023, doi: 10.48550/arXiv.2311.14003. [Google Scholar]
  53. A. Greco, S. D. Riccio, J. Timmis, and G. Nicosia, “Assessing Algorithm Parameter Importance Using Global Sensitivity Analysis,” in Analysis of Experimental Algorithms, vol. 11544, Springer, 2019, pp. 392–407, doi: 10.1007/978-3-030-34029-2_26. [CrossRef] [Google Scholar]
  54. F. Pinel, G. Danoy, and P. Bouvry, “Evolutionary Algorithm Parameter Tuning with Sensitivity Analysis,” in Proceedings of the 2011 IEEE Congress on Evolutionary Computation, New Orleans, LA, USA, 2011, pp. 1–8, doi: 10.1109/CEC.2011.5949700. [Google Scholar]
  55. W. Zhou and G. Luo, “Parameter Sensitivity Analysis for the Progressive SamplingBased Bayesian Optimization Method for Automated Machine Learning Model Selection,” in Heterogeneous Data Management, Polystores, and Analytics for Healthcare, V. Gadepally et al., Eds. Cham: Springer, 2021, vol. 12633, pp. 203–217, doi: 10.1007/978-3-030-71055-2_17. [Google Scholar]
  56. Y. Li, M. R. Luyten, and M. van der Schaar, “Risk-Sensitive Diffusion for Perturbation- Robust Optimization,” arXiv Preprint, 3 Feb 2024, doi: 10.48550/arXiv.2402.02081. [Google Scholar]
  57. F. Farokhi, “Distributionally-Robust Optimization with Noisy Data for Discrete Uncertainties Using Total Variation Distance,” IEEE Control Systems Letters, vol. 7, pp. 1146–1151, 2023, doi: 10.1109/LCSYS.2023.3271434. [Google Scholar]
  58. E. Dedeoglu, H. T. Kesgin, and M. F. Amasyali, “A Robust Optimization Method for Label Noisy Datasets Based on Adaptive Threshold: Adaptive-k,” arXiv Preprint, 26 Mar 2022, doi: 10.48550/arXiv.2203.14165. [Google Scholar]
  59. Y. Liu and S. D. Tajbakhsh, “Stochastic Optimization Algorithms for Problems with Controllable Biased Oracles,” arXiv Preprint, 13 Jun 2023, doi: 10.48550/arXiv.2306.07810. [Google Scholar]
  60. A. Cutkosky, H. Mehta, and F. Orabona, “Optimal Stochastic Non-smooth Non-convex Optimization through Online-to-Non-convex Conversion,” arXiv Preprint, 7 Feb 2023, doi: 10.48550/arXiv.2302.03775. [Google Scholar]
  61. Y. Rychener, D. Kuhn, and T. Sutter, “End-to-End Learning for Stochastic Optimization: A Bayesian Perspective,” arXiv Preprint, 7 Jun 2023, doi: 10.48550/arXiv.2306.04174. [Google Scholar]
  62. M. Psarakis, A. Dounis, A. Almabrok, S. Stavrinidis, and G. Gkekas, “An FPGA-Based Accelerated Optimization Algorithm for Real-Time Applications,” Journal of Signal Processing Systems, vol. 92, pp. 1155–1176, 2020, doi: 10.1007/s11265-020-01522-5. [CrossRef] [Google Scholar]
  63. J. Liang, X. Ban, K. Yu, B. Qu, K. Qiao, and C. Yue, “A Survey on Evolutionary Constrained Multiobjective Optimization,” IEEE Transactions on Evolutionary Computation, vol. 27, no. 2, pp. 201–221, Apr. 2023, doi: 10.1109/TEVC.2022.3155533. [CrossRef] [Google Scholar]
  64. F. Ming, W. Gong, L. Wang, and L. Gao, “A Constraint-Handling Technique for Decomposition-Based Constrained Many-Objective Evolutionary Algorithms,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 53, no. 12, pp. 7783–7793, Dec. 2023, doi: 10.1109/TSMC.2023.3299570. [CrossRef] [Google Scholar]
  65. T. Friedrich, T. Kotzing. F. Neumann, and A. Radhakrishnan, “Theoretical Study of Optimizing Rugged Landscapes with the cGA,” in Parallel Problem Solving from Nature - PPSN XVII, G. Rudolph et al., Eds. Cham: Springer, 2022, vol. 13399, pp. 41–55, doi: 10.1007/978-3-031-14721-0_41. [Google Scholar]
  66. R. Monteiro and K. Sau, “Landscape-Sketch-Step: An AI/ML-Based Metaheuristic for Surrogate Optimization Problems,” arXiv Preprint, 14 Sep 2023, doi: 10.48550/arXiv.2309.07936. [Google Scholar]
  67. J. Li, A. P. P. Abdul Majeed, and P. Lefevre, “Halfway Escape Optimization: A Quantum-Inspired Solution for Complex Optimization Problems,” arXiv Preprint, 5 May 2024, doi: 10.48550/arXiv.2405.02850. [Google Scholar]
  68. M. Joshi, M. Gyanchandani, and R. Wadhvani, “Analysis of Genetic Algorithm, Particle Swarm Optimization, and Simulated Annealing on Benchmark Functions,” in Proceedings of the 2021 5th International Conference on Computing Methodologies and Communication (ICCMC), Coimbatore, India, 2021, pp. 1152–1157, doi: 10.1109/ICCMC51019.2021.9418458. [CrossRef] [Google Scholar]
  69. B. Wang, Q. Jin, R. Zhao, and Y. Zhang, “A New Optimization Idea: Parallel SearchBased Golden Jackal Algorithm,” IEEE Access, vol. 11, pp. 12345–12356, 2023, doi: 10.1109/ACCESS.2023.3312684. [Google Scholar]
  70. H. Zhang, J. Yang, T. Qin, Y. Fan, Z. Li, and W. Wei, “A Multi-Strategy Improved Sparrow Search Algorithm for Solving the Node Localization Problem in Heterogeneous Wireless Sensor Networks,” Applied Sciences, vol. 12, no. 10, p. 5080, 2022, doi: 10.3390/app12105080. [CrossRef] [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.