Open Access
Issue
E3S Web Conf.
Volume 499, 2024
The 1st Trunojoyo Madura International Conference (1st TMIC 2023)
Article Number 01017
Number of page(s) 8
Section Dense Matter
DOI https://doi.org/10.1051/e3sconf/202449901017
Published online 06 March 2024
  1. R. Benzer, Population dynamics forecasting using artificial neural networks, January 2015, Fresenius environmental bulletin, 24, 2 (2015). [Google Scholar]
  2. V. Riiman, A. Wilson, P. Pirkelbauer, Comparing Artificial Neural Network and Cohort-Component Models for Population Forecasts, Published in Population, Economics, Review, 22 October 2019 [Google Scholar]
  3. N Widyas, T. S. M. Widi, E. Baliarti, Predicting Madura cattle growth curve using non-linear model, IOP Conf. Ser.: Earth Environ. Sci. 142, 012006, (2018). [CrossRef] [Google Scholar]
  4. U. Paputungan, M. J. Hendrik, W. Utiah, Predicting live weight of Indonesian Local-Bali cattle using body volume formula, Livestock Research for Rural Development. Volume 30, 2018. [Google Scholar]
  5. P. Alkhairi, E. R. Batubara, R. Rosnelly, W. Wanayaumini, and H. S. Tambunan, Effect of Gradien Descent With Momentum Backpropagation Training Function in Detecting Alphabet Letters, Sinkron: Jurnal Penelitian Teknik Informatika, 8, 574–583 (2023). [Google Scholar]
  6. T. Akiba, S. Sano, T. Yanase, T. Ohta, M. Koyama, Optuna, A next-generation hyperparameter optimization framework, in Proceedings of the 25rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2019). [Google Scholar]
  7. F. R. Ramadani, Inggih Permana, M. Afdal, and Siti Monalisa, Model for Estimating Waste Generation in Pekanbaru Using Backpropagation Algorithm, J. INFORMATICS Telecommun. Eng., 7, pp. 317–327, (2023). [CrossRef] [Google Scholar]
  8. M. G. M. Abdolrasol et al., Artificial neural networks based optimization techniques: A review, Electron, 10, 21 (2021). https://doi.org/10.3390/electronics10212689 [Google Scholar]
  9. K. M. R. Alam, N. Siddique, and H. Adeli, A dynamic ensemble learning algorithmfor neural networks, Neural Comput. Appl., 32, 2, pp. 8675–8690, (2020). [Google Scholar]
  10. A. Zheng. Chapter 4: Hyperparameter tuning, In: Evaluating Machine Learning Models. USA: O’Reilly Media, Inc., (2015). [Google Scholar]
  11. M. Feurer and F. Hutter, Hyperparameter optimization, pp. 3–33, (2019). [Google Scholar]
  12. Y. A. Du, Research on the Route Pricing Optimization Model of the Car-Free Carrier Platform Based on the BP Neural Network Algorithm, Complexity, (2021) [Google Scholar]
  13. K. Adamczyk, D. Zaborski, W. Grzesiak, J. Makulska, W. Jagusiak, Recognition of culling reasons in Polish dairy cows using data mining methods, Comput, Electron, Agric, 26-27 (2016) [CrossRef] [Google Scholar]
  14. Lee, D. H.; Lee, S.-H.; Cho, B.-K.; Wakholi, C.; Seo, Y. W.; Cho, S.-H.; Kang, T.-H.; Lee, W.-H. Estimation of carcass weight of Hanwoo (Korean Native Cattle) as a function of body measurements using statistical models and a neural network. Asian-Australas. J. Anim. Sci., 33, (2020) [Google Scholar]
  15. L. Krpálková, V. E. Cabrera, J. Kvapilík, J. Burdych, P. Crump, Associations between age at first calving, rearing average daily weight gain, herd milk yield and dairy herd production, reproduction, and profitability. J. Dairy Sci., 97 (2014) [Google Scholar]
  16. J. Bergstra, Y. Bengio, Random search for hyper- parameter optimization. J. Mach. Learn. Res., 13, pp 281–305 (2012) [Google Scholar]
  17. N Widyas, S Prastowo, T S M Widi and E Baliarti, Predicting Madura cattle growth curve using non- linear model, IOP Conf.Series: Earth and Environmental Science, 142 (2018) [Google Scholar]
  18. J. Choi, D. Kim, M. Ko, D. Lee, K. Wi, H. Lee, Co mpressive strength prediction of ternary-blended concrete using deep neural network with tuned hyperparameters, Journal of Building Engineering 75, 15 September 2023. [Google Scholar]
  19. M. Jin, Q. Liao, S. Patil, A. Abdulraheem, D. Al- Shehri, G. Glatz, Hyperparameter Tuning of Artificial Neural Networks for Well Production Estimation Considering the Uncertainty in Initialized Parameters, ACS Omega, 7, pp. 24145–24156, (2022) [CrossRef] [PubMed] [Google Scholar]
  20. M. Ahuja, D. P. Mishra, D. Mohanty, H. Agrawal, S. Roy, Development of Empirical and Artificial Neural Network Model for the Prediction of Sorption Time to Assess the Potential of CO2 Sequestration in Coal. ACS Omega, 8, 34, pp. 31480-31492, (2023) [CrossRef] [PubMed] [Google Scholar]
  21. Z. S. Kadhim, H. S. Abdullah, K. I. Ghathwan, Artificial Neural Network Hyperparameters Optimization: A Survey, (iJOE), 18, 15, pp. 59-87, (2022) [Google Scholar]
  22. S. Bansal and A. Kumar, Automatic Deep Neural Network Hyper-Parameter Optimization for Maize Disease Detection, 2021, IOP Conf. Series: Materials Science and Engineering, 1022, (2021) 012089. [CrossRef] [Google Scholar]
  23. L. Yang, A. Shami, On hyperparameter optimization of machine learning algorithms: Theory and practice, Neurocomputing, 415, pp 295–316 (2020) [CrossRef] [Google Scholar]
  24. A. Esmaeili, Z. Ghorrati, E. T. Matson, Agent- Based Collaborative Random Search for Hyperparameter Tuning and Global Function Optimization. Systems, 11, 228 (2023) [CrossRef] [Google Scholar]
  25. F. F. Firdaus, H. A. Nugroho and I. Soesanti, Deep Neural Network with Hyperparameter Tuning for Detection of Heart Disease, 2021 IEEE Asia Pacific Conference on Wireless and Mobile (APWiMob), Bandung, Indonesia, pp. 59-65, 2021, doi: 10.1109/APWiMob51111.2021.9435250. [Google Scholar]
  26. I. Jamaleddyn, R. El ayachi, M. Biniz, An improved approach to Arabic news classification based on hyperparameter tuning of machine learning algorithms, Journal of Engineering Research, 11, 2, (2023) [Google Scholar]
  27. L. Wen, X. Ye, L. Gao, A new automatic machine learning based hyperparameter optimization for workpiece quality prediction, Measurement and Control, 53, 7,), pp.1088–1098, (2020) [CrossRef] [Google Scholar]
  28. S. KARMAKAR, G. SHRIVASTAVA, M. K. Kowar, Impact of learning rate and momentum factor in the performance of back-propagation neural network to identify internal dynamics of chaotic motion, Kuwait Journal of Science (KJS), 41, 2 (2014) [Google Scholar]
  29. K. M. R. Alam, N. Siddique, and H. Adeli, “A dynamic ensemble learning algorithmfor neural networks,” Neural Comput. Appl., 32, 12, pp. 8675–8690 (2020). [CrossRef] [Google Scholar]
  30. J. Bergstra, R. Bardenet, Y. Bengio, and B. Kégl, Algorithms for hyper-parameter optimization, Adv. Neural Inf. Process. Syst. 24–25th Annu. Conf. Neural Inf. Process. Syst., NIPS 2011, 1–9, (2011). [Google Scholar]
  31. B. Raharjo, N. Farida, P. Subekti, R. H. S. Siburian, P. D. H. Ardana, and R. Rahim, Optimization Forecasting Using Back-Propagation Algorithm, J. Appl. Eng. Sci., 19, 1083–1089, (2021). [CrossRef] [Google Scholar]
  32. G. I. Diaz, A. Fokoue-Nkoutche, G. Nannicini, and H. Samulowitz, An effective algorithm for hyperparameter optimization of neural networks, IBM J. Res. Dev., 61, 4, (2017) [Google Scholar]
  33. P. Liashchynskyi and P. Liashchynskyi, Grid search, random search, genetic algorithm: A big comparison for NAS, 2017, pp. 1–11, 2019, [Online]. Available: http://arxiv.org/abs/1912.06059. [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.