Open Access
Issue
E3S Web Conf.
Volume 267, 2021
7th International Conference on Energy Science and Chemical Engineering (ICESCE 2021)
Article Number 02059
Number of page(s) 7
Section Environmental Chemistry Research and Chemical Preparation Process
DOI https://doi.org/10.1051/e3sconf/202126702059
Published online 04 June 2021
  1. Ulissi, Z. W., Medford, A. J., Bligaard, T., Nørskov, J. K. To address surface reaction network complexity using scaling relations machine learning and DFT calculations. Nature communications, 8, 1, 1–7. (2017). [Google Scholar]
  2. Faber, F. A., Hutchison, L., Huang, B., et al. Prediction errors of molecular machine learning models lower than hybrid DFT error. Journal of chemical theory and computation, 13, 11, 5255–5264. (2017). [Google Scholar]
  3. Shen, L., & Yang, W. Molecular dynamics simulations with quantum mechanics/molecular mechanics and adaptive neural networks. Journal of chemical theory and computation, 14, 3, 1442–1455. (2018). [Google Scholar]
  4. Rosenbrock, C. W., Homer, E. R., Csányi, G., Hart, G. L. Discovering the building blocks of atomic systems using machine learning: application to grain boundaries. NPJ Computational Materials, 3,1, 1–7. (2017). [Google Scholar]
  5. Behler, J. Representing potential energy surfaces by high-dimensional neural network potentials. Journal of Physics: Condensed Matter, 26, 18, 183001. (2014). [Google Scholar]
  6. Wong, S. Y., Bund, R. K., Connelly, R. K., et al. Modeling the crystallization kinetic rates of lactose via artificial neural network. Crystal growth & design, 10, 6, 2620–2628. (2010). [Google Scholar]
  7. Bartók, A. P., Csányi, G. G aussian approximation potentials: A brief tutorial introduction. International Journal of Quantum Chemistry, 115, 16, 1051–1057. (2015). [Google Scholar]
  8. Bartók, A. P., Payne, M. C., Kondor, R., et al. Gaussian approximation potentials: The accuracy of quantum mechanics, without the electrons. Physical review letters, 104, 13, 136403. (2010). [Google Scholar]
  9. Hansen, K., Biegler, F., Ramakrishnan, R., Pronobis, W., et al. Machine learning predictions of molecular properties: Accurate many-body potentials and nonlocality in chemical space. The journal of physical chemistry letters, 6, 12, 2326–2331. (2015). [Google Scholar]
  10. Schütt, K. T., Arbabzadah, F., Chmiela, S., et al. Quantum-chemical insights from deep tensor neural networks. Nature communications, 8, 1, 1–8. (2017). [Google Scholar]
  11. Yao, K., Herr, J. E., Toth, D. W., et al. The TensorMol-0.1 model chemistry: a neural network augmented with long-range physics. Chemical science, 9, 8, 2261–2269. (2018). [Google Scholar]
  12. Huang, S. D., Shang, C., Kang, P. L., et al. Atomic structure of boron resolved using machine learning and global sampling. Chemical science, 9, 46,, 8644–8655. (2018). [Google Scholar]
  13. Gómez-Bombarelli, R., Wei, J. N., Duvenaud, D., et al. Automatic chemical design using a data-driven continuous representation of molecules. ACS central science, 4, 2, 268–276. (2018). [Google Scholar]
  14. Ramprasad, R., Batra, R., Pilania, G., et al. Machine learning in materials informatics: recent applications and prospects. npj Computational Materials, 3, 1, 1–13. (2017). [Google Scholar]
  15. Shapeev, A. V. Applications of machine learning for representing interatomic interactions. In Computational Materials Discovery. Royal Society of Chemistry. (2018). [Google Scholar]
  16. Behler, J. Perspective: Machine learning potentials for atomistic simulations. The Journal of chemical physics, 145, 17, 170901. (2016). [Google Scholar]
  17. Xie, T., Grossman, J. C. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Physical review letters, 120, 14, 145301.1-145301.6. (2018). [Google Scholar]
  18. Khorshidi, A., Peterson, A. A. Amp: A modular approach to machine learning in atomistic simulations. Computer Physics Communications, 207, 310–324. (2016). [Google Scholar]
  19. Bartók, A. P., Kondor, R., Csányi, G. On representing chemical environments. Physical Review B, 87, 18, 184115. (2013). [Google Scholar]
  20. Imbalzano, G., Anelli, A., Giofré, D., et al. Automatic selection of atomic fingerprints and reference configurations for machine-learning potentials. The Journal of chemical physics, 148, 24, 241730. (2018). [Google Scholar]
  21. Zhang, K., Yin, L., Liu, G. Physically inspired atomcentered symmetry functions for the construction of high dimensional neural network potential energy surfaces. Computational Materials Science, 186, 110071. (2021). [Google Scholar]
  22. Huo, H., Rupp, M. Unified representation of molecules and crystals for machine learning. arXiv preprint arXiv:1704.06439. (2017). [Google Scholar]
  23. Kim, H., Park, J. Y., Choi, S. Energy refinement and analysis of structures in the QM9 database via a highly accurate quantum chemical method. Scientific data, 6, 1, 1–8. (2019). [Google Scholar]
  24. Zeledon, J. A. H., Romero, A. H., Ren, P., et al. The structural information filtered features (SIFF) potential: Maximizing information stored in machine-learning descriptors for materials prediction. Journal of Applied Physics, 127, 21, 215108. (2020). [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.