Open Access
Issue |
E3S Web Conf.
Volume 420, 2023
EBWFF 2023 - International Scientific Conference Ecological and Biological Well-Being of Flora and Fauna (Part 1)
|
|
---|---|---|
Article Number | 10040 | |
Number of page(s) | 9 | |
Section | Environmental Education and Sustainable Tourism | |
DOI | https://doi.org/10.1051/e3sconf/202342010040 | |
Published online | 04 September 2023 |
- Z. Al-Halah, J. S. Jang, Emotion recognition from speech using deep learning, Neural Networks, 118, 211-223 (2019) [Google Scholar]
- B. Schuller, Speech emotion recognition: The need for benchmarking generalization. Journal of the Acoustical Society of America, 143(1), EL475-EL481 (2018) [Google Scholar]
- S. Kim, J. Andrea, Emotion recognition from speech using machine learning approaches: A review. IEEE Access, 6, 14728-14739 (2018) [Google Scholar]
- K. Han, D. Kim, Emotion recognition based on audio-visual data using deep learning: A review. Sensors, 20(18), 5207 (2020). [CrossRef] [PubMed] [Google Scholar]
- F. Eyben, K. R. Scherer, B. W. Schuller, J. Sundberg, E.André, C.Busso, A. Batliner, The Geneva minimalistic acoustic parameter set (GeMAPS) for voice research and affective computing. IEEE Transactions on Affective Computing, 7(2), 190–202 (2015). [Google Scholar]
- N. L. Ko, Y. Suh, H. G. Lee, H. G. Kim, Emotion recognition in the wild using transfer learning from face and audio. Information Sciences, 568, 401-417 (2021) [Google Scholar]
- A. Mollahosseini, D. Chan, M. H. Mahoor, Going deeper in facial expression recognition using deep neural networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 189-197 (2017) [Google Scholar]
- B. W. Schuller, Speech emotion recognition: Two decades in a nutshell, benchmarks, and ongoing trends. Communications of the ACM, 61(5), 90–99 (2018) [CrossRef] [Google Scholar]
- Y. Baveye, M. Goudbeek, B. Schuller, The acoustic correlates of emotions: A review of methods and challenges. IEEE Transactions on Affective Computing, 12(6), 1189–1210 (2019) [Google Scholar]
- C. Busso, M. Bulut, C. C. Lee, A. Kazemzadeh, E. Mower, S. Kim, … & S. Narayanan,. IEMOCAP: Interactive emotional dyadic motion capture database. Language Resources and Evaluation, 42(4), 335–359 (2008) [CrossRef] [Google Scholar]
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.