Bibiliometric Analysis of the Scientific Production of Deep Learning and Big Data

Wagner Vicente-Ramos, Amanda Durán-Carhuamaca

Research output: Contribution to journalOriginal Articlepeer-review

Abstract

This article presents a bibliometric analysis of the scientific production on deep learning and big data worldwide. Using the R package and the associated biblioshiny, the study analyzed 456 research articles published in Scopus between 2003 and 2023. The study applied performance analysis, keyword analysis, and thematic analysis. China is the country with the highest production (536 publications) followed by India (260 publications), likewise, most of these collaborations occur from China to the United States, Hong Kong, Sweden, Australia, Pakistan, Saudi Arabia and other countries. The rapid growth of the keywords Deep learning, big data, learning systems and data analytics; It demonstrated the interest of researchers, industry professionals, governments, investors and all other key players in the need to optimize information processing with artificial intelligence features. Finally, the thematic analysis shows that the predictive improvements through Big Data will be applied to traffic management, medical care and the forecasting of economic trends. As future work, Data Masking should be considered as a security measure, incorporation of multi-cloud architecture, Data Fabric and Data Mesh, to manage and improve the exchange of data from different sources.

Original languageAmerican English
Pages (from-to)355-362
Number of pages8
JournalInternational Journal of Intelligent Systems and Applications in Engineering
Volume11
Issue number4
StateIndexed - 21 Sep 2023
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2023, Ismail Saritas. All rights reserved.

Keywords

  • Artificial intelligence
  • bibliometric analysis
  • big data
  • deep learning

Fingerprint

Dive into the research topics of 'Bibiliometric Analysis of the Scientific Production of Deep Learning and Big Data'. Together they form a unique fingerprint.

Cite this