Generative Artificial Intelligence and Movement and Behavior Tracking Tools, Remote Sensing and Cognitive Computing Systems, and Immersive Audiovisual Content in Virtually Simulated Workspace Environments
Victor V. Dengov1, Katarina Zvarikova2, and Raluca-Ștefania Balica3ABSTRACT. The purpose of this study is to examine generative artificial intelligence and ambient scene detection tools deploying mobile biometric and sentiment data, realistic movement simulations, and real-time predictive analytics in virtual work environments. In this article, previous research findings were cumulated indicating that generative artificial intelligence and deep learning computer vision algorithms can shape predictive workflows, tailored job upskilling, and meaningful performance management. Throughout August 2023, a quantitative literature review of the Web of Science, Scopus, and ProQuest databases was performed, with search terms including “generative artificial intelligence and movement and behavior tracking tools” + “remote sensing and cognitive computing systems,” “immersive audiovisual content,” and “virtually simulated workspace environments.” As research published in 2023 was inspected, only 173 articles satisfied the eligibility criteria, and 50 mainly empirical sources were selected. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: AXIS, Dedoose, MMAT, and SRDR.
Keywords: generative artificial intelligence; movement and behavior tracking tools; remote sensing; cognitive computing; immersive audiovisual content; virtual; simulation; workspace
How to cite: Dengov, V. V., Zvarikova, K., and Balica, R.-Ș. (2023). “Generative Artificial Intelligence and Movement and Behavior Tracking Tools, Remote Sensing and Cognitive Computing Systems, and Immersive Audiovisual Content in Virtually Simulated Workspace Environments,” Analysis and Metaphysics 22: 274–293. doi: 10.22381/am22202315.
Received 11 September 2023 • Received in revised form 16 December 2023
Accepted 25 December 2023 • Available online 30 December 2023