Mobile Biometric and Sentiment Data, Generative Artificial Intelligence and Behavior Tracking Tools, and Wearable Sensor-based and Connected Monitoring Devices in Immersive Interconnected 3D Worlds
Kriselda Gura1, Alice AlAkoum2, Mihaela Melenciuc2, and Susan Henley3ABSTRACT. Despite the relevance of generative artificial intelligence and virtual reality immersive training tools shaping virtual employee engagement, productivity improvements, and long-term talent pipelines, only limited research has been conducted on this topic. The contribution to the literature is by showing that Web3 technology-based immersive remote work experiences can be achieved by use of generative artificial intelligence and mobile analytics algorithms, text mining and analytics, and body-tracking data metrics. Throughout July 2023, a quantitative literature review of the Web of Science, Scopus, and ProQuest databases was performed, with search terms including “generative artificial intelligence and behavior tracking tools” + “mobile biometric and sentiment data,” “wearable sensor-based and connected monitoring devices,” and “immersive interconnected 3D worlds.” As research published in 2023 was inspected, only 167 articles satisfied the eligibility criteria, and 51 mainly empirical sources were selected. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: AXIS, Dedoose, Distiller SR, and MMAT.
Keywords: mobile biometric and sentiment data; generative artificial intelligence; behavior tracking tool; wearable; sensor; connected monitoring devices; immersive
How to cite: Gura, K., AlAkoum, A., Melenciuc, M., and Henley, S. (2023). “Mobile Biometric and Sentiment Data, Generative Artificial Intelligence and Behavior Tracking Tools, and Wearable Sensor-based and Connected Monitoring Devices in Immersive Interconnected 3D Worlds,” Analysis and Metaphysics 22: 255–273. doi: 10.22381/am22202314.
Received 19 August 2023 • Received in revised form 18 December 2023
Accepted 22 December 2023 • Available online 30 December 2023