Behavioral Analytics, Immersive Technologies, and Machine Vision Algorithms in the Web3-powered Metaverse World
Maria Kovacova1, Jakub Horak2, and Michael Higgins3ABSTRACT. We draw on a substantial body of theoretical and empirical research on behavioral analytics, immersive technologies, and machine vision algorithms in the Web3-powered metaverse world. With increasing evidence of data tracking, management, measurement, optimization, and analysis across metaverse worlds, there is an essential demand for comprehending whether technology-enabled live shopping can improve customer engagement and satisfaction in virtual and augmented reality-based immersive environments. In this research, prior findings were cumulated indicating that ambient scene detection and visual analytics can drive customer retention and acquisition in the virtual retail market, building brand awareness and hyper-personalization across e-commerce operations. We carried out a quantitative literature review of ProQuest, Scopus, and the Web of Science throughout March 2022, with search terms including “metaverse” + “behavioral analytics,” “immersive technologies,” and “machine vision algorithms.” As we analyzed research published in 2022, only 80 papers met the eligibility criteria. By removing controversial or unclear findings (scanty/unimportant data), results unsupported by replication, undetailed content, or papers having quite similar titles, we decided on 15, chiefly empirical, sources. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: AMSTAR, Distiller SR, ROBIS, and SRDR.
Keywords: immersive; analytics; metaverse; algorithm; machine vision; behavior
How to cite: Kovacova, M., Horak, J., and Higgins, M. (2022). “Behavioral Analytics, Immersive Technologies, and Machine Vision Algorithms in the Web3-powered Metaverse World,” Linguistic and Philosophical Investigations 21: 57–72. doi: 10.22381/lpi2120224.
Received 27 March 2022 • Received in revised form 25 May 2022
Accepted 27 May 2022 • Available online 30 May 2022