Context Modeling and Urban Internet of Things Sensing Tools, Spatio-Temporal Fusion and Deep Learning Image Segmentation Algorithms, and Metaverse and Digital Twin Technologies in Immersive 3D and Robotic Simulation Environments
Gheorghe H. Popescu1, Mehmet Emin Kalgı2, Alexandru Bogdan3, Petrică Tudosă3, and Firdaus Abdullah4ABSTRACT. In this research, prior findings were cumulated indicating that spatial and edge computing technologies, artificial intelligence-based decision support and bio-inspired swarm robotic systems, 5G communication network and machine-learning-based wireless system operations, and multiphysics simulation and urban analytics tools assist multimedia immersive environments. A quantitative literature review of ProQuest, Scopus, and the Web of Science was carried out throughout June 2024, with search terms including “immersive 3D and robotic simulation environments” + “context modeling and urban Internet of Things sensing tools,” “spatio-temporal fusion and deep learning image segmentation algorithms,” and “metaverse and digital twin technologies.” As research published in 2022 and 2023 was inspected, only 167 articles satisfied the eligibility criteria, and 31 mainly empirical sources were selected. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: AMSTAR, Colandr, PICO Portal, ROBIS, SluRp, and Systematic Review Accelerator.
Keywords: urban Internet of Things; spatio-temporal fusion; deep learning image segmentation; metaverse; digital twin; immersive 3D environment
How to cite: Popescu, G. H., Kalgı, M. E., Bogdan, A., Tudosă, P., and Abdullah, F. (2024). “Context Modeling and Urban Internet of Things Sensing Tools, Spatio-Temporal Fusion and Deep Learning Image Segmentation Algorithms, and Metaverse and Digital Twin Technologies in Immersive 3D and Robotic Simulation Environments,” Geopolitics, History, and International Relations 16(2): 97–112. doi: 10.22381/GHIR16220245.
Received 10 July 2024 • Received in revised form 18 December 2024
Accepted 23 December 2024 • Available online 29 December 2024