Visual Analytics and Digital Twin Modeling Tools, Spatio-Temporal Fusion and Predictive Modeling Algorithms, and Deep Learning-based Sensing and Image Recognition Technologies in Data-driven Smart Sustainable Cities and Immersive Multisensory Virtual Spaces
Milos Poliak1, Adela Poliakova1, Oana-Diana Crîșmariu2, Raluca-Ștefania Balica3ABSTRACT. This paper draws on a substantial body of theoretical and empirical research on urban big data analytics, immersive virtual and data-driven planning technologies, and smart city logistics deployed in smart networked environments. A quantitative literature review of ProQuest, Scopus, and the Web of Science was carried out throughout April 2023, with search terms including “data-driven smart sustainable cities and immersive multisensory virtual spaces” + “visual analytics and digital twin modeling tools,” “spatio-temporal fusion and predictive modeling algorithms,” and “deep learning-based sensing and image recognition technologies.” As research published in 2022 and 2023 was inspected, only 146 articles satisfied the eligibility criteria, and 21 mainly empirical sources were selected. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: AMSTAR, Distiller SR, ROBIS, and SRDR.
Keywords: visual analytics; digital twin modeling; spatio-temporal fusion; predictive modeling algorithms; deep learning; sensing and image recognition; smart sustainable cities; immersive multisensory virtual spaces
How to cite: Poliak, M., Poliakova, A., Crîșmariu, O.-D., and Balica, R.-Ș. (2023). “Visual Analytics and Digital Twin Modeling Tools, Spatio-Temporal Fusion and Predictive Modeling Algorithms, and Deep Learning-based Sensing and Image Recognition Technologies in Data-driven Smart Sustainable Cities and Immersive Multisensory Virtual Spaces,” Geopolitics, History, and International Relations 15(1): 91–105. doi: 10.22381/GHIR15120236.
Received 13 May 2023 • Received in revised form 27 June 2023
Accepted 29 June 2023 • Available online 30 June 2023