Machine Vision and Immersive 3D Technologies, Digital Twin Modeling and Simulation Tools, and Multi-Sensor Data Fusion and Spatial Computing Algorithms in Virtual Urban Environments and Networked Smart Cities
Raluca-Ștefania Balica1, Mile Vasic2, Cătălina Ioana Bonciu3, Ștefan Gabriel Burcea4, Izzat Al-Hadi Razali5ABSTRACT. This article advances existing literature concerning machine vision and immersive 3D technologies, digital twin modeling and simulation tools, and multi-sensor data fusion and spatial computing algorithms in virtual urban environments and networked smart cities. The analysis highlights that smart city and big data visualization analytics, intelligent sensor and digital twin networks, virtual reality modeling and big data mining tools, Internet of Things sensor and 3D virtual simulation technologies, and ambient sound recognition software enhance immersive 3D environments. Machine learning-based study selection process and text mining systematic review management software and tools harnessed include BIBOT, Citationchaser, Eppi-Reviewer, JBI SUMARI, Litstream, PICO Portal, and SWIFT-Active Screener. The case study covers how Seoul’s smart city and artificial intelligence algorithms enable efficient public service delivery and natural disaster simulation, collaborative urban mobility, waste, and renewable energy planning and management, and environmental condition and greenhouse gas emission monitoring and analysis for carbon neutrality.
Keywords: machine vision; digital twin; multi-sensor data fusion; spatial computing; virtual urban environments; networked smart cities
How to cite: Balica, R.-Ș., Vasic, M., Bonciu, C. I., Burcea, Ș. G., and Razali, I. Al-H. (2025). “Machine Vision and Immersive 3D Technologies, Digital Twin Modeling and Simulation Tools, and Multi-Sensor Data Fusion and Spatial Computing Algorithms in Virtual Urban Environments and Networked Smart Cities,” Geopolitics, History, and International Relations 17(2): 65–76. doi: 10.22381/GHIR17220255.
Received 18 July 2025 • Received in revised form 21 October 2025
Accepted 28 October 2025 • Available online 30 October 2025
