Remote Sensing Data Fusion Techniques, Autonomous Vehicle Driving Perception Algorithms, and Mobility Simulation Tools in Smart Transportation Systems
Tomas Kliestik1, Hussam Musa2, Veronika Machova3, and Linda Rice4ABSTRACT. The objective of this paper is to systematically review remote sensing data fusion techniques, autonomous vehicle driving perception algorithms, and mobility simulation tools in smart transportation systems. The findings and analyses highlight that visual perception algorithms, sensing and computing technologies, and route planning and control tools configure networked digital infrastructures. Throughout March 2022, a quantitative literature review of the Web of Science, Scopus, and ProQuest databases was performed, with search terms including “smart transportation systems” + “remote sensing data fusion techniques,” “autonomous vehicle driving perception algorithms,” and “mobility simulation tools.” As research published between 2019 and 2022 was inspected, only 92 articles satisfied the eligibility criteria. By taking out controversial or ambiguous findings (insufficient/irrelevant data), outcomes unsubstantiated by replication, too general material, or studies with nearly identical titles, we selected 15 mainly empirical sources. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: AMSTAR, Dedoose, Distiller SR, and SRDR.
Keywords: autonomous vehicle; remote sensing; data fusion; mobility simulation
How to cite: Kliestik, T., Musa, H., Machova, V., and Rice, L. (2022). “Remote Sensing Data Fusion Techniques, Autonomous Vehicle Driving Perception Algorithms, and Mobility Simulation Tools in Smart Transportation Systems,” Contemporary Readings in Law and Social Justice 14(1): 137–152. doi: 10.22381/CRLSJ14120229.
Received 17 March 2022 • Received in revised form 25 July 2022
Accepted 28 July 2022 • Available online 30 July 2022