Emotion Recognition and Visual Matching Algorithms, Augmented Reality Makeup Try-on and 3D Body Scanning Technologies, and Artificial Neural Network-based Neuromorphic Computing and In-Sensor Motion Perception Systems for Physical Appearance and Attractiveness
George Lăzăroiu*ABSTRACT. The objective of this paper is to systematically review 3D body sensory and hyper-realistic augmented reality beauty technologies, deep learning-based realistic-sounding synthetic voices, and digital clothing try-on apps. The findings and analyses highlight that augmented reality face filters further negative behaviors and sentiments, body shape and attractiveness, social appearance comparisons, and cognitive and affective engagement. Throughout June 2024, a quantitative literature review of the Web of Science, Scopus, and ProQuest databases was performed, with search terms including “physical appearance and attractiveness” + “emotion recognition and visual matching algorithms,” “augmented reality makeup try-on and 3D body scanning technologies,” and “artificial neural network-based neuromorphic computing and in-sensor motion perception systems.” As research published between 2017 and 2024 was inspected, only 166 articles satisfied the eligibility criteria, and 29 mainly empirical sources were selected. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: Colandr, JBI SUMARI, PICO Portal, Rayyan, SRDR+, and SWIFT-Active Screener.
Keywords: emotion recognition; augmented reality makeup try-on; 3D body scanning; visual matching; in-sensor motion perception; physical appearance and attractiveness
How to cite: Lăzăroiu, G. (2024). “Emotion Recognition and Visual Matching Algorithms, Augmented Reality Makeup Try-on and 3D Body Scanning Technologies, and Artificial Neural Network-based Neuromorphic Computing and In-Sensor Motion Perception Systems for Physical Appearance and Attractiveness,” Journal of Research in Gender Studies 14(2): 83–98. doi: 10.22381/JRGS14220245.
Received 18 July 2024 • Received in revised form 20 December 2024
Accepted 25 December 2024 • Available online 30 December 2024