3D Virtual Modeling and Artificial Intelligence Clothing Try-on Technologies, Hyper-Realistic Augmented Reality-powered Makeup Filters, and Cognitive and Affective Algorithms for Beauty Ideal Cultural Constructions
Gheorghe H. Popescu1, Silvia Elena Iacob2, Alexandra Carmen Bran2, Cristian Stana2, and Mădălina Radu3ABSTRACT. This article advances existing literature concerning 3D virtual modeling and artificial intelligence clothing try-on technologies, hyper-realistic augmented reality-powered makeup filters, and cognitive and affective algorithms for beauty ideal cultural constructions. The contribution to the literature on thinness ideal internalization, negative mood and self-esteem, and self-presentation perceptions and behaviors is by showing that beauty filters and facial recognition technologies modify facial features, simulate makeup, balance complexion, and smooth skin, ensuring natural-looking results and appearance optimization, and thus enhancing visual appeal. Machine learning classifiers, reference management software, and study screening tools leveraged include AMSTAR, DistillerSR, Eppi-Reviewer, METAGEAR package for R, PICO Portal, and Systematic Review Accelerator. The case studies cover Peachy, Photoshop Express, BIGVU beauty face filter app, YouCam Makeup AI-powered face editing and virtual makeup tools, and AirBrush AI Face Retouching and advanced 3D tools.
Keywords: 3D virtual modeling; artificial intelligence clothing try-on; hyper-realistic augmented reality-powered makeup; cognition; affectiom; beauty ideal cultural construction
How to cite: Popescu, G. H., Iacob, S. E., Bran, A. C., Stana, C., and Radu, M. (2025). “3D Virtual Modeling and Artificial Intelligence Clothing Try-on Technologies, Hyper-Realistic Augmented Reality-powered Makeup Filters, and Cognitive and Affective Algorithms for Beauty Ideal Cultural Constructions,” Journal of Research in Gender Studies 15(1): 33–40. doi: 10.22381/JRGS15120253.
Received 9 February 2025 • Received in revised form 22 July 2025
Accepted 23 July 2025 • Available online 28 July 2025
