Classification and Gender Recognition from Veiled Faces - Classical Machine Learning to Deep Learning
Our aim here is to investigate to what extent a computer system can identify veiled-human and recognize gender using eyes and the uncovered part of the face. For the purpose of this, we have created a new veiled persons image (VPI) database shot using a mobile phone camera, imaging 100 different veiled-persons over two sessions. After preprocessing and segmentation we used a fused method for feature extraction. The fusion occurs between geometrical (edge ratio) and textural (probability density function of the color moments) features. The experimental results using different classifiers were ranging from 88.63% to 97.22% for person identification accuracy before feature selection and up to 97.55% after feature selection. The proposed method  achieved up to 99.41% success rate for gender classification. We next tested the ability of deep learning based automated computer system  to identify not only persons, but also to perform recognition of gender, age, and facial expressions such as eye smile. Our experimental results indicate that we obtain high accuracy for all the tasks. The best recorded accuracy values are up to 99.95% for identifying persons, 99.9% for gender recognition, 99.9% for age recognition and 80.9% for facial expression (eye smile) recognition.
Dataset details: Veiled-persons image (VPI) database is available upon request from Ahmad Hassanat.
A. B. Hassanat, V. B. S. Prasath, B. M. Al-Mahadeen, S. M. M. Alhasanat. Classification and Gender Recognition from veiled-faces. International Journal of Biometrics. 9(4), 347-364, September 2017. doi:10.1504/IJBM.2017.10009351
A. B. A. Hassanat, A. A. Albustanji, A. S. Tarawneh, M. Alrashidi, H. Alharbi, M. Alanazi, M. Alghamdi, I. S. Alkhazi, V. B. S. Prasath. DeepVeil: Deep learning for identification of face, gender, expression recognition under veiled conditions. International Journal of Biometrics, 14(3/4), 453-480, August 2022. doi:10.1504/IJBM.2022.10048981. Preliminary version at arXiv, doi:10.48550/arXiv.2111.01930