Zhang, L., Shum, H. P. H., Liu, L., Guo, G., & Shao, L. (2019). Multiview discriminative marginal metric learning for makeup face verification. Neurocomputing, 333, 339–350. doi:10.1016/j.neucom.2018.12.003
Abstract:
Makeup face verification in the wild is an important research problem for its popularization in real-world. However, little effort has been made to tackle it in computer vision. In this research, we first build a new database, i.e., Facial Beauty Database (FBD), which contains paired facial images of 8933 subjects without and with makeup in different real-world scenarios. To the best of our knowledge, FBD is the largest makeup face database to date compared with existing databases for facial makeup research. Moreover, we propose a new discriminative marginal metric learning (DMML) algorithm to deal with this problem in the wild. Inspired by the fact that interclass marginal faces are usually more discriminative than interclass nonmarginal faces in learning the discriminative metric space, we use the interclass marginal faces to depict the discriminative information. Simultaneously, we wish that those interclass marginal faces without makeup relations are separated from each other as far as possible, so that more discriminative information between facial images without and with makeup can be exploited for verification. Furthermore, since multiple features could provide comprehensive information in describing the facial representations from diverse points of view and extract more informative cues from facial images, we also introduce a multiview discriminative marginal metric learning (MDMML) algorithm by effectively learning a robust metric space such that multiple features from different points of view can be integrated to effectively enhance the performance of makeup face verification. Experimental results on two real-world makeup face databases are utilized to show the effectiveness of our method and the possibility of verifying the makeup relations from facial images in real-world.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
This project was supported by the Engineering and Physical Sciences Research Council (EPSRC)