Recognition of emotional expressions using the grouping crowdings of characteristic mimic states

O.V. Barmak, E.A. Manziuk, O.D. Kalyta, Iu. Krak, V.O. Kuznetsov, A.I. Kulias

Abstract


The characteristic forms of facial expressions of the emotional states of a person are typical of a rather large degree of generalization on the basis of common physiological structures and the location of the muscles that form the human face. This circumstance is one of the main reasons for the commonality of human manifestations of emotions that are reflected in the face. By the nature and form of facial expressions on the face with high probability, it is possible to determine the emotional state of a person with some correction on the part of the cultural characteristics and traditions of certain groups. In accordance with the existence of common mimic forms of emotional manifestations, an approach is proposed to create a model of recognition of emotional manifestations on the face of a person with relatively low requirements for the means of photo, video-fixation and acceptable speed in the video stream. The creation of the model is based on the implementation of the hyperplane classification of mimic manifestations of major emotional states. One of the main advantages of the proposed approach is the small computational complexity that allows realizing the recognition of the changes in people’s emotional state without any special equipment (for low-resolution or long-distance video cameras).  In addition, the model developed on the basis of the proposed approach allows obtaining proper recognition accuracy with low requirements for quality image characteristics, which allows extending the scope of practical application to a great extent. One example of practical application is control over the drivers in the process of driving the vehicle, complex production operators, and other automated visual surveillance systems. The set of detected emotional states is formed in accordance with the set tasks and gives the opportunity to focus on the recognition of mimic forms and group characteristic structural manifestations based on the set of distinguished characteristic features.

Problems in programming 2020; 2-3: 173-181

 

Keywords


facial expressions emotion recognition; hyperplane classification; a simplified classification model

References


Hjortsjö C.H. Man's Face and Mimic Language, Studentlitteratur. Lund, Sweden, 1970.

Ekman P., Friesen W.V. Manual for the Facial Action Code. Palo Alto, CA: Consulting Psychologist Press. 1978.

Ekman P., Friesen W.V., Hager J.C. The Facial Action Coding System. Salt Lake City, UT Research Nexus eBook. 2002.

Hamm J., Kohler C.G., Gur R.C., Verma R. Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders. Journal of Neuroscience Methods. Philadelphia, USA. 2011.

Brunelli R., Poggio T. Face recognition: features versus templates. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1993. 15(10). P. 1042–1052.

Kryvonos I.G., Krak I.V. Modeling human hand movements, facial expressions, and articulation to synthesize and visualize gesture information. Cybernetics and Systems Analysis. 2011. 47. P. 501–505.

Krak Yu.V., Barmak A.V., Baraban E.M. Usage of NURBS-approximation for construction of spatial model of human face. Journal of Automation and Information Sciences. 2011. Vol. 43(2). P. 71–81.

Martinez A., Du S. A model of perception of facial expressions of emotion by human: research overview and perspectives. Journal of Machine Learning Research. 2012. 13. P. 1589–1608.

Kryvonos I.G., Krak I.V., Barmak O.V., Ternov A.S., Kuznetsov V.O. Information Technology for the Analysis of Mimic Expressions of Human Emotional States. Cybernetics and Systems Analysis. 2015. 51. P. 25–33.

Barmak O.V., Kalyta O.D., Hashchuk T.O. Skrypnyk T.K. Information technology for determining the criteria of facial areas that reproduce emotional facial expressions. Herald of Khmelnytskyi national university. Technical Sciences, Issue 6, 2018 (2). p. 130-134.

Barmak A.V., Krak Y.V., Manziuk E.A., Kasianiuk V.S. Information technologyof separating hyperplanes synthesis for linear classifiers. Journal of Automation and Information Sciences. 2019. 51(5). P. 54–64.

Krak I., Barmak O., Manziuk E. Using visual analytics to develop human and machine centric models: A review of approaches and proposed information technology. Computational Intelligence. 2020. 1– 26. CrossRef

Cox T.F., Cox M.A.A. Multidimensional scaling, 2nd edn. Chapman and Hall / CRC, Boca Raton, 2001.

Krak I.V., Kudin G.I., Kulyas A.I. Multidimensional Scaling by Means of Pseudoinverse Operations. Cybernetics and Systems Analysis. 2019. 55(1). P. 22–29.

Barmak A.V., Krak Yu.V., Manziuk E.A., Kasianiuk V.S. Definition of information core for documents classification. Journal of Automation and Information Sciences. 2018. Vol. 50(4). P. 25–34.

Maaten van der L.J.P., Postma E.O., Herik van den H.J. Dimensionality reduction: a comparative review. Technical report TiCC-TR 2009-005. Tilburg University 2009.

Leeuw de J., Mair P. Multidimensional scaling using majorization: SMACOF in R. Journal of Statistical Software. 2009. Vol. 31. Issue 3.

P. 1–30.

Wingenbach TSH, Ashwin C, Brosnan M Correction: Validation of the Amsterdam Dynamic Facial Expression Set – Bath Intensity Variations (ADFES-BIV): A Set of Videos Expressing Low, Intermediate, and High Intensity Emotions. PLOS ONE 11(12): e0168891. 2016. CrossRef




DOI: https://doi.org/10.15407/pp2020.02-03.173

Refbacks

  • There are currently no refbacks.