You are on page 1of 6

A new tool to support diagnosis of neurological disorders by means of facial expressions

Vitoantonio Bevilacqua12, Dario DAmbruoso1, Giovanni Mandolino1, Marco Suma1


Dipartimento di Elettrotecnica ed Elettronica, Politecnico di Bari, Via Orabona, 4, Bari, 70125, Italy 2 e.B.I.S. s.r.l. (electronic Business in Security), Spin-Off of Polytechnic of Bari Via Pavoncelli, 139 Bari Italy corresponding author: bevilacqua@poliba.it
AbstractRecognizing facial emotions is an important aspect of interpersonal communication that may be impaired in various neurological disorders: Aspergers syndrome, Autism, Schizoid Personality, Parkinsonism, Urbach-Wiethe, Amyotrophic Lateral Sclerosis, Bipolar Disorder, Depression, Alzheimer's desease. Altough it is not possible to define unique emotions, we can say that are mental states, physiological and psychophysiological changes associated with internal or external stimuli, both natural and learned. This paper highlights certain requirements that the specification approach would need to meet if the production of such tools were to be achievable. In particular, we present an innovative and still experimental tool to support diagnosis of neurological disorders by means of facial-expressionsmonitoring. At the same time, we propose a new study to measure several impairments of patients recognizing emotions ability, and to improve the reliability of using them in computer aided diagnosis strategies.
Keywords: Emotion Recognition; Action Units Recognition; Apathy; Absence Expression; Flattening Expression; Neurological Disorders.
1

Happiness; Surprise; Disgust; Anger; Fear; Sadness.

FIGURE 1. Examples of Emotional Expressions, from left to right: Mild (row 1) and Extreme (row 2) expressions.

I.

INTRODUCTION

Ability to produce and recognize facial expressions of emotion is an important component of interpersonal communication in humans and primates. Cultures may differ in social rules and customs, however certain facial expressions of emotion are universally recognized. These universal emotions include happiness, sadness, anger, fear, disgust, surprise. The word "emotion" comes from the Latin "ex" that means out, and "motio", that means moving, so "emotion" indicates a movement, in particular a body movement. Ekman's theory is based on a experiment analysis and cross-cultural comparison (Ekman et. al., 1972). It has been observed as an emotional expression within a specific population is interpreted correctly and uniformly by other ones, and vice versa. We can define as primary those expressions vey common in the whole of humanity, it means that there are common emotions they generate, and these can be defined as primary (FIGURE 1). Ekman has thus identified six primary emotions:

While universal emotions are recognized across different cultural and ethnic groups, social emotions, such as guilt, shame, arrogance, admiration and flirtatiousness, are particular of culture specific interactions. Processing of affect is supported by distributed neural systems, and as lesions studies show, preferentially by the right hemisphere. On account of human and primate studies, cortical areas of the inferior occipital gyrus, fusiform gyrus and inferior temporal gyrus have been reported as essential for face processing. The amygdala (FIGURE 2) receives input from these cortical areas and from the visual pathways of the midbrain and thalamus [1]. The amygdala is mainly involved in the processing of fear, but is activated upon passive and implicit presentations of facial expressions of emotions in general. Orbitofrontal areas, in particular on the right, are involved in explicit identification of facial emotions.

Some individuals emphasize somatic complaints (e.g., bodily aches and pains) rather than reporting feelings of sadness. Many individuals report or exhibit increased irritability (e.g., persistent anger, a tendency to respond to events with angry). In the Autistic Disorder is known a marked impairment in the use of multiple nonverbal behaviors, such as eye-to-eye gaze, facial expressions [3]. III. IMPAIRMENT IN THE USE OF FACIAL EXPRESSION

In Schizophrenia the emotion processing deficits may relate to dysfunction of mesial temporal regions. Schizophrenia patients are impaired in overall emotion recognition, particularly fear and disgust, and the neutral cues are frequently misidentified as unpleasant or threatening.
FIGURE 2. Section image of an anatomical limbic lobe.

Amygdala and frontal areas can modulate sensory areas via feedback mechanisms, can engage other cortical areas, such as hippocampus and neocortex, and can generate a physical response via connections to motor cortex, hypothalamus, and brainstem. There are two main theories regarding the hemispheric specialization in processing emotions. The right hemispheric hypothesis posits that the right hemisphere is specialized to process all emotions, whereas the valence hypothesis regards the right hemisphere as superior for processing negative and the left hemisphere as superior for positive emotions. Common negative emotions are sadness, fear and anger, common positive emotions are happiness and possibly, surprise. While cognition dysfunction has been thoroughly evaluated across neuropsychiatric disorders, impairments in emotion recognition have received increasing attention within the past 15 years. Early investigations on emotion recognition were limited to groups of persons with schizophrenia, depression and brain injury, however within the past decade, emotion recognition deficits have been explored in a wide range of brain related disorders. Now we will analyze the neurological disorder linked to emotional recognition deficits and marked impairment in the use of facial expressions. II. EMOTIONAL RECOGNITION DEFICIT

The subjects affected of Bipolar Disorder suffer from specific deficits of facial emotion perception. The patients present impaired recognition of disgust, fear and sadness [4]. The Urbach-Wiethe (lipoid proteinosis) is a rare disorder with progressive calcification and necrosis of the amygdala. The patients with this disease showed a disproportionate impairment in recognizing fear in facial expressions, and only a much milder impairment in recognizing the intensity of other negative emotions [5]. In patients with Alzheimer (AD) desease may be a deficit in processing some or all facial expressions of emotions [6]. Emotional recognition deficits occur in bulbar Amyotrophic Lateral Sclerosis (ALS), particularly with emotional facial expressions, and can arise independent of depressive and dementia symptoms or co-morbidity with depression and dementia. These findings expand the scope of cognitive dysfunction detected in ALS, and bolster the view of ALS as a multisystem disorder involving cognitive and motor deficits [7]. IV. ACTION UNITS

In the Asperger's Disorder there is a marked impairment in the use of multiple non verbal behaviors such as eye-to-eye gaze, facial expressions, body postures, and gestures to regulate social interaction, a lack of spontaneous seeking to share enjoyment [2]. In the Parkinsonism there is known a decrease of spontaneous facial expressions, gestures, speech, or body movements. Usually in the Schizoid Personality Disorder is displayed a "bland" exterior without visible emotional reactivity and rarely reciprocate gestures or facial expressions, such as smiles or nods. They rarely experience strong emotions such as anger and joy. In the Major Depressive there is a depressed mood that can be inferred from the person facial expressions and demeanors.

Replicating the studies by Charles Darwin, the Canadian psychologist Paul Ekman has confirmed that an important feature of basic emotions is the fact that they are expressed universally. The face, as is known, is the space where, in a communication process, is concentrated most of the sensory information, whether they exhibited (issuer) or read (receiver). According to Ekman and Friesen "the face is a system of multi-signal response, multi-message capable of tremendous flexibility and specificity". These signals are combined to produce different messages, and the movement of facial muscles pull the skin, temporarily distorting the shape of the eyes, eyebrows, lips and the appearance of folds, wrinkles and facial swelling in different parts of the face. Before describing the tools, we explain and discuss the meaning of Action Units (AUs) [8]. AUs are the basic elements for the construction of an expression, they represent minimal facial actions, that are not further separated into more simple actions. The muscle actions and Action Unit (AU) do not agree: in fact, an AU may correspond to the action of one or more of the same muscles and a muscle can be associated with several AUs. Au is, in

short, a basic change in appearance of the face, caused by activation of one or more facial muscles. The AUs are divided into groups according to position and/or the type of action involved. First AUs are in the UPPER FACE, and affect the eyebrows, forehead, and eyelids. Then the LOWER FACE AUs are presented in a file groups: Up/Down, Horizontal, Oblique, Orbital, and Miscellaneous. V. METHODS

The basic idea is to track the face, after detecting it by a C++ software library for finding features in faces (STASM) [9] or Automatic Facial Feature Points Detection [10]. Stasm extends the Active Shape Model of Tim Cootes and his colleagues with other techniques. Before searching for facial features, Stasm uses the Rowley or the Viola Jones face detector to locate the overall position of the face. The STASM is a software that fits a standard shape in the frontal face images with Active Shape Model (ASM) algorithm. The shapes are fitted by moving the points along a line orthogonal to the boundary called Whisker. The points movement is made by weighing changes in levels of gray along the whisker. Each point has an associated profile vector and the image is sampled along the one-dimensional whisker. After the shape is fitted, the STASM creates the log file and the image with the overlapped shape. The algorithm begins with the acquisition of the shape coordinates points from the log file. After selecting the face, Stasm tracks the fiducial points. You give it an image of a face and it returns the positions of the facial features. It also allows you to build your own Active Shape Models (FIGURE 3).

FIGURE 4. Identification features points on the face.

VI.

FROM ACTION UNIT TO EMOTION

The state of the art related to encoding of emotions according to Facial Action Coding System (FACS) [8], allowed us to compile a matrix (TABLE I) in which each AU is associated with expressions (and emotions) in which it occurs. Below are listed only three of all AUs, precisely the ones we have selected as indicative of the difference between positive and negative attitudes.
TABLE I. COMPARE SAME AU AU 1 Inner Brow Raiser

Fear Sadness Surprise[12] Fear Sadness Surprise [15] FIGURE 3. The mean face" (yellow) positioned over a search face at the start of a search. AU 4 Brow lowerer

Fear Sadness Surprise [13] Fear Sadness Surprise [16] Fear 100% Sadness 100% Surprise 87.5% Disgust 37.5% [17]

In this tool we used a modified version of Automatic Facial Feature Points Detection that made a face segmentation in which it localizes the various face components (eyebrows, eyes, mouth, nose). It now uses Machine Perception Toolbox (MPT) [11] to find face and eyes. After this, the frame is converted in a gray-scale image and binaryzed; the darkest clusters contain face features such as eyes, mouth, nostrils and eyebrows. It detects 17 features points (FIGURE 4) using image processing and anthropometrical considerations: the two pupils, the four eye corners, the four eyebrow corners, the two nostrils, the nose tip, the two mouth corners, the upper and lower lip extremity. It is adapted to work in real time.

Anger Fear Sadness [12] Anger Fear Sadness [15]

Anger Fear Sadness [13] Anger Fear Sadness [16]

Sadness 100% Disgust 87,5% Anger 85,5% Fear 75% Embarrassment 12,5% Surprise 37,5% [17]

AU 12 Lip corner puller

Happy [12] Happy [15]

Happy [13] Happy [16]

Happy [14] Joy 100% Happy 50% [17]

VII. FACIAL ACTION UNITS RECOGNITION Analyzing the photos at our disposal, as shown in the TABLE I, we can conclude that to distinguish between positive and negative emotions is not necessary to evaluate all the AUs, and then we use AU 1 (FIGURE 6.a) or AUs 4 (FIGURE 6.b), for negative expressions, and AU 12 (FIGURE 6.c) for the positive ones.

FIGURE 7. This image shows the neutral situation.

FIGURE 6. This images represent particular Aus, in the order: Inner Brow Raiser, Brow lowerer, Lip corner puller.

To evaluate AUs we calculate the variation of the polygons areas as shown in (FIGURE 7). AU-1 is revealed when the polygon created with the browcorners and the eye-corners raises; it is strongly related in sadness and fear emotion. AU-4 is revealed when the same polygon of AU-1 decreases; it is related in anger and disgust emotions. AU-12 is revealed when the polygon created around the mouth is constant; it is strongly related in happiness emotion (FIGURE 8.a). Furthermore the tool is able to recognize some useful combinations of other AUs, which discriminate expressions of diagnostic importance (FIGURE 8.c).

FIGURE 8. This images explain how the areas change in relation with emotions.

Using close links between AUs and emotions, this tool displays the emotions obtained, with the relative intensity. at the moment this tool evaluates intensity of emotions through the region of polygons: when the region raises, the intensity increases. Obviously the intensity of the emotion is due to the appearance of wrinkles and furrow in well-determinate regions of the face. VIII. IMPLEMENTATION This tool is completely written in C++, it uses the OpenCV library for image processing, MPT for face and eyes detection. The Graphical User Interface (GUI) is implemented with Qt library; it is divided in two areas: the first one shows the processing frame (Figure 9), the second one shows the emotional state (Figure 10).

provides rapid results for possible or probable cases; obviously it is not a deterministic method. There is a direct relationship between neurological distress and expressivity of some emotions. For example depression presents persistence of anger and sadness. The tool aims to ultimately recognize the expressions of a face, and then translates them into positive and negative attitudes. The ultimate aim of all this is to help to diagnose neurological distress that have been a loss of expressiveness in the face: a photo or video is shown to the patient, he responds by changing the facial expression, the tool decodes the expression intensity or its absence. Therefore, when the tool reveals a loss of expressivity, i.e. the subject does not react adequately to the to the external stimuli, it signals the potentially presence of disease. Apathy is among the most frequent behavioral disturbances present in patients with Alzheimer's disease. So, this application can associate the absence or flattening expression of emotions, to some neurological disorders, such as Alzheimer and Depression. X.
[1] REFERENCES Christian G. Kohler, MD, Travis H. Turner, BS, Raquel E. Gur, Ruben C. Gur et. al, Recognition of Facial Emotions in Neuropsychiatric Disorders, CNS Spectrums, vol. 9, no. 4, pp. 267-274, 2004; Diagnostic and Statistical Manual of Mental Disorders Fourth Edition, Washington D.C, American Psychiatric Association, 1994; Alan J.Lerner, MD, Diagnostic Criteria in Neuorology, 2006; Cristiana C. de Almeida Rocca, Eveline V. D. Heuvel, Sheila C. Caetano, Beny Lafer, Facial Emotion Recognition in Bipolar Disorder: a Critical Review, Rev. Bras. Psiquiatr., vol. 31, no. 2, So Paulo, 2009; Chiara Cristinzio, David Sander, Patrik Vuilleumier, Recognition of Emotional Face Expressions and Amygdala Pathology, Epileptologie, vol. 24, pp. 130138, 2007; Anselmo-Gallagher, Christian G.M.D., Gerri B.A., Bilker, Kohler CG, Karlawish, et. al, Emotion-Discrimination Deficits in Mild Alzheimer Disease, Am J Geriatr Psychiatry, vol. 13, pp. 926933, 2005; Zimmerman EK, Eslinger PJ, Emotional Perception Deficits in Amyotrophic Lateral Sclerosis, Dep. of Neurology, College of Medicine, Penn State/Hershey Medical Center, Cogn Behav Neurol., vol. 20, no. 2, pp. 7982, 2007; Paul Ekman, FACS: Facial Action Coding System, Research Nexus division of Network Information Research Corporation, Salt Lake City, UT 84107, 2002; S.Milborrow and F. Nicolls, Locating Facial Features with an Extended Active Shape Model, ECCV 2008, Part IV, LNCS 5305, pp. 504513, 2008; Vitoantonio Bevilacqua, Alessandro Ciccimarra, Ilenia Leone, and Giuseppe Mastronardi, Automatic Facial Feature Points Detection LNAI 5227, Edited by:D.-S. Huang, pp. 11421149, 2008; The machine perception toolbox, University of California San Diego, 2005, [Online]. Available: http://mplab.ucsd.edu/grants/project1/freesoftware/mptwebsite/introduction.html Irene Kotsiay Ioannis Pitasy, Facial Expression Recognition in Image Sequences using Geometric Deformation Features and Support Vector, Ieee Transactions on Image Processing, vol. 16, no. 1, 2007; Bridget M.Waller, Jamese J.Cry, Anne M.Burrows, Selection for Universal Facial Emotion, Emotion, vol. 8, no. 3, pp. 435439, 2008; Hadi Seyedarabi, Won-Sook Lee, Classification of Upper and Lower Face Action Units and Facial Expressions using Hybrid Tracking System and Probabilistic Neural Networks, Proceedings of the 5th WSEAS International Conference on Signal Processing, pp. 161-166, 2006;

FIGURE 9. This image shows one of two areas of the GUI: processing frame.

[2] [3] [4]

[5]

[6]

[7]

FIGURE 10. This image shows an example of emotion processing.

[8]

The emotional state is divided in: 1. Emotional details: it contains the emotional percentage about the six emotions; 2. Emotional situation: it contains the overall situation of the human subject. IX. CONCLUSIONS Neurodegenerative diseases are more expensive for Public Health; they are frequently detected only when it is too late, inhibiting the possibility of efficacious therapy. The solution is the prevention, in particular for people over-fifty. The target of this tool is to provide an instrument for an early diagnosis in the initial stage of the disease, overworking the flat-affect of these patients. The flat-affect is the scientific term which represents the impaired recognition of facial expressions of emotions [2]. This non-invasive instrument
[9]

[10]

[11]

[12]

[13] [14]

[15] Wallace V. Friesen, Paul Ekman, Emfacs, Emotional Facial Action Coding System. Unpublished Manual, University of California, California, 1984; [16] Paul Ekman, Wallace V. Friesen, Joseph C. Hager, Facial Action Coding System Investigators Guide, Published by A Human Face, 666 Malibu Drive, Salt Lake City UT 84107, 2002; [17] Worth a thousand words: Absolute and Relative Decoding of Nonlinguistic Affect Vocalizations, Skyler T. Hawk, Gerben A. Van

Kleef, Agneta H. Fischer, Job Van Der Schal, Emotion, vol 9, Issue 3, pp. 293-305, 2009. [18] Anatomical basic of facial expression learning tool, Victoria Contreras Flores, 2005, [Online]. Available: http://www.artnatomia.net/uk/artnatomiaIng.html

You might also like