• Ei tuloksia

Facial expression based satisfaction index for empathic buildings

N/A
N/A
Info
Lataa
Protected

Academic year: 2022

Jaa "Facial expression based satisfaction index for empathic buildings"

Copied!
4
0
0

Kokoteksti

(1)

Facial Expression Based Satisfaction Index for Empathic Buildings

Fahad Sohrab

Faculty of Information Technology and Communication Sciences,

Tampere University Tampere, Finland fahad.sohrab@tuni.fi

Jenni Raitoharju

Programme for Environmental Information,

Finnish Environment Institute Jyväskylä, Finland jenni.raitoharju@environment.fi

Moncef Gabbouj

Faculty of Information Technology and Communication Sciences,

Tampere University Tampere, Finland moncef.gabbouj@tuni.fi

Imaging and preprocessing

Facial expression recognition

Real time emotions display in Empathic Building platform Real time emotions display in Empathic Building platform

Subject Satisfaction

index

Sensor data

Data post-processing and analysis

Satisfaction statistics

Storage in database

Images Online expressions

Figure 1: Facial expression recognition system for well-being analysis

ABSTRACT

In this work, we examine the suitability of automatic facial expres- sion recognition to be used for satisfaction analysis in an Empathic Building environment. We use machine learning based facial ex- pression recognition on the working stations to integrate an online satisfaction index into Empathic Building platform. To analyze the suitability of facial expression recognition to reflect longer-term satisfaction, we examine the changes and trends in the happiness curves of our test users. We also correlate the happiness curve with temperature, humidity, and light intensity of the test users’

local city (Tampere Finland). The results indicate that the proposed analysis indeed shows some trends that may be used for long-term satisfaction analysis in different kinds of intelligent buildings.

CCS CONCEPTS

•Human-centered computing→Collaborative and social computing theory, concepts and paradigms; •Applied com- puting→Sociology.

KEYWORDS

Empathic Building; Facial Expressions; Satisfaction Index; Machine Learning

ACM Reference Format:

Fahad Sohrab, Jenni Raitoharju, and Moncef Gabbouj. 2020. Facial Ex- pression Based Satisfaction Index for Empathic Buildings. InAdjunct Pro- ceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Sym- posium on Wearable Computers (UbiComp/ISWC ’20 Adjunct), September 12–16, 2020, Virtual Event, Mexico.ACM, New York, NY, USA, 4 pages.

https://doi.org/10.1145/3410530.3414443

1 INTRODUCTION

Employers increasingly understand that their employees’ well- being is an important asset that can help to ensure their long-term motivation and productivity. There are multiple ways to increase well-being at work, one of them being a pleasant working envi- ronment. Intelligent buildings are viewed as entities that provide a responsive, effective, and supportive environment to users of the building within which a specific objective is achieved [5]. The approaches to define intelligent buildings can be categorized into performance-based, services-based, and system-based definitions [19]. The performance-based descriptions include the buildings that provide efficient use of resources and utilize the facilities efficiently at minimum cost. Services-based definitions describe intelligent buildings from the service giving perspective. System-based defini- tions solely focus on the technology and systems that intelligent buildings should include. The common goal of all approaches is to increase user satisfaction. Thus, it is important to understand which changes or services are indeed contributing to increased satisfaction levels and to follow general satisfaction in intelligent buildings.

Traditionally, long term satisfaction is judged by carrying out different surveys [16]. These surveys can be tedious to design and prone to human error. Repeated surveys may be even seen as a factor decreasing work-satisfaction. Furthermore, surveys cannot provide immediate feedback of user satisfaction that would allow to

(2)

UbiComp/ISWC ’20 Adjunct, September 12–16, 2020, Virtual Event, Mexico Sohrab et al.

address potential problems (e.g. too noisy environment) without de- lays. Therefore, different approaches for automatic satisfaction and emotion recognition have recently gathered much attention. There have been studies to monitor human emotion states automatically from different cues, for example, from a speech during conversation [4]. More sophisticated approaches for real-time mood monitoring involves using a wearable sensor such asMoodmetric rings1. Mood- metric rings use electrodermal activity to measure mood changes in real-time [8]. The goal is to build a real-time emotion detection system for analyzing social mood in a particular environment, so that the services provided in the environment can be adapted and adjusted by reacting to different moods of users [13, 20].

One of the leading indicators of emotions is facial expression.

In human to human interaction, the communication contributed by verbal cues, vocal cues, and facial expression is 7, 38 and 55 percent, respectively [11]. Facial expressions allow people to ex- press emotional states in day-to-day routines and convey emotional experiences [14]. The six basic emotions are happiness (including amusement), surprise, disgust, sadness, anger, and fear [6]. In this paper, we analyze the suitability of automatic facial expression anal- ysis carried out on the office computers for satisfaction monitoring.

To this end, we propose a simple system that uses the amount of recorded happy expressions as a measure of satisfaction. The expression of happiness in facial expressions is a smile or, more specifically, a deuchenne smile, which can be automatically detected by deploying an automatic emotion recognition system. Figure 1 depicts the overall idea of emotions recognition system based on visual cues forEmpathic Building2platform. The Empathic Building platform focuses on improving employee well-being and happiness by providing solutions/answers to end-user problems. It offers a view of data coming from many different sensors on the map of premises in real-time.

While several studies have highlighted the role of well-being and satisfaction as a factor that influences people’s evaluation of happiness [2, 10], it is not straightforward to use happy expressions as a measure of satisfaction with the environment. Happiness and satisfaction are often used interchangeably in the literature for well- being analysis; however, happiness is the emotional component, while satisfaction is the cognitive component [12]. Furthermore, emotions are not necessarily shown as observable expressions [9]

and it is likely that changes in the office conditions can generate only mild emotions compared to other factors such as personal interactions. It is evident that a facial expression at a particular moment cannot be assumed to be a measure of overall satisfaction, but we analyze whether there are observable patterns in the shown expressions related to the time of the day. As it is known that the human mood can be affected by direct sunlight or a view of indirect sunlight from a window [1], we also analyze whether we can see any correlation between the observed expression and outside weather.

The rest of the paper is organized as follows. In Section 2, we explain the basic components of our approach for measuring sat- isfaction index based on automatic facial expression recognition.

In Section 3, we describe our experimental data collection process and carry out the analysis. In Section 4, conclusions are drawn.

1https://moodmetric.com/

2https://empathicbuilding.com/

2 SYSTEM DESCRIPTION

The main components of the proposed system for satisfaction mon- itoring are Empathic Building platform and a facial expression recognition algorithm. Our goal is to have both real-time satis- faction index to be displayed on Empathic Building platform and long-term satisfaction analysis. We use webcams placed at working stations of different users for expression recognition in real-time.

Figure 2 shows the satisfaction indices of different users displayed at their corresponding position in a virtual map of the premises in Empathic Building platform.

Figure 2: A screenshot of Empathic Building platform with real-time satisfaction indices displayed in blue circular bub- ble icons (on a scale of 1-100).

The system for facial expression recognition has two primary steps. First, the face of the subject is detected in the image, and relevant features are extracted. In the second step, a classifier is used to predict the emotions based on extracted features. For face detection, we use Haar cascade frontal face detection algorithm implemented in open CV library [18]. In Haar cascade frontal face detection algorithm, Haar-like features are used to encode the local appearance of objects; in our case, the objects are faces. If a face is detected, it is further processed; otherwise, the captured image is discarded. To speedup the face detection, the idea of cascade classifiers is applied to each image. After the face is detected, we apply a pre-trained neural network for emotion recognition to iden- tify the emotions of a given detected face. The pre-trained model used in this work is MiniXception [3], which is trained over facial expression recognition (FER) data-set [7]. FER dataset consists of 48×48 pixel grayscale images of the face. The dataset contains 28,709 training images, 3,589 validation images and 3,589 test im- ages from seven categories (0=Angry, 1=Disgust, 2=Fear, 3=Happy, 4=Sad, 5=Surprise, 6=Neutral). In this work, we use only happy emotions as a satisfaction index to display on Empathic Building platform and for analysis over a longer period. As a measure of hap- piness level, we scale the output of the softmax layer, i.e., the output layer of MiniXception, for the happy class to the range of 1-100. All the images are discarded as soon as the happiness level is collected, so the system does not collect any privacy-sensitive information.

Whenever the expression recognition script is executed, it runs a series of checks necessary for record management, such as creating

(3)

Facial Expression Based Satisfaction Index for Empathic Buildings UbiComp/ISWC ’20 Adjunct, September 12–16, 2020, Virtual Event, Mexico

Figure 3: The averaged and scaled daily satisfaction curves for persons A, B, C, D, and E.

a new record every day and saving the data accordingly in a series with timestamps. The developed system also provides the option of recording different log events, such as lunch, coffee, or meeting, to explain absences from the working space. This diary of recorded events can be useful for evaluating the prior and post-event sat- isfaction levels for a particular subject or in general for a group of people.

The raw data collected by the system are at the frame frequency level of the video, i.e., whenever a frame with a frontal face is detected, the corresponding satisfaction index is saved with that particular timestamp. If no frontal face is detected, the emotion recognition system is not triggered, and hence no information is collected. For long-term satisfaction analysis, we systematically process the collected expression data to perceive the satisfaction index at a particular time period. To analyze typical daily satisfac- tion curves of different persons, we first find the average happiness value of each minute of each day and then average it with the corresponding values (e.g.10:00-10:01) of other days. After finding the mean values of every minute averaged with all correspond- ing daily values, we apply the Savitzky-Golay filter over the data [15]. We use a quadratic polynomial for the model used in the Savitzky-Golay method. We keep the span of the smoothing filter at 30 percent of the total collected data. After smoothing, we scale the smoothed values𝑥𝑖between 0 and 1 as ˜𝑥𝑖 =𝑥𝑥𝑖−𝑥𝑚𝑖𝑛

𝑚𝑎𝑥−𝑥𝑚𝑖𝑛, where

˜

𝑥𝑖is the scaled value,𝑥𝑚𝑖𝑛is the minimum and𝑥𝑚𝑎𝑥the maximum obtained smoothed value for the corresponding subject.

3 DATA COLLECTION AND ANALYSIS

Real-time data were collected for five volunteers at Hervanta cam- pus of Tampere University, Finland. The cameras were placed on their corresponding working machines in the office environment.

The recording were carried out during daytime in good lighting conditions. Each volunteer was asked to start the expression recog- nition system upon daily arrival to the working desk. The collected measurements for persons A, B, C, D, and E span around 14, 36, 9, 11, and 5 days, respectively. The recorded data contains only the predicted emotions for the subject under observation and does not contain any personal information or images. The diary events were not considered in this study.

The obtained average daily satisfaction curves for different test persons are plotted in Figure 3. The figure shows that all the sub- jects follow a certain personal daily pattern, where, e.g., (assumed) lunch times affect the satisfaction level. This leads us to assume that the proposed approach for measuring satisfaction from fa- cial expressions indeed shows some regularity and the changes in such patterns for multiple people may be later used to analyze the long-term effects of changes in the working environment.

We also computed the daily average happiness levels for different persons and compared them with averaged temperature, humidity, and light for that particular day. To obtain the daily averages, we smoothed the emotions values using the Savitzky-Golay filter as explained above and then took the average over the whole day. We retrieved the temperature, humidity, and light measurements from Thingspeak3, which is an online open-source platform for saving

3https://thingspeak.com/

(4)

UbiComp/ISWC ’20 Adjunct, September 12–16, 2020, Virtual Event, Mexico Sohrab et al.

Figure 4: The averaged and scaled daily satisfaction levels along with the averaged and scaled measurements of temperature, humidity, and light for corresponding days for persons A and B.

and accessing data from many different sensors. FromThingspeak, we retrieved the sensor data from Ursa Astronomical Association publicly available channel (channel ID: 37245) that gives the local weather information [17]. The averaged daily satisfaction levels along with the averaged measurements of temperature, humidity, and light are plotted in Figure 4 for persons A and B. Due to space limitation, we show these plots only for persons with the highest amount of collected data. All data is scaled between 0 and 1 for clarity of analysis. We can see some correlation between the satis- faction levels and amounts of light (sunshine) especially for the first six days for subject A. Nevertheless, it is also clear that weather conditions alone cannot explain the changes in satisfaction. This is natural and must be taken in to account if using the proposed ap- proach for long-term satisfaction analysis in an office environment.

4 CONCLUSION

In this work, we implemented a real-time facial expression recog- nition system for Empathic Building platform. We analyzed daily satisfaction curves for our test subjects and observed that there are personal patterns. We also observed some level of correlation between the satisfaction levels and amount of sunshine. These observation lead as to assume that the proposed approach has po- tential to be used for long-term large-scale satisfaction surveys in intelligent buildings, but it must be taken into account that the facial expression of an individual person at a certain time is not in- dicative, but we need to focus on collective long-term curves. With Empathic Building platform this analysis can be also connected with specific zones in the building.

ACKNOWLEDGMENTS

TietoEVRY and Business Finland supported this work for Virpa-D and CVDI AMALIA project. We want to thank all the technical staff of Empathic Building platform and extend our sincere gratitude to Pauli Ervi (Dead Set Bit) and Tomi Teikko and Matti Vakkuri (TietoEVRY) for the technical discussion and financial support.

REFERENCES

[1] Mihyang An, Stephen M Colarelli, Kimberly O’Brien, and Melanie E Boyajian.

2016. Why we need more nature at work: Effects of natural elements and sunlight on employee mental health and work attitudes.PloS one11, 5 (2016), 1–17.

[2] Metin Argan, Mehpare Tokay Argan, and Mehmet Tahir Dursun. 2018. Ex- amining relationships among well-being, leisure satisfaction, life satisfaction,

and happiness.International Journal of Medical Research & Health Sciences7, 4 (2018), 49–59.

[3] Octavio Arriaga, Matias Valdenegro-Toro, and Paul Plöger. 2017. Real-time convolutional neural networks for emotion and gender classification. (2017).

arXiv:arXiv:1710.07557

[4] Ling Cen, Zhu Liang Yu, and Wee Ser. 2015. Maximum a Posteriori Based Fusion Method for Speech Emotion Recognition. InEmotion Recognition: A Pattern Analysis Approach. John Wiley and Sons, Inc, New Jersey, USA.

[5] T Derek and J Clements-Croome. 1997. What do we mean by intelligent buildings?

Automation in construction6, 5-6 (1997), 395–400.

[6] Paul Ekman and Dacher Keltner. 1997. Universal facial expressions of emotion.

Segerstrale U, P. Molnar P, eds. Nonverbal communication: Where nature meets culture(1997), 27–46.

[7] Ian J Goodfellow, Dumitru Erhan, Pierre Luc Carrier, Aaron Courville, Mehdi Mirza, Ben Hamner, Will Cukierski, Yichuan Tang, David Thaler, Dong-Hyun Lee, et al. 2015. Challenges in representation learning: A report on three machine learning contests.Neural Networks64 (2015), 59–63.

[8] Eija Halkola, Lauri Lovén, Marta Cortes, Ekaterina Gilman, and Susanna Pirt- tikangas. 2019. Towards measuring well-being in smart environments. InAdjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiq- uitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers. Association for Computing Machinery, New York, United States, 1166–1169.

[9] Carroll E. Izard and George Mandler. 1997. Emotions and facial expressions:

A perspective from Differential Emotions Theory. Cambridge University Press, Cambridge, UK, 57–77. https://doi.org/10.1017/CBO9780511659911.005 [10] Jin-Ding Lin, Pei-Ying Lin, and Chia-Ling Wu. 2010. Wellbeing perception of

institutional caregivers working for people with disabilities: Use of Subjective Happiness Scale and Satisfaction with Life Scale analyses.Research in develop- mental disabilities31, 5 (2010), 1083–1090.

[11] Albert Mehrabian. 2008. Communication without words.Communication theory 6 (2008), 193–200.

[12] Amado Peiro. 2006. Happiness, satisfaction and socio-economic conditions: Some international evidence.The Journal of Socio-Economics35, 2 (2006), 348–365.

[13] Anjali Rai and Deepak Tandon. 2018. Emotional Intelligence & Job Satisfaction in IT Industry Employee.Asian Journal of Research in Social Sciences and Humanities 8, 5 (2018), 139–147.

[14] Matthew S Ratliff and Eric Patterson. 2008. Emotion recognition using facial expressions with active appearance models. InProceeding Human Computer Interaction ACTA. Citeseer, Press, CA, United States, 138–143.

[15] Ronald W Schafer. 2011. What is a Savitzky-Golay filter?[lecture notes].IEEE Signal processing magazine28, 4 (2011), 111–117.

[16] Feryal Subaşı and Osman Hayran. 2005. Evaluation of life satisfaction index of the elderly people living in nursing homes.Archives of Gerontology and Geriatrics 41, 1 (2005), 23–29.

[17] URSA. 2020. Tähtitieteellinen yhdistys Ursa. Ursa Astronomical Association.

Retrieved June 22, 2020 from http://https://www.ursa.fi/

[18] Paul Viola, Michael Jones, et al. 2001. Rapid object detection using a boosted cascade of simple features.CVPR (1)1, 511-518 (2001), 3.

[19] Shengwei Wang. 2009.Intelligent buildings and building automation. Routledge, London, UK.

[20] Mingmin Zhao, Fadel Adib, and Dina Katabi. 2016. Emotion recognition using wireless signals. InProceedings of the 22nd Annual International Conference on Mobile Computing and Networking(New York City, New York). ACM, Association for Computing Machinery, New York, NY, USA, 95–108.

Viittaukset

LIITTYVÄT TIEDOSTOT

The results show that there are many personal reasons for the researched phenomenon as well as complicated factors that influence in-service teachers to leave their jobs and go

Clarifying how job satisfaction and stress are connected to HIM and job control (the control employees have over their work), this study is based on data from two Finnish sources:

Based on the findings, this study suggests that public relations agencies harness their expertise in publics segmentation and relationship building to help their corporate

An exception to this can be found in post-Keynesian economics, which is based on a notion that firms aim to achieve market power and that their pricing models

It emphasizes that crucial to the work of FLOSS’ network actors is not their merely technological productivity, but their cultural and political productivity – that is, their

It is said that there are two ways for foreign investors acting under the pro- tection of an investment treaty to initiate international arbitration proceed- ings based on

If parents’ rights are already restricted at that point and the long- term separation of the child from their parents has taken place for their best interest, how is the step

Users are, however, reluctant to provide their personal information to applications; therefore, there is a craving for new regulations and systems that allow