• Ei tuloksia

Detection of Facial Emotions while reading online text

N/A
N/A
Info
Lataa
Protected

Academic year: 2022

Jaa "Detection of Facial Emotions while reading online text"

Copied!
54
0
0

Kokoteksti

(1)

Sunil Khan Bhadur

Master's thesis

Detection of Facial Emotions while reading online text

School of Computing Computer Science

May 2021

(2)

UNIVERSITY OF EASTERN FINLAND, Faculty of Science and Forestry, {Joensuu│Kuopio}

School of Computing {Computer Science}

Sunil Khan Bhadur: Demonstration of facial emotions in Online Learning Environment Master’s Thesis, 54 p

Supervisors of the Master’s Thesis: Ph.D. Markku Tukianen May 2021

Abstract: Emotion Detection while reading online text, is an innovative way to take feedback from real-time facial emotions. Facial emotions are the non-verbal communication type used by humans to transfer their feelings to others. The primary purpose of this study is to present methods to detect and analyze those facial expressions of humans while reading online text and using those facial expressions for further analysis. There are different trained models available to detect those facial expressions in which CNN is popular and widely used. We have developed an application by using the CNN-based trained Facial Expression Recognition Model of an open-source API called JavaScript face-API and tested this application empirically to try to detect the facial expression. Human emotions detection plays a crucial role when students learn through any online learning platform. This approach makes it easy to detect whether the student understands the particular topic, so it is the most innovative way of getting feedback rather than the traditional method such as survey forms.

Keywords: Facial expression, emotions detection, online text, online learning, Java Script face API, Design Science research

(3)

Foreword

This thesis was done at the School of Computing, University of Eastern Finland during spring 2021.

I want to extend my gratitude to my Teachers, Parents, and Friends who encouraged me and helped me out to complete this Thesis. I would not be able to do it without the encouragement of all those people.

First of all, I would like to thanks our Lord Jesus Christ for every success of my life because without Him I am not able to do anything.

I am very thankful to Prof. Markku Tukianen for his guidance and supervision to complete my Thesis. I am grateful to him for his help and care regarding my work.

I would also thank my family especially my mother for her motivation and support throughout my studies.

Lastly, I would like to say thank you to the School of Computing, University of Eastern Finland for providing me this wonderful opportunity and all the professors/instructors of my courses who taught me throughout this study program.

(4)

List of abbreviations

API Application Programming Interface AI Artificial Intelligence

ANN Artificial Neural Network CNN Convolutional Neural Network DNN Deep Neural Network

HCI Human-Computer Interaction ML Machine Learning

OS Operating System UI User Interface

URL Uniform Resource Locator UX User Experience

VN Vision Network

(5)

Contents

1 Introduction ...6

1.1 Motivation ...8

1.2 Development of Online Learning system ...9

1.3 Implementing an example emotion detection ... 10

1.4 Limitations in online learning methods ... 11

1.5 Problem Statement ... 11

1.6 Research Objectives ... 12

1.7 Research Questions ... 12

2 Literature Review ... 14

2.1 Overview of Facial expression ... 14

2.1.1 Facial Expressions concepts ... 14

2.1.2 Facial Expression Recognition ... 15

2.1.3 Technological Classification of Human Expressions ... 15

2.2 Technologies for the detection of facial expression ... 16

2.2.1 Extraction of information through facial expression ... 16

2.2.2 Expression and Challenges for detecting non-verbal Expressions ... 17

2.3 Facial Expression and emotion detection role in Education ... 19

2.3.1 Facial Expression and Virtual Classes ... 19

2.3.2 Class Environment and Facial Expression ... 20

2.4 Covid-19 and Online Learning... 21

2.4.1 Impact of Covid-19 on Education ... 21

2.4.2 Nature of Online learning platform ... 22

2.4.3 Most Used Online Platforms ... 23

2.5 Facial Expression with Deep Learning and Computer Vision ... 24

2.5.1 Deep Learning for Facial Expression ... 25

2.5.2 Methods for Facial Expression Recognition ... 25

2.5.3 Correct identification of Emotions ... 28

2.5.4 JavaScript and Facial Manifestation ... 28

3 Development Environment and Framework ... 30

3.1 JavaScript Face-API ... 30

3.2 Customized Algorithm ... 31

3.3 Database and illustration through charts ... 31

(6)

3.4 API integration ... 32

4 Evaluation of Emotion Detection Application ... 33

4.1 Research Content and Participants ... 33

4.2 Learning Activity and Experiment Design ... 33

4.3 Data Analysis ... 34

4.3.1 First Case Analysis ... 34

4.3.1 Second Case Analysis ... 38

4.4 Results... 43

5 Discussion ... 46

6 Conclusion and Future work ... 48

References ... 49

(7)

1 Introduction

A human can perceive information through non-verbal and verbal ways. Verbal communication means the best ways to share information using words, but facial expression uses only non-verbal communication methods (Al-Saqqa, 2018). One of the most expressive non-verbal communication ways for a human to transfer their emotional feelings is facial expressions (Altameem, 2020). The face is considered a multi-signal channel for the receiver and sender, capable of tremendous specificity and flexibility. Human emotion detection is critical in digital communication, HCI, intelligent robotics, and health care (Bandhakavi, 2020). In recognizing human expression, the researcher performs excellent research under a controlled, tight environment. In a natural environment condition, most formal face applications can be constructed that operate very well in the real world (Breuer, 2017). The facial expression technique is the detection of expression which classify this expression into fear, happiness, sadness, and anger, etc. We conclude a different kind of results from these evaluated expressions like we can deduce from the expression of confusion that a person is facing some difficulty or does not understand something (Chatterjee, 2019). The ability to perceive the emotional states and feelings is a critical concept for recognizing human competence with the task (Fayolle, 2014).

Online learning plays a vital role in the educational field nowadays. With these online learning communication systems, students complete their learning tasks in a more accessible and comfortable way (Georgakopoulos, 2018). In online learning, teachers need to provide content with a high level of engagement to motivate students to learn. The content should be presented in a more accessible way that the students can easily understand (Hasani, 2018). To obtain sound

(8)

learning effects and positive correlation among students, learning engagement is a prerequisite for academic achievements and the higher development of ability and skills.

Deep learning that is part of a broader family of Machine Learning (ML), provides a method based on ANN that gives the right direction for understanding the learning (Herzig, 2018). Education can be supervised, semi-supervised, or unsupervised. According to research conducted by Hasani

& Mahoor (2017), emotions in case of failure contain a relevant layer of information of synthetic feature that provides lots of guesses towards emotional detection (Hasani & Mahoor, 2017).

Learning facial expressions contributes a lot, and teachers use these expressions and technology as a handy tool for further teaching or guessing the student's learning capability in class. There are various deep learning algorithms that we can use to detect facial expressions, such as deep learning networks (DLN) and convolutional neural networks (CNN) (Syafeeza et al., 2015). According to other studies, CNN is one of the best methods for emotional detection, and we have used this algorithm in our studies (Hasani & Mahoor, 2017). Similarly, Georgakopoulos et al. (2018) focus on describing the usage of the best model for emotion detection, and Wang et al., (2016) help to elaborate the concept of the neural network along with telling about their importance in emotion detection (Wang et al., 2016).

The CNN is a particular type of multi-layer perceptron (MLP) that focuses on the expected relationship between pixels with perspective. CNN is unique because its architecture consists of 4-8 layers with image, processing tasks into the design, which performs segmentation, feature extraction, and classification in processing module with minimal preprocessing functions on the input image (Krizhevsky et al., 2017). The first CNN architecture that was designed for frontal images with facial expressions was very simple. Different models are followed in the deep learning architectures to detect emotions, but the defined datasets train the CNN model to consist of early

(9)

specified data. Every deep learning architecture has different models, same as for an emotional detection method used CNN with other models' help. The models are trained by defined datasets, which consist of early specified data (Wang et al., 2016).

In the practical part of this thesis, we used CNN-based JavaScript Face API, an open library, and available on GitHub. It contains different models, i.e., Face Detection, Faces Landmark Detection, Faces Recognition, Emotion Recognition, Age Estimation, and Gender Recognition. We have used three of them in our project; Face Detection, Emotion Recognition, and Face Expression Recognition Model. For facial expression detection, Face-api.js is solely implemented as SSD mobile net v1, based on CNN (Sawyer et al., 2017).

1.1 Motivation

The motivation behind this work is the growing change in the education system from physical learning to online learning. This rapid growth of online learning facilitates the production of more online content like books, blogs, articles, and research. There is a need for the teachers/reviewers to detect the emotions of students/learners while reading through online platforms. For face-to- face teaching, it is straightforward to detect the student's emotions by looking at them physically, but in online learning teachers/reviewers cannot detect the emotions of students (Muyldermans, 2019).

Another motivation behind this work is that to make an interactive platform where teachers and students can interact with each other and teachers can know about their student’s emotions in a better way to make future decisions. By knowing their student’s emotions teachers can also help them to understand the particular concept or topic.

(10)

1.2 Development of Online Learning system

Online education changes the teaching and learning style, so in order to adopt these changes worldwide, it has become an immense interest topic among the researcher, administrator, educator, and businesses (Oyelere, 2018). Today online learning is advancing, and education becomes more economical, feasible and operational. Various online programs are offered, and online books are offered to the students to learn online. A rich and diverse education system produces a substantial research body by examining various online education aspects. Online education generates a tremendous impact on both inside and outside the education, so it offers some potential by offering new opportunities to transform the learning delivery in a landscape (Passarelli, 2018). According to teachers' perspective, online education pose distinct challenges to maintain and increase the teacher responsibility to be more rewarding and cultivating in the learning environment (Ragheb, 2019).

It is worth it that various students learn online worldwide and have the greatest revolution in learning through an online platform. Everyone wants to learn something, making a vast and open opportunity for everyone (Refat, 2019).

People mostly think that students cannot learn better through online service as compared to physical classes because, during physical classes, teachers can see students in front of them and can easily recognize whether students understand the particular concept or not, whether they are confused or disappointed on any particular topic or not. While learning through the online system, it is not possible that teachers physically see their students and recognize their emotions. A system is needed that detects the student's emotions while they are learning, e.g. reading online text.

(11)

Through online learning, the student learns more comfortably without any physical bondage (Smythe, 2021).

1.3 Implementing an example emotion detection

Various types of face detection models are implemented on different projects, which we discuss in detail. SSD mobile net V1 implements SSD for detecting the face and its neural network computes each face feature location in an image form and returns bounding boxes together with its probability of face. This face detector aims to obtain accuracy in detecting bounding boxes of the face instead of low inference time. The size of this model takes about 5.4MB (Mühler, 2020).

Another model of JavaScript face-API is a small face detector, a real-time face detector and performant, smaller, consuming fewer resources and faster than the SSD face detector. This model proved to be user-friendly and takes the size of 190KB only. The bounding boxes entirely cover the whole face, and this model predicts these bounding boxes, so this model generates better results with subsequent detection of face landmarks (Espinosa, 2019).

MTCN (Sathik, 2013) is an alternative to SSD and Tiny Yolo face detectors, which offer more space for configuring face points. When the input is given, MTCNN has three cascades. CNN simultaneously returns five landmark face points with its bounding boxes and score for each face.

The size of this model is 2MB (Sathik, 2013).

Another model which computes a face descriptor named as Face recognition model (Klym, 2020) is implemented, with an architecture ResNet-34 which is used to describe the person's facial characteristics. This model not only describes the face set of a person, but it can be used for face recognition of any person, so it can determine the similarity of two faces by comparing the face

(12)

descriptions by computing Euclidean distance using the choice classifier. The face recognition model is very lightweight, quick, and generates reasonable accuracy. This Study uses JavaScript face API, an open library on GitHub containing different models for detecting a face. For example, gender recognition, age recognition, emotion detection, face recognition, landmark face detection, and some face recognition models. We use three projects named emotion recognition, face detection, and facial expression recognition model for evaluating our project. For face expression detection, Face-api.js is solely implemented as SSD mobile net v1, based on CNN (Klym, 2020).

1.4 Limitations in online learning methods

The scope of emotion detection while learning online text needs to be examined. Many researchers have performed studies regarding online education and detection of facial expression by performing a review. They use different kinds of methods to identify facial expressions, like CNN (Sawyer et al., 2017). The limitations found in research regarding humans emotions detection while they are learning online are inaccuracy and the real time-consuming in detecting emotions.

In Covid-19 Pandemic, most of the students learned through online books, so in this situation, there is a great need to develop such a system that detects the emotions of humans like whether they understand the text.

1.5 Problem Statement

Online education is increasing daily, and worldwide, students are converting from physical to online education. The Availability of online education, books, documents, and files has made people's lives easy because people can read the particular topic more conveniently. During the

(13)

physical classes, teachers and professors can easily detect students' expressions by looking at their faces quickly, but this would be a little bit difficult when students are learning online.

When students reading an online text, it becomes difficult to recognize the student's emotions, so a system is needed that determines a student's emotions when they read an online text. This work's extent appears to imply that it has not been discussed, so this study aims to evaluate such a system that detects the student's emotions on each page of the document and evaluates what the student feels when they read a particular text.

1.6 Research Objectives

The objective of this research is to design and develop a system that helps to use the online education system in a more handy way which is based on the following ideas:

By exploring the physical education system and online education system, it is evaluated whether there is a need to detect human emotions in an online learning system. Hence, our objective is to explore the concept of online learning.

Evaluate the student’s interests by detecting their expressions for particular content.

1.7 Research Questions

To achieve the above objectives and explore the problem statement, the following research questions have been formulated.

RQ1: What are the student’s emotions while learning through an online platform?

RQ2: How does the content of literature affect students' emotions while reading online text?

RQ3: How this Emotion Detection application can be useful for teachers to analyze student’s emotions?

(14)

These research questions explore the advantages of online learning and explore the approaches to detect emotions while online learning.

(15)

2 Literature Review

Achievement in digital communication leads student learning communities towards a new dimension of online learning. Many virtual schools are increasing worldwide, and online education becomes a part of education in every field nowadays (Wang et al., 2016). Munezero (2014) and Kolog (2018) explored the study of the detection of facial expressions. By researching various experiments, they use different tools. Many researchers have developed different techniques for detecting facial expressions in which they describe one of the best methods for facial expression detection, CNN (Munezero, 2014) (Jin et al., 2019). In this study, we aim to find related concepts throughout the research.

2.1 Overview of Facial expression

Facial expression is the motions and one or more positions of muscles under the face skin. In this section, we have reviewed the research of different researchers regarding facial expressions and try to explain the main concept of facial expression. We have explained facial recognition in detail along with some case studies done on this topic by other researchers.

2.1.1 Facial Expressions concepts

According to the controversial set of theories, movements of muscle convey the emotional state of the individual for the observer (Ekman, 1999). Emotions or facial expressions are the non-verbal forms of communication for conveying social information among the human. These expressions can be found in mammals and other animal species (Dimberg, 2018). A human can adopt these expressions from an involuntary and voluntary form of expressions and neural mechanisms have a responsibility to control these expressions which are different in each case. Ekman, P., & Keltner,

(16)

D. (1997) argue that human emotions & facial behavior are socially conditioned and associated with emotions through cultural variable learning. Facial expression is the emotional experience of the brain and is highly involved in the recognition process (Ekman et al., 1997).

2.1.2 Facial Expression Recognition

A technology known as facial expression recognition system is introduced which consists various features like face detection, facial feature extraction and expression classification. By utilizing precisely this technology we will try to implement sentiment analysis tool having an ability to detect the basic universal expressions of a human: anger, surprise, fear, disgust, sadness, and happiness (Qi, C., 2018). Gestures and expressions of humans convey non-communication hints which are important for interpersonal relations. These hints help the listener to interpret the spoken words with intended meaning that is why recognition of facial expression extracts and analyzes the information from video and image and can deliver unbiased, unfiltered, and emotional response of data (Dimberg 2018).

2.1.3 Technological Classification of Human Expressions

Hammal et al., (2007) proposed a method for the classification of facial expressions by performing an analysis of facial deformations. Classification is based on a transferable belief framework related to six emotions disgust, anger, fear, neutral, sadness, surprise, and joy. This classifier evaluates the data on a counter segmentation technique which extracts the human emotion through facial features like eyebrows, eyes, mouth and drives the coefficient of the face image. These characteristics led to a decision-based system with rules that rely on data fusion and transferable belief model (TBH) (Hammal, 2007).

(17)

Krithika, L. B., & Priya, G. L., (2020) proposed a graph-based approach to classifying the facial expressions known as “Graph-based feature extraction and hybrid classification approach”.

Through graphical approach extraction of human facial expression evaluate in a very effective way. Firstly image of the face is identified using the Viola-Jones algorithm (Krithika, L. B., &

Priya, G. L., 2020) then facial parts like the left eye, right eye, nose, mouth are extracted from that image. Features with edge-based invariant are utilized to extract the extracted parts of the face and then these dimensions are optimized by using a weighted visibility graph through which facial expressions are extracted and recognized. By using a self-organizing map which is based on a network classifier expression are recognized.

2.2 Technologies for the detection of facial expression

There are different kind of technologies introduced for the detection of facial expressions by many researchers. This section contains the research done by different researchers about those technologies that can be used for the detection of facial expressions.

2.2.1 Extraction of information through facial expression

Yang, H et al., (2018) performed a study on the combination of expressive components of the person. The purpose of facial expression is to extract information through a de-expression learning process called De-expression Residue Learning. This model is the production of natural face image at any input of face image, therefore known as de-expression, where expression filters out by the generated model. The given natural image of face deposition remains in the deposition at intermediate layers of the generated model, so such residue is essential because it contains expressive components produced at any input of facial expression images (Yang, 2018).

(18)

Use of audios, videos, slides, presentations, and text in virtual learning, which simulate the physical learning and the learning environment related as much as possible. For the virtual environment, the excess use is a pedagogical purpose like distance learning. Virtual schools and online learning are increasing by benefiting daily with new technologies by computer considered to be more essential for the future generation for learning (Yang et al., 2018). A learning technique researcher says that virtual education shows instruction in a learning environment where a teacher interacts with students with different time zone, environments, and spaces. Teachers provide course management applications using the internet, student communication, conferencing, and multimedia with students using technologies (Zhang, 2020). By understanding the difference between physical learning in classrooms and online virtual learning, online learning provides flexibility and adaptability. The Internet provides lots of information to students at the same time while learning virtually by sitting at a single place without going anywhere for getting books or something else (Gleaves, 2020).

2.2.2 Expression and Challenges for detecting non-verbal Expressions

Ekman, (2006) the research professor of facial expression detection, believes that most facial expressions are blended into number of feelings, so recognizing these expressions is the most challenging thing to detect non-verbal expressions. A clear understanding of these expressions has proved to be very challenging. However, the emotional expression can be used as a signal like winking an eye is a signal of approval, and sticking tongue out is a signal of playful distaste. In this way, recognizing expressions is becoming more straightforward to comprehend (Ekman, 2006).

(19)

Díaz-Aristizabal et al., (2019) state that facial expression is the least controversial part of communication for all non-verbal channels. Hence emotional expressions are considered to be the most experimental group of gestures. Therefore, facial expressions need attention to be focused on instead of other body parts of the body, which means that it is widely accepted. Facial features represent many things through expressions like shock, surprise, angriness, happiness, and confusion (Díaz-Aristizabal et al., 2019).

Fig. 1. Facial Expression classification

Uploaded by: Heechul Jung, 19 Oct 2015

Retrieved from: https://images.app.goo.gl/JNwKPzBwRk6WSJu47

In 2020 Wikipedia declares that emotional expressions result from muscles' motions, and the face consists of lots of muscles that combine to produce expressions (Bandhakavi, 2020). These muscles' movement conveys lots of emotions from individuals (See Fig. 1). So emotional expressions are the primary part of non-verbal communication, meaning social information conveying from human to human. Díaz-Aristizabal states that human face muscles produce lots of expressions while considering how these muscles produce expressions like when a human is surprised, there is a wide range of muscle movement and an emotional expression is produced (Díaz-Aristizabal, 2019). As we can see in Fig. 1 through the movement of facial muscles we can

(20)

recognize emotions and also we can distinguish them based on that movement. While talking about learning, like teacher and students' interaction, emotions play a most crucial role. Ragheb (2019) and Shivhare (2017) describe a model known as an attention-based model for the detection of emotions through the detection of emotions and its contrast by understanding the interest of students so teachers can easily recognize the feelings of students whether they are understanding or satisfied with the learning method of teachers or whether they need more attention (Refat, 2019) (Shivhare, 2017). Teachers use the most proper role of emotion detection because it makes them easier to understand the students while teaching them in the classroom, eventually leading to learning outcomes.

2.3 Facial Expression and emotion detection role in Education

As the world we live in changes to accept technology prospects, how and what we instruct in our education system will likewise be reshaped to stay up-to-date with the latest technologies.

Education system moving fast from physical environment to virtual environment, so at that time facial expression and emotion detection can play a vital role in terms of better understanding.

2.3.1 Facial Expression and Virtual Classes

A student learns in a physical classroom with face-to-face communication with teachers, and students easily engage with one another. There is a purpose of virtual classes to implement some regular online courses and chat with classmates using classroom conferences or video conferencing. In business virtual learning, the most crucial role referred to as virtual communication is teaching in a unique virtual environment to deliver a business presentation (Sawyer et al., 2017). We know that human unconsciously and consciously receives facial clues

(21)

or signals like in physical environment student get various facial clues in the classroom having two primary reasons (Sawyer et al., 2017).

According to Sarianidi, facial expressions have minor proportions of atomic movement that facial features can produce (Kawulok, 2016). The CNN method is used for recognizing emotions, which turned out into a small and noisy face image. The detected facial image has 20 by 20 pixels and a classifier, but CNN provides an end-to-end expression recognition system (Muyldermans, 2019).

2.3.2 Class Environment and Facial Expression

In a class environment, teacher to student and student-to-student interaction is vital, so the facial expression in communication is powerful. The face is rich in emotional expressions representing lots of information regarding individual identity, mental state, and mood. Four studies are evaluated, which reveal that facial expression is the most prominent and expressive part of displaying information (Sawyer et al., 2017). Recognizing information next to words' facial expressions is the primary source for determining the internal feelings of humans. While getting lectures, the student uses facial expressions to get one another. A study reveals that online facial expression recognition is necessary to stay motivated and interested while getting lessons and lectures (Kawulok, 2016). Thus, when a lecturer lectures, students' facial expression recognition is the most valuable source of helpful feedback. Sathik, (2013) helps to know facial expression recognition and says that when a lecturer detects the student’s facial expression, then they can recognize whether he/she has to slow their teaching material or continue (Zen et al., 2016). In learning optimizing student’s behaviors, teachers' primary strategy is to feel the student's mind which will change continuously, so there must be a reasonable resource through which the observer detects the student’s facial expression, movement, and action. This strategy helps to

(22)

understand students' weaknesses and strengths to adapt to the changes that suit their learning (Yu, 2015).

2.4 Covid-19 and Online Learning

Currently, we are facing a very dangerous pandemic COVID-19, which impacts almost every field of life. This pandemic also impacts our education systems and exposed many inadequacies in our education system. To decrease the spread of COVID-19 almost every Government of the world closed their educational institutes. At that time we need another way to provide education to our students because education is no exception. So in this section, we reviewed the impact of COVID- 19 on the education system and the benefits of online learning during this pandemic.

2.4.1 Impact of Covid-19 on Education

Word Health Organization declares the COVID-19 a pandemic and also imposed contemporary threat on people. This pandemic impact several activates and all fields of the world, and force the global shutdown of every activity including educational activates (Mukhtar, K., Javed, 2020). Due to the pandemic education system is susceptible to external changes and there is a digital transformational in logical challenges and attitudinal modification. Impact of Covid-19 on educational system imposed in following ways:

Pandemic anxieties have a negative impact on student’s academic performance.

Student’s academic performance is affected by economic, resource, and racial differences.

Instructors are not ready to deliver high-quality instructions remotely.

(23)

Online learning is fully dependent on technological devices with the internet, instructors, and students. During online learning sometimes students and teachers do not understand each other properly which impacts the student's academic record (Verawardina, U et al., 2020).

2.4.2 Nature of Online Learning Platform

Covid-19 results in a shutdown of schools in the global world over 1.3 billion children are out of their physical classes so education change dramatically with its distinctive rise of online learning and teaching is undertaken remotely on digital platforms. Online learning completely depends on technological devices which provide flexibility to the students to learn anywhere in the world. The current increase in its adoption in educational institutes directs their actions towards alignment in both local and global practices with the policies to overcome Covid-19 Pandemic (Sun, L et al., 2020). Pandemic quickly proves to a digital transformational of educational activities but where they have lots of benefits a challenge arises in online learning platforms. During online learning, students have to learn using an online platform like the use of web pages, PDF, and Word documents where there is a lot of difficulty for the teachers to interact with students properly and recognized whether they understand the lecture or not. During online reading teachers and students do not interact with one another and recognize each other’s expressions which are only possible when they interact physically so such a platform is needed which recognize the student's expressions and evaluate how they feel after reading a particular text. This platform helps the teachers a lot in understanding whether their students understand the text or not (Verma, A., &

Prakash S., 2020).

(24)

2.4.3 Most Used Online Platforms

E-learning platforms are offering great access to the service for online interaction in response to the significant demand like Bangalore-based education technology and tutoring technology which is now highly valued in the tech company. BYJU observe a 200% increase in the student using this product according to the Chief operating officer. Tencent classroom is used extensively in mid-February instructed by Chinese who impose their student to study through online platforms which result in the largest online movement in education history with 81% of K-12 student attending classes through Tencent (Johnson, M. W., Maitland, 2020).

Some companies provide bolstering capabilities for teachers and students. Lark which is a Singapore-based product developed by ByteDance, is an internet tool to meet potential growth offering its services to teachers and students with unlimited video conferencing time, real-time editing projects, auto-translation time, smart calendar scheduling, and other features. Lark implements and improves its infrastructure and its engineering capabilities to make it more reliable and strong connectivity. Alibaba distance learning also plays a great role Din Talk prepares for the influx to support the large scale network of remote work the platform tapped by Alibaba cloud which deploys more than 10,000 new servers of cloud in 2 hours by setting a new record of rapid capacity expansion (Chakraborty, I., & Maity, 2020). Some schools implement unique partnerships like offering educational broadcasts with different channels with a focus on ages and digital options. Media like BBC power virtual learning like Bitesize Daily offering 14 weeks curriculum which is based on ids learning across the UK.

Nowadays professors are giving lectures online through various platforms like video conferencing software, Google Meet, Zoom, and other social media to teach particular courses to their students

(25)

(Nawrot, I., & Doucet, A, 2014). An online platform like blackboard and Google classroom allows them to share notes and other recourses through, multimedia to their students and teachers. These online learning platforms allow the students to share their assignments, take quizzes, and presentations and allow professors to keep track of the student's progress. Video conferencing like Zoom, Google Meet, and Microsoft team helps in organizing online lectures for students and discussion as well. These online platforms provide slide shows and online chat as well. Lots of universities are sharing their recourses, and course materials through their websites and their learning management system. To teach science courses many professors use virtual laboratories which allow the students to simulate the experiments of their related course work. So such tools are used for data visualization and simulation.

Rural areas often do not have adequate access to Covid-19 through which they access the information and communication technology. Rural people face a lot of difficulties in attending online classes. Pandemic allows the student to get together in the classroom but online learning is feasible to use which model the class to be more effective (Abuhassna et al., 2020)

2.5 Facial Expression with Deep Learning and Computer Vision

Deep learning plays a vital role in terms of Facial Expression Recognition (FER). It provides a different and popular method for facial expression recognition. There are different types of algorithms used to develop that FER system such as CNN. In this section, we reviewed and analyzed those deep learning methods.

(26)

2.5.1 Deep Learning for Facial Expression

Emotions, the movement of muscle like wrinkling, curling lips, rolling eyes, and eyebrow-raising are part of facial expressions, so according to the medical studies, when a person feels uncomfortable, they lower their eyebrow, shrink their brows, and got wrinkles in the vertical and horizontal direction so it takes time to maintain eye contact (Sathik, 2013). To detect this expression correctly, humans have to be familiar with the non-verbal types of expressions that people/students send. Studies evaluate that student’s emotional states expressed by human behavior are detected automatically, so grabbing it through facial features like forehead, mouth, nose, and eyes are essential, like how the people act in front of others (Kawulok, 2016). The extraction of emotions with underlying emotional states is the spontaneous facial expression category's task from these features. Some related studies are evaluated in which most of the researchers present their views regarding facial expression and emotion detection strategies.

2.5.2 Methods for Facial Expression Recognition

Ghosh, S., & Bandyopadhyay, S. K. (2015) says that role of facial expression is major in face recognition and image processing system technique as a human-machine interface. Several techniques are followed for the selection of facial features like Distance calculation among face components, principal component analysis, and template matching so the algorithm is designed as a simple template matching which is based on facial feature selection technique which detects the facial expression based on the distance among facial features with the use of data sets. Facial Detection methods are divided into two primary techniques which are a view-based method and a feature-based method. The global filter is used to detect facial expressions known as the

(27)

appearance-based approach. Principle component analysis is used for feature size reduction (Ghosh, 2015).

A neural network is a trained pattern to consider the deep learning if the input and output layer with its hidden middle layer is recognized. Every node is calculated through its input and output nodes regarding their previous nodes. Weighing values can be adjusted with the specific recognition of the task so-referred to as a CNN. The CNN has a lot of computational power which calculates the weighted values of interconnected nodes with the addition of efficient data movement more important (Gleaves, 2020). This neural network efficiently implements computer vision because they reuse it with lots of weight across just the images. The advantage of the two- dimensional structure of input reduces redundant computation. Implementation of a deep neural network requires two independent phases in which the first phase is known as the training phase and the second phase is known as the deployment phase (Hammal, 2007).

Table 1. Related work on emotion detection

Solution Short description Results

Munezero, M., Montero, C. S., Sutinen, E., &

Pajunen, J. (2014)

Conversion of auto social behavior into text automatically.

Results show that emotional information has a 90% load of words with ABC classification.

Ahmed, M., Rasool, A. G., Afzal, H., &

Siddiqi, I. (2017)

Gender classification, binary classification problems, and discriminating features take a vector space, including emotion feature bias.

Their experiment shows that information exploitation text has 80% cross-validation accuracy and supports the vector learning machine, so there is a favorable implication of emotional features.

Alluqmani, A., &

Shamir, L. (2018)

This research explores various writing styles with an array of author gender and text genre.

Linguistic feature analysis like work categories, n-grams, style-metric authors presents an exploratory study on emotions.

(28)

Gleaves, A., Walker, C., & Grey, J. (2020)

Analyzing the study manually and learning diaries are essential for getting informative feedback.

Their system automatically detects emotions and changes emotion styles when a student learns expressed in a journal.

Pérez-Rosas, V., Kleinberg, B., Lefevre, A., &

Mihalcea, R. (2017)

For detection of sentiment, feelings, and emotions, the proper differentiation of text among subjective terms and understanding it correctly like how these emotions are related to one another.

Their determinations clarify that the five subjective term differences reveal the significant concepts of the computational linguistic community.

Breuer, R., & Kimmel, R. (2017)

For the detection of sentiments, ML technology is used when learning stories.

There is a primary influence of school staff, students, and peers on student academics, so students are the clad label for supervised class.

Zhang, Y., Zhao, D., Sun, J., Zou, G., & Li, W. (2016)

An adaptive CNN is purposed for the determination of CNN structure without comparison.

Results show that there is a better tradeoff between the recognition of ACNN and training time consumption.

Krumhuber, E. G., &

Skora, L. (2016)

Research suggests that for character recognition, emotional expression is significant, but it depends on a person's intentions and emotions.

Results show that character recognition and emotional expression are interrelated.

Montero, C. S., Munezero, M., &

Kakkonen, T. (2014, April)

The use of various popular deep learning methods is done in this study to detect the correct emotional state of humans and their classification. Most algorithms like FER are used for the deep belief network.

Results show that Deep learning techniques and emotions can be detected with the use of different algorithms.

Munezero, M., Montero, C. S., Mozgovoy, M., &

Sutinen, E. (2013, November)

Evaluation of the model is done for the face pattern recognition through a computer with a target system stimulus.

Results show that face patterns help to find the target emotions.

Passarelli, M., Masini, M., Bracco, F., Petrosino, M., &

Chiorri, C. (2018)

A research study is performed for the development and validation of the emotional expression recognition test.

Their results show that validation of emotions is most important.

(29)

Facial expression recognition scope is marvelous and is a need for the student examination. In this chapter, a study is performed on the literature review where many researchers conduct a study on emotion detection and facial expression detection. Many researchers are there who apply different methods and techniques for the correct identification of emotions.

2.5.3 Correct identification of Emotions

To improve the correct identification of emotions, an algorithm is developed known as CNN used by many researchers to detect a human's emotions. By reviewing all the literature studies, it is found that there are still some limitations that authors do not focus on. Accuracy and real time- consuming in detecting emotions is a point that needs attention. Today in COVID-19 many students learn through an online platform and using different sources like PDF books, so seeing these situations, emotions detection is necessary. In Literature, no study found that focus on emotion detection visualization while reading any online text. Therefore, this study's focus is to cover these aspects using JavaScript language to develop web-based emotion detection applications.

2.5.4 JavaScript and Facial Manifestation

We use JavaScript language because it is a widely used language in building web development tasks, enterprise, and android development applications. Recently the information technology industry is growing vastly. The demand for using JavaScript is growing, so we use JavaScript as the most considerable language and most suitable for this study.

JavaScript face API is an open library available on Github consisting of different kinds of models like Faces Recognition, Face Landmark detection, Face Expression detection, Gender Recognition, and Age Estimation. In this study, three models named Face Recognition, Face

(30)

Expression Recognition, and Face Detection are done. Face-api.js is a solely implemented architecture, so it might not be possible to achieve the real-time expression with this face detector unless we have a web application in a built-in descent GPU into machines.

(31)

3 Development Environment and Framework

Emotion detection application is a web-based application that is used to detect the facial expressions of the reader while reading the online text through this platform. With the help of an algorithm, the facial expressions of readers have been detected. The whole architecture of this application is implemented on the web platform so that it can be accessible for anyone. The whole architecture of this application has been present in Fig. 2.

Fig. 2. Use case diagram for Emotion Detection System.

3.1 JavaScript Face-API

To detect the facial emotions and facial landmarks, JavaScript Face-API has been used which is implemented by the CNN Framework. The face-api.js leverages the flow of expressions and optimized the web browser. It’s an open-source API with several features like Face Recognition, Face Landmark Detection, Face Expression Recognition, and Age Estimation & Gender Recognition. In this application, we have used three features (Face Recognition, Face Landmark Detection, and Face Expression Recognition) of this API.

(32)

3.2 Customized Algorithm

To capture the facial emotion of the reader during the reader’s interaction with the online text, JavaScript face-API has been used. We have another objective to calculate the time spend on a single paragraph of each page so that we can predict, what was the reader’s expression on a particular paragraph of a specific page. To achieve that objective there is a customized algorithm has been implemented in which, when a reader opens a text from our application and starts reading that text, the application stores the current page no and starting time into the database. After each five-second application capture, the facial expression and stores these expressions into the database and this process goes on. But when the reader jumps to the next page, at that event, the application stores the end time of the previous page and starting time of the new page into the database. So by this algorithm, we have stored the total time spent by the user on each page. We calculate the total time duration on a single paragraph by using this formula.

Total time spend on a single paragraph = no. of paragraphs on page/time spent.

After applying this equation we can predict two things; the first one is, total estimated time the reader spent on a single paragraph and the second is that we can get the exact facial expressions of the reader that have been captured during that time slot.

3.3 Database and illustration through charts

We have designed the database to store the captured facial expressions of the reader along with other data. For this purpose, we used the Relational Database Management System called MySql, which is a popular database system, especially for web-based applications. We also developed a Dashboard for reviewers/admins to see the graphical view of stats. On that dashboard there are two types of graph; Bar Graph and Pie Graph has been implemented. The teacher/reviewer can see the graphical view of the reader’s facial expressions while reading any particular book or any page of that particular book.

(33)

3.4 API integration

There is another major objective that we had to achieve that how we can integrate this application to any other online learning platform. We have designed a customized API to integrate this application with any other online learning platform to detect the facial expressions of readers.

There are a lot of learning platforms that provide opportunities to students/learners to educate them and there are various types of data available on that platforms i.e. video tutorials, books, and audio, etc. In the case of text reading, this application would be beneficial for those platforms to detect the facial expressions of readers. For this purpose, we have developed an API service to integrate our application with that particular platform. There are few general integration steps for this interface:

1. As this application is web-based, so you need a laptop/computer with a camera to use the features of this application.

2. Our application needs camera access from the user, so the user should have a webcam and also allow camera access.

3. There should be enough light available so that camera can capture clear expressions.

4. For now, we have only integrated this API for PDF text, so the text should be in PDF format.

(34)

4 Evaluation of Emotion Detection Application

For the formative evaluation of the emotion detection application, ten students were recruited to test this application. The main reason to do that evaluation is to know the student’s emotions while reading through an online platform and how the content of literature affects student’s emotions.

For this purpose, we divided our students into two equal groups and we selected two articles with different content for each group. Five students of the first group read an article (Smythe, 2021) which is a scientific article related to health care. The other five students of the second group read the article (The Play's the thing, 2020) which is a short funny story.

Moreover, we also evaluated the interest of students in this platform because we implemented a few interesting live expression charts and live videos stream so that student can see their expression while reading. There is another purpose of our research is to know that how teachers/reviewers found this application useful in terms of emotion detection of students while reading through an online platform.

4.1 Research Content and Participants

This research was conduction in the computer science department of the University of Eastern Finland. We recruited 10 participants who volunteered to participate in the experiment. All participants were computer science students and all participants went through the whole process of emotion detection while reading short articles through this application.

4.2 Learning Activity and Experiment Design

There were different phases in this whole experiment, first, we gave a short overview of this application to the students and then we created login credentials for all students. All students logged in to this emotion detection application and read an article that was assigned to them for this research. Our application recorded their expressions when they were reading those articles and stored those expressions in our database. In the next phase, we have used that stored expressions data to plot charts and tables for data illustration and presentation.

(35)

4.3 Data Analysis

We have used two different articles for the evaluation of this application so that we can evaluate our application that how different content affects the emotions of students. We have used JavaScript Face-API to detect emotions, using publicly available datasets, i.e., CK+ Facial Expression, JAFFE, and KDEF. These public datasets include over 2000 images of different facial expressions that have been tested to be 96.3% accurate. As we mentioned we have used two different articles for analysis purposes so that’s why we divided our Data Analysis section into two sub-section to analyze them separately.

4.3.1 First Case Analysis

This is the first case analysis that presents the analysis of facial expressions obtained from the first group of five students who read the article of Smythe, (2021). This particular article contains healthcare-related data. Students followed each step according to the given instruction and read that article. When students allow camera access to this application, our system detects their facial expressions and stores those expressions into Database.

As you can see in Table 2 we have the count of facial expressions of students on each paragraph of all pages along with the time duration. This time duration is the time spent by the students on a particular paragraph and our system recorded their expressions during that time slot. For example in the first row of Table 2 student1 gave 7 Neutral expressions on paragraph 1 of page no.2 between 21:44:47 - 21:45:12.

(36)

Table 2. Student’s expressions while reading a short article (Smythe, 2021)

Students Page No.

Paragraph No.

Expression Type

Expression count

Time

Student1 2 1 neutral 7 21:44:47 - 21:45:12

Student1 2 2 neutral 4 21:45:12 - 21:45:37

Student1 2 3 neutral 3 21:45:37 - 21:46:02

Student1 2 4 neutral 3 21:46:02 - 21:46:27

Student1 3 1 neutral 1 21:46:49 - 21:47:14

Student1 3 2 neutral 2 21:47:14 - 21:47:39

Student1 3 3 neutral 5 21:47:39 - 21:48:04

Student1 3 4 neutral 2 21:48:04 - 21:48:29

Student1 4 1 neutral 3 21:48:40 - 21:49:01

Student1 4 2 neutral 2 21:49:01 - 21:49:22

Student1 4 2 surprised 1 21:49:01 - 21:49:22

Student1 4 3 neutral 2 21:49:22 - 21:49:43

Student1 4 4 neutral 5 21:49:43 - 21:50:04

Student2 2 1 neutral 1 13:36:35 - 13:37:13

Student2 2 1 happy 3 13:36:35 - 13:37:13

Student2 2 1 neutral 8 13:36:35 - 13:37:13

Student2 2 2 neutral 8 13:37:13 - 13:37:51

Student2 2 3 neutral 6 13:37:51 - 13:38:29

Student2 2 4 neutral 8 13:38:29 - 13:39:07

Student2 3 1 neutral 10 13:39:08 - 13:39:58

(37)

Student2 3 2 neutral 10 13:39:58 - 13:40:48

Student2 3 3 neutral 11 13:40:48 - 13:41:38

Student2 3 4 neutral 10 13:41:38 - 13:42:28

Student2 4 1 happy 1 13:42:33 - 13:43:17

Student2 4 1 neutral 9 13:42:33 - 13:43:17

Student2 4 2 happy 1 13:43:17 - 13:44:01

Student2 4 2 neutral 8 13:43:17 - 13:44:01

Student2 4 3 neutral 8 13:44:01 - 13:44:45

Student2 4 4 neutral 10 13:44:45 - 13:45:29

Student3 2 1 neutral 3 10:28:16 - 10:28:38

Student3 2 2 neutral 1 10:28:38 - 10:29:00

Student3 2 3 neutral 5 10:29:00 - 10:29:22

Student3 2 4 neutral 4 10:29:22 - 10:29:44

Student3 3 1 neutral 2 10:29:46 - 10:30:11

Student3 3 2 neutral 4 10:30:11 - 10:30:36

Student3 3 3 neutral 4 10:30:36 - 10:31:01

Student3 3 4 neutral 4 10:31:01 - 10:31:26

Student3 4 1 neutral 5 10:31:36 - 10:32:01

Student3 4 2 neutral 2 10:32:01 - 10:32:26

Student3 4 3 neutral 4 10:32:51 - 10:33:16

Student4 2 1 neutral 9 19:42:51 - 19:43:14

Student4 2 2 neutral 4 19:43:14 - 19:43:37

Student4 2 3 neutral 2 19:43:37 - 19:44:00

(38)

Student4 2 4 neutral 3 19:44:00 - 19:44:23

Student4 3 1 neutral 4 19:44:26 - 19:44:49

Student4 3 2 neutral 4 19:44:49 - 19:45:12

Student4 3 2 surprised 1 19:44:49 - 19:45:12

Student4 3 3 neutral 4 19:45:12 - 19:45:35

Student4 3 4 neutral 5 19:45:35 - 19:45:58

Student4 4 1 neutral 3 19:46:01 - 19:46:21

Student4 4 2 neutral 5 19:46:21 - 19:46:41

Student4 4 3 neutral 5 19:46:41 - 19:47:01

Student4 4 4 neutral 5 19:47:01 - 19:47:21

Student5 2 1 neutral 2 20:16:44 - 20:16:58

Student5 2 2 neutral 1 20:16:58 - 20:17:12

Student5 2 3 neutral 2 20:17:12 - 20:17:26

Student5 2 3 sad 1 20:17:12 - 20:17:26

Student5 2 4 neutral 4 20:17:26 - 20:17:40

Student5 3 1 neutral 4 20:17:45 - 20:18:02

Student5 3 2 neutral 3 20:18:02 - 20:18:19

Student5 3 3 neutral 4 20:18:19 - 20:18:36

Student5 3 4 neutral 3 20:18:36 - 20:18:53

Student5 4 1 neutral 5 20:18:55 - 20:19:18

Student5 4 2 neutral 5 20:19:18 - 20:19:41

Student5 4 3 neutral 4 20:19:41 - 20:20:04

Student5 4 4 neutral 4 20:20:04 - 20:20:27

(39)

4.3.1 Second Case Analysis

This section presents the analysis of facial expressions obtained from the second group of five students, who read the short story (The Play’s the Thing, 2020) (See Table 3.). This short story (The Play’s the Thing, 2020) contains different content from the previous article (Smythe, 2021).

Same as Table 2, in Table 3 we have a count of facial expressions of students along with the particular paragraph number of a specific page. For example, Student1 spent time from 09:31:42 to 09:32:19 on paragraph no. 3 and system recorded a total of 8 expressions in which 2 are happy and 6 are Neutral.

Table 3. Student’s expressions while reading a short story (The Play’s the Thing, 2020)

Students Page No. Paragraph No.

Expression Type

Expression count

Time

Student1 2 1 happy 1 09:30:28 - 09:31:05

Student1 2 1 neutral 8 09:30:28 - 09:31:05

Student1 2 2 neutral 7 09:31:05 - 09:31:42

Student1 2 3 happy 2 09:31:42 - 09:32:19

Student1 2 3 neutral 6 09:31:42 - 09:32:19

Student1 2 4 happy 1 09:32:19 - 09:32:56

Student1 2 4 neutral 6 09:32:19 - 09:32:56

Student1 3 1 happy 2 09:32:58 - 09:33:31

Student1 3 1 neutral 5 09:32:58 - 09:33:31

Student1 3 2 happy 1 09:33:31 - 09:34:04

Student1 3 2 neutral 6 09:33:31 - 09:34:04

Student1 3 3 angry 1 09:34:04 - 09:34:37

Student1 3 3 happy 1 09:34:04 - 09:34:37

Student1 3 3 neutral 4 09:34:04 - 09:34:37

(40)

Student1 3 4 happy 1 09:34:37 - 09:35:10

Student1 3 4 neutral 6 09:34:37 - 09:35:10

Student1 4 1 neutral 6 09:35:13 - 09:35:42

Student1 4 2 happy 2 09:35:42 - 09:36:11

Student1 4 2 neutral 4 09:35:42 - 09:36:11

Student1 4 3 happy 1 09:36:11 - 09:36:40

Student1 4 3 neutral 5 09:36:11 - 09:36:40

Student1 4 4 happy 1 09:36:40 - 09:37:09

Student1 4 4 neutral 5 09:36:40 - 09:37:09

Student1 5 1 neutral 3 09:37:13 - 09:37:26

Student1 5 2 neutral 3 09:37:26 - 09:37:39

Student1 5 3 happy 1 09:37:39 - 09:37:52

Student1 5 3 neutral 1 09:37:39 - 09:37:52

Student1 5 4 neutral 3 09:37:52 - 09:38:05

Student2 2 1 neutral 5 13:29:07 - 13:29:26

Student2 2 2 happy 2 13:29:26 - 13:29:45

Student2 2 2 neutral 1 13:29:26 - 13:29:45

Student2 2 3 neutral 2 13:29:45 - 13:30:04

Student2 2 4 happy 2 13:30:04 - 13:30:23

Student2 2 4 neutral 2 13:30:04 - 13:30:23

Student2 3 1 neutral 1 13:30:28 - 13:30:41

Student2 3 1 sad 2 13:30:28 - 13:30:41

Student2 3 2 happy 1 13:30:41 - 13:30:54

(41)

Student2 3 2 neutral 1 13:30:41 - 13:30:54

Student2 3 2 sad 1 13:30:41 - 13:30:54

Student2 3 3 neutral 2 13:30:54 - 13:31:07

Student2 3 4 happy 1 13:31:07 - 13:31:20

Student2 3 4 neutral 1 13:31:07 - 13:31:20

Student2 4 1 neutral 3 13:31:28 - 13:31:38

Student2 4 2 neutral 3 13:31:38 - 13:31:48

Student2 4 3 happy 1 13:31:48 - 13:31:58

Student2 4 3 neutral 2 13:31:48 - 13:31:58

Student2 4 4 neutral 3 13:31:58 - 13:32:08

Student2 5 1 happy 1 13:32:13 - 13:32:18

Student2 5 1 neutral 1 13:32:13 - 13:32:18

Student2 5 2 happy 1 13:32:18 - 13:32:23

Student2 5 2 neutral 1 13:32:18 - 13:32:23

Student2 5 3 neutral 2 13:32:23 - 13:32:28

Student2 5 4 neutral 2 13:32:28 - 13:32:33

Student3 2 1 neutral 4 09:30:32 - 09:33:17

Student3 2 2 happy 2 09:38:47 - 09:41:32

Student3 2 2 neutral 28 09:38:47 - 09:41:32

Student3 3 1 neutral 4 09:41:40 - 09:41:59

Student3 3 2 happy 1 09:41:59 - 09:42:18

Student3 3 2 neutral 3 09:41:59 - 09:42:18

Student3 3 3 neutral 2 09:42:18 - 09:42:37

(42)

Student3 3 4 neutral 4 09:42:37 - 09:42:56

Student3 4 1 neutral 1 09:43:00 - 09:43:02

Student3 4 2 neutral 1 09:43:04 - 09:43:06

Student4 2 1 neutral 5 21:00:09 - 21:00:22

Student4 2 2 neutral 3 21:00:22 - 21:00:35

Student4 2 3 neutral 3 21:00:35 - 21:00:48

Student4 2 4 happy 1 21:00:48 - 21:01:01

Student4 2 4 neutral 2 21:00:48 - 21:01:01

Student4 3 1 neutral 2 21:01:06 - 21:01:15

Student4 3 2 happy 2 21:01:15 - 21:01:24

Student4 3 3 happy 1 21:01:24 - 21:01:33

Student4 3 3 neutral 1 21:01:24 - 21:01:33

Student4 3 4 neutral 2 21:01:33 - 21:01:42

Student4 4 1 happy 1 21:01:46 - 21:01:54

Student4 4 1 neutral 1 21:01:46 - 21:01:54

Student4 4 2 happy 2 21:01:54 - 21:02:02

Student4 4 3 neutral 1 21:02:02 - 21:02:10

Student4 4 4 neutral 2 21:02:10 - 21:02:18

Student4 5 1 neutral 2 21:02:21 - 21:02:26

Student4 5 2 neutral 2 21:02:26 - 21:02:31

Student4 5 3 happy 1 21:02:31 - 21:02:36

Student4 5 3 neutral 1 21:02:31 - 21:02:36

Student4 5 4 happy 2 21:02:36 - 21:02:41

(43)

Student5 2 1 neutral 7 09:42:45 - 09:43:09

Student5 2 2 happy 1 09:43:09 - 09:43:33

Student5 2 2 neutral 4 09:43:09 - 09:43:33

Student5 2 3 neutral 2 09:43:33 - 09:43:57

Student5 2 4 happy 1 09:43:57 - 09:44:21

Student5 2 4 neutral 2 09:43:57 - 09:44:21

Student5 3 1 neutral 2 09:44:24 - 09:44:32

Student5 3 2 happy 2 09:44:32 - 09:44:40

Student5 3 3 neutral 1 09:44:40 - 09:44:48

Student5 3 4 neutral 2 09:44:48 - 09:44:56

Student5 4 1 neutral 1 09:44:59 - 09:45:02

Student5 4 2 neutral 1 09:45:02 - 09:45:05

Student5 4 3 neutral 1 09:45:08 - 09:45:11

Student5 5 1 neutral 1 09:45:14 - 09:45:14

Student5 5 2 neutral 1 09:45:14 - 09:45:14

Student5 5 3 neutral 1 09:45:14 - 09:45:14

Student5 5 4 neutral 1 09:45:14 - 09:45:14

(44)

4.4 Results

We have tested and evaluated this emotion detection application with ten different students while they were reading particular articles. As we can see in the data analysis section, we have got two different data sets. We used two different articles with different content to get those expressions.

When students logged in to our application and give camera access, the system starts recording their facial expressions through live video streaming. These expressions are securely stored in our database along with the start and end times. We have used this data to demonstrate results in form of tables and charts.

Table no. 2 contains the data of facial expressions of students when they were reading an article (Smythe, 2021), in which we can see most of the expressions are Neutral. We got mostly neutral expressions because of the content of that particular article (Smythe, 2021). When students were reading that article, they just read it naturally without changing expressions. We also present this data in form of chart illustrations (See Fig. 3).

Fig. 3. Illustration of Emotions while reading the article (Smythe, 2021).

Viittaukset

LIITTYVÄT TIEDOSTOT

The time has been reduced in a similar way in some famous jataka-reliefs from Bhårhut (c. Various appearances of a figure has here been conflated into a single figure. The most

The purpose of this study is to determine which tasks Finnish L2 English teachers and students see as most suitable and useful for the middle school level in learning English

In addition, according to some studies also governmental or departmental policies as well as teachers’ own perceptions of their language skills can affect their

There are two main frameworks in iOS API for processing camera images and detecting the facial emotions: the Core Image (CI) framework (Section 3.3) and the Vision Network (VN)

Solmuvalvonta voidaan tehdä siten, että jokin solmuista (esim. verkonhallintaisäntä) voidaan määrätä kiertoky- selijäksi tai solmut voivat kysellä läsnäoloa solmuilta, jotka

Conclusion: By using social network analysis, researchers can analyze the social structure of an online course and reveal important information about students ’ and teachers

Se voi tarkoittaa esimerkiksi sitä, että pelastuslaitos ei ole sisäministeriön tulkinnan mukaan toimivaltainen viranomainen ratkaisemaan pelastuslain 6 §:n soveltumista

Network-based warfare can therefore be defined as an operative concept based on information supremacy, which by means of networking the sensors, decision-makers and weapons