• Ei tuloksia

Evoking Physiological Synchrony and Empathy Using Social VR with Biofeedback

N/A
N/A
Info
Lataa
Protected

Academic year: 2022

Jaa "Evoking Physiological Synchrony and Empathy Using Social VR with Biofeedback"

Copied!
11
0
0

Kokoteksti

(1)

1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

Evoking Physiological Synchrony and Empathy Using Social VR with Biofeedback

Mikko Salminen, Simo Järvelä, Antti Ruonala, Ville J. Harjunen, Juho Hamari, Giulio Jacucci, and Niklas Ravaja

Abstract—With the advent of consumer grade virtual reality (VR) headsets and physiological measurement devices, new possibilities for mediated social interaction emerge enabling the immersion to environments where the visual features react to the users’ physiological activation. In this study, we investigated whether and how individual and interpersonally shared biofeedback (visualised respiration rate and frontal asymmetry of electroencephalography, EEG) enhance synchrony between the users’ physiological activity and perceived empathy towards the other during a compassion meditation exercise carried out in a social VR setting. The study was conducted as a laboratory experiment (N=72) employing a Unity3D-based Dynecom immersive social meditation environment and two amplifiers to collect the psychophysiological signals for the biofeedback. The biofeedback on empathy-related EEG frontal asymmetry evoked higher self-reported empathy towards the other user than the biofeedback on respiratory activation, but the perceived empathy was highest when both feedbacks were simultaneously presented. In addition, the participants reported more empathy when there was stronger EEG frontal asymmetry

synchronization between the users. The presented results inform the field of affective computing on the possibilities that VR offers for different applications of empathic technologies.

Index Terms—biofeedback, electroencephalography, empathy, respiration, virtual reality

—————————— u ——————————

1 I

NTRODUCTION

HE

advent of virtual reality (VR) and various means to extract, mediate, and visualize affective information for virtual environments with biofeedback functionalities enable novel applications of affective computing and technology-mediated social interaction. Therefore, it is of utmost importance to study the psychological effects that are related to the use of these technologies.

Virtual reality (VR) has been advertized as the “ultimate empathy machine” capable of placing us into another per- son’s shoes [1]. There are already some VR applications de- veloped for evoking empathy, including virtual reality en- vironments (VRE) that use illusions of body ownership for empathy training [2]. In addition, immersive journalism may evoke empathy and train empathy-related skills by al- lowing the change of perspective [3].

The previous applications build on the perspective tak- ing aspect of empathy but pay little attention to its affective facet, that is, merging of affective states between persons [4], [5], [6]. In social settings, empathy is linked to the syn- chronization of physiological activities between individu- als [7]. It can thus be assumed that promoting such syn- chrony by means of VR application could be used to evoke greater empathy between users.

Virtually mediated social interactions enable various ways to share and convey information about users’ emo- tions. Some changes in our emotional states are visible to others during social interaction, such as gestures, postures and facial expressions, but others, such as heartbeat and brain activity, are not directly observable by others [8].

With biosensors and VR technology we can bring these previously hidden emotional fluctuations in view and, by so doing, promote merging of the interactants’ emotional states.

In the current study we investigate how shared visuali- zations of the user’s physiological signals in social VR could contribute to the evoking of empathy in the context of social meditation exercise where both participants are virtually present.

1.1 Related work 1.1.1 Empathy

Empathy has been defined as a person's ability to un- derstand the innate states of others and as the merging of affective states between persons [4], [5], [6], whereas in compassion the other’s emotions and feelings are acknowl- edged but not felt as such [9]. Similar differentiation has been suggested also with the concepts of affective and cog- nitive empathy [5] and also by the so-called Russian doll model of empathy which divides it to multiple layers or mechanisms based on the mechanism’s evolutionary his- tory [10], [11]. The inner core of the model includes motor mimicry and emotional contagion and represents phyloge- netically early mechanism associated with automatically activated neural representations of the other person’s feel- ings [9]. The phylogenetically more recent layers include

xxxx-xxxx/0x/$xx.00 © 200x IEEE Published by the IEEE Computer Society

————————————————

M.S. and J.H. are with the Gamification Group, Faculty of Information Technology and Communication Sciences, Tampere University, FI-33014, Finland. E-mail: firstname.lastname@tuni.fi.

S.J., V.H., and N.R. are with the Department of Psychology and Logope- dics, Faculty of Medicine, P.O.Box 63, FI-00014 University of Helsinki, Finland. E-mail: {simo.v.jarvela, ville.harjunen, niklas.ravaja}@hel- sinki.fi.

A.R. and G.J. are with the Helsinki Institute for Information Technology HIIT, Department of Computer Science, P.O.Box 68, FI-00014 University of Helsinki, Finland. E-mail: firstname.lastname@helsinki.fi.

T

(2)

sympathic concern and perspective taking, or deliberate taking of another person’s point of view. The multilayered nature of empathy needs to be kept in mind when consid- ering different subjective and physiological indices of em- pathy and their visualization.

1.1.2 Empathy and psychophysiology

On individual psychophysiological level, empathy has been associated with various changes in central nervous system activity [12], such as the increased frontal alpha asymmetry visible in the 8 – 13 Hz frequency band of electroencephalography (EEG), which has previously been considered as an index of approach and withdrawal motivation [13], [14]. Due to the multifaceted nature of empathy it has been related previously to both left and right frontal EEG asymmetry. Empathy may be an ap- proach-related reaction to the suffering of the other, which would be related to left frontal asymmetry; right frontal asymmetry related empathy can be due to vicari- ous sharing of the pain or sorrow of the other [14].

In social settings, empathy has also been linked to the synchronization of physiological activities between indi- viduals [7]. For example, the synchronization of electroder- mal activities between a therapist and patient has been re- lated to perceived empathy towards the therapist [15]. In addition to the autonomic nervous system level, synchro- nization may occur also on the level of brain activities and there are specific neural mechanisms that have been pro- posed as being related to the cortical synchronization [16], [17].

1.1.3 Meditation and biofeedback

Meditation as a broad concept covers various methods for the self-regulating of the mind and the body. The possible benefits of meditation have been studied for several dec- ades, e.g., [18], [19]. There are already some VR and bio- feedback applications targeted to enhance meditation ef- fectiveness. For example, electrodermal activation (EDA) has been utilized as an index of relaxation in a mindful- ness-based stress reduction VRE for chronic pain patients [20]. In addition, there are VR meditation applications that utilize heart rate feedback [21].

In some of the meditation traditions the observing of breathing is an integral part of the practice and, possibly due to the improved emotion regulation by the awareness of one’s own internal states, mindful breathing has been shown to be related to less mind-wandering and better mood [22], [23], [24]. In biofeedback meditation applica- tions the feedback of respiration rate has been used mainly in the purpose to guide the practitioner to deeper breathing and slower respiration rate [25], [21], [26].

Visual feedback of user’s brain activation, referred to as neurofeedback, has also been used in meditation appli- cations as well as for various other clinical purposes [27], [28]. In previous biofeedback meditation applications, EEG-based neurofeedback has been utilized for making the user aware of his or her brain activation by visualiza- tion or sonification of the EEG, e.g. [29], or to guide the user to achieve a certain state, e.g. [30], [31], [32]. For neu- rofeedback purposes the oscillatory responses of the EEG

are used most commonly. Previous studies have identi- fied specific EEG frequency bands for attentional, affec- tive, and memory processes, [33], [34], as well as for med- itative states, for a review, see [35].

2 C

URRENT STUDY AND HYPOTHESIS

Biofeedback has been utilized previously in solitary medi- tation applications that have been targeted to relaxation or attentional processes. However, to the best of our knowledge, there are no social biofeedback VREs where the synchronized physiological activities of two or more users would be visualized for the purpose of empathy training. To fill this gap, we have developed a social bio- feedback VRE for the conducting of simplified empathy exercises, that are inspired by traditional meditation prac- tices. The biofeedback functionality that provides infor- mation on the synchronization of the physiological sig- nals between two users is suggested to benefit the con- ducting of these empathy exercises.

Modern consumer grade VR devices allow for more immersive and engaging environments to train empathy and also other affective skills. In addition, bio-feedback functionalities could enhance the training of various affec- tive skills by making visible the otherwise hidden bodily responses. It is thus important to study the possibilities that these new and emerging technologies could have in training of these skills that can contribute to human flour- ishing. In future, these types of functionalities could be utilized also in various types of communication and en- tertainment purposes.

The current study contributes also to the wider fields of affective computing and social signal processing. In line with the suggestions by Chanel and Mühl [8], we pre- sent a system for computer-mediated interaction between human users that utilizes individual user’s physiological signals as novel social cues and also uses inter-user physi- ological indices as an information of the interaction.

In the current VRE, we included visual feedback of both, respiration and EEG activation to promote per- ceived empathy and physiological synchrony between the users. We suggest that the reported empathy would be highest in a condition where both these functionalities were used given that there would then be most infor- mation available for the users to interpret and to evoke contagious affective reactions.

Some of the physiological changes are visible to the partners during social interaction, such as blushing, but other changes, such as heartbeat, are not directly observa- ble by others, e.g., [8]. However, with the current technol- ogy it is possible to measure, visualize and mediate both these types of signals as a form of social information. This would serve as a one mean to widen the otherwise often narrow bandwidth of computer mediated communica- tion. In addition, respiration is, at least partly, voluntary action which is rather straightforward to control, whereas EEG activation may be more difficult to control. Previ- ously it has been shown that aurally presented heart rate (HR) of the other person may be interpreted as a socially relevant signal conveying affective information [36], and

1949-3045 (c) 2019 IEEE. Personal use is permitRestrictions apply.

information.

(3)

linkage and experienced social presence [37]. It is possible that technologically mediated breathing rate of the part- ner would be interpreted similarly as a relevant signal when pursuing to evoke empathy towards the other. On the other hand, it may be that the EEG activation related to empathy processing is considered as a more informa- tive affective signal than breathing since it’s more easily associated with psychological processes than the rather mechanistic breathing.

In the VR, the users are often represented as avatars. To study the effects of social presence, we compare two ex- perimental conditions: one in which the users did the em- pathy exercise with another avatar (representing another human participant) and another in which they are facing a cold, inanimate, statue. This comparison enables to study the difference in evoking empathy when only ones’

own biofeedback visualizations are present versus when also the other participants’ visualizations are shown.

In concentrative meditation, the focus is often targeted to a specific mental or sensory target, such as a visual ob- ject, repeated sound, or a bodily activity like breathing [35]. We intend to study, whether an inanimate visual ob- ject could be used also in empathy-evoking meditative practice, or if a visual object representing another human with biofeedback visualizations would be more effective.

We suggest that it would be easier for the user to evoke empathy towards the opposing object if it represents an- other human being and shows also biofeedback visualiza- tions instead of it being a cold inanimate object [e.g., 38].

Given the preceding review of previous research, we suggest the following two hypotheses:

H1: Biofeedback functionality evokes heightened self-re- ported empathy in a social VR empathy exercise.

H2: There is higher self-reported empathy after conditions where the empathy is targeted towards a VR avatar (represent- ing a human participant) than in conditions where empathy is targeted towards a cold inanimate object that doesn’t represent a human participant.

3 T

HE EMPIRICAL STUDY 3.1 Participants

The participants were 72 persons, contacted through mail- ing lists of university student unions. Alltogether, they formed 36 same-sex dyads, of which 22 dyads were fe- male-female, and 14 dyads were male-male pairs. The participant’s ages ranged between 19 and 50 years, with a mean age of 26.11 years (SD = 6.10). The dyad members were acquaintances, who had known each other on aver- age for 7.4 years (SD = 6.3). As a compensation for their time in the experiment, each of the participants received two movie tickets. One participant chose to discontinue the experiment because of experienced distress; the re- spective dyad was excluded from the analyses. In addi- tion, due to technical issues the experiment had to be stopped for one dyad and they also were excluded from further analyses. Following the Declaration of Helsinki, written consent was obtained from all the participants.

The experiment was planned and conducted following the recommendations of the national board of research

integrity.

Fig. 1. A room view in the Dynecom VR for the 2-min. baseline re- cording preceding each of the eight experiment conditions. The us- ers were instructed to fixate to the cross on the wall.

3.2 The DYNECOM VRE

The DYNECOM (DYadic NEuro-COMpassion) VRE draws inspiration from the loving-kindness meditation (LKM) and compassion meditation (CM) that are both related to the currently popular mindfulness meditation practices.

The emphasis of these two practices is to increase “uncon- ditional, positive emotional states of kindness and compas- sion” [39, pp. 1126]. The techniques of LKM are targeted to building unconditional kindness towards others, whereas in CM sympathy is evoked towards those in need with a wish to help them; these thoughts are targeted towards oneself, others, or all living beings [40], [39]. The benefits of CM and LKM practice have been shown to include, for example, increase in positive and decrease in negative af- fect, increased activation on brain areas related to emotion and empathy, and empathic accuracy; for reviews, see [39], [41], [42]. Notably for the current study, even a short dura- tion mindfulness intervention has been shown to lead to enhanced empathy [43], [44].

3.2.1 Design of the VRE

The DYNECOM VRE includes a room environment for baseline measurement (Fig. 1), and the environment for conducting the meditative exercise (Fig. 2). Both of the us- ers were represented by statue-like avatars in the

DYNECOM VRE. The avatars were seated in a circle with four other similar statues; the design of the visual outlook of the VRE was inspired by a setting of relaxed social meditation by a campfire (Fig. 2). The statues were sur- rounded by a low wall and behind that a dark forest scen- ery was presented; this served to keep the user’s attention to the statues and thus to discourage excess explorative scanning. To keep focus to the task, movement in the en- vironment was not possible. An ambient pink noise audio track resembled the sound of wind. The audio track served to increase immersion by blocking possible noise from the laboratory. We chose to represent the users as static and neutral statues to ensure maximal concentra- tion to the adaptive visualizations of the psychophysio- logical signals. The four non-active statues were included in the setting for possible future studies where a larger group’s physiological synchronization could be

1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

(4)

examined.

Fig. 2. DYNECOM VRE. The users are represented by the two illu- minated statue-like avatars. The coloured aura around the statues and the illuminated bricks on the connecting bridge visualize the user’s physiological activation. In the pictured condition both the EEG and the respiration-based biofeedbacks are on.

The two avatars, representing the users, were, depend- ing on the experimental condition, both surrounded by an aura and they were also connected by a tile bridge or path.

Both these visual elements were designed for the visuali- zation of the physiological activation of the users. The bridge was inspired by the idea of highlighting that the two users were connected. The aura around the statues drew from the concept of “mental powers” in religious im- agery and in popular culture. The user’s respiration rate was visualized as a movement of the pulsating aura and also as a gradual illumination of the tiles of the bridge (Fig.

2, Fig. 3). Synchronization of the outbreaths of the users was visualized by collision and evoked highlight of the il- luminated tiles at the bridge (Fig. 2). Empathy-related ap- proach motivation towards the other participant, calcu- lated from the frontal asymmetry of the EEG of the users, was visualized as a color cue in the bridge tiles, sidebars of the bridge, and also in the aura (Fig. 3). The color visuali- zation of the brain electrical activation varied between green (withdrawal motivation, low empathy), yellow, or- ange, red, and pink (highest approach motivation, highest empathy) with a glowing effect on the sidebars of the bridge as an index of synchronized EEG frontal asymme- tries between the users (Fig. 3).

Fig. 3. The user’s view in the DYNECOM VRE. Glowing sidebars on the sides of the bridge visualize synchronized levels of empathy-re- lated EEG frontal asymmetries between the users. Pink colour in the aura and in the sidebars represents highest level of empathy-related EEG frontal asymmetry.

3.2.2 Technical setup

In the DYNECOM VRE, the participant’s physiological ac- tivation was measured with two Brain Products QuickAmp amplifiers, each connected to a desktop com- puter. The physiological data were recorded to hard disks for offline analyses; in addition, the OpenViBE platform [45] was used to process and stream the data for bio-adap- tive functionalities to two MSi Laptop computers that were running Unity3D (https://unity3d.com/) for the VRE im- plementation. Two Oculus Rift HMD’s (https://www.oc- ulus.com/rift/) were utilized to present the VRE and re- sponses for the self-reports at the end of each condition were collected using Oculus touch hand controllers. The 3D models of the VRE were developed using Blender (www.blender.org/) and Adobe Photoshop (www.adobe.com/Photoshop). In addition, some elements were obtained from the Unity3D asset store (https://as- setstore.unity.com/). All the relevant source code for the

DYNECOM VRE are shared in GitHub

(https://github.com/furtherform/dynecom).

3.2.3 The bio-adaptive functionalities

Abdominal respiratory movements were measured with Brainproducts elastic respiration belt, attached around the participant’s torso. The respiration visualizations were re- sponsive to the detection of the user’s outbreath. For the participant these effects seemed instantaneous and re- sponsive to his or her volitional respiratory actions. When the participants reached a synchronous breathing rate, the illuminated tiles colliding at the middle of the bridge blinked with a highlight effect.

EEG was measured with six electrodes (F3, F4, C3, C4, P3, P4) attached to a lycra cap. Brainproducts QuickAmp amplifier was used to acquire the data with 2000 Hz sam- pling rate and 0.1 high pass, 100 Hz low pass, and 50 Hz notch filters were utilized. For the EEG-based feedback the frontal asymmetry of the alpha (8 - 13 Hz) frequency band was calculated using the formula ln(F4) - ln(F3), since in the current setting empathy related to approach motivation was targeted and with this formula higher scores putatively indicate relatively greater left frontal ac- tivation if it is assumed that alpha band activation is in- versely related to neural activation [13], [46], [11].

The algorithm for calculating the EEG-based feedback was adaptive for each participant so that individual mini- mum and maximum values were determined and this range was tracked and updated during each of the ses- sions. A moving average of the last 9 seconds was used in calculating a value for the color cue in the system. The two participant’s EEG frontal asymmetries were consid- ered to be synchronized when their FA values were in the same percentage range within their respective individual (and adaptive) ranges. The settings and constants for the reactivity of the functions of the system were decided based on iterative development work and testing with various users. In addition, the built-in functionalities of the OpenViBE for epoch averaging and moving averages were used. A more detailed description of the implemen- tation of the biofeedback functionalities can be found in the Appendix.

1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

(5)

For the analysis purposes, the numeric values of the physiological indexes related to the visualizations were written by the system to a log file. For the respiration to the log files were written the number of outbreaths dur- ing each of the conditions, and also the number of times that the visual effect for respiration synchronization was shown during each of the conditions. For the EEG frontal asymmetry-based feedback to the log files were written average relative frontal asymmetry value, and also the number of seconds during which frontal asymmetry syn- chronization was visualized to occur during each of the conditions.

3.3 Procedure

Upon arrival, the participants were first described the ex- perimental setting and the technical setup. A written con- sent was obtained and the electrodes were attached. Dur- ing the experiment the participants were seated in com- fortable chairs in an electrically shielded room. In addi- tion, there was a partition between the participants dur- ing the experiment. In the beginning of each of the eight experimental conditions there was a two-minute baseline, where the participants were instructed to fixate to a cross on the wall of an empty room in the VRE (Fig. 1.). For the first 30 sec in each of the experimental conditions the bio- feedbacks were not on, thus, there were no visualizations showed to the participants. This period was used to calcu- late the initial values for the feedback. For the following 6 min of each of the conditions the biofeedbacks worked as intended.

The eight conditions, presented in a randomized order, were combinations of sociality of the meditation exercise (solo, dyadic) X type of biofeedback (no feedback, respira- tion-based feedback, EEG-based feedback, both the EEG- based and respiration-based feedbacks).

In solo condition the participants faced a cold statue that showed no visualizations of bio-signals and the par- ticipants were aware that the statue didn’t represent an- other user, but was an inanimate object. Whereas in the dyadic condition with no bio-adaptations on, the statuses showed no visualizations of bio-signals, but the statuses were lit to indicate that they nevertheless represented a living human, the other member of the dyad. The partici- pants were aware of these cues.

The participants were instructed to evoke empathetic, warm, and compassionate feelings towards the opposing statue, whether it was cold and inanimate or representing the other human participant. In this task the participants were encouraged to use the information provided by the biofeedbacks, if they were presented. At the end of each condition there was a set of self-report questions to an- swer.

3.4 Self-reported empathy

Self-reported empathy was assessed at the end of each condition by asking the participants to rate (1 = not at all, 7 = extremely) how much they felt sympathetic, compas- sionate, soft-hearted, warm, tender, and moved. An aver- age of these six ratings was used in the analysis as an in- dex of experienced empathy. For the current sample, an

average internal reliability for this scale was .91. This measure has been used previously to assess subjectively experienced empathy [47], for a review see [48], for a va- lidity assessment see [49].

3.5 Analyses

The data was analyzed with the IBM SPSS (v. 24) using the Linear Mixed Models (LMM) procedure with re- stricted maximum likelihood estimation. Compared to the more traditional analysis methods, the mixed models are stated to be more efficient, parsimonious, and flexible [50, pp. 13]. All the analyses were conducted on continuous variables; for visualization purposes only, some variables were dichotomized (Figure 5).

The dyad members were not independent in statistical sense (see Table 1 for correlations), but they were affected by the other member of the dyad (partner) during the ex- periment. Thus, following the suggestions of Kenny and colleagues [51] the analyses were conducted on the level of dyad instead of on the level of an individual. In this analysis approach each dyad member is considered as both an actor and a partner and it is possible to analyze the so-called actor and partner effects. An actor effect de- notes the effect that a score on a predictor variable has on the same person’s outcome variable; partner effect is the effect that the person’s score on a predictor variable has on his or her partner’s score of an outcome variable [51].

In the current experiment, this analysis approach enables the studying of, for example, the effect of either one of the participant’s EEG visualization to the self-reported empa- thy, or interaction effects that are relevant for the current study, e.g. whether the effect on self-reported empathy that that one participants’ high-empathy indexing EEG visualization has is affected by the perceivers’ own status of EEG visualization (high or low empathy).

For studying the hypotheses, a fixed effect for the ex- perimental condition (8 different) was defined in the model. Within-dyad average self-reported empathy was set as the dependent variable and the dyad ID was the subject variable to conduct the analyses on the dyad level.

The order of the presented condition was defined as the repeated variable, and compound symmetry was used as the covariance structure. Five planned contrast were used to study the effects of sociality of the condition (dyadic vs.

solo, Contrast 1); the effect of EEG-based biofeedback (EEG-feedback on vs. no feedback, Contrast 2); the differ- ence between the EEG-based and the respiration-based feedbacks (EEG vs. respiration, Contrast 3); the effect of respiration-based feedback (respiration vs. no feedback, Contrast 4); and the effect of both the biofeedbacks being simultaneously on (both vs. no feedback, Contrast 5).

To study the effects of the individual visualized EEG and respiratory feedbacks the analyses were conducted with an LMM where the within-dyad average self-re- ported empathy was the dependent variable, and sepa- rately for respiration and EEG frontal asymmetry values, centered actor and partner values and their interaction were included in a fixed effects model. The presentation order of the condition was set as the repeated variable with compound symmetry covariance structure and the 1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

(6)

dyad ID was set as the subject variable.

For the study of the effects of the visualized synchroni- zation indices, an LMM model was specified where the within-dyad average self-reported empathy was the de- pendent variable and the synchronization score of either the respiration rates or the frontal EEG asymmetries was set as the independent variable in a fixed effects model.

The scenario (1 to 8) was the repeated variable, dyad ID was the subject variable and compound symmetry covari- ance structure was used.

4 R

ESULTS

Correlations for the variables that were used in the anal- yses are presented in Table 1.

T

ABLE

1

C

ORRELATIONS FOR THE VARIABLES

Empathy = self-reported empathy; FA = average EEG frontal asymmetry; FA-sync. = number of seconds during which frontal asymmetry synchronization occurred; RESP = the num- ber of outbreaths; RESP-sync. = the number of seconds the par- ticipant’s outbreaths were in synchrony; P-FA = partner’s av- erage EEG frontal asymmetry; P-RESP = the number of part- ner’s outbreaths. *p < 0.05; **p < 0.01.

Results for the contrast analyses for the self-reported empathy are presented in Table 2. There was higher self- reported empathy in EEG-feedback than in no-feedback condition (for Contrast 2, p < .001; Fig. 4), thus supporting hypothesis H1. In addition, similar effect was observed for the respiration-based biofeedback (p = .03, Fig. 4).

Also, the self-reported empathy was higher after condi- tions where there were both the feedbacks on than when there was no feedback (for Contrast 5, p < .001; Fig. 4).

There was also an effect that was almost statistically sig- nificant, such that higher self-reported empathy was ob- served after EEG-feedback than after Respiration-feed- back conditions (for Contrast 3, p = .05; Fig. 4). Support- ing hypothesis H2, there was a statistically significant ef- fect of sociality on the self-reported empathy (Contrast 1, p < .001; Fig. 4); higher empathy was reported in dyadic than in solo conditions.

Fig. 4. Self-reported empathy for different conditions. D = dyadic meditation condition; S = solo meditation condition; OFF = no bio- feedback; EEG = EEG feedback on; RESP = respiration feedback on; BOTH = both, the EEG-based and the respiration-based feed- backs on. Error bars denote 95% confidence intervals.

T

ABLE

2. R

ESULTS OF CONTRAST ANALYSES

Dyad vs. Solo = Dyadic vs. solo; EEG vs. None = EEG adapta- tion on vs. no adaptation; EEG vs. Resp = EEG adaptation on vs. respiration adaptation on; Resp vs. None = respiration ad- aptation on vs. no adaptation; Both vs. None = both adaptations on vs. no adaptations.

T

ABLE

3. R

ESULTS OF DYADIC ANALYSES

Act. X Part. = interaction of actor and partner effects of EEG frontal asymmetry and respiration values on self-reported em- pathy.

Next, we examined more closely the actor and partner effects, and their interactions. Although the correlation between EEG frontal asymmetry and the self-reported empathy was statistically significant, there was no statisti- cally significant actor or partner effect of EEG frontal asymmetry on the self-reported empathy (p’s = .19 and .27, respectively, Table 3). However, there was statistically significant actor X partner interaction effect of EEG TABLE 1

CORRELATIONS FOR THE DEPENDENT AND INDEPENDENT VARIABLES OF THE STUDY

1 2 3 4 5 6 7

1.Empathy -

2.FA .16** -

3.FA-sync. .22** .54** -

4.RESP .07 .01 .05 -

5.RESP-sync. .14** .03 .31** .56* -

6.P-FA .19** .94** .54** .01 .04 -

7.P-RESP .03 .01 .05 .70** .57** .02 -

Empathy = self-reported empathy; FA = average visualized EEG frontal asymmetry; FA-sync. = number of seconds during which frontal asymmetry synchronization was visualized to occur; RESP = the amount of visualized outbreaths; RESP-sync. = the number of seconds the participant’s outbreaths were visualized to be in synchrony; P-FA = partner’s average visualized EEG frontal asymmetry; P-RESP = the amount of partner’s visualized outbreaths.

*p < 0.05; **p < 0.01.

Table 2

Variable Estimate SE df t p Empathy

Dyad vs. Solo 1.10 .27 254.22 4.03 <.001 EEG vs. None .80 .19 235.49 4.18 <.001 EEG vs. Resp .38 .19 256.45 1.97 .50 Resp vs. None .42 .19 253.26 2.16 .03 Both vs. None .90 .19 248.28 4.70 <.001

Alkuperäinen Table2, ei vielä dyadisilla:

Variable Estimate SE df t p

Empathy

Contrast 1 .004 .13 523.20 .04 .97 Contrast 2 -.81 .18 487.94 -4.45 <.001 Contrast 3 .50 .19 523.86 2.66 .008 Contrast 4 -.32 .18 509.77 -1.73 .084 Contrast 5 -.92 .18 497.42 -5.01 <.001

Ja tämä Table 2, dyadisilla: Table 3

Variable Estimate SE df t p Source df F p

Empathy EEG

Contrast 1 1.10 .27 254.22 4.03 <.001 Actor 1, 253.57 1.75 .19

Contrast 2 -.80 .19 235.49 -4.18 <.001 Partner 1, 253.57 1.20 .27

Contrast 3 .38 .19 256.45 1.97 .50 Act. X Part. 1, 252.97 5.53 .02

Contrast 4 -.42 .19 253.26 -2.16 .03 Resp.

Contrast 5 -.90 .19 248.28 -4.70 <.001 Actor 1, 242.14 .84 .36

Partner 1, 242.14 .09 .76 Act. X Part. 1, 250.48 .04 .85

1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

(7)

frontal asymmetry on the self-reported empathy (p = .02), such that when the actor (user’s own) EEG frontal asym- metry was higher, the higher partner EEG frontal asym- metry had a more pronounced effect to the self-reported empathy (Fig. 5). For the respiration, there was no statisti- cally significant actor or partner effect, nor actor X partner interaction effect on the self-reported empathy (p’s = .36, .76, and .85, respectively).

The number of respiration synchronizations during a condition didn’t have a statistically significant relation with the self-reported empathy (p = .20). However, the EEG frontal asymmetry synchronization had a statisti- cally significant effect on the self-reported empathy; F(1, 216.95) = 26.51, p < .001; stronger EEG frontal asymmetry synchronization was followed with higher self-reported empathy (M = 4.8, SD = 1.06; versus M = 4.32, SD = 1.22).

Fig. 5. Interaction effect of Actor X Partner EEG approach motivation on Self-reported empathy. Error bars denote 95% confidence inter- vals.

5 D

ISCUSSION

In this study, it was investigated how socially shared bio- feedback visualizations affect experienced empathy to- wards the partner and synchrony between the users’

physiological activity during a social meditation exercise carried out in VR.

Confirming the hypothesis, the participants reported more empathy towards avatars (a statue representing the other human participant with biofeedback visualizations) than towards a cold inanimate statue. In some focused at- tention meditative practices, one is instructed to focus to a certain visual object, e.g. a flower in a vase or a detail of a painting. However, it is suggestd that an inanimate object to focus to may not be optimal for conducting empathy- related meditative exercises.

The employed self-report scale by Batson and col- leagues [47] assessed rather general empathy, not empa- thy that would have been directed especially towards the other person. In future studies a scale assessing more di- rected empathy towards the other could be utilized to test whether the observed effect would persist or be even stronger.

One solution to increase even more the empathy to- wards the other user’s avatar would be to replace the stat- ues by more realistic human characters or by adding non- verbal gestures to the avatars. The chameleon effect (mimicry-induced social influence) evoked by, for exam- ple, mimicked head movements has been shown to be ef- fective also in computer-mediated settings [52]. In the fu- ture versions of the currently described social VRE, such non-verbal cues could be implemented. These cues could be based on the physiological activation of the user to en- able mimicry for enhanced efficiency of the social practice of empathy skills. Possibly the gestural mimicry could lead to emotional contagion and thus evoke empathy, alt- hough credible technological implementation may be challenging, e.g. [53].

The biofeedback visualizations of the EEG and respira- tory activations had hypothesized effects to the self-re- ported empathy. Highest self-reported empathy was fol- lowed after the conditions where both feedbacks were on.

It is suggested that this was due to the higher amount of information presented to form inferences about the part- ner’s mental state and also of the participants own state.

The higher amount of information about the partner’s state may have led to higher perceived empathy towards him or her due to stronger emotional contagion or be- cause the partner possibly felt more actively present in the VR.

The observed difference between the two biofeedbacks in the subjective empathy suggests that visualizations of the EEG based frontal asymmetry may have been per- ceived as a more relevant for the task of empathy evoking and targeting than the visualization of the respiration rate. On the other hand, this observed difference could also be due to the quality of the visualizations; it is possi- ble that the visualization for the EEG based feedback was more convincing than the visualization for the respiratory activity. Since the EEG visualization was based on the color it is possible that it was more easily detectable than the movement-based respiration feedback. In future, dif- ferent visualizations for these physiological signals should be tested to tease apart the independent effects of certain visual features on user experience. However, there was a statistically significant correlation between the syn- chronization of the EEG frontal asymmetry and the respi- ration rate, which suggests that the respiration rate wasn’t completely irrelevant for the task but similarly informed about the other’s emotional state, although being less sali- ent.

The self-reported empathy was not influenced by the participant’s own respiratory rhythm or the partner’s vis- ualized respiratory rate. It is possible that physiological activation that is under direct voluntary control may not be optimal for empathy exercises where biofeedback is utilized because this may lead to a mechanistic pursuit to reach the desired physiological state without focusing to the targeted subjective state. On the other hand, the only way to synchronize EEG was to follow the instructions and empathize with the other by concentrating on one’s mental and affective state. The observed effect of EEG frontal asymmetry synchronization between the

1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

(8)

participants to the self-reported empathy and also the sta- tistically significant positive correlations between per- ceived empathy and FA, FA synchronizations, and respir- atory rate synchronization suggests at least some associa- tion between these physiological indices and the subjec- tive perception. In future versions of the DYNECOM VR system, also the visualized synchronization of some other physiological signal could be considered. For example, heart rate and heart rate synchronization-based feedback has been shown to be an effective cue for connectedness and to evoke sense of social presence [37].

There was no statistically significant partner effect of the EEG based visualization to the perceived empathy.

Similar results were observed for the respiration rate based responsive visualization; the partner’s respiration- based visualization didn’t induce a statistically significant effects to the self-reported empathy. Either the visualiza- tion for the partner’s bio-signals weren’t effective or they were not considered as relevant cues when assessing per- ceived empathy. However, there was a statistically signifi- cant correlation between respiration rates and the EEG frontal asymmetries of the two participants. This implies that there may have been at least some level of coupling of these bio-signals evoked by the responsive visualiza- tions.

The observed interaction between the actor and part- ner effects of EEG frontal asymmetries on the self-re- ported empathy may imply that the participants were more influenced by their partner’s visualized EEG FA when they were themselves experiencing more prominent approach motivation and empathy, as indexed by the EEG FA. Putatively, the motivational tendency to ap- proach promoted more open and receptive processing of the partner’s socio-affective cues, in the similar manner as Fredrikson’s broaden-and-build theory suggests that pos- itive emotional state broadens momentary thought-action repertoires [54].

The currently presented DYNECOM environment is, to the best of our knowledge, the first biofeedback social VRE that utilizes the synchronization of the bio-signals of two users as a feedback in the context of empathy exer- cises or meditative practices. The DYNECOM VRE repre- sents a type of empathic technology that supports emo- tional converge by making empathy a more salient and impactful construct through inferring empathy related in- formation from the users and their behaviour and then presenting this information as a feedback via visual cues or other modalities [55]. In addition, the current study re- lates also to the fields of affective computing, where ma- chine-based emotion detection is studied [56]; social sig- nal processing, where machine-based detection of socially relevant signals during interaction is studied [57]; and for the field of two-person neuroscience [58]. The biofeed- back utilized in the current study could also be studied in the context of computer-supported collaborative work (CSCW). There are already promising results on the posi- tive effects of shared emotional states on the collaboration effectiveness and transactivity of a distributed, technol- ogy-mediated group [59], [60].

In the current experiment the only source for

information about the partner’s affective state were the biofeedback visualizations. Affective contagion could oc- cur only via the visualizations and the participants were instructed to empathize the partner and to use in this task the information provided by the visualizations. It has been proposed that emotional contagion could evoke em- pathy by spontaneous mimicry of other person and dur- ing this process neural representation of the person are formed thus leading to emotional resonance [61], [62]. On the other hand, the perception-action mechanism [63] pro- posed by the neural theories of empathy suggests a more straightforward route from perceptions to neural map- pings and finally to actions. With a different conceptual- ization, two systems for (evoking of) empathy have been identified, an emotional contagion-based system and a cognitive perspective taking dependent system [64].

Given the novel and non-natural form of affective infor- mation in the current experiment, further studies would be needed to verify which of these processes was domi- nant.

The participants of the current study were always aware that adaptive visualizations were reflecting the ac- tual physiological processes of a human partner. That is, a cold statue (that didn’t represent a human partner as an avatar) didn’t display any visualizations that would re- flect EEG or respiratory activation. This may present a limitation to the interpretation of the findings. In future studies this issue could be handled by including (e.g. ran- domly generated) feedback visualizations also to the cold statues. In future studies it could also be manipulated whether the participants are aware of that the statue rep- resents another human or is merely a cold object, this would enable the study of interpretations that the users make on biofeedback visualizations.

One future direction would be to include elements of gamification to the system. In a review Johnson and col- leagues [65] found promising results on the effectiveness of gamified elements in interventions related to health and wellbeing. In addition, another form of interactive digital media, digital games, have been shown to be pos- sibly viable tools in evoking mindfulness [66]. Within the current DYNECOM environment, the gamified elements could include, for example, a competitive task to synchro- nize one’s own physiological activation with a third com- puter-controlled agent statue that would exhibit visuali- zations typical to a desired meditative or empathetic state. More generally, various types of social “relax to win” settings could also be implemented within the DYNECOM framework. The gamification of the empathy and meditation exercises could motivate to sustained con- ducting of the exercises. However, gamification of these types of exercises has to be done cautiously. After all, meditation and empathy are domains where competitive elements could also have detrimental effects.

6 C

ONCLUSION

The present study showed the effect of social biofeed- back cues in interpersonal process of empathy in a dy- adic technology-mediated setting. The findings of the

1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

(9)

experiment suggest that the biofeedback based on EEG frontal asymmetry was more effective in evoking expe- rienced empathy than the biofeedback based on respi- ration rate. In addition, the synchronization of the EEG frontal asymmetry activities between the users was re- lated to increased perceived empathy. These observa- tions encourage to explore the use of affective cues that are based on physiological activation also in other types of social settings, including multiplayer games and collaborative virtual platforms. It is suggested that such cues could lead to emotional contagion, empathy, and physiological synchronization, which would puta- tively contribute to the effectiveness of the collabora- tion.

ACKNOWLEDGMENTS

This work was supported by the Academy of Finland (305576, 305577).

REFERENCES

[1] C. Milk, “How Virtual Reality can create the ultimate empathy ma- chine,” 2015, https://www.ted.com/talks/chris_milk_how_virtual_re- ality_can_create_the_ultimate_empathy_machine, Accessed 28th Aug. 2019.

[2] P. Bertrand, J. Guegan, L. Robieux, C.A. McCall and F. Zenasni,

”Learning Empathy Through Virtual Reality: Multiple Strategies for Training Empathy-Related Abilities Using Body Ownership Illusions in Embodied Virtual Reality,” Front. Robot. AI, vol. 5, no. 26, 2018.

[3] A.L. Sánchez Laws, ”Can Immersive Journalism Enhance Empathy?”

Digital Journalism, vol. 54, pp. 1-16, 2017.

[4] F. De Vignemont and T. Singer, ”The Empathic Brain: How, When and Why?” Trends in Cogn. Sci., vol. 10, no. 10, pp. 435-441, 2006.

[5] B.M. Cuff, S.J. Brown, L. Taylor and D.J. Howat, ”Empathy: a Re- view of the Concept, Emotion Rev., vol. 8, no. 2, pp. 144-153, 2016.

[6] J.E. Escalas and B.B. Stern, ”Sympathy and Empathy: Emotional Re- sponses to Advertising Dramas,” J. Consum. Res., vol. 29, no. 4, pp.

566-578, 2003.

[7] J.A. Soto and R.W. Levenson, ”Emotion Recognition Across Cul- tures: The Influence of Ethnicity on Empathic Accuracy and Physio- logical Linkage,” Emotion, vol. 9, no.6, pp. 874-884, 2009.

[8] G. Chanel and C. Mühl, “Connecting brains and bodies: Apply- ing physiological computing to support social interaction,” In- teract. Comput., vol. 27, no. 5, pp. 534-550, 2015.

[9] T. Singer and O.M. Klimecki, “Empathy and compassion,”

Curr. Biol., vol. 24, no. 18, R875-R878, 2014.

[10] F.B.M. de Waal, “Putting the altruism back into altruism: The evolution of empathy,” Annu. Rev. Psychol., vol. 59, pp. 279–300, 2008.

[11] A.M. Tullett, E. Harmon-Jones and M. Inzlicht, “Right frontal cortical asymmetry predicts empathic reactions: Support for a link between withdrawal motivation and empathy,” Psycho- physiol., vol. 49, no. 8, pp. 1145-1153, 2012.

[12] J. Zaki and K.N. Ochsner, ”The Neuroscience of Empathy: Progress, Pitfalls and Promise,” Nat. Neurosci., vol. 15, no. 5, pp. 675-680, 2012.

[13] J.J. Allen, J.A. Coan and M. Nazarian, ”Issues and Assumptions on the Road from Raw Signals to Metrics of Frontal EEG Asymmetry in Emotion,” Biol. Psychol., vol. 67, no. 1-2, pp. 183-218. 2004.

[14] A.M. Tullett, E. Harmon-Jones and M. Inzlicht, “Right frontal cortical asymmetry predicts empathic reactions: Support for a link between withdrawal motivation and empathy,” Psycho- physiol. vol. 49, no. 8, pp. 1145-1153, 2012.

[15] C.D. Marci, J. Ham, E. Moran and S.P. Orr, ”Physiologic Correlates of Perceived Therapist Empathy and Social-Emotional Process During Psychotherapy,” J. Nerv. Ment. Dis., vol. 195, no. 2, pp. 103-111, 2007.

[16] G. Rizzolatti and L. Craighero, (2004). ”The mirror-neuron system,”

Annu. Rev. Neurosci., vol. 27, pp. 169-192, 2004.

[17] E. Tognoli, J. Lagarde, G.C. DeGuzman and J.A.S. Kelso, ”The Phi Complex as a Neuromarker of Human Social Coordination,” P. Natl.

Acad. Sci., vol. 104, pp. 8190–8195, 2007.

[18] H. Benson, H.P. Klemchuk and J.R. Graham, “The usefulness of the relaxation response in the therapy of headache,” Headache: J.

Head Face Pain, vol. 14, no. 1, pp. 49-52, 1974.

[19] R.L. Woolfolk, L. Carr-Kaffashan, T.F. McNulty and P.M. Leh- rer, “Meditation training as a treatment for insomnia,” Behav.

Ther., vol. 7, no. 3, pp. 359-365, 1976.

[20] D. Gromala, X. Tong, A. Choo, M. Karamnejad and C.D. Shaw, ”The Virtual Meditative Walk: Virtual Reality Therapy for Chronic Pain Management,” Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, ACM, pp. 521-524, 2015.

[21] C.D. Shaw, D. Gromala and A.F. Seay, ”The Meditation Chamber:

Enacting Autonomic Senses,” Proceedings of the 4th International Conference on Enactive Interfaces, ENACTIVE/07, pp. 405–408, 2007.

[22] R. Gilpin, ”The Use of Theravāda Buddhist Practices and Perspectives in Mindfulness-Based Cognitive Therapy,” Contemp. Buddhism, vol.

9, no. 2, pp. 227–251, 2008.

[23] D.B. Levinson, E.L. Stoll, S.D. Kindy, H.L. Merry and R.J. Davidson,

”A Mind You Can Count on: Validating Breath Counting as a Behav- ioral Measure of Mindfulness,” Front. Psychol., vol. 5, no. 1202, 2014.

[24] J. Sliwinski, M. Katsikitis and C.M. Jones, ”A Review of Interactive Technologies as Support Tools for the Cultivation of Mindfulness,”

Mindfulness, vol. 8, no. 5, pp. 1150-1159, 2017.

[25] M. Prpa, K. Cochrane, B.E. Riecke, ”Hacking Alternatives in 21st Century: Designing a Bio-Responsive Virtual Environment for Stress Reduction,” S. Serino, A. Matic, D. Giakoumis, G. Lopez and P. Ci- presso, eds. Pervasive Computing Paradigms for Mental Health, MindCare 2015, Communications in Computer and Information Sci- ence, vol 604. Springer, Cham. 2016.

[26] A. Shamekhi and T. Bickmore, ”Breathe Deep: A Breath-Sensitive In- teractive Meditation Coach,” N. Minsky and V. Osmani, eds., Pro- ceedings of ACM Woodstock conference (PervasiveHealth’18), New York, NY: ACM, pp. 108-117, 2018.

[27] J. Lubar, M. Swartwood, J. Swartwood and P. O’Donnell, ”Evaluation of the Effectiveness of EEG Neurofeedback Training for ADHD in a Clinical Setting as Measured by Changes in T.O.V.A. Scores, Behav- ioral Ratings, and WISC-R Performance,” Biofeedback and Self-regu- lation, vol. 20, pp. 83–99, 1995.

[28] M. Sterman and T. Egner, ”Foundation and Practice of Neurofeedback for the Treatment of Epilepsy,” Appl. Psychophys. Biof., vol. 31, pp.

21–35, 2006.

[29] C. Sas and R. Chopra, ”MeditAid: A Wearable Adaptive Neurofeed- back-Based System for Training Mindfulness State,” Pers. Ubiquit.

Comput., vol. 19, pp. 1169-1182, 2015.

[30] T. Hinterberger, ”The Sensorium: A Multimodal Neurofeedback En- vironment,” Advances in Hum.-Comp. Interact., 724204, 2011.

[31] I. Kosunen, M. Salminen, S. Järvelä, A. Ruonala, N. Ravaja and G.

Jacucci, ”RelaWorld: Neuroadaptive and Immersive Virtual Reality Meditation System,” Proceedings of the 21st International Confer- ence on Intelligent User Interfaces, ACM, pp. 208-217, 2016.

[32] I. Kosunen, A. Ruonala, M. Salminen, S. Järvelä, N. Ravaja and G.

Jacucci, ”Neuroadaptive Meditation in the Real World,” Proceedings of the 2017 ACM Workshop on An Application-Oriented Approach to BCI Out of the Laboratory, pp. 29-33, 2017.

[33] L. Aftanas and S. Golocheikine, ”Human Anterior and Frontal Mid- line Theta and Lower Alpha Reflect Emotionally Positive State and Internalized Attention: High-resolution EEG Investigation of Medita- tion,” Neurosci. Let., vol. 310, no. 1, pp. 57–60, 2001.

[34] M. Schurmann and E. Başar, ”Functional Aspects of Alpha Oscilla- tions in the EEG,” Int. J. Psychophysiol., vol. 39, pp. 151– 158, 2001.

[35] B.R. Cahn and J. Polich, ”Meditation States and Traits: EEG, ERP, and Neuroimaging Studies,” Psychol. Bull., vol. 132, no. 2, pp. 180- 211, 2006.

[36] J.H. Janssen, W.A. Ijsselsteijn, J.H. Westerink, P. Tacken and G.J. de Vries, ”The Tell-Tale Heart: Perceived Emotional Intensity of Heart- beats. Int. J. Synthetic Emot., vol. 4, no. 1, pp- 65-91, 2013.

[37] S. Järvelä, J. Kätsyri, N. Ravaja, G. Chanel and P. Henttonen, ”In- tragroup Emotions: Physiological Linkage and Social Presence,”

Frontiers in Psychol., vol. 7, no. 105, 2016.

1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

(10)

[38] J.N. Bailenson, J. Blascovich, A.C. Beall and J.M. Loomis, ”Interper- sonal Distance in Immersive Virtual Environments,” Pers. Soc. Psy- chol. Bulletin, vol. 29, no. 7, pp. 819-833, 2003.

[39] S.G. Hofmann, P. Grossman and D.E. Hinton, ”Loving-Kindness and Compassion Meditation: Potential for Psychological Interventions,”

Clinic. Psychol. Rev., vol. 31, no. 7, pp. 1126-1132, 2011.

[40] P. Grossman and N.T. Van Dam, ”Mindfulness, by Any Other Name…: Trials and Tribulations of Sati in Western Psychology and Science,” Contemp. Buddhism, vol. 12, no. 1, pp. 219-239, 2011.

[41] A. Ridderinkhof, E.I. de Bruin, E. Brummelman and S.M. Bögels,

”Does Mindfulness Meditation Increase Empathy? An experiment,”

Self Identity, vol. 16, no. 3, pp. 251-269, 2017.

[42] E. Shonin, W. Van Gordon, A. Compare, M. Zangeneh and M.D.

Griffiths, ”Buddhist-Derived Loving-Kindness and Compassion Med- itation for the Treatment of Psychopathology: A Systematic Review,”

Mindfulness, vol. 6, no. 5, pp. 1161-1180, 2015.

[43] L.B. Tan, B.C. Lo and C.N. Macrae, ”Brief Mindfulness Meditation Improves Mental State Attribution and Empathizing,” PLoS One, vol.

9, pp. 1–5, 2014.

[44] A.P. Winning and S. Boag, ”Does Brief Mindfulness Training In- crease Empathy? The role of personality,” Pers. Indiv. Differ., vol. 86, pp. 492–498, 2015.

[45] Y. Renard, F. Lotte, G. Gibert, M. Congedo, E. Maby, V. Delannoy, O. Bertrand and A. Lécuyer, ”Openvibe: An Open-Source Software Platform to Design, Test, and Use Brain–Computer Interfaces in Real and Virtual Environments,” Presence: Teleop. Virt., vol. 19, no. 1, pp.

35-53, 2010.

[46] C.A. Moyer, M.P. Donnelly, J.C. Anderson, K.C. Valek, S.J.

Huckaby, D.A. Wiederholt, R.L. Doty, A.S. Rehlinger, and B.L. Rice,

”Frontal Electroencephalographic Asymmetry Associated with Posi- tive Emotion is Produced by Very Brief Meditation Training,” Psy- chol. Sci., vol. 22, no. 10, pp. 1277-1279, 2011.

[47] C.D. Batson, J. Fultz and P.A. Schoenrade, ”Distress and Empathy:

Two Qualitatively Distinct Vicarious Emotions with Different Moti- vational Consequences,” J. Pers., vol. 55, no. 1, pp. 19-39, 1987.

[48] C.D. Batson, “The altruism question: Toward a social-psychological answer,” Hillsdale, NJ: Erlbaum, 1991.

[49] C.D. Batson, J. Chang, R. Orr and J. Rowland, “Empathy, atti- tudes, and action: Can feeling for a member of a stigmatized group motivate one to help the group?” Pers. Soc. Psychol. Bul- let., vol. 28, no. 12, pp. 1656–1666, 2002.

[50] E. Bagiella, R.P. Sloan and D.F. Heitjan, ”Mixed-Effects Models in Psychophysiology,” Psychophysiology, vol. 37, no. 1, pp. 13-20, 2000.

[51] D.A. Kenny, D.A. Kashy and W.L. Cook, “Dyadic data analysis,”

Guilford press. 2006.

[52] J.N. Bailenson and N. Yee, (2005). Digital Chameleons: Automatic Assimilation of Nonverbal Gestures in Immersive Virtual Environ- ments. Psychol. Sci., 16(10), 814-819. 2005.

[53] N. Ravaja, G. Bente, J. Kätsyri, M. Salminen and T. Takala, ”Virtual Character Facial Expressions Influence Human Brain and Facial EMG Activity in a Decision-Making Game,” IEEE Trans. Affect. Comput., vol. 9, no. 2, pp. 1-14, 2016.

[54] B.L. Fredrikson, “The Role of Positive Emotions in Positive Psy- chology,” Am. Psychol., vol. 56, no. 3, pp. 218-226.

[55] J.H. Janssen, ”A Three-Component Framework for Empathic Tech- nologies to Augment Human Interaction,” J. Multimodal User Inter- faces, vol. 6, no. 3-4, pp. 143-161, 2012.

[56] R.W. Picard, Affective computing. Cambridge, Mass.: MIT Press, 1997.

[57] A. Vinciarelli, M. Pantic and H. Bourlard, ”Social Signal Processing:

Survey of an Emerging Domain,” Image Vision Comput., vol. 27, no.

12, pp. 1743-1759, 2009.

[58] R. Hari and M.V. Kujala, ”Brain Basis of Human Social Interaction:

from Concepts to Brain Imaging,” Physiol. Rev., vol. 89, no. 2, pp.

453-479, 2009.

[59] G. Molinari, G. Chanel, M. Betrancourt, T. Pun, C. Bozelle Giroud, ” Emotion Feedback during Computer-Mediated Collaboration: Effects on Self-Reported Emotions and Perceived Interaction,” CSCL 2013 Conference Proceedings, pp. 336-343, 2013.

[60] G. Chanel, G. Molinari, D. Cereghetti, M. Bétrancourt and T.

Pun, “Assessment of Computer- Supported Collaborative Pro- cesses using Interpersonal Physiological and Eye-Movement Coupling,” In Conference on Affective Computing and Intelligent Interaction. Geneva, Switzerland, 2013.

[61] A.J. Hofelich and S.D. Preston, ”The Meaning in Empathy: Distin- guishing Conceptual Encoding from Facial Mimicry, Trait Empathy, and Attention to Emotion,” Cogn. Emot., vol. 26, no. 1, pp. 119-128, 2012.

[62] L.M. Oberman, P. Winkielman and V.S. Ramachandran, ”Face to Face: Blocking Facial Mimicry can Selectively Impair Recognition of Emotional Expressions,” Soc. Neurosci., vol. 2, no. 3-4, pp. 167-178, 2007.

[63] S.D. Preston and F.B. De Waal, ”Empathy: Its Ultimate and Proxi- mate Bases,” Behav. Brain Sci., vol. 25, no. 1, pp. 1-20, 2002.

[64] S.G. Shamay-Tsoory, J. Aharon-Peretz and D. Perry, ”Two Systems for Empathy: A Double Dissociation Between Emotional and Cogni- tive Empathy in Inferior Frontal Gyrus Versus Ventromedial Prefron- tal Lesions,” Brain, vol. 132, no. 3, pp. 617-627, 2009.

[65] D. Johnson, S. Deterding, K.A. Kuhn, A. Staneva, S. Stoyanov and L.

Hides, ”Gamification for Health and Wellbeing: A Systematic Review of the Literature,” Internet Interventions, vol. 6, pp. 89-106, 2016.

[66] J. Sliwinski, M. Katsikitis and C.M. Jones, ”Mindful Gaming: How Digital Games Can Improve Mindfulness,” J. Abascal, S. Barbosa, M.

Fetter, T. Gross, P. Palanque, M. Winckler, eds., Human-Computer Interaction – INTERACT 2015 – Lecture Notes in Computer Science, vol. 9298, Springer, Cham. 2015.

Mikko Salminen received the PhD degree in psychology from the University of Helsinki in 2018. He started his academic career in Aalto University School of Business in 2003, and currently works at the Tampere University. His current research interests include affective processes when using VR and playing games and also during (technology-mediated) social interaction.

Simo Järvelä received his BBA degree in 2001, and MA (cognitive science) in 2017. He started his research career in Aalto University School of Business in 2007, and currently works at the University of Helsinki. He’s final- izing his doctoral thesis in interpersonal physi- ological synchrony and social presence by the end of the 2019. He has published 11 journal articles, 2 book chapters, 9 scientific confer- ence papers, and 9 design articles. His re- search areas interests include emotions, physiological synchrony, political moral, em- pathy, games, VR, meditation and bioadaptive systems

Antti Ruonala is studying MA in Game De- sign in Aalto University Media Lab in Espoo Finland. He has a background in 3D-visualiza- tion and games industry. Since 2015, he has been working as research assistant in Univer- sity of Helsinki, CS department Ubiquitous Computing group and HIIT, Helsinki Institute of Information Technology. His work focuses on research of novel interfaces, notably, sup- port of meditation through virtual reality, use of AR in navigating cultural heritage and mobile app design supporting self monitoring for behaviour change for gestational diabetes.

Ville J. Harjunen received the Master's de- gree in social sciences from University of Hel- sinki in 2014. He is currently working as a PhD candidate in the research group of Emotiotional Communication and eHealth lead by Prof. Niklas Ravaja in the Faculty of Medicine, University of Helsinki. He has au- thored and co-authored six international re- search papers. His topics of research include 1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

(11)

emotional information processing, affective communication in virtual reality and underlying brain processes.

Juho Hamari is a Professor of Gamifcation and head of the Gamifcation Group. Dr. Ha- mari and the Gamifcation Group operate un- der Tampere University and the University of Turku, as well as a part of the Centre of Ex- cellence in Game Culture Studies. Dr. Ha- mari received his PhD in information systems science from Aalto University Business School in 2015. Dr. Hamari has published approximately 100 articles related to topics including gamifcation, games, and online economy from perspec- tives of consumer behaviour, human–computer interaction, game studies, and information systems science. Dr. Hamari has been cited over 10,000 times and he was named as “information systems scholar of the year” by Tietojenkäsittelytieteen Seura, and “Emerging virtual scholar” by the American Educational Research Association (AERA). His research has been featured on the list of most notable articles in computer science by the ACM and he has received sev- eral awards for scientific productivity.

Giulio Jacucci is Professor at the Depart- ment of Computer Science at the University of Helsinki. He has been Professor at the Aalto University, Department of Design 2009-2010.

His research field and competencies are in hu- man-computer interaction in particular: multi- modal interaction, interactive intent modelling, mixed reality providing innovation for applica- tions in information discovery and wellbe- ing. Giulio Jacucci publishes in various journals and conferences in HCI, ubicomp, information retrieval and management with over 5000 citations in GS. He has experience in coordinating international pro- jects. He is coordinator of CO-ADAPT a H2020 project on Adaptive Environments and Conversational Agent Based approaches for Healthy Ageing and Work Ability 2018-2022, has coordinated in FP7 BeAware FP7 EU ICT Smart Environments for Energy Aware- ness and MindSee on “Symbiotic Mind Computer Interaction for In- formation Seeking”. He is co-founder and has served as chairman of the board MultiTaction.com Ltd. a leader in visual collaboration envi- ronments based on proprietary modular screen technology. He is co- founder of Etsimo Healthcare ltd offering a platform leveraging AI on top of health data, making it possible for healthcare providers to in- stantly offer their customers preventive healthcare. He co-authored two patents in the area of information seeking and on modular screens.

Niklas Ravaja received the PhD degree in psy- chology from the University of Helsinki, Finland, in 1996. He is a (full) professor of eHealth and well-being at the University of Helsinki. He is an expert on emotional and physiological processes during mediated social interaction.

1949-3045 (c) 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more

Viittaukset

LIITTYVÄT TIEDOSTOT

Heart Rate Variability, Breathing, Music Therapy, Polyvagal Theory, Resonance Frequency, Heart Rate Variability Biofeedback, Resonance Frequency Breathing, ABC Relaxation

We investigated whether and how different interneuronal circuits contribute to such resonance by using transcranial magnetic stimulation (TMS) during transcranial alternating

Kunnossapidossa termillä ”käyttökokemustieto” tai ”historiatieto” voidaan käsittää ta- pauksen mukaan hyvinkin erilaisia asioita. Selkeä ongelma on ollut

Lannan käsittelystä aiheutuvat metaanipäästöt ovat merkitykseltään vähäisempiä kuin kotieläinten ruoansulatuksen päästöt: arvion mukaan noin 4 prosenttia ihmi- sen

In this pro- spective study, we investigated the effects of fingolimod initiation and 3 months of continuous fingolimod treatment on heart rate- corrected

We investigated whether and how different interneuronal circuits contribute to such resonance by using transcranial magnetic stimulation (TMS) during transcranial alternating

Poliittinen kiinnittyminen ero- tetaan tässä tutkimuksessa kuitenkin yhteiskunnallisesta kiinnittymisestä, joka voidaan nähdä laajempana, erilaisia yhteiskunnallisen osallistumisen

EU:n ulkopuolisten tekijöiden merkitystä voisi myös analysoida tarkemmin. Voidaan perustellusti ajatella, että EU:n kehitykseen vaikuttavat myös monet ulkopuoliset toimijat,