Date of Award

12-2020

Document Type

Campus Access Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Developmental and Brain Sciences

First Advisor

Vivian M. Ciaramitaro

Second Advisor

Erik Blaser

Third Advisor

Richard Hunter

Abstract

Correctly interpreting emotional information from faces is a crucial part of social interaction. While emotional information can be extracted from a face alone, faces are usually accompanied by emotional sounds. By combining information from faces and voices, we can optimize emotional processing. While some studies have examined the processing of multisensory emotional information, the results are equivocal. Further, there is little literature on the effect of attention on multimodal emotional processing. While previous studies have suggested that children might process multimodal stimuli differently from adults, few studies have examined if multisensory emotional information provide an advantage for children. The goal of this dissertation was to examine the influence of a heard emotion on the judgement of faces, and the influence of attention on this process in both children and adults.

In Chapter 1, I present a set of experiments run in adults that examine under which conditions congruent stimuli (visual and auditory stimuli that match in emotional valence) afford a perceptual advantage. I found that emotional sounds did not influence the judgement of emotional faces when face stimuli were fully salient, when numerosity matched between visual and auditory stimuli, or when face stimuli were reduced in salience. However, a congruent advantage emerged when face stimuli were both reduced in salience and attention was directed towards the emotion in the faces.

In Chapter 3, I present a similar set of experiments run in children at the Museum of Science. In agreement with the adult data, children showed no congruent advantage when faces were fully salient or reduced in salience. However, we failed to replicate the finding that a combination of faces reduced in salience and increased attention revealed a congruent advantage. These findings contribute to the understanding of the development of multisensory interactions, and are in line with previous data that shows that multisensory emotional processing develops on a U-shaped curve.

In Chapter 4, I present a study in adults following up on our result of attention in Chapter 2. In this experiment, we directed participants’ attention either towards emotional information or gender information in the faces, to test the hypothesis that emotional information is only processed when there are sufficient attentional resources to do so. Though we did not observe a congruent advantage or attentional influence, this is likely due to our small sample size.

In Chapter 5, I present a study using steady state visual evoked potential to examine the neuronal underpinnings of emotional multisensory integration and the effect of attention in both children and adults. This study has been preregistered with the journal Cognition and Emotion and has passed stage 1 approval. Data collection has been delayed due to the pandemic, but will begin when it is safe to do so. Here, I describe the background literature and the planned methods in detail.

Taken together, these findings suggest that emotionally congruent stimuli provide a perceptual advantage over incongruent stimuli, but only in certain circumstances. However, these results are not paralleled in children, suggesting a possible U-shaped developmental trajectory for multisensory emotional processing.

Comments

Free and open access to this Campus Access Dissertation is made available to the UMass Boston community by ScholarWorks at UMass Boston. Those not on campus and those without a UMass Boston campus username and password may gain access to this dissertation through resources like Proquest Dissertations & Theses Global or through Interlibrary Loan. If you have a UMass Boston campus username and password and would like to download this work from off-campus, click on the "Off-Campus UMass Boston Users" link above.

Share

COinS