Laurie Bayet

Peer-reviewed publications

  1. Bayet L., Behrendt H., Cataldo J., Westerlund A., & Nelson C. A. (in press) Recognition of facial emotions of varying intensities by three-year-olds. Developmental Psychology

  2. Bayet L., Quinn P.C., Laboissière R., Caldara R., Lee K., & Pascalis O. (2017) Fearful but not happy expressions boost face detection in human infants. Proceedings of the Royal Society B: Biological Sciences 284: 2017105 doi: 10.1098/rspb.2017.1054 | PDF SI data
    abstract
    Human adults show an attentional bias towards fearful faces, an adaptive behaviour that relies on amygdala function. This attentional bias emerges in infancy between 5 and 7 months, but the underlying developmental mechanism is unknown. To examine possible precursors, we investigated whether 3.5-, 6- and 12-month-old infants show facilitated detection of fearful faces in noise, compared to happy faces. Happy or fearful faces, mixed with noise, were presented to infants (N ¼ 192), paired with pure noise. We applied multivariate pattern analyses to several measures of infant looking behaviour to derive a criterion-free, continuous measure of face detection evidence in each trial. Analyses of the resulting psychometric curves supported the hypothesis of a detection advantage for fearful faces compared to happy faces, from 3.5 months of age and across all age groups. Overall, our data show a readiness to detect fearful faces (compared to happy faces) in younger infants that developmentally precedes the previously documented attentional bias to fearful faces in older infants and adults.
  3. Zinszer B. D., Bayet L., Emberson L. L., Raizada R. D. S., & Aslin R. N. (2017) Decoding semantic representations from functional near-infrared spectroscopy signals. Neurophotonics 5, 011003 doi: 10.1117/1.NPh.5.1.011003| PDF data+code
    abstract
    This study uses representational similarity-based neural decoding to test whether semantic information elicited by words and pictures is encoded in functional near-infrared spectroscopy (fNIRS) data. In experiment 1, subjects passively viewed eight audiovisual word and picture stimuli for 15 min. Blood oxygen levels were measured using the Hitachi ETG-4000 fNIRS system with a posterior array over the occipital lobe and a left lateral array over the temporal lobe. Each participant’s response patterns were abstracted to representational similarity space and compared to the group average (excluding that subject, i.e., leave-one-out cross-validation) and to a distributional model of semantic representation. Mean accuracy for both decoding tasks significantly exceeded chance. In experiment 2, we compared three group-level models by averaging the similarity structures from sets of eight participants in each group. In these models, the posterior array was accurately decoded by the semantic model, while the lateral array was accurately decoded in the between-groups comparison. Our findings indicate that semantic representations are encoded in the fNIRS data, preserved across subjects, and decodable by an extrinsic representational model. These results are the first attempt to link the functional response pattern measured by fNIRS to higher-level representations of how words are related to each other.
  4. Bayet L., Quinn P. C., Tanaka J., Lee K., Gentaz E., & Pascalis O. (2015) Face gender influences the looking preference for smiling expressions in 3.5-month-old human infants. PLOS ONE 10, e0129812 doi: 10.1371/journal.pone.0129812 | PDF data+code
    abstract
    Young infants are typically thought to prefer looking at smiling expressions. Although some accounts suggest that the preference is automatic and universal, we hypothesized that it is not rigid and may be influenced by other face dimensions, most notably the face’s gender. Infants are sensitive to the gender of faces; for example, 3-month-olds raised by female caregivers typically prefer female over male faces. We presented neutral versus smiling pairs of faces from the same female or male individuals to 3.5-month-old infants (n = 25), controlling for low-level cues. Infants looked longer to the smiling face when faces were female but longer to the neutral face when faces were male, i.e., there was an effect of face gender on the looking preference for smiling. The results indicate that a preference for smiling in 3.5-month-olds is limited to female faces, possibly reflective of differential experience with male and female faces.
  5. Damon F., Bayet L., Hillairet de Boisferon A., Méary D., Dupierrix E., Quinn P. C., Lee K., & Pascalis O. (2015) Can human eyes prevent perceptual narrowing for monkey faces? Developmental Psychobiology 57, 637-64 doi:10.1002/dev.21319 | PDF
    abstract
    Perceptual narrowing has been observed in human infants for monkey faces: 6-month-olds can discriminate between them, whereas older infants from 9 months of age display difficulty discriminating between them. The difficulty infants from 9 months have processing monkey faces has not been clearly identified. It could be due to the structural characteristics of monkey faces, particularly the key facial features that differ from human faces. The current study aimed to investigate whether the information conveyed by the eyes is of importance. We examined whether the presence of Caucasian human eyes in monkey faces allows recognition to be maintained in 6-month-olds and facilitates recognition in 9- and 12-month-olds. Our results revealed that the presence of human eyes in monkey faces maintains recognition for those faces at 6 months of age and partially facilitates recognition of those faces at 9 months of age, but not at 12 months of age. The findings are interpreted in the context of perceptual narrowing and suggest that the attenuation of processing of other species faces is not reversed by the presence of human eyes. Keywords: infant; perceptual narrowing; monkey faces; human eyes
  6. Bayet L., Pascalis O., Quinn P.C., Lee K., Gentaz E., & Tanaka J. (2015) Angry facial expressions bias gender categorization in children and adults: behavioral and computational evidence. Frontiers in Psychology 6, 346 doi:10.3389/fpsyg.2015.00346 | PDF data+code
    abstract
    Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward “male” responding in children as young as 5-6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1-2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling. Keywords: face, emotion, gender, children, representation, stereotype
  7. Marti S., Bayet L., & Dehaene S. (2015) Subjective report of eye fixations during serial search. Consciousness and Cognition 33, 1-15 doi:10.1016/j.concog.2014.11.007 | PDF
    abstract
    Humans readily introspect upon their thoughts and their behavior, but how reliable are these subjective reports? In the present study, we explored the consistencies of and differences between the observer’s subjective report and actual behavior within a single trial. On each trial of a serial search task, we recorded eye movements and the participants’ beliefs of where their eyes moved. The comparison of reported versus real eye movements revealed that subjects successfully reported a subset of their eye movements. Limits in subjective reports stemmed from both the number and the type of eye movements. Furthermore, subjects sometimes reported eye movements they actually never made. A detailed examination of these reports suggests that they could reflect covert shifts of attention during overt serial search. Our data provide quantitative and qualitative measures of observers’ subjective reports and reveal experimental effects of visual search that would otherwise be inaccessible. Keywords: Subjective reports, Metacognition, Introspection, Visual search, Eye movements, Attention, Consciousness
  8. Bayet L., Pascalis O., & Gentaz E. (2014) Le développement de la discrimination des expressions faciales émotionnelles chez les nourrissons dans la première année. L’Année Psychologique 114, 469-500 doi:10.4074/S0003503314003030 |PDF
    abstract
    Here we review the studies of emotional facial expression discrimination by newborns and infants in the first year of life. These studies show that 1. sensitivity to changes in facial expression and an attraction to smiling faces might exist in newborns, and are present in the first months of life, 2. the ability to discriminate joy from several other expressions appears before 6 months of age, 3. older infants (aged of 6 or 7 months) show an attraction to fearful faces due to attentional effects and 4. those older infants begin to develop the ability to discriminate between several expressions other than joy. We then discuss the sensitivity of the infants to the genuinely emotional content of facial expressions, which is left more or less unresolved by the reviewed studies, and some possible causal explanations for its development.

Thesis

  • Bayet L. (2015) Le développement de la perception des expressions faciales [The development of facial expressions perception] | PDF slides
    abstract
    This thesis addressed the question of how the perception of emotional facial expressions develops, reframing it in the theoretical framework of face perception: the separation of variant (expression, gaze) and invariant (gender, race) streams, the role of experience, and social attention. More specifically, we investigated how in infants and children the perception of angry, smiling, or fearful facial expressions interacts with gender perception (Studies 1-2), gaze perception (Study 3), and face detection (Study 4). In a first study, we found that adults and 5-12 year-old children tend to categorize angry faces as male (Study 1). Comparing human performance with that of several automatic classifiers suggested that this reflects a strategy of using specific features and second-order relationships in the face to categorize gender. The bias was constant over all ages studied and extended to other-race faces, further suggesting that it doesn’t require extensive experience. A second set of studies examined whether, in infants, the perception of smiling depends on experience-sensitive, invariant dimensions of the face such as gender and race (Study 2). Indeed, infants are typically most familiar with own-race female faces. The visual preference of 3.5 month-old infants for open-mouth, own-race smiling (versus neutral) faces was restricted to female faces and reversed in male faces. The effect did not replicate with own- or other-race closed-mouth smiles. We attempted to extend these results to an object-referencing task in 3.5-, 9- and 12-month-olds (Study 3). Objects previously referenced by smiling faces attracted similar attention as objects previously cued by neutral faces, regardless of age group and face gender, and despite differences in gaze following. Finally, we used univariate (face side preference) and multivariate (face versus noise side decoding evidence) trial-level measures of face detection, coupled with non-linear mixed modeling of psychometric curves, to reveal the detection advantage of fearful faces (compared to smiling faces) embedded in phase-scrambled noise in 3.5-, 6-, and 12-month-old infants (Study 4). The advantage was as or more evident in the youngest group than in the two older age groups. Taken together, these results provide insights into the early ontogeny and underlying cause of gender-emotion relationships in face perception and the sensitivity to fear. Keywords: infant, children, perception, face, emotion, facial expression

Electronic copies of publications provided on this website are for individual, non-commercial use only. Copyright belongs to those designated within each publication. Files provided herein are not to be disseminated or reposted without permission of the appropriate entities.


Contact:


Laboratories of Cognitive Neuroscience, Boston Children’s Hospital, 1 Autumn Street, Boston MA, 02215, USA.

Google scholar | Researchgate | Twitter | Figshare | ORCID | Publons


Last updated on Sun Jun 03 20:54:17 2018.