Laurie Bayet

Publications

  1. Bayet L., Behrendt H., Cataldo J., Westerlund A., & Nelson C. A. (à paraître) Recognition of facial emotions of varying intensities by three-year-olds. Developmental Psychology

  2. Bayet L., Quinn P.C., Laboissière R., Caldara R., Lee K., & Pascalis O. (2017) Fearful but not happy expressions boost face detection in human infants. Proceedings of the Royal Society B: Biological Sciences 284: 2017105 doi: 10.1098/rspb.2017.1054 | PDF SI data
    abstract
    Human adults show an attentional bias towards fearful faces, an adaptive behaviour that relies on amygdala function. This attentional bias emerges in infancy between 5 and 7 months, but the underlying developmental mechanism is unknown. To examine possible precursors, we investigated whether 3.5-, 6- and 12-month-old infants show facilitated detection of fearful faces in noise, compared to happy faces. Happy or fearful faces, mixed with noise, were presented to infants (N ¼ 192), paired with pure noise. We applied multivariate pattern analyses to several measures of infant looking behaviour to derive a criterion-free, continuous measure of face detection evidence in each trial. Analyses of the resulting psychometric curves supported the hypothesis of a detection advantage for fearful faces compared to happy faces, from 3.5 months of age and across all age groups. Overall, our data show a readiness to detect fearful faces (compared to happy faces) in younger infants that developmentally precedes the previously documented attentional bias to fearful faces in older infants and adults.
  3. Zinszer B. D., Bayet L., Emberson L. L., Raizada R. D. S., & Aslin R. N. (2017) Decoding semantic representations from functional near-infrared spectroscopy signals. Neurophotonics 5, 011003 doi: 10.1117/1.NPh.5.1.011003| PDF data+code
    abstract
    This study uses representational similarity-based neural decoding to test whether semantic information elicited by words and pictures is encoded in functional near-infrared spectroscopy (fNIRS) data. In experiment 1, subjects passively viewed eight audiovisual word and picture stimuli for 15 min. Blood oxygen levels were measured using the Hitachi ETG-4000 fNIRS system with a posterior array over the occipital lobe and a left lateral array over the temporal lobe. Each participant’s response patterns were abstracted to representational similarity space and compared to the group average (excluding that subject, i.e., leave-one-out cross-validation) and to a distributional model of semantic representation. Mean accuracy for both decoding tasks significantly exceeded chance. In experiment 2, we compared three group-level models by averaging the similarity structures from sets of eight participants in each group. In these models, the posterior array was accurately decoded by the semantic model, while the lateral array was accurately decoded in the between-groups comparison. Our findings indicate that semantic representations are encoded in the fNIRS data, preserved across subjects, and decodable by an extrinsic representational model. These results are the first attempt to link the functional response pattern measured by fNIRS to higher-level representations of how words are related to each other.
  4. Bayet L., Quinn P. C., Tanaka J., Lee K., Gentaz E., & Pascalis O. (2015) Face gender influences the looking preference for smiling expressions in 3.5-month-old human infants. PLOS ONE 10, e0129812 doi: 10.1371/journal.pone.0129812 | PDF data+code
    abstract
    Young infants are typically thought to prefer looking at smiling expressions. Although some accounts suggest that the preference is automatic and universal, we hypothesized that it is not rigid and may be influenced by other face dimensions, most notably the face’s gender. Infants are sensitive to the gender of faces; for example, 3-month-olds raised by female caregivers typically prefer female over male faces. We presented neutral versus smiling pairs of faces from the same female or male individuals to 3.5-month-old infants (n = 25), controlling for low-level cues. Infants looked longer to the smiling face when faces were female but longer to the neutral face when faces were male, i.e., there was an effect of face gender on the looking preference for smiling. The results indicate that a preference for smiling in 3.5-month-olds is limited to female faces, possibly reflective of differential experience with male and female faces.
  5. Damon F., Bayet L., Hillairet de Boisferon A., Méary D., Dupierrix E., Quinn P. C., Lee K., & Pascalis O. (2015) Can human eyes prevent perceptual narrowing for monkey faces? Developmental Psychobiology 57, 637-64 doi:10.1002/dev.21319 | PDF
    abstract
    Perceptual narrowing has been observed in human infants for monkey faces: 6-month-olds can discriminate between them, whereas older infants from 9 months of age display difficulty discriminating between them. The difficulty infants from 9 months have processing monkey faces has not been clearly identified. It could be due to the structural characteristics of monkey faces, particularly the key facial features that differ from human faces. The current study aimed to investigate whether the information conveyed by the eyes is of importance. We examined whether the presence of Caucasian human eyes in monkey faces allows recognition to be maintained in 6-month-olds and facilitates recognition in 9- and 12-month-olds. Our results revealed that the presence of human eyes in monkey faces maintains recognition for those faces at 6 months of age and partially facilitates recognition of those faces at 9 months of age, but not at 12 months of age. The findings are interpreted in the context of perceptual narrowing and suggest that the attenuation of processing of other species faces is not reversed by the presence of human eyes. Keywords: infant; perceptual narrowing; monkey faces; human eyes
  6. Bayet L., Pascalis O., Quinn P.C., Lee K., Gentaz E., & Tanaka J. (2015) Angry facial expressions bias gender categorization in children and adults: behavioral and computational evidence. Frontiers in Psychology 6, 346 doi:10.3389/fpsyg.2015.00346 | PDF data+code
    abstract
    Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward “male” responding in children as young as 5-6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1-2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling. Keywords: face, emotion, gender, children, representation, stereotype
  7. Marti S., Bayet L., & Dehaene S. (2015) Subjective report of eye fixations during serial search. Consciousness and Cognition 33, 1-15 doi:10.1016/j.concog.2014.11.007 | PDF
    abstract
    Humans readily introspect upon their thoughts and their behavior, but how reliable are these subjective reports? In the present study, we explored the consistencies of and differences between the observer’s subjective report and actual behavior within a single trial. On each trial of a serial search task, we recorded eye movements and the participants’ beliefs of where their eyes moved. The comparison of reported versus real eye movements revealed that subjects successfully reported a subset of their eye movements. Limits in subjective reports stemmed from both the number and the type of eye movements. Furthermore, subjects sometimes reported eye movements they actually never made. A detailed examination of these reports suggests that they could reflect covert shifts of attention during overt serial search. Our data provide quantitative and qualitative measures of observers’ subjective reports and reveal experimental effects of visual search that would otherwise be inaccessible. Keywords: Subjective reports, Metacognition, Introspection, Visual search, Eye movements, Attention, Consciousness
  8. Bayet L., Pascalis O., & Gentaz E. (2014) Le développement de la discrimination des expressions faciales émotionnelles chez les nourrissons dans la première année. L’Année Psychologique 114, 469-500 doi:10.4074/S0003503314003030 | PDF
    abstract
    Cette revue présente une synthèse des études examinant la discrimination des expressions faciales émotionnelles chez les nourrissons durant la première année de vie. Ces études montrent 1. une sensibilité aux changements d’expression faciale ainsi qu’une attirance pour les visages joyeux, probablement dès les premiers jours après la naissance et sûrement lors des premiers mois, 2. la capacité de distinguer les visages joyeux d’autres expressions après les premiers mois, 3. une attirance plus tardive, vers 6 à 7 mois, pour les visages de peur due à une modulation de l’attention, 4. l’émergence vers 6 à 7 mois de la capacité à distinguer entre elles les expressions autres que le sourire. Nous discutons enfin de l’aspect intrinsèquement émotionnel de cette discrimination précoce des expressions faciales émotionnelles, plus ou moins laissé en suspens par les études recensées, de même que l’explication causale de son développement.

Thèse

  • Bayet L. (2015) Le développement de la perception des expressions faciales [The development of facial expressions perception] | PDF slides
    abstract
    Cette thèse se propose d’examiner le développement de la perception des expressions faciales émotionnelles en le replaçant dans le cadre théorique de la perception des visages : séparation entre aspects variants (expression, regard) et invariants (genre, type), rôle de l’expérience, attention sociale. Plus spécifiquement, nous avons cherché à mettre en évidence l’existence, tant chez l’enfant que chez le nourrisson, d’interactions réciproques entre la perception d’expressions faciales de colère, de sourire ou de peur et la perception du genre (Études 1-2), la perception du regard (Étude 3), et la détection des visages (Étude 4). Dans un premier temps, nous avons montré que les adultes et les enfants de 5 à 12 ans tendent à catégoriser les visages en colère comme masculins (Étude 1). Comparer les performances humaines avec celles de classifieurs automatique suggère que ce biais reflète l’utilisation de certains traits et relations de second-ordre des visages pour en déterminer le genre. Le biais est identique à tous les âges étudiés ainsi que pour les visages de types non-familiers. Dans un second temps, nous avons testé si, chez le nourrisson, la perception du sourire dépend de dimensions invariantes du visage sensibles à l’expérience - le genre et le type (Étude 2). Les nourrissons ont généralement plus d’expérience avec les visages féminins d’un seul type. Les nourrissons de 3.5 mois montrent une préférence visuelle pour les visages souriants (dents visibles, versus neutre, de type familier) lorsque ceux-ci sont féminins ; l’inverse est observé lorsqu’ils sont masculins. L’effet n’est pas répliqué lorsque les dents des visages souriants (d’un type familier ou non) ne sont pas visibles. Nous avons cherché à généraliser ces résultats à une tâche de référencement d’objet chez des nourrissons de 3.5, 9 et 12 mois (Étude 3). Les objets préalablement référencés par des visages souriants étaient autant regardés que les objets préalablement référencés par des visages neutres, quel que soit le groupe d’âge ou le genre du visage, et ce malgré des différences en terme de suivi du regard. Enfin, en employant une mesure univariée (préférence visuelle pour le visage) et une mesure multivariée (évidence globale distinguant le visage du bruit) de la détection du visage à chaque essai, associées à une modélisation des courbes psychométriques par modèles non-linéaire mixtes, nous mettons en évidence une meilleure détection des visages de peur (comparés aux visages souriants) dans le bruit phasique chez les nourrissons à 3.5, 6 et 12 mois (Étude 4). Ces résultats éclairent le développement précoce et le mécanisme des relations entre genre et émotion dans la perception des visages ainsi que de la sensibilité à la peur. Mots-clés : nourrisson, enfant, perception, visage, émotion, expression faciale

Electronic copies of publications provided on this website are for individual, non-commercial use only. Copyright belongs to those designated within each publication. Files provided herein are not to be disseminated or reposted without permission of the appropriate entities.


Contact:


Laboratories of Cognitive Neuroscience, Boston Children’s Hospital, 1 Autumn Street, Boston MA, 02215, USA.

Google scholar | Researchgate | Twitter | Figshare | ORCID | Publons


Dernière mise à jour: Sun Jun 03 20:58:42 2018.