Skip to main content
Bicocca Open Archive Research Data

Datasets within this collection

Filter Results
1970
2024
1970 2024
77 results
  • Unaware Processing of Words Activates Experience-Derived Information in Conceptual-Semantic Brain Networks
    Supplementary materials, neuroimaging data, and analysis code for the submitted manuscript: Unaware Processing of Words Activates Experience-Derived Information in Conceptual-Semantic Brain Networks Marta Ghio1,2,*, Barbara Cassone3,*, Marco Tettamanti2,3 1 Faculty of Mathematics and Natural Sciences, Heinrich Heine University Düsseldorf, Düsseldorf, Germany; 2 CIMeC - Center for Mind/Brain Sciences, University of Trento, Italy; 3 Department of Psychology, University of Milan-Bicocca, Milano, Italy. * Equal contribution. Keywords: conceptual-semantic representations; experience; awareness; continuous flash suppression; manipulable objects; emotions.
    • Dataset
  • Data for: "Music Literacy shapes the Specialization of a Right-hemispheric Word Reading area, beyond VWFA"
    This study investigates the neural mechanisms underlying word reading in professional musicians compared to musically naïve individuals (control group), focusing on the N170 component of ERPs dedicated to orthographic processing. The application of standardized weighted low-resolution electromagnetic tomography (swLORETA) to individual data contributes to the innovative nature of this project. The results showed that musicians showed a bilateral activation of the Visual Word Form Area (VWFA, BA37) in contrast to controls who showed a clearly left lateralized activation of the middle occipital gyrus (MOG, BA19). Musicians also showed enhanced reading skills compared to controls. It is thought that musicians develop this extra reading region to read the spatial and holistic aspects of musical notation. ERPs were recorded in 80 participants (men and women, musicians and non-musicians). The study involved the visual presentation of 300 Italian words of different length and complexity, presented randomly on a computer screen, as described in detail in the study by Proverbio et al. (2013). The words, written in upper case, ranged from 4 to 10 letters. A recognition task was performed in which participants had to press a key when they saw a specific target letter within a word, depending on the experimental condition, while ignoring non-target letters. Words lasted 1,600 ms, and the interstimulus interval (ISI) was randomly varied between 1,000 and 1,200 ms. ERPs were averaged from -100 to 1200 ms. The N170 component was quantified between 150-190 ms. swLORETA was applied to N170 responses during word reading in both groups. The full list of dipoles and neuroimaging data is presented here. The data are a compendium to the paper "Music Literacy shapes the Specialization of a Right-hemispheric Word Reading area" and include raw data collected from 2013 to 2023 at the Cognitive Lab ERP of UNIMIB for the "Neuroscience of Music" project. Related papers: Pantaleo MM, Arcuri G, Manfredi M, Proverbio AM. Music literacy improves reading skills via bilateral orthographic development. Sci Rep. 2024 Feb 12;14(1):3506. Proverbio AM, Manfredi M, Zani A, Adorni R. Musical expertise affects neural bases of letter recognition. Neuropsychologia. 2013 Feb;51(3):538-49.
    • Dataset
  • Mapping the Emotional Homunculus with fMRI
    Emotions are commonly associated with bodily sensations, e.g., boiling with anger when overwhelmed with rage. Studies have shown that emotions are related to specific body parts, suggesting that somatotopically organized cortical regions that commonly respond to somatosensory and motor experiences might be involved in the generation of emotions. We used functional magnetic resonance imaging to investigate whether the subjective feelings of emotion are accompanied by the activation of somatotopically defined sensorimotor brain regions, thus aiming to reconstruct an “emotional homunculus”. By defining the convergence of the brain activation patterns evoked by self-generated emotions during scanning onto a sensorimotor map created on participants’ tactile and motor brain activity, we showed that all the evoked emotions activated parts of this sensorimotor map, yet with considerable overlap among different emotions. Although we could not find a highly specific segmentation of discrete emotions over sensorimotor regions, our results support an embodied experience of emotions. Here, you will find the original code to digitise emBODY pen and paper silhouettes.
    • Dataset
  • Illumination and gaze effects on face evaluation: the Bi-AGI Database
    Face evaluation and first impression generation can be affected by multiple face elements such as invariant facial features, gaze direction and environmental context; however, the composite modulation of eye gaze and illumination on faces of different gender and ages has not been previously investigated. We aimed at testing how these different facial and contextual features affect ratings of social attributes. Thus, we created and validated the Bi-AGI Database, a freely available new set of male and female face stimuli varying in age across lifespan from 18 to 87 years, gaze direction and illumination conditions. Judgments on attractiveness, femininity-masculinity, dominance and trustworthiness were collected for each stimulus. Results evidence the interaction of the different variables in modulating social trait attribution, in particular illumination differently affects ratings across age, gaze and gender, with less impact on older adults and greater effect on young faces.
    • Dataset
  • Effect of race on Gaze Cueing in adults with high and low autistic traits
    In an online study we investigated in White Italian adults whether individuals with high autistic traits (measured by the Autism spectrum Quotient) show reduced implicit racial bias (measured by the Implicit Association Test) and if this bias would lead to differences in the gaze cueing effect (GCE) triggered by gaze direction of faces of different races. Results showed that participants with high and low-medium autistic traits had the same ingroup bias. Interestingly, females with high autistic traits showed a significant GCE only for White faces. However, with Black cueing faces when the AQ score increased the GCE decreased or was absent. In females, higher level of autistic traits is associated with a reduced ability to orient attention toward the outgroup.
    • Dataset
  • Pictionary-based communication tool for assessing individual needs and motivational states in locked-in patients: P.A.I.N. set
    This set of communication tools is indicated for understanding the needs of locked-in (LIS) patients unable to speak or communicate through verbal, sign, or body language, and is ideal for communicating through Brain Computer Interface (with the P300 paradigm, speller, machine learning, EEG/ERP classification, eye tracking). It comprises 60 validated, easily comprehensible, pictorial tables depicting adult persons in various contexts that affect their physiological or psychological state. Their motivational states are illustrated in a small cloud. The drawings are in color and representative of both sexes and various ethnicity. THIS MATERIAL CAN BE FREELY USED FOR RESEARCH OR CLINICAL PURPOSES, PROVIDED THAT APPROPRIATE CREDIT IS GIVEN TO THE SOURCE. Proverbio, A.M., Pischedda, F. (2022). Validation of a Pictionary-based communication tool for assessing individual needs and motivational states in locked-in patients (P.A.I.N. set). Terms and conditions: This material cannot be used for commercial purposes. Therefore it cannot be transferred to technological devices of any type and be sold. This material cannot be placed in any internet web site nor can it be provided to profit making companies, or to the media (television, journals). To request the PAIN set for clinical use, or for non-profit academic research purposes, write to mado.proverbio@unimib.it We do not accept requests directly from students, nor do we accept any requests from students on behalf of their advisor. Students should ask their faculty advisor to make the request.
    • Dataset
  • Data related to article "Facemasks impair the recognition of facial expressions that stimulate empathy (sadness, fear and disgust): an ERP study"
    In this study, ERPs were recorded in a group of 26 healthy male and female students engaged in recognizing 6 facial expressions of people wearing or not wearing surgical masks. Faces were preceded by the visual presentation of emotion prime words that were congruent or incongruent with the facial expression. The data of this collection are grand-average waveforms of Event-Related potentials (ERPs) recorded in healthy right-handed humans from 128 electrode sites at 512 Hz of sampling rate. The time epoch is 800 ms. ERP data were processed in two different ways: i) considering all EEG trials, or ii) only that not associated with an incorrect response. However, due to the largely uneven error rate across conditions (e.g., for masked faces or subtle emotions) ranging from 3.22 to 55.66%, the second method was considered unsound from the methodological point of view, and the more conservative and rigorous approach was preferred. Differences across conditions (masked vs. unmasked) were in fact larger when considering only correct responses. It should be noted that ERPs computed on the correct recognitions mostly contained signals reflecting the processing of easier to recognize stimuli (e.g. happy unmasked faces).
    • Dataset
  • Data for: ERP markers of visual and auditory imagery: a ‘mind reading’ approach for BCI systems
    Grand-average ERP waveforms recorded during mental imagery and perception of visual and auditory stimuli in a large group of healthy right-handed participants. Visual and auditory stimuli representing biologically relevant categories (e.g., faces, animals, voices…) were presented to 30 participants during a perceptual and an imagery condition, to collect the corresponding neural electrical signals. Unprecedented electro-physiological markers of imagery (in absence of sensory stimulation) were identified showing a specific response at given scalp sites and latency during imagination of infants (centroparietal positivity, CPP and late CPP), human faces (anterior negativity, AN), animals (anterior positivity, AP), music (P300), speech (N400), vocalizations (P2-like) and sensory (visual vs. auditory) modality (PN300). These ERP markers might be precious tools for BCI systems (pattern recognition, classification or A.I. algorithms) applied to patients affected by consciousness disorders (e.g., in a vegetative or comatose state) or locked-in-patients (e.g. spinal or SLA patients).
    • Dataset
  • Data for: Improvising enhances anticipatory skills. A comparison between jazz and classical musicians.
    The dataset here included is related to the research project “Improvising enhances anticipatory skills. A comparison between jazz and classical musicians”, approved by the ethical committee of the University of Milan-Bicocca (RM_2016_78). It includes performances of 2 different groups of musicians (classical and jazz) and a control group of nonmusician participants at an auditory and visual Anticipation Task. 3 conditions of the task are included for each modality (Rhythm 1 = unstressed condition, Rhythm 2 = stressed condition, and Rhythm 5 = unpredictable condition). Each condition includes 10 trials that are reported in 10 columns for each participant. Performances at the Purdue Pegboard Battery (Tiffin, 1999) and at the discrimination of rhythm task (here expressed in dprime scores) are also included for all the participants. The last columns included report the average scores and standard deviations of the anticipation task's trials.
    • Dataset
  • Data for: Discrimination of ordinal relationships in temporal sequences by 4-month-old infants
    Raw data for habiutation and test trials by subject (sex and age)
    • Dataset
1