Datasets within this collection
Filter Results
121 results
- Data for analysis stepsDatabase to
- Subject personal evalutation of hand and fingersComplete individual evaluation in cm for hand and finger of sighted and blind participants.
- A Nonverbal Signs Dataset for the Italian PopulationA Nonverbal Signs Dataset for the Italian Population 1,522 Colorful Stimuli of Spontaneous Social Communication, for experimental settings, including EEG/ERP experiments. The provided stimuli are made available for use with the understanding that proper acknowledgment and citation of the source are required in any resulting work or publication. Proper attribution not only ensures academic integrity but also duly recognizes the effort involved in their development. Should further details regarding citation be needed, the relevant information can be provided upon request. SOURCE and VALIDATION studies: - Proverbio AM, Gabaro V, Orlandi A, Zani A. Semantic brain areas are involved in gesture comprehension: An electrical neuroimaging study. Brain Lang. 2015 Aug;147:30-40. doi: 10.1016/j.bandl.2015.05.002. - Proverbio AM, Ornaghi L, Gabaro V. How face blurring affects body language processing of static gestures in women and men. Soc Cogn Affect Neurosci. 2018 Jun 1;13(6):590-603. doi: 10.1093/scan/nsy033.
- Unaware Processing of Words Activates Experience-Derived Information in Conceptual-Semantic Brain NetworksSupplementary materials, neuroimaging data, and analysis code for the submitted manuscript: Unaware Processing of Words Activates Experience-Derived Information in Conceptual-Semantic Brain Networks Marta Ghio1,2,*, Barbara Cassone3,*, Marco Tettamanti2,3 1 Faculty of Mathematics and Natural Sciences, Heinrich Heine University Düsseldorf, Düsseldorf, Germany; 2 CIMeC - Center for Mind/Brain Sciences, University of Trento, Italy; 3 Department of Psychology, University of Milan-Bicocca, Milano, Italy. * Equal contribution. Keywords: conceptual-semantic representations; experience; awareness; continuous flash suppression; manipulable objects; emotions.
- Restricted AccessData for: "Music Literacy shapes the Specialization of a Right-hemispheric Word Reading area, beyond VWFA"This study investigates the neural mechanisms underlying word reading in professional musicians compared to musically naïve individuals (control group), focusing on the N170 component of ERPs dedicated to orthographic processing. The application of standardized weighted low-resolution electromagnetic tomography (swLORETA) to individual data contributes to the innovative nature of this project. The results showed that musicians showed a bilateral activation of the Visual Word Form Area (VWFA, BA37) in contrast to controls who showed a clearly left lateralized activation of the middle occipital gyrus (MOG, BA19). Musicians also showed enhanced reading skills compared to controls. It is thought that musicians develop this extra reading region to read the spatial and holistic aspects of musical notation. ERPs were recorded in 80 participants (men and women, musicians and non-musicians). The study involved the visual presentation of 300 Italian words of different length and complexity, presented randomly on a computer screen, as described in detail in the study by Proverbio et al. (2013). The words, written in upper case, ranged from 4 to 10 letters. A recognition task was performed in which participants had to press a key when they saw a specific target letter within a word, depending on the experimental condition, while ignoring non-target letters. Words lasted 1,600 ms, and the interstimulus interval (ISI) was randomly varied between 1,000 and 1,200 ms. ERPs were averaged from -100 to 1200 ms. The N170 component was quantified between 150-190 ms. swLORETA was applied to N170 responses during word reading in both groups. The full list of dipoles and neuroimaging data is presented here. The data are a compendium to the paper "Music Literacy shapes the Specialization of a Right-hemispheric Word Reading area" and include raw data collected from 2013 to 2023 at the Cognitive Lab ERP of UNIMIB for the "Neuroscience of Music" project. Related papers: Pantaleo MM, Arcuri G, Manfredi M, Proverbio AM. Music literacy improves reading skills via bilateral orthographic development. Sci Rep. 2024 Feb 12;14(1):3506. Proverbio AM, Manfredi M, Zani A, Adorni R. Musical expertise affects neural bases of letter recognition. Neuropsychologia. 2013 Feb;51(3):538-49.
- Mapping the Emotional Homunculus with fMRIEmotions are commonly associated with bodily sensations, e.g., boiling with anger when overwhelmed with rage. Studies have shown that emotions are related to specific body parts, suggesting that somatotopically organized cortical regions that commonly respond to somatosensory and motor experiences might be involved in the generation of emotions. We used functional magnetic resonance imaging to investigate whether the subjective feelings of emotion are accompanied by the activation of somatotopically defined sensorimotor brain regions, thus aiming to reconstruct an “emotional homunculus”. By defining the convergence of the brain activation patterns evoked by self-generated emotions during scanning onto a sensorimotor map created on participants’ tactile and motor brain activity, we showed that all the evoked emotions activated parts of this sensorimotor map, yet with considerable overlap among different emotions. Although we could not find a highly specific segmentation of discrete emotions over sensorimotor regions, our results support an embodied experience of emotions. Here, you will find the original code to digitise emBODY pen and paper silhouettes.
- Illumination and gaze effects on face evaluation: the Bi-AGI DatabaseFace evaluation and first impression generation can be affected by multiple face elements such as invariant facial features, gaze direction and environmental context; however, the composite modulation of eye gaze and illumination on faces of different gender and ages has not been previously investigated. We aimed at testing how these different facial and contextual features affect ratings of social attributes. Thus, we created and validated the Bi-AGI Database, a freely available new set of male and female face stimuli varying in age across lifespan from 18 to 87 years, gaze direction and illumination conditions. Judgments on attractiveness, femininity-masculinity, dominance and trustworthiness were collected for each stimulus. Results evidence the interaction of the different variables in modulating social trait attribution, in particular illumination differently affects ratings across age, gaze and gender, with less impact on older adults and greater effect on young faces.
- Restricted AccessEffect of race on Gaze Cueing in adults with high and low autistic traitsIn an online study we investigated in White Italian adults whether individuals with high autistic traits (measured by the Autism spectrum Quotient) show reduced implicit racial bias (measured by the Implicit Association Test) and if this bias would lead to differences in the gaze cueing effect (GCE) triggered by gaze direction of faces of different races. Results showed that participants with high and low-medium autistic traits had the same ingroup bias. Interestingly, females with high autistic traits showed a significant GCE only for White faces. However, with Black cueing faces when the AQ score increased the GCE decreased or was absent. In females, higher level of autistic traits is associated with a reduced ability to orient attention toward the outgroup.
- Restricted AccessPictionary-based communication tool for assessing individual needs and motivational states in locked-in patients: P.A.I.N. setThis set of communication tools is indicated for understanding the needs of locked-in (LIS) patients unable to speak or communicate through verbal, sign, or body language, and is ideal for communicating through Brain Computer Interface (with the P300 paradigm, speller, machine learning, EEG/ERP classification, eye tracking). It comprises 60 validated, easily comprehensible, pictorial tables depicting adult persons in various contexts that affect their physiological or psychological state. Their motivational states are illustrated in a small cloud. The drawings are in color and representative of both sexes and various ethnicity. THIS MATERIAL CAN BE FREELY USED FOR RESEARCH OR CLINICAL PURPOSES, PROVIDED THAT APPROPRIATE CREDIT IS GIVEN TO THE SOURCE. Proverbio, A.M., Pischedda, F. (2022). Validation of a Pictionary-based communication tool for assessing individual needs and motivational states in locked-in patients (P.A.I.N. set). Terms and conditions: This material cannot be used for commercial purposes. Therefore it cannot be transferred to technological devices of any type and be sold. This material cannot be placed in any internet web site nor can it be provided to profit making companies, or to the media (television, journals). To request the PAIN set for clinical use, or for non-profit academic research purposes, write to mado.proverbio@unimib.itWe do not accept requests directly from students, nor do we accept any requests from students on behalf of their advisor. Students should ask their faculty advisor to make the request.
- Restricted AccessData related to article "Facemasks impair the recognition of facial expressions that stimulate empathy (sadness, fear and disgust): an ERP study"In this study, ERPs were recorded in a group of 26 healthy male and female students engaged in recognizing 6 facial expressions of people wearing or not wearing surgical masks. Faces were preceded by the visual presentation of emotion prime words that were congruent or incongruent with the facial expression. The data of this collection are grand-average waveforms of Event-Related potentials (ERPs) recorded in healthy right-handed humans from 128 electrode sites at 512 Hz of sampling rate. The time epoch is 800 ms. ERP data were processed in two different ways: i) considering all EEG trials, or ii) only that not associated with an incorrect response. However, due to the largely uneven error rate across conditions (e.g., for masked faces or subtle emotions) ranging from 3.22 to 55.66%, the second method was considered unsound from the methodological point of view, and the more conservative and rigorous approach was preferred. Differences across conditions (masked vs. unmasked) were in fact larger when considering only correct responses. It should be noted that ERPs computed on the correct recognitions mostly contained signals reflecting the processing of easier to recognize stimuli (e.g. happy unmasked faces).
1