The musical imagery fMRI dataset was acquired by Laboratory of Intergrated Brain Imaging (LIBI, https://engineering.purdue.edu/libi/). The dataset includes the original stimuli (an 8 min music piece and its visualized version) and the BOLD fMRI responses.
The auditory stimulus was the first 8 min 22 s of the first movement of Beethoven’s Symphony 9 (sampling rate: 11025 Hz). This music piece was visualized as a movie through Stephen Malinowski’s Music Animation Machine (https://musanim.com). The movie offered real-time visual cues to control the timing and inform the content of musical imagery.1
This dataset contains all data from the first three subjects (age 23-26, 1 female, average 9.3 years of musical training) who participated in the study, with informed written consent obtained from each subject according to a research protocol approved by the Institutional Review Board at Purdue University.
In the musical perception sessions, each subject was instructed to listen to this music with his or her eyes closed while no movie was presented. During the musical imagery sessions, each subject was instructed to imagine the music piece while watching the silent music visualization. The three subjects performed the musical-perception task eight times and the musical-imagery task twelve times. The sessions were conducted over a period of five days. Each subject also underwent six sessions of resting-state fMRI on a different day. In each session, the subjects were instructed to rest for 8 min with the eyes closed, without falling into sleep. In four additional sessions for subject 1, we also presented the music with the same visual display (the unmuted movie) as used in the musical-imagery task, yielding a control condition for us to validate findings from the intra-subject reproducibility analysis.
If you use this data in publications, please cite the data and this paper:
Zhang, Y., Chen, G., Wen, H., Lu, K.-H. & Liu, Z. Musical Imagery Involves Wernicke’s Area in Bilateral and Anti-Correlated Network Interactions in Musicians. Scientific Reports 7, 17066 (2017). https://doi.org/10.1038/s41598-017-17178-4
1. Malinowski, S. & Turetsky, L. Music animation machine. http://www.musanim.com/mam/mamhist.htm., [Jul. 8, 2009] (2011).
Cite this work
Researchers should cite this work as follows:
- Zhang, Y., Chen, G., Wen, H., Lu, K., Liu, Z. (2018). fMRI Data for Human Subjects During Musical Perception and Imagery. Purdue University Research Repository. doi:10.4231/R7W957B3
MRI/fMRI images were preprocessed using a similar pipeline as in the Human Connectome Project. The fMRI data is in folder “/subject*/fmri/”. The folder “raw” contains the fMRI data without preprocessing (NIFTI format). The folder “mni” contains the preprocessed fMRI data in MNI space. The folder “cifti” contains the fMRI data on the cortical surface template (CIFTI format). The anatomical structure data is in folder “/subject*/smri/” (NIFTI format). The NIFTI files can be read by any standard MRI software. The surface data can be read by the Connectome Workbench toolbox developed by Human Connectome Project (can be downloaded from https://www.humanconnectome.org/software/connectome-workbench).
Laboratory of Integrated Brain Imaging
This publication belongs to the Laboratory of Integrated Brain Imaging group.