fMRI Data for Human Subjects During Musical Perception and Imagery

Listed in Datasets publication by group Laboratory of Integrated Brain Imaging

By Yizhen Zhang1, Gang Chen2, Haiguang Wen1, Kun-Han Lu1, Zhongming Liu3

1. Purdue University 2. Scientific and Statistical Computing Core, National Institute of Mental Health, National Institutes of Health 3. Weldon School of Biomedical Engineering, School of Electrical and Computer Engineering, Purdue University

This fMRI dataset includes the original stimuli and the BOLD fMRI responses for a musical imagery study.

Download Bundle

Additional materials available

Version 1.0 - published on 05 Jan 2018 doi:10.4231/R7W957B3 - cite this

Licensed under CC0 1.0 Universal



The musical imagery fMRI dataset was acquired by Laboratory of Intergrated Brain Imaging (LIBI, The dataset includes the original stimuli (an 8 min music piece and its visualized version) and the BOLD fMRI responses.

The auditory stimulus was the first 8 min 22 s of the first movement of Beethoven’s Symphony 9 (sampling rate: 11025 Hz). This music piece was visualized as a movie through Stephen Malinowski’s Music Animation Machine ( The movie offered real-time visual cues to control the timing and inform the content of musical imagery.1

This dataset contains all data from the first three subjects (age 23-26, 1 female, average 9.3 years of musical training) who participated in the study, with informed written consent obtained from each subject according to a research protocol approved by the Institutional Review Board at Purdue University. 

In the musical perception sessions, each subject was instructed to listen to this music with his or her eyes closed while no movie was presented. During the musical imagery sessions, each subject was instructed to imagine the music piece while watching the silent music visualization. The three subjects performed the musical-perception task eight times and the musical-imagery task twelve times. The sessions were conducted over a period of five days. Each subject also underwent six sessions of resting-state fMRI on a different day. In each session, the subjects were instructed to rest for 8 min with the eyes closed, without falling into sleep. In four additional sessions for subject 1, we also presented the music with the same visual display (the unmuted movie) as used in the musical-imagery task, yielding a control condition for us to validate findings from the intra-subject reproducibility analysis. 

If you use this data in publications, please cite the data and this paper:

Zhang, Y., Chen, G., Wen, H., Lu, K.-H. & Liu, Z. Musical Imagery Involves Wernicke’s Area in Bilateral and Anti-Correlated Network Interactions in Musicians. Scientific Reports 7, 17066 (2017).


Stimulus visualization:

1. Malinowski, S. & Turetsky, L. Music animation machine., [Jul. 8, 2009] (2011).


Cite this work

Researchers should cite this work as follows:



MRI/fMRI images were preprocessed using a similar pipeline as in the Human Connectome Project. The fMRI data is in folder “/subject*/fmri/”. The folder “raw” contains the fMRI data without preprocessing (NIFTI format). The folder “mni” contains the preprocessed fMRI data in MNI space. The folder “cifti” contains the fMRI data on the cortical surface template (CIFTI format). The anatomical structure data is in folder “/subject*/smri/” (NIFTI format). The NIFTI files can be read by any standard MRI software. The surface data can be read by the Connectome Workbench toolbox developed by Human Connectome Project (can be downloaded from   

Laboratory of Integrated Brain Imaging

Laboratory of Integrated Brain Imaging group image

The Purdue University Research Repository (PURR) is a university core research facility provided by the Purdue University Libraries, the Office of the Executive Vice President for Research and Partnerships, and Information Technology at Purdue (ITaP).