By members and collaborators of the Functional Imaging Laboratory

Multi-modal Face Dataset

This dataset is contributed by R. Henson.


This dataset contains EEG, MEG and fMRI data on the same subject within the same paradigm.

It can be used to examine how various measures of face perception, such as the "N170" ERP (EEG), the "M170" ERF (MEG) and fusiform activation (fMRI), are related. For example, the localisation of the generator(s) of the N170 and/or M170 can be constrained by the fMRI activations.

It also includes a high resolution anatomical MRI image (aMRI) for construction of a head-model for the EEG and MEG data, together with data from a Polhemus digitizer that can be used to coregister the EEG and MEG data with the aMRI.


Overview of the dataset, and step-by-step description of analysis:
SPM8: manual.pdf
Data Set:
EEG data: multimodal_eeg.zip (662 Mb)
MEG data: multimodal_meg.zip (377 Mb)
fMRI data:multimodal_fmri.zip (128 Mb)
Structural data: multimodal_smri.zip (6.5 Mb)
Preprocessed MEG data (SPM12 format): cdbespm12_SPM_CTF_MEG_example_faces1_3D.zip (53Mb)
MEG simulations:
Tutorial: SPM8 manual
Simulated data: meg_simulations.zip (83 Mb)

A previous version of this dataset, compatible with SPM5 and described in the tutorial section of the SPM5 manual can be found here (85 Mb)