Collection of scripts and data files for a study testing how episodic features are integrated during encoding
This repository includes scripts and data for the following paper:
The hallmark of episodic memory is recollecting multiple perceptual details tied to a specific spatial-temporal context. To remember an event, it is therefore necessary to integrate such details into a coherent representation during initial encoding. Here we tested how the brain encodes and binds multiple, distinct kinds of features in parallel, and how this process evolves over time during the event itself. We analyzed data from 27 human subjects (16 females, 11 males) who learned a series of objects uniquely associated with a color, a panoramic scene location, and an emotional sound while functional magnetic resonance imaging data were collected. By modeling how brain activity relates to memory for upcoming or just-viewed information, we were able to test how the neural signatures of individual features as well as the integrated event changed over the course of encoding. We observed a striking dissociation between early and late encoding processes: left inferior frontal and visuo-perceptual signals at the onset of an event tracked the amount of detail subsequently recalled and were dissociable based on distinct remembered features. In contrast, memory-related brain activity shifted to the left hippocampus toward the end of an event, which was particularly sensitive to binding item color and sound associations with spatial information. These results provide evidence of early, simultaneous feature-specific neural responses during episodic encoding that predict later remembering and suggest that the hippocampus integrates these features into a coherent experience at an event transition.
Scripts to run the task in Psychtoolbox as well as links to stimuli can be found here.
The R script - Encoding-BindingfMRI-Analysis.Rmd
- to run all analyses on single trial betas can be found in the analysis
folder, and the corresponding report can be accessed here. The whole-brain parcellation - bindingfmri_wholebrain_ROIs.nii
, which uses the 200-parcel atlas from Schaefer et al. (2018) - and corresponding .csv file with ROI ID numbers can be found in the data
folder.
The output of all whole brain linear mixed effects analyses can be found in the analysis
folder, which contains csv files with statistics for every region, nifti files showing whole brain t statistics and those that pass FDR-correction for multiple comparisons, and png images visualizing the significant results.
The workflow of the main script - Encoding-BindingfMRI-Analysis.Rmd
- is as follows:
AllSubjects_bindingfMRI-behavior.csv
and label each trial according to successful retrieval of i) color, ii) sound, iii) scene features, and iv) total number of details recalled (0-3).memorydetail_effects_onsetoffset.csv
.memorydetail_effects_covariate.csv
-, ii) showing that hippocampal memory correlates at the end of the event are not present throughout the encoding trial - saved as memorydetail_effects_eventepoch.csv
-, and iii) showing that the results of the onset- and offset-based analyses look extremely similar to predicting activity during each 1s ITI with memory for the upcoming or preceding event - saved as memorydetail_effects_ITI.csv
.feature_effects_onsetoffset.csv
.feature_effects_eventepoch.csv
.All data included in this project is licensed under the CC BY 4.0 license, and all code is licensed under the MIT license.
Please direct any comments to Rose Cooper, rose.cooper at bc.edu. Please feel free to use any of these scripts. Unfortunately I cannot provide support for you to adapt them to your own data. Notice a bug? Please tell me.