It has been a while, but alas I am resurrecting the fMRIprep preprocssing pipeline. There have been a few updates and tweaks to the xnat-tools put out by the CCV folks, which also has great documentation. Shoutout to Isabel Restrepo. 🙂
I won’t put the scripts in this post, but you should be able to access and download them from my GitHub Repo.
Step 1: TCB_xnat2bids_bidspostprocess_parallel.sh
This script pulls Dicoms from XNAT to the server and stores in xnat-export folder (note: naming irregularities should be noted in a .json file), converts dicoms to nifti format, and then converts to bids format which is stored in a bids folder. Note that you will need an Ascension number associated with each subject, which you can find from the XNAT GUI. Alternatively, you can go to your “https://bnc.brown.edu/xnat/data/experiments/” URL to identify the ascension number associated with your subject. Different institutions may have different specifics wrt the URL, but it should generate a list of your acquired scans like the following:

Next, you will want to edit and submit the scripts to the cluster. A minor addition to these scripts are that since we have collected. field maps, the “bidspostprocess” function updates the associated .json file to include the runs that the fieldmaps should be applied for (more detail below). My scripts are set up so all you will need is are edit is the input to the associative array. Note that our T1 mprage scan spits out both and RMS and expanded image, and we want to ignore the non-RMS T1 scan. Once we update this info, then we can run the script.

Navigate to the directory with the scripts and then submit job to cluster.


Once the job has been completed successfully, we should see the following:

Step 2: TCB_bids-validator.sh
Next we will want to check that the data are BIDS validated. If you look in the “bids” folder BIDS Validation, and specifically in the func folder (for task-based fMRI), the data should be in BIDS format.

Additionally, you will want to check that the bidspostprocess worked, and if it works you you will see an “IntendedFor” argument in the .json file with the associated functional runs it will be applied for.


Next, you can run the bids-validator script, which tests for bids compliance within the bids folder. Here, I haven’t specified the events.tsv files (will do that later), so the bids-validator is just warning me. But otherwise it looks great.


Step 3: TCB_fmriprep_fieldmap_ica_parallel.sh
Next, we will run fMRIprep with fieldmap correction and ICA aroma. The script is parallelized so that participants will be sent to separate nodes. (There is more information in the header of the bash script not included below).

Step 4: TCB_postfmriprep_data.sh
This step is for unzipping the .nii.gz files that are generated after fmriprep. This puts all of the relevant functional and anatomical nifti images into an spm-data file for GLM analyses. This is an optional step, and highly variable depending on what software you wish to perform your analyses in.
Anyways, that’s all for preprocessing. Stay tuned for more updated scripts regarding generating stimulus onset files, and setting up first and second level GLMs in SPM.