An automatically and computationally reproducible neuroimaging analysis from scratch

This use case sketches the basics of a portable analysis that can be automatically computationally reproduced, starting from the acquisition of a neuroimaging dataset with a magnetic resonance imaging (MRI) scanner up to complete data analysis results:

  1. Two extension packages, datalad-container and datalad-neuroimaging extend DataLad’s functionality with the ability to work with computational containers and neuroimaging data workflows.

  2. The analysis is conducted in a way that leaves comprehensive provenance (including software environments) all the way from the raw data, and structures study components in a way that facilitates reuse.

  3. It starts with preparing a raw data (dicom) dataset, and subsequently uses the prepared data for a general linear model (GLM) based analysis.

  4. After completion, data and results are archived, and disk usage of the dataset is maximally reduced.

This use case is adapted from the ReproIn/DataLad tutorial by Michael Hanke and Yaroslav Halchenko, given at the 2018 OHBM training course ran by ReproNim.

Note

This use case is multi-layered, assumes familiarity with neuroimaging data structures and standards, and some understanding of neuroimaging software. It includes many neuroimaging-specific tools, commands, or concepts that are not covered in the Basics of the handbook.

For a less complex yet still automatically reproducible analysis of public data (without DataLad extensions), please see the hidden section below. This hidden use case demonstrates a more “introductory” automatically reproducible analysis.

An automatically reproducible analysis of public neuroimaging data

This hidden use case sketches the basics of an analysis that can be automatically reproduced by anyone:

  1. Public open data stems from the DataLad superdataset ///.

  2. Automatic data retrieval can be ensured by using DataLad’s commands in the analysis scripts, or the --input specification of datalad run,

  3. Analyses are executed using datalad run and datalad rerun commands to capture everything relevant to reproduce the analysis.

  4. The final dataset can be kept as lightweight as possible by dropping input that can be easily re-obtained.

  5. A complete reproduction of the computation (including input retrieval), is possible with a single datalad rerun command.

This use case is a specialization of Writing a reproducible paper: It is a data analysis that requires and creates large data files, uses specialized analysis software, and is fully automated using solely DataLad commands and tools. While exact data types, analysis methods, and software mentioned in this use case belong to the scientific field of neuroimaging, the basic workflow is domain-agnostic.

The Challenge

Creating reproducible (scientific) analyses seems to require so much: One needs to share data, scripts, results, and instructions on how to use data and scripts to obtain the results. A researcher at any stage of their career can struggle to remember which script needs to be run in which order, or to create comprehensible instructions for others on where and how to obtain data and how to run which script at what point in time. This leads to failed replications, a loss of confidence in results, and major time requirements for anyone trying to reproduce others or even their own analyses.

The DataLad Approach

Scientific studies should be reproducible, and with the increasing accessibility of data, there is not much excuse for a lack of reproducibility anymore. DataLad can help with the technical aspects of reproducible science.

For neuroscientific studies, the DataLad superdataset /// provides unified access to a large amount of data. Using it to install datasets into an analysis-superdataset makes it easy to share this data together with the analysis. By ensuring that all relevant data is downloaded via datalad get via DataLad’s command line tools in the analysis scripts, or --input specifications in a datalad run, an analysis can retrieve all required inputs fully automatically during execution. Recording executed commands with datalad run allows to rerun complete analysis workflows with a single command, even if input data does not exist locally. Combining these three steps allows to share fully automatically reproducible analyses as lightweight datasets.

Step-by-Step

It always starts with a dataset:

$ datalad create -c yoda demo
[INFO] Creating a new annex repo at /home/me/usecases/repro/demo 
[INFO] Running procedure cfg_yoda 
[INFO] == Command start (output follows) ===== 
[INFO] == Command exit (modification check follows) ===== 
create(ok): /home/me/usecases/repro/demo (dataset)

For this demo we are using two public brain imaging datasets that were published on OpenFMRI.org, and are available from the DataLad superdataset /// (datasets.datalad.org). When installing datasets from this superdataset, we can use its abbreviation ///. The two datasets, ds000001 and ds000002, are installed into the subdirectory inputs/.

$ cd demo
$ datalad clone -d . ///openfmri/ds000001 inputs/ds000001
[INFO] Cloning dataset to Dataset(/home/me/usecases/repro/demo/inputs/ds000001)
[INFO] Attempting to clone from http://datasets.datalad.org/openfmri/ds000001 to /home/me/usecases/repro/demo/inputs/ds000001
[INFO] Attempting to clone from http://datasets.datalad.org/openfmri/ds000001/.git to /home/me/usecases/repro/demo/inputs/ds000001
[INFO] Start counting objects
[INFO] Start compressing objects
[INFO] Start receiving objects
[INFO] Start resolving deltas
[INFO] Completed clone attempts for Dataset(/home/me/usecases/repro/demo/inputs/ds000001)
install(ok): inputs/ds000001 (dataset)
add(ok): inputs/ds000001 (file)
add(ok): .gitmodules (file)
save(ok): . (dataset)
action summary:
  add (ok: 2)
  install (ok: 1)
  save (ok: 1)
$ cd demo
$ datalad clone -d . ///openfmri/ds000002 inputs/ds000002
[INFO] Cloning dataset to Dataset(/home/me/usecases/repro/demo/inputs/ds000002)
[INFO] Attempting to clone from http://datasets.datalad.org/openfmri/ds000002 to /home/me/usecases/repro/demo/inputs/ds000002
[INFO] Attempting to clone from http://datasets.datalad.org/openfmri/ds000002/.git to /home/me/usecases/repro/demo/inputs/ds000002
[INFO] Start counting objects
[INFO] Start compressing objects
[INFO] Start receiving objects
[INFO] Start resolving deltas
[INFO] Completed clone attempts for Dataset(/home/me/usecases/repro/demo/inputs/ds000002)
install(ok): inputs/ds000002 (dataset)
add(ok): inputs/ds000002 (file)
add(ok): .gitmodules (file)
save(ok): . (dataset)
action summary:
  add (ok: 2)
  install (ok: 1)
  save (ok: 1)

Both datasets are now registered as subdatasets, and their precise versions (e.g. in the form of the commit shasum of the lastest commit) are on record:

$ datalad --output-format '{path}: {gitshasum}' subdatasets
/home/me/usecases/repro/demo/inputs/ds000001: f7fe2e38852915e7042ca1755775fcad0ff166e5
/home/me/usecases/repro/demo/inputs/ds000002: 6b16eff0c9e8d443ee551784981ddd954f657071

DataLad datasets are fairly lightweight in size, they only contain pointers to data and history information in their minimal form. Thus, so far very little data were actually downloaded:

$ du -sh inputs/
14M	inputs/

Both datasets would actually be several gigabytes in size, once the dataset content gets downloaded:

$ datalad -C inputs/ds000001 status --annex
$ datalad -C inputs/ds000002 status --annex
130 annex'd files (2.3 GB recorded total size)
nothing to save, working tree clean
274 annex'd files (2.7 GB recorded total size)
nothing to save, working tree clean

Both datasets contain brain imaging data, and are compliant with the BIDS standard. This makes it really easy to locate particular images and perform analysis across datasets.

Here we will use a small script that performs ‘brain extraction’ using FSL as a stand-in for a full analysis pipeline. The script will be stored inside of the code/ directory that the yoda-procedure created that at the time of dataset-creation.

$ cat << EOT > code/brain_extraction.sh
# enable FSL
. /etc/fsl/5.0/fsl.sh

# obtain all inputs
datalad get \$@
# perform brain extraction
count=1
for nifti in \$@; do
   subdir="sub-\$(printf %03d \$count)"
   mkdir -p \$subdir
   echo "Processing \$nifti"
   bet \$nifti \$subdir/anat -m
   count=\$((count + 1))
done
EOT

Note that this script uses the datalad get command which automatically obtains the required files from their remote source – we will see this in action shortly.

We are saving this script in the dataset. This way, we will know exactly which code was used for the analysis. Everything inside of code/ is tracked with Git thanks to the yoda-procedure, so we can see more easily how it was edited over time. In addition, we will “tag” this state of the dataset with the tag setup_done to mark the repository state at which the analysis script was completed. This is optional, but it can help to identify important milestones more easily.

$ datalad save --version-tag setup_done -m "Brain extraction script" code/brain_extraction.sh
add(ok): code/brain_extraction.sh (file)
save(ok): . (dataset)
action summary:
  add (ok: 1)
  save (ok: 1)

Now we can run our analysis code to produce results. However, instead of running it directly, we will run it with DataLad – this will automatically create a record of exactly how this script was executed.

For this demo we will just run it on the structural images (T1w) of the first subject (sub-01) from each dataset. The uniform structure of the datasets makes this very easy. Of course we could run it on all subjects; we are simply saving some time for this demo. While the command runs, you should notice a few things:

  1. We run this command with ‘bash -e’ to stop at any failure that may occur

  2. You’ll see the required data files being obtained as they are needed – and only those that are actually required will be downloaded (because of the appropriate --input specification of the datalad run – but as a datalad get is also included in the bash script, forgetting an --input specification would not be problem).

$ datalad run -m "run brain extract workflow" \
  --input "inputs/ds*/sub-01/anat/sub-01_T1w.nii.gz" \
  --output "sub-*/anat" \
  bash -e code/brain_extraction.sh inputs/ds*/sub-01/anat/sub-01_T1w.nii.gz
[INFO] Making sure inputs are available (this may take some time) 
[INFO] == Command start (output follows) ===== 
action summary:
  get (notneeded: 4)
Processing inputs/ds000001/sub-01/anat/sub-01_T1w.nii.gz
Processing inputs/ds000002/sub-01/anat/sub-01_T1w.nii.gz
[INFO] == Command exit (modification check follows) ===== 
get(ok): inputs/ds000001/sub-01/anat/sub-01_T1w.nii.gz (file) [from web...]
get(ok): inputs/ds000002/sub-01/anat/sub-01_T1w.nii.gz (file) [from web...]
add(ok): sub-001/anat.nii.gz (file)
add(ok): sub-001/anat_mask.nii.gz (file)
add(ok): sub-002/anat.nii.gz (file)
add(ok): sub-002/anat_mask.nii.gz (file)
save(ok): . (dataset)
action summary:
  add (ok: 4)
  get (notneeded: 2, ok: 2)
  save (notneeded: 2, ok: 1)

The analysis step is done, all generated results were saved in the dataset. All changes, including the command that caused them are on record:

$ git show --stat
commit 27ca92d1ec7a0ba8185bfe444a4a31a73b82f555
Author: Elena Piscopia <elena@example.net>
Date:   Tue Jun 23 20:45:32 2020 +0200

    [DATALAD RUNCMD] run brain extract workflow
    
    === Do not change lines below ===
    {
     "chain": [],
     "cmd": "bash -e code/brain_extraction.sh inputs/ds000001/sub-01/anat/sub-01_T1w.nii.gz inputs/ds000002/sub-01/anat/sub-01_T1w.nii.gz",
     "dsid": "a89af03a-b581-11ea-90a2-3119e6b9cf19",
     "exit": 0,
     "extra_inputs": [],
     "inputs": [
      "inputs/ds*/sub-01/anat/sub-01_T1w.nii.gz"
     ],
     "outputs": [
      "sub-*/anat"
     ],
     "pwd": "."
    }
    ^^^ Do not change lines above ^^^

 sub-001/anat.nii.gz      | 1 +
 sub-001/anat_mask.nii.gz | 1 +
 sub-002/anat.nii.gz      | 1 +
 sub-002/anat_mask.nii.gz | 1 +
 4 files changed, 4 insertions(+)

DataLad has enough information stored to be able to re-run a command.

On command exit, it will inspect the results and save them again, but only if they are different. In our case, the re-run yields bit-identical results, hence nothing new is saved.

$ datalad rerun
[INFO] Making sure inputs are available (this may take some time) 
[INFO] == Command start (output follows) ===== 
action summary:
  get (notneeded: 4)
Processing inputs/ds000001/sub-01/anat/sub-01_T1w.nii.gz
Processing inputs/ds000002/sub-01/anat/sub-01_T1w.nii.gz
[INFO] == Command exit (modification check follows) ===== 
unlock(ok): sub-001/anat.nii.gz (file)
unlock(ok): sub-001/anat_mask.nii.gz (file)
unlock(ok): sub-002/anat.nii.gz (file)
unlock(ok): sub-002/anat_mask.nii.gz (file)
add(ok): sub-001/anat.nii.gz (file)
add(ok): sub-001/anat_mask.nii.gz (file)
add(ok): sub-002/anat.nii.gz (file)
add(ok): sub-002/anat_mask.nii.gz (file)
action summary:
  add (ok: 4)
  get (notneeded: 4)
  save (notneeded: 3)
  unlock (notneeded: 4, ok: 4)

Now that we are done, and have checked that we can reproduce the results ourselves, we can clean up. DataLad can easily verify if any part of our input dataset was modified since we configured our analysis, using datalad diff and the tag we provided:

$ datalad diff setup_done inputs

Nothing was changed.

With DataLad with don’t have to keep those inputs around – without losing the ability to reproduce an analysis. Let’s uninstall them, and check the size on disk before and after.

$ du -sh
26M	.
$ datalad uninstall inputs/*
drop(ok): inputs/ds000001/sub-01/anat/sub-01_T1w.nii.gz (file) [checking http://openneuro.s3.amazonaws.com/ds000001/ds000001_R1.1.0/uncompressed/sub001/anatomy/highres001.nii.gz?versionId=8TJ17W9WInNkQPdiQ9vS7wo8ZJ9llF80...]
drop(ok): inputs/ds000001 (directory)
uninstall(ok): inputs/ds000001 (dataset)
drop(ok): inputs/ds000002/sub-01/anat/sub-01_T1w.nii.gz (file) [checking http://openneuro.s3.amazonaws.com/ds000002/ds000002_R2.0.0/uncompressed/sub-01/anat/sub-01_T1w.nii.gz?versionId=vXK2.bQ360phhPqbVV_n6RMYqaWAy4Dg...]
drop(ok): inputs/ds000002 (directory)
uninstall(ok): inputs/ds000002 (dataset)
action summary:
  drop (ok: 4)
  uninstall (ok: 2)
$ du -sh
3.1M	.

The dataset is substantially smaller as all inputs are gone…

$ ls inputs/*
inputs/ds000001:

inputs/ds000002:

But as these inputs were registered in the dataset when we installed them, getting them back is very easy. Only the remaining data (our code and the results) need to be kept and require a backup for long term archival. Everything else can be re-obtained as needed, when needed.

As DataLad knows everything needed about the inputs, including where to get the right version, we can re-run the analysis with a single command. Watch how DataLad re-obtains all required data, re-runs the code, and checks that none of the results changed and need saving.

$ datalad rerun
[INFO] Making sure inputs are available (this may take some time) 
[INFO] Cloning dataset to Dataset(/home/me/usecases/repro/demo/inputs/ds000001)
[INFO] Attempting to clone from http://datasets.datalad.org/openfmri/ds000001/.git to /home/me/usecases/repro/demo/inputs/ds000001
[INFO] Start counting objects
[INFO] Start compressing objects
[INFO] Start receiving objects
[INFO] Start resolving deltas
[INFO] Completed clone attempts for Dataset(/home/me/usecases/repro/demo/inputs/ds000001)
[INFO] Cloning dataset to Dataset(/home/me/usecases/repro/demo/inputs/ds000002)
[INFO] Attempting to clone from http://datasets.datalad.org/openfmri/ds000002/.git to /home/me/usecases/repro/demo/inputs/ds000002
[INFO] Start counting objects
[INFO] Start compressing objects
[INFO] Start receiving objects
[INFO] Start resolving deltas
[INFO] Completed clone attempts for Dataset(/home/me/usecases/repro/demo/inputs/ds000002)
[INFO] == Command start (output follows) ===== 
action summary:
  get (notneeded: 4)
Processing inputs/ds000001/sub-01/anat/sub-01_T1w.nii.gz
Processing inputs/ds000002/sub-01/anat/sub-01_T1w.nii.gz
[INFO] == Command exit (modification check follows) ===== 
install(ok): inputs/ds000001 (dataset) [Installed subdataset in order to get /home/me/usecases/repro/demo/inputs/ds000001]
install(ok): inputs/ds000002 (dataset) [Installed subdataset in order to get /home/me/usecases/repro/demo/inputs/ds000002]
get(ok): inputs/ds000001/sub-01/anat/sub-01_T1w.nii.gz (file) [from web...]
get(ok): inputs/ds000002/sub-01/anat/sub-01_T1w.nii.gz (file) [from web...]
unlock(ok): sub-001/anat.nii.gz (file)
unlock(ok): sub-001/anat_mask.nii.gz (file)
unlock(ok): sub-002/anat.nii.gz (file)
unlock(ok): sub-002/anat_mask.nii.gz (file)
add(ok): sub-001/anat.nii.gz (file)
add(ok): sub-001/anat_mask.nii.gz (file)
add(ok): sub-002/anat.nii.gz (file)
add(ok): sub-002/anat_mask.nii.gz (file)
action summary:
  add (ok: 4)
  get (notneeded: 2, ok: 2)
  install (ok: 2)
  save (notneeded: 3)
  unlock (notneeded: 4, ok: 4)

Reproduced!

This dataset could now be published and shared as a lightweight yet fully reproducible resource and enable anyone to replicate the exact same analysis – with a single command. Public data and reproducible execution for the win!

Note though that reproducibility can and should go further: With more complex software dependencies, it is inevitable to keep track of the software environment involved in the analysis as well. If you are curious on how to do this, read on into the main usecase below.

The Challenge

Allan is an exemplary neuroscientist and researcher. He has spent countless hours diligently learning not only the statistical methods for his research questions and the software tools for his computations, but also taught himself about version control and data standards in neuroimaging, such as the brain imaging data structure (BIDS). For his final PhD project, he patiently acquires functional MRI data of many participants, and prepares it according to the BIDS standard afterwards. It takes him a full week of time and two failed attempts, but he eventually has a BIDS-compliant dataset.

When he writes his analysis scripts he takes extra care to responsibly version control every change. He happily notices how much cleaner his directories are, and how he and others can transparently see how his code evolved. Once everything is set up, he runs his analysis using large and complex neuroscientific software packages that he installed on his computer a few years back. Finally, he writes a paper and publishes his findings in a prestigious peer-reviewed journal. His data and code can be accessed by others easily, as he makes them publicly available. Colleagues and supervisors admire him for his wonderful contribution to open science.

However, a few months after publication, Allan starts to get emails by that report that his scripts do not produce the same results as the ones reported in the publication. Startled and confused he investigates what may be the issue. After many sleepless nights he realizes: The software he used was fairly old! More recent versions of the same software compute results slightly different, changed function’s names, or fixed discovered bugs in the underlying source code. Shocked, he realizes that his scripts are even incompatible with the most recent release of the software package he used and throw an error. Luckily, he can quickly fix this by adding information about the required software versions to the README of his project, and he is grateful for colleagues and other scientists that provide adjusted versions of his code for more recent software releases. In the end, his results prove to be robust regardless of software version. But while Allen shared code and data, not including any information about his software environment prevented his analysis from becoming computationally reproducible.

The DataLad Approach

Even if an analysis workflow is fully captured and version-controlled, and data and code are being linked, an analysis may not reproduce. Comprehensive computational reproducibility requires that also the software involved in an analysis and its precise versions need to be known. DataLad can help with this. Using the datalad-containers extension, complete software environments can be captured in computational containers, added to (and thus shared together with) datasets, and linked with commands and outputs they were used for.

Step-by-Step

The first part of this Step-by-Step guide details how to computationally reproducibly and automatically reproducibly perform data preparation from raw DICOM files to BIDS-compliant NifTi images. The actual analysis, a first-level GLM for a localization task, is performed in the second part. A final paragraph shows how to prepare the dataset for the afterlife.

For this use case, two DataLad extensions are required:

You can install them via pip like this:

$ pip install datalad-neuroimaging datalad-container

Data Preparation

We start by creating a home for the raw data:

$ datalad create localizer_scans
$ cd localizer_scans
[INFO] Creating a new annex repo at /home/me/usecases/repro2/localizer_scans
create(ok): /home/me/usecases/repro2/localizer_scans (dataset)

For this example, we use a number of publicly available DICOM files. Luckily, at the time of data acquisition, these DICOMs were already equipped with the relevant metadata: Their headers contain all necessary information to identify the purpose of individual scans and encode essential properties to create a BIDS compliant dataset from them. The DICOMs are stored on Github (as a Git repository1), so they can be installed as a subdataset. As they are the raw inputs of the analysis, we store them in a directory we call inputs/raw.

$ datalad clone --dataset . \
 https://github.com/datalad/example-dicom-functional.git  \
 inputs/rawdata
[INFO] Cloning dataset to Dataset(/home/me/usecases/repro2/localizer_scans/inputs/rawdata)
[INFO] Attempting to clone from https://github.com/datalad/example-dicom-functional.git to /home/me/usecases/repro2/localizer_scans/inputs/rawdata
[INFO] Start enumerating objects
[INFO] Start receiving objects
[INFO] Start resolving deltas
[INFO] Completed clone attempts for Dataset(/home/me/usecases/repro2/localizer_scans/inputs/rawdata)
install(ok): inputs/rawdata (dataset)
add(ok): inputs/rawdata (file)
add(ok): .gitmodules (file)
save(ok): . (dataset)
action summary:
  add (ok: 2)
  install (ok: 1)
  save (ok: 1)

The datalad subdatasets reports the installed dataset to be indeed a subdataset of the superdataset localizer_scans:

$ datalad subdatasets
subdataset(ok): inputs/rawdata (dataset)

Given that we have obtained raw data, this data is not yet ready for data analysis. Prior to performing actual computations, the data needs to be transformed into appropriate formats and standardized to an intuitive layout. For neuroimaging, a useful transformation is a transformation from DICOMs into the NifTi format, a format specifically designed for scientific analyses of brain images. An intuitive layout is the BIDS standard. Performing these transformations and standardizations, however, requires software. For the task at hand, HeudiConv, a DICOM converter, is our software of choice. Beyond converting DICOMs, it also provides assistance in converting a raw data set to the BIDS standard, and it integrates with DataLad to place converted and original data under Git/Git-annex version control, while automatically annotating files with sensitive information (e.g., non-defaced anatomicals, etc).

To take extra care to know exactly what software is used both to be able to go back to it at a later stage should we have the need to investigate an issue, and to capture full provenance of the transformation process, we are using a software container that contains the relevant software setup. A ready-made singularity container is available from singularity-hub at shub://ReproNim/ohbm2018-training:heudiconvn.

Using the datalad containers-add command we can add this container to the localizer_scans superdataset. We are giving it the name heudiconv.

$ datalad containers-add heudiconv --url shub://ReproNim/ohbm2018-training:heudiconvn
add(ok): .datalad/config (file)
save(ok): . (dataset)
containers_add(ok): /home/me/usecases/repro2/localizer_scans/.datalad/environments/heudiconv/image (file)
action summary:
  add (ok: 1)
  containers_add (ok: 1)
  save (ok: 1)

The command datalad containers-list can verify that this worked:

$ datalad containers-list
heudiconv -> .datalad/environments/heudiconv/image

Great. The dataset now tracks all of the input data and the computational environment for the DICOM conversion. Thus far, we have a complete record of all components. Let’s stay transparent, but also automatically reproducible in the actual data conversion by wrapping the necessary heudiconv command seen below:

$ heudiconv -f reproin -s 02 -c dcm2niix -b -l "" --minmeta -a . \
 -o /tmp/heudiconv.sub-02 --files inputs/rawdata/dicoms

within a datalad containers-run command. To save time, we will only transfer one subjects data (sub-02, hence the subject identifier -s 02 in the command). Note that the output below is how it indeed should look like – the software we are using in this example produces very wordy output.

$ datalad containers-run -m "Convert sub-02 DICOMs into BIDS" \
  --container-name heudiconv \
  heudiconv -f reproin -s 02 -c dcm2niix -b -l "" --minmeta -a . \
  -o /tmp/heudiconv.sub-02 --files inputs/rawdata/dicoms
[INFO] Making sure inputs are available (this may take some time)
[INFO] == Command start (output follows) =====
INFO: Running heudiconv version 0.5.2-dev
INFO: Analyzing 5460 dicoms
INFO: Filtering out 0 dicoms based on their filename
WARNING: dcmstack without support of pydicom >= 1.0 is detected. Adding a plug
INFO: Generated sequence info for 1 studies with 1 entries total
INFO: Processing sequence infos to deduce study/session
INFO: Study session for {'locator': 'Hanke/Stadler/0083_transrep2', 'session': None, 'subject': '02'}
INFO: Need to process 1 study sessions
INFO: PROCESSING STARTS: {'outdir': '/tmp/heudiconv.sub-02/', 'session': None, 'subject': '02'}
INFO: Processing 1 pre-sorted seqinfo entries
INFO: Processing 1 seqinfo entries
INFO: Doing conversion using dcm2niix
INFO: Converting ./sub-02/func/sub-02_task-oneback_run-01_bold (5460 DICOMs) -> ./sub-02/func . Converter: dcm2niix . Output types: ('nii.gz', 'dicom')
INFO: Generating grammar tables from /usr/lib/python3.5/lib2to3/Grammar.txt
INFO: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
200518-07:48:58,821 nipype.workflow INFO:
	 [Node] Setting-up "convert" in "/tmp/dcm2niix7qddgg6t/convert".
INFO: [Node] Setting-up "convert" in "/tmp/dcm2niix7qddgg6t/convert".
200518-07:48:59,749 nipype.workflow INFO:
	 [Node] Running "convert" ("nipype.interfaces.dcm2nii.Dcm2niix"), a CommandLine Interface with command:
dcm2niix -b y -z y -x n -t n -m n -f func -o . -s n -v n /tmp/dcm2niix7qddgg6t/convert
INFO: [Node] Running "convert" ("nipype.interfaces.dcm2nii.Dcm2niix"), a CommandLine Interface with command:
dcm2niix -b y -z y -x n -t n -m n -f func -o . -s n -v n /tmp/dcm2niix7qddgg6t/convert
200518-07:49:01,731 nipype.interface INFO:
	 stdout 2020-05-18T07:49:01.730960:Chris Rorden's dcm2niiX version v1.0.20180622 GCC6.3.0 (64-bit Linux)
INFO: stdout 2020-05-18T07:49:01.730960:Chris Rorden's dcm2niiX version v1.0.20180622 GCC6.3.0 (64-bit Linux)
200518-07:49:01,731 nipype.interface INFO:
	 stdout 2020-05-18T07:49:01.730960:Found 5460 DICOM file(s)
INFO: stdout 2020-05-18T07:49:01.730960:Found 5460 DICOM file(s)
200518-07:49:01,731 nipype.interface INFO:
	 stdout 2020-05-18T07:49:01.730960:swizzling 3rd and 4th dimensions (XYTZ -> XYZT), assuming interslice distance is 3.300000
INFO: stdout 2020-05-18T07:49:01.730960:swizzling 3rd and 4th dimensions (XYTZ -> XYZT), assuming interslice distance is 3.300000
200518-07:49:01,731 nipype.interface INFO:
	 stdout 2020-05-18T07:49:01.730960:Warning: Images sorted by instance number  [0020,0013](1..5460), but AcquisitionTime [0008,0032] suggests a different order (160423..160223)
INFO: stdout 2020-05-18T07:49:01.730960:Warning: Images sorted by instance number  [0020,0013](1..5460), but AcquisitionTime [0008,0032] suggests a different order (160423..160223)
200518-07:49:01,731 nipype.interface INFO:
	 stdout 2020-05-18T07:49:01.730960:Using RWVSlope:RWVIntercept = 4.00757:0
INFO: stdout 2020-05-18T07:49:01.730960:Using RWVSlope:RWVIntercept = 4.00757:0
200518-07:49:01,731 nipype.interface INFO:
	 stdout 2020-05-18T07:49:01.730960: Philips Scaling Values RS:RI:SS = 4.00757:0:0.0132383 (see PMC3998685)
INFO: stdout 2020-05-18T07:49:01.730960: Philips Scaling Values RS:RI:SS = 4.00757:0:0.0132383 (see PMC3998685)
200518-07:49:01,731 nipype.interface INFO:
	 stdout 2020-05-18T07:49:01.730960:Convert 5460 DICOM as ./func (80x80x35x156)
INFO: stdout 2020-05-18T07:49:01.730960:Convert 5460 DICOM as ./func (80x80x35x156)
200518-07:49:02,410 nipype.interface INFO:
	 stdout 2020-05-18T07:49:02.409947:compress: "/usr/bin/pigz" -n -f -6 "./func.nii"
INFO: stdout 2020-05-18T07:49:02.409947:compress: "/usr/bin/pigz" -n -f -6 "./func.nii"
200518-07:49:02,410 nipype.interface INFO:
	 stdout 2020-05-18T07:49:02.409947:Conversion required 2.598621 seconds (1.967562 for core code).
INFO: stdout 2020-05-18T07:49:02.409947:Conversion required 2.598621 seconds (1.967562 for core code).
200518-07:49:02,563 nipype.workflow INFO:
	 [Node] Finished "convert".
INFO: [Node] Finished "convert".
INFO: Populating template files under ./
INFO: PROCESSING DONE: {'outdir': '/tmp/heudiconv.sub-02/', 'session': None, 'subject': '02'}
[INFO] == Command exit (modification check follows) =====
add(ok): CHANGES (file)
add(ok): README (file)
add(ok): dataset_description.json (file)
add(ok): participants.tsv (file)
add(ok): sourcedata/README (file)
add(ok): sourcedata/sub-02/func/sub-02_task-oneback_run-01_bold.dicom.tgz (file)
add(ok): sub-02/func/sub-02_task-oneback_run-01_bold.json (file)
add(ok): sub-02/func/sub-02_task-oneback_run-01_bold.nii.gz (file)
add(ok): sub-02/func/sub-02_task-oneback_run-01_events.tsv (file)
add(ok): sub-02/sub-02_scans.tsv (file)
add(ok): task-oneback_bold.json (file)
save(ok): . (dataset)
action summary:
  add (ok: 11)
  get (notneeded: 1)
  save (notneeded: 1, ok: 1)

Find out what changed after this command by comparing the most recent commit by DataLad (i.e., HEAD) to the previous one (i.e., HEAD~1) with datalad diff:

$ datalad diff -f HEAD~1
    added: CHANGES (file)
    added: README (file)
    added: dataset_description.json (file)
    added: participants.tsv (file)
    added: sourcedata/README (file)
    added: sourcedata/sub-02/func/sub-02_task-oneback_run-01_bold.dicom.tgz (file)
    added: sub-02/func/sub-02_task-oneback_run-01_bold.json (file)
    added: sub-02/func/sub-02_task-oneback_run-01_bold.nii.gz (file)
    added: sub-02/func/sub-02_task-oneback_run-01_events.tsv (file)
    added: sub-02/sub-02_scans.tsv (file)
    added: task-oneback_bold.json (file)

As expected, DICOM files of one subject were converted into NifTi files, and the outputs follow the BIDS standard’s layout and naming conventions! But what’s even better is that this action and the relevant software environment was fully recorded.

There is only one thing missing before the functional imaging data can be analyzed: A stimulation protocol, so that we know what stimulation was done at which point during the scan. Thankfully, the data was collected using an implementation that exported this information directly in the BIDS events.tsv format. The file came with our DICOM dataset and can be found at inputs/rawdata/events.tsv. All we need to do is copy it to the right location under the BIDS-mandated name. To keep track of where this file came from, we will also wrap the copying into a datalad run command. The {inputs} and {outputs} placeholders can help to avoid duplication in the command call:

$ datalad run -m "Import stimulation events" \
  --input inputs/rawdata/events.tsv \
  --output sub-02/func/sub-02_task-oneback_run-01_events.tsv \
  cp {inputs} {outputs}
[INFO] Making sure inputs are available (this may take some time)
[INFO] == Command start (output follows) =====
[INFO] == Command exit (modification check follows) =====
unlock(ok): sub-02/func/sub-02_task-oneback_run-01_events.tsv (file)
add(ok): sub-02/func/sub-02_task-oneback_run-01_events.tsv (file)
save(ok): . (dataset)
action summary:
  add (ok: 1)
  get (notneeded: 3)
  save (notneeded: 1, ok: 1)
  unlock (ok: 1)

git log shows what information DataLad captured about this command’s execution:

$ git log -n 1
commit 690d26059c51a3f73fbd1a0680e3b7caf483557a
Author: Elena Piscopia <elena@example.net>
Date:   Mon May 18 07:49:20 2020 +0200

    [DATALAD RUNCMD] Import stimulation events

    === Do not change lines below ===
    {
     "chain": [],
     "cmd": "cp '{inputs}' '{outputs}'",
     "dsid": "126b62e8-98cb-11ea-b8a1-1371662818ca",
     "exit": 0,
     "extra_inputs": [],
     "inputs": [
      "inputs/rawdata/events.tsv"
     ],
     "outputs": [
      "sub-02/func/sub-02_task-oneback_run-01_events.tsv"
     ],
     "pwd": "."
    }
    ^^^ Do not change lines above ^^^

Analysis execution

Since the raw data are reproducibly prepared in BIDS standard we can now go further an conduct an analysis. For this example, we will implement a very basic first-level GLM analysis using FSL that takes only a few minutes to run. As before, we will capture all provenance: inputs, computational environments, code, and outputs.

Following the YODA principles2, the analysis is set up in a new dataset, with the input dataset localizer_scans as a subdataset:

# get out of localizer_scans
$ cd ../

$ datalad create glm_analysis
$ cd glm_analysis
[INFO] Creating a new annex repo at /home/me/usecases/repro2/glm_analysis
create(ok): /home/me/usecases/repro2/glm_analysis (dataset)

We install localizer_scans by providing its path as a --source to datalad install:

$ datalad clone -d . \
  ../localizer_scans \
  inputs/rawdata
[INFO] Cloning dataset to Dataset(/home/me/usecases/repro2/glm_analysis/inputs/rawdata)
[INFO] Attempting to clone from ../localizer_scans to /home/me/usecases/repro2/glm_analysis/inputs/rawdata
[INFO] Completed clone attempts for Dataset(/home/me/usecases/repro2/glm_analysis/inputs/rawdata)
install(ok): inputs/rawdata (dataset)
add(ok): inputs/rawdata (file)
add(ok): .gitmodules (file)
save(ok): . (dataset)
action summary:
  add (ok: 2)
  install (ok: 1)
  save (ok: 1)

datalad subdatasets reports the number of installed subdatasets again:

$ datalad subdatasets
subdataset(ok): inputs/rawdata (dataset)

We almost forgot something really useful: Structuring the dataset with the help of DataLad! Luckily, procedures such as yoda can not only be applied upon creating of a dataset (as in Create a dataset), but also with the run-procedure command (as in Configurations to go)

$ datalad run-procedure cfg_yoda
[INFO] Running procedure cfg_yoda
[INFO] == Command start (output follows) =====
[INFO] == Command exit (modification check follows) =====

The analysis obviously needs custom code. For the simple GLM analysis with FSL we use:

  1. A small script to convert BIDS-formatted events.tsv files into the EV3 format FSL understands, available at https://raw.githubusercontent.com/myyoda/ohbm2018-training/master/section23/scripts/events2ev3.sh

  2. An FSL analysis configuration template script, available at https://raw.githubusercontent.com/myyoda/ohbm2018-training/master/section23/scripts/ffa_design.fsf

These script should be stored and tracked inside the dataset within code/. The datalad download-url command downloads these scripts and records where they were obtained from:

$ datalad download-url  --path code/ \
  https://raw.githubusercontent.com/myyoda/ohbm2018-training/master/section23/scripts/events2ev3.sh \
  https://raw.githubusercontent.com/myyoda/ohbm2018-training/master/section23/scripts/ffa_design.fsf
[INFO] Downloading 'https://raw.githubusercontent.com/myyoda/ohbm2018-training/master/section23/scripts/events2ev3.sh' into '/home/me/usecases/repro2/glm_analysis/code/'
[INFO] Downloading 'https://raw.githubusercontent.com/myyoda/ohbm2018-training/master/section23/scripts/ffa_design.fsf' into '/home/me/usecases/repro2/glm_analysis/code/'
download_url(ok): /home/me/usecases/repro2/glm_analysis/code/events2ev3.sh (file)
download_url(ok): /home/me/usecases/repro2/glm_analysis/code/ffa_design.fsf (file)
add(ok): code/events2ev3.sh (file)
add(ok): code/ffa_design.fsf (file)
save(ok): . (dataset)
action summary:
  add (ok: 2)
  download_url (ok: 2)
  save (ok: 1)

The commit message that DataLad created shows the URL where each script has been downloaded from:

$ git log -n 1
commit a2f447edfe927f101c6c6324da414c3f4c2eba35
Author: Elena Piscopia <elena@example.net>
Date:   Mon May 18 07:49:25 2020 +0200

    [DATALAD] Download URLs

    URLs:
      https://raw.githubusercontent.com/myyoda/ohbm2018-training/master/section23/scripts/events2ev3.sh
      https://raw.githubusercontent.com/myyoda/ohbm2018-training/master/section23/scripts/ffa_design.fsf

Prior to the actual analysis, we need to run the events2ev3.sh script to transform inputs into the format that FSL expects. The datalad run makes this maximally reproducible and easy, as the files given as --inputs and --outputs are automatically managed by DataLad.

$ datalad run -m 'Build FSL EV3 design files' \
  --input inputs/rawdata/sub-02/func/sub-02_task-oneback_run-01_events.tsv \
  --output 'sub-02/onsets' \
  bash code/events2ev3.sh sub-02 {inputs}
[INFO] Making sure inputs are available (this may take some time)
[INFO] == Command start (output follows) =====
sub-02
1
[INFO] == Command exit (modification check follows) =====
get(ok): inputs/rawdata/sub-02/func/sub-02_task-oneback_run-01_events.tsv (file) [from origin...]
add(ok): sub-02/onsets/run-1/body.txt (file)
add(ok): sub-02/onsets/run-1/face.txt (file)
add(ok): sub-02/onsets/run-1/house.txt (file)
add(ok): sub-02/onsets/run-1/object.txt (file)
add(ok): sub-02/onsets/run-1/scene.txt (file)
add(ok): sub-02/onsets/run-1/scramble.txt (file)
save(ok): . (dataset)
action summary:
  add (ok: 6)
  get (notneeded: 1, ok: 1)
  save (notneeded: 1, ok: 1)

The dataset now contains and manages all of the required inputs, and we’re ready for FSL. Since FSL is not a simple program, we make sure to record the precise software environment for the analysis with datalad containers-run. First, we get a container with FSL in the version we require:

$ datalad containers-add fsl --url shub://mih/ohbm2018-training:fsl
add(ok): .datalad/config (file)
save(ok): . (dataset)
containers_add(ok): /home/me/usecases/repro2/glm_analysis/.datalad/environments/fsl/image (file)
action summary:
  add (ok: 1)
  containers_add (ok: 1)
  save (ok: 1)

As the analysis setup is now complete, let’s label this state of the dataset:

$ datalad save --version-tag ready4analysis
save(ok): . (dataset)

All we have left is to configure the desired first-level GLM analysis with FSL. At this point, the template contains placeholders for the basepath and the subject ID, and they need to be replaced. The following command uses the arcane, yet powerful sed editor to do this. We will again use datalad run to invoke our command so that we store in the history how this template was generated (so that we may audit, alter, or regenerate this file in the future — fearlessly).

$ datalad run \
 -m "FSL FEAT analysis config script" \
 --output sub-02/1stlvl_design.fsf \
 bash -c 'sed -e "s,##BASEPATH##,{pwd},g" -e "s,##SUB##,sub-02,g" \
 code/ffa_design.fsf > {outputs}'
[INFO] == Command start (output follows) =====
[INFO] == Command exit (modification check follows) =====
add(ok): sub-02/1stlvl_design.fsf (file)
save(ok): . (dataset)
action summary:
  add (ok: 1)
  save (notneeded: 1, ok: 1)

To compute the analysis, a simple feat sub-02/1stlvl_design.fsf command is wrapped into a datalad containers-run command with appropriate --input and --output specification:

$ datalad containers-run --container-name fsl -m "sub-02 1st-level GLM" \
  --input sub-02/1stlvl_design.fsf \
  --input sub-02/onsets \
  --input inputs/rawdata/sub-02/func/sub-02_task-oneback_run-01_bold.nii.gz \
  --output sub-02/1stlvl_glm.feat \
  feat {inputs[0]}
[INFO] Making sure inputs are available (this may take some time)
[INFO] == Command start (output follows) =====
To view the FEAT progress and final report, point your web browser at /home/me/usecases/repro2/glm_analysis/sub-02/1stlvl_glm.feat/report_log.html
[INFO] == Command exit (modification check follows) =====
get(ok): inputs/rawdata/sub-02/func/sub-02_task-oneback_run-01_bold.nii.gz (file) [from origin...]
add(ok): sub-02/1stlvl_glm.feat/.files/fsl.css (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/3.1r.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/3.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/flirt-bg.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-bg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-bg.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-logo-big.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-logo.gif (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-logo.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-logo.png (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-macos-snapshot.tiff (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fslstart.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fslstart.png (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fugue-bg.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/tick.gif (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/vert2.png (file)
add(ok): sub-02/1stlvl_glm.feat/.ramp.gif (file)
add(ok): sub-02/1stlvl_glm.feat/absbrainthresh.txt (file)
add(ok): sub-02/1stlvl_glm.feat/cluster_mask_zstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/cluster_zstat1.html (file)
add(ok): sub-02/1stlvl_glm.feat/cluster_zstat1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/confoundevs.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev2.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev3.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev4.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev5.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev6.txt (file)
add(ok): sub-02/1stlvl_glm.feat/design.con (file)
add(ok): sub-02/1stlvl_glm.feat/design.frf (file)
add(ok): sub-02/1stlvl_glm.feat/design.fsf (file)
add(ok): sub-02/1stlvl_glm.feat/design.mat (file)
add(ok): sub-02/1stlvl_glm.feat/design.min (file)
add(ok): sub-02/1stlvl_glm.feat/design.png (file)
add(ok): sub-02/1stlvl_glm.feat/design.ppm (file)
add(ok): sub-02/1stlvl_glm.feat/design.trg (file)
add(ok): sub-02/1stlvl_glm.feat/design_cov.png (file)
add(ok): sub-02/1stlvl_glm.feat/design_cov.ppm (file)
add(ok): sub-02/1stlvl_glm.feat/example_func.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/filtered_func_data.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/lmax_zstat1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat0 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat0_init.e294223 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat0_init.o294223 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat1 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat1a_init (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat2_pre (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat2_pre.e294303 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat2_pre.o294303 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat3_film.e294757 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat3_film.o294757 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat3_stats (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat4_post (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat4_post.e295374 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat4_post.o295374 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat5_stop.e296006 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat5_stop.o296006 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat9 (file)
add(ok): sub-02/1stlvl_glm.feat/mask.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/mc/disp.png (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0000 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0001 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0002 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0003 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0004 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0005 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0006 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0007 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0008 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0009 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0010 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0011 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0012 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0013 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0014 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0015 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0016 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0017 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0018 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0019 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0020 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0021 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0022 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0023 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0024 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0025 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0026 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0027 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0028 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0029 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0030 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0031 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0032 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0033 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0034 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0035 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0036 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0037 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0038 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0039 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0040 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0041 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0042 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0043 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0044 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0045 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0046 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0047 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0048 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0049 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0050 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0051 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0052 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0053 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0054 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0055 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0056 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0057 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0058 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0059 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0060 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0061 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0062 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0063 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0064 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0065 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0066 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0067 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0068 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0069 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0070 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0071 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0072 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0073 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0074 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0075 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0076 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0077 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0078 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0079 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0080 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0081 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0082 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0083 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0084 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0085 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0086 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0087 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0088 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0089 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0090 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0091 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0092 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0093 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0094 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0095 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0096 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0097 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0098 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0099 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0100 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0101 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0102 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0103 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0104 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0105 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0106 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0107 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0108 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0109 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0110 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0111 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0112 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0113 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0114 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0115 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0116 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0117 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0118 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0119 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0120 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0121 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0122 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0123 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0124 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0125 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0126 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0127 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0128 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0129 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0130 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0131 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0132 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0133 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0134 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0135 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0136 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0137 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0138 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0139 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0140 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0141 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0142 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0143 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0144 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0145 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0146 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0147 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0148 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0149 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0150 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0151 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0152 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0153 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0154 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0155 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.par (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_abs.rms (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_abs_mean.rms (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_final.par (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_rel.rms (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_rel_mean.rms (file)
add(ok): sub-02/1stlvl_glm.feat/mc/rot.png (file)
add(ok): sub-02/1stlvl_glm.feat/mc/trans.png (file)
add(ok): sub-02/1stlvl_glm.feat/mean_func.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/rendered_thresh_zstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/rendered_thresh_zstat1.png (file)
add(ok): sub-02/1stlvl_glm.feat/report.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_log.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_poststats.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_prestats.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_reg.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_stats.html (file)
add(ok): sub-02/1stlvl_glm.feat/stats/cope1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/dof (file)
add(ok): sub-02/1stlvl_glm.feat/stats/logfile (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe10.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe11.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe12.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe13.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe14.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe15.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe16.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe17.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe18.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe2.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe3.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe4.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe5.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe6.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe7.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe8.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe9.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/res4d.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/sigmasquareds.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/smoothness (file)
add(ok): sub-02/1stlvl_glm.feat/stats/threshac1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/tstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/varcope1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/zstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/thresh_zstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/thresh_zstat1.vol (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev1.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev10.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev10.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev10p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev11.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev11.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev11p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev12.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev12.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev12p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev1p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev2.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev2.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev2p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev3.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev3.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev3p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev4.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev4.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev4p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev5.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev5.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev5p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev6.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev6.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev6p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev7.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev7.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev7p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev8.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev8.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev8p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev9.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev9.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev9p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev1.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev10.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev10.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev10p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev11.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev11.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev11p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev12.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev12.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev12p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev1p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev2.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev2.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev2p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev3.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev3.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev3p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev4.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev4.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev4p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev5.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev5.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev5p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev6.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev6.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev6p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev7.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev7.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev7p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev8.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev8.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev8p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev9.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev9.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev9p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_index (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_index.html (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_zstat1.html (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_zstat1.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_zstat1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_zstat1p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplotc_zstat1.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplotc_zstat1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplotc_zstat1p.png (file)
save(ok): . (dataset)
action summary:
  add (ok: 344)
  get (notneeded: 4, ok: 1)
  save (notneeded: 1, ok: 1)

Once this command finishes, DataLad will have captured the entire FSL output, and the dataset will contain a complete record all the way from the input BIDS dataset to the GLM results. The BIDS subdataset in turn has a complete record of all processing down from the raw DICOMs onwards.

Note

See how many files were created and added in this computation of a single participant? If your study has many participants, analyses like the one above could inflate your dataset. Please check out the chapter Go big or go home. in particular the section Calculate in greater numbers for tips and tricks on how to create analyses datasets that scale.

Archive data and results

After study completion it is important to properly archive data and results, for example for future inquiries by reviewers or readers of the associated publication. Thanks to the modularity of the study units, this tasks is easy and avoids needless duplication.

The raw data is tracked in its own dataset (localizer_scans) that only needs to be archived once, regardless of how many analysis are using it as input. This means that we can “throw away” this subdataset copy within this analysis dataset. DataLad can re-obtain the correct version at any point in the future, as long as the recorded location remains accessible.

To make sure we’re not deleting information we are not aware of, datalad diff and git log can help to verify that the subdataset is in the same state as when it was initially added:

$ datalad diff -- inputs

The command does not show any output, thus indicating that there is indeed no difference. git log confirms that the only action that was performed on inputs/ was the addition of it as a subdataset:

$ git log -- inputs
commit 721add9dec06817845a62a30e1dcc50cc4bbc61e
Author: Elena Piscopia <elena@example.net>
Date:   Mon May 18 07:49:22 2020 +0200

    [DATALAD] Recorded changes

Since the state of the subdataset is exactly the state of the original localizer_scans dataset it is safe to uninstall it.

$ datalad uninstall --dataset . inputs --recursive
drop(ok): inputs/rawdata/sub-02/func/sub-02_task-oneback_run-01_bold.nii.gz (file)
drop(ok): inputs/rawdata/sub-02/func/sub-02_task-oneback_run-01_events.tsv (file)
drop(ok): inputs/rawdata (directory)
uninstall(ok): inputs/rawdata (dataset)
action summary:
  drop (ok: 3)
  uninstall (ok: 1)

Prior to archiving the results, we can go one step further and verify their computational reproducibility. DataLad’s rerun command is capable of “replaying” any recorded command. The following command re-executes the FSL analysis by re-running everything since the dataset was tagged as ready4analysis). It will record the recomputed results in a separate Git branch named verify. Afterwards, we can automatically compare these new results to the original ones in the master branch. We will see that all outputs can be reproduced in bit-identical form. The only changes are observed in log files that contain volatile information, such as time steps.

$ datalad rerun --branch verify --onto ready4analysis --since ready4analysis
[INFO] == Command start (output follows) =====
[INFO] == Command exit (modification check follows) =====
[INFO] Making sure inputs are available (this may take some time)
[INFO] Cloning dataset to Dataset(/home/me/usecases/repro2/glm_analysis/inputs/rawdata)
[INFO] Attempting to clone from /home/me/usecases/repro2/glm_analysis/../localizer_scans to /home/me/usecases/repro2/glm_analysis/inputs/rawdata
[INFO] Completed clone attempts for Dataset(/home/me/usecases/repro2/glm_analysis/inputs/rawdata)
[INFO] == Command start (output follows) =====
To view the FEAT progress and final report, point your web browser at /home/me/usecases/repro2/glm_analysis/sub-02/1stlvl_glm.feat/report_log.html
[INFO] == Command exit (modification check follows) =====
add(ok): sub-02/1stlvl_design.fsf (file)
save(ok): . (dataset)
get(ok): inputs/rawdata/sub-02/func/sub-02_task-oneback_run-01_bold.nii.gz (file) [from origin...]
add(ok): sub-02/1stlvl_glm.feat/.files/fsl.css (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/3.1r.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/3.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/flirt-bg.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-bg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-bg.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-logo-big.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-logo.gif (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-logo.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-logo.png (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fsl-macos-snapshot.tiff (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fslstart.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fslstart.png (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/fugue-bg.jpg (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/tick.gif (file)
add(ok): sub-02/1stlvl_glm.feat/.files/images/vert2.png (file)
add(ok): sub-02/1stlvl_glm.feat/.ramp.gif (file)
add(ok): sub-02/1stlvl_glm.feat/absbrainthresh.txt (file)
add(ok): sub-02/1stlvl_glm.feat/cluster_mask_zstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/cluster_zstat1.html (file)
add(ok): sub-02/1stlvl_glm.feat/cluster_zstat1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/confoundevs.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev2.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev3.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev4.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev5.txt (file)
add(ok): sub-02/1stlvl_glm.feat/custom_timing_files/ev6.txt (file)
add(ok): sub-02/1stlvl_glm.feat/design.con (file)
add(ok): sub-02/1stlvl_glm.feat/design.frf (file)
add(ok): sub-02/1stlvl_glm.feat/design.fsf (file)
add(ok): sub-02/1stlvl_glm.feat/design.mat (file)
add(ok): sub-02/1stlvl_glm.feat/design.min (file)
add(ok): sub-02/1stlvl_glm.feat/design.png (file)
add(ok): sub-02/1stlvl_glm.feat/design.ppm (file)
add(ok): sub-02/1stlvl_glm.feat/design.trg (file)
add(ok): sub-02/1stlvl_glm.feat/design_cov.png (file)
add(ok): sub-02/1stlvl_glm.feat/design_cov.ppm (file)
add(ok): sub-02/1stlvl_glm.feat/example_func.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/filtered_func_data.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/lmax_zstat1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat0 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat0_init.e299367 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat0_init.o299367 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat1 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat1a_init (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat2_pre (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat2_pre.e299450 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat2_pre.o299450 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat3_film.e299957 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat3_film.o299957 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat3_stats (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat4_post (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat4_post.e300332 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat4_post.o300332 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat5_stop.e300964 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat5_stop.o300964 (file)
add(ok): sub-02/1stlvl_glm.feat/logs/feat9 (file)
add(ok): sub-02/1stlvl_glm.feat/mask.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/mc/disp.png (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0000 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0001 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0002 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0003 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0004 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0005 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0006 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0007 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0008 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0009 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0010 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0011 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0012 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0013 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0014 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0015 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0016 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0017 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0018 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0019 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0020 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0021 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0022 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0023 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0024 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0025 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0026 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0027 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0028 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0029 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0030 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0031 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0032 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0033 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0034 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0035 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0036 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0037 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0038 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0039 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0040 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0041 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0042 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0043 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0044 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0045 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0046 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0047 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0048 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0049 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0050 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0051 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0052 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0053 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0054 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0055 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0056 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0057 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0058 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0059 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0060 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0061 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0062 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0063 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0064 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0065 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0066 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0067 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0068 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0069 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0070 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0071 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0072 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0073 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0074 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0075 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0076 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0077 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0078 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0079 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0080 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0081 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0082 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0083 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0084 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0085 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0086 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0087 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0088 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0089 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0090 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0091 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0092 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0093 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0094 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0095 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0096 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0097 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0098 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0099 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0100 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0101 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0102 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0103 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0104 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0105 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0106 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0107 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0108 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0109 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0110 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0111 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0112 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0113 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0114 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0115 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0116 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0117 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0118 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0119 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0120 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0121 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0122 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0123 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0124 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0125 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0126 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0127 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0128 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0129 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0130 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0131 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0132 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0133 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0134 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0135 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0136 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0137 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0138 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0139 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0140 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0141 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0142 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0143 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0144 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0145 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0146 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0147 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0148 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0149 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0150 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0151 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0152 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0153 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0154 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.mat/MAT_0155 (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf.par (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_abs.rms (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_abs_mean.rms (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_final.par (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_rel.rms (file)
add(ok): sub-02/1stlvl_glm.feat/mc/prefiltered_func_data_mcf_rel_mean.rms (file)
add(ok): sub-02/1stlvl_glm.feat/mc/rot.png (file)
add(ok): sub-02/1stlvl_glm.feat/mc/trans.png (file)
add(ok): sub-02/1stlvl_glm.feat/mean_func.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/rendered_thresh_zstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/rendered_thresh_zstat1.png (file)
add(ok): sub-02/1stlvl_glm.feat/report.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_log.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_poststats.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_prestats.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_reg.html (file)
add(ok): sub-02/1stlvl_glm.feat/report_stats.html (file)
add(ok): sub-02/1stlvl_glm.feat/stats/cope1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/dof (file)
add(ok): sub-02/1stlvl_glm.feat/stats/logfile (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe10.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe11.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe12.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe13.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe14.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe15.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe16.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe17.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe18.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe2.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe3.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe4.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe5.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe6.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe7.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe8.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/pe9.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/res4d.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/sigmasquareds.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/smoothness (file)
add(ok): sub-02/1stlvl_glm.feat/stats/threshac1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/tstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/varcope1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/stats/zstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/thresh_zstat1.nii.gz (file)
add(ok): sub-02/1stlvl_glm.feat/thresh_zstat1.vol (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev1.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev10.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev10.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev10p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev11.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev11.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev11p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev12.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev12.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev12p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev1p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev2.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev2.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev2p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev3.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev3.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev3p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev4.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev4.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev4p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev5.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev5.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev5p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev6.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev6.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev6p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev7.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev7.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev7p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev8.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev8.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev8p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev9.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev9.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplot_zstat1_ev9p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev1.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev10.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev10.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev10p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev11.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev11.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev11p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev12.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev12.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev12p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev1p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev2.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev2.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev2p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev3.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev3.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev3p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev4.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev4.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev4p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev5.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev5.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev5p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev6.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev6.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev6p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev7.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev7.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev7p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev8.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev8.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev8p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev9.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev9.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/ps_tsplotc_zstat1_ev9p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_index (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_index.html (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_zstat1.html (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_zstat1.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_zstat1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplot_zstat1p.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplotc_zstat1.png (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplotc_zstat1.txt (file)
add(ok): sub-02/1stlvl_glm.feat/tsplot/tsplotc_zstat1p.png (file)
save(ok): . (dataset)
action summary:
  add (ok: 345)
  get (notneeded: 4, ok: 1)
  save (notneeded: 1, ok: 2)
# check that we are now on the new `verify` branch
$ git branch
  git-annex
  master
* verify
# compare which files have changes with respect to the original results
$ git diff master --stat
 sub-02/1stlvl_glm.feat/logs/feat0                                      | 2 +-
 sub-02/1stlvl_glm.feat/logs/{feat0_init.e294223 => feat0_init.e299367} | 0
 sub-02/1stlvl_glm.feat/logs/{feat0_init.o294223 => feat0_init.o299367} | 0
 sub-02/1stlvl_glm.feat/logs/feat1                                      | 2 +-
 sub-02/1stlvl_glm.feat/logs/{feat2_pre.e294303 => feat2_pre.e299450}   | 0
 sub-02/1stlvl_glm.feat/logs/{feat2_pre.o294303 => feat2_pre.o299450}   | 0
 sub-02/1stlvl_glm.feat/logs/{feat3_film.e294757 => feat3_film.e299957} | 0
 sub-02/1stlvl_glm.feat/logs/{feat3_film.o294757 => feat3_film.o299957} | 0
 sub-02/1stlvl_glm.feat/logs/{feat4_post.e295374 => feat4_post.e300332} | 0
 sub-02/1stlvl_glm.feat/logs/{feat4_post.o295374 => feat4_post.o300332} | 0
 sub-02/1stlvl_glm.feat/logs/{feat5_stop.e296006 => feat5_stop.e300964} | 0
 sub-02/1stlvl_glm.feat/logs/{feat5_stop.o296006 => feat5_stop.o300964} | 0
 sub-02/1stlvl_glm.feat/report.html                                     | 2 +-
 sub-02/1stlvl_glm.feat/report_log.html                                 | 2 +-
 14 files changed, 4 insertions(+), 4 deletions(-)
# switch back to the master branch and remove the `verify` branch
$ git checkout master
$ git branch -D verify
Switched to branch 'master'
Deleted branch verify (was eeac9a0).

The outcome of this usecase can be found as a dataset on Github here.

Footnotes

1

“Why can such data exist as a Git repository, shouldn’t large files be always stored outside of Git?” you may ask. The DICOMs exist in a Git-repository for a number of reasons: First, it makes them easily available for demonstrations and tutorials without involving DataLad at all. Second, the DICOMs are comparatively small: 21K per file. Importantly, the repository is not meant to version control those files and future states or derivatives and results obtained from them – this would bring a Git repositories to its knees.

2

To re-read everything about the YODA principles, checkout out section YODA: Best practices for data analyses in a dataset.