The reasons for using python are discussed elsewhere. Fact is, there is a solid and growing toolchain to perform analysis on complex data. This is the de facto standard in much of “Data Science”.

A few excellent introductions to python and data analysis in python:

Also, if you want to learn neuroimaging data analysis, look at nilearn and for machine learning use sklearn.

Installing (Ana)conda

The python toolchain is a collection of packages that work together. This means we need a system to maintain the right versions to ensure that all these packages work together. For this, we recommend the use of (ana)conda

First, you’ll want to download the latest version of anaconda. You’ll want to download the command-line installer, and choose your version of python wisely. Python 3.5 is more modern, but not all relevant packages are supported yet. Python 2.7 ensures full support for more packages, but: from __future__ import division!.

Once you have downloaded the .sh file from the anaconda website, install by entering:


or the name of the relevant installer, of course. Install in your home directory, perhaps even in the ~/bin location. Don’t have it change .bashrc, but preferably do that in your own ~/.bash_profile.

Installing a virtual environment

Always use a virtual environment in which to run your analyses. This encapsulates the software collection used, and makes it easy to publish the exact software packages and their versions used to create your results.

Word of Warning

Make sure that you take care of your virtual environments!

Don’t hesitate to clone/duplicate an environment before you start installing and/or updating new packages, because they may alter the way your analyses run.

One option: using intel packages

We’re starting to use an intel-provided distribution of packages, which have all been compiled to be faster than standard python and numerical packages on our server. Follow the instructions on the intel website. Briefly, these are the commands:

conda update conda

Note: the latest version of conda breaks the intel channel; see the intel website link for more information.

# add the intel channel:
conda config --add channels intel

# add the very useful conda-forge channel:
conda config --add channels conda-forge

# a new virtual environment for python 2.7:
conda create -n i2 intelpython2_full python=2

# a new virtual environment for python 3.5:
conda create -n i3 intelpython3_full python=3

# to activate:
source activate i2

Once you’re in the environment, you can start installing neuroimaging related packages, such as nilearn, bottleneck, lmfit, fir, etc. Many of these are in the conda-forge channel.

Keep in mind, you’re of course also very welcome to fix this yourself, by installing another type of virtual environment.

Also, feel free to add the activation of your ‘standard environment’ to your ~/.bash_profile.

Legacy packages

Some packages are used in our legacy code, and are not in conda because they are less/not actively maintained. Two examples: ParallelPython has been superseded by joblib, which is easier to use. PyNifti (no website) is an older package for accessing nifti files that has been superseded by nibabel (in conda-forge). To install these packages into a 2.7 virtual environment use the following zipfiles:

Unzip these files, go to the enclosing folder and install them into your VE using pip install -e pynifti or pip install -e pp.