Neurodocker
Todo
Add content.
This example demonstrates how to build and run an image with Jupyter Notebook.
Note
When you exit a Docker image, any files you created in that image are lost. So if you create Jupyter Notebooks while in a Docker image, remember to save them to a mounted directory. Otherwise, the notebooks will be deleted (and unrecoverable) after you exit the Docker image.
neurodocker generate docker \ --pkg-manager apt \ --base-image debian:buster-slim \ --miniconda version=latest \ conda_install="matplotlib notebook numpy pandas seaborn" \ --user nonroot \ --workdir /work \ > notebook.Dockerfile # Build the image. docker build --tag notebook --file notebook.Dockerfile . # Run the image. The current directory is mounted to the working directory of the # Docker image, so our notebooks are saved to the current directory. docker run --rm -it --publish 8888:8888 --volume $(pwd):/work notebook \ jupyter-notebook --no-browser --ip 0.0.0.0
This example demonstrates how to create a Docker image with multiple conda environments.
neurodocker generate docker \ --pkg-manager apt \ --base-image debian:buster-slim \ --miniconda \ version=latest \ env_name=envA \ env_exists=false \ conda_install=pandas \ --miniconda \ version=latest \ installed=true \ env_name=envB \ env_exists=false \ conda_install=scipy \ > multi-conda-env.Dockerfile docker build --tag multi-conda-env --file multi-conda-env.Dockerfile .
One can use the image in the following way:
docker run --rm -it conda-multi-env bash # Pandas is installed in envA. conda activate envA python -c "import pandas" # Scipy is installed in envB. conda activate envB python -c "import scipy"