Docker containers provide an isolated environment in order to both distribute and run software tools with ease. We recommend placing each major step of the pipeline into its own docker container, to be run in a step-wise fashion. This page provides instructions for containerizing your pipeline with docker (information on saving docker images remotely can be found here).
Step 1: create the Dockerfile
In a folder containing any necessary scripts or files, create and open a file called "Dockerfile":
nano Dockerfile
Specify a base image to build upon (e.g. an empty ubuntu image), by adding to the file:
FROM ubuntu
Install any required dependencies(e.g. install, make, gcc, and git):
RUN apt-get update -qq \ && apt-get install -y make gcc \ && apt-get install -y git
(use the -y flag to ensure installation continues through any prompts)
If installing a tool from github:
RUN git clone https://github.com/alexdobin/STAR.git
For R scripts requiring external packages, install these packages with:
RUN R -e "install.packages('BiocManager')" RUN R -e "BiocManager::install('DESeq2')"
Specify the program to run when container is started
ENTRYPOINT ["/STAR-2.7.10b/bin/Linux_x86_64/STAR"]
Exit and save the Dockerfile
Step 2: build the docker image
specify the Dockerfile and a tag (-t) by naming the image and assigning a version number (e.g. ":0.0.1")
docker build -f Dockerfile -t image_name:0.0.1 .
The docker image tag can be changed in order to push to your dockerhub repo with:
docker tag image_name:0.0.1 dockerhub_user_name/image_name:0.0.1
Step 3: push to dockerhub
docker push dockerhub_user_name/image_name:0.0.1
(check here for more information on saving docker images)
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article