How I use Bioconductor with Docker (part 2): More memory, faster Bioconductor with Docker

Nitesh Turaga
3 min readJul 2, 2019

--

Github: nturaga, twitter: niteshturaga

My previous posts showed “How I use Bioconductor with Docker”, think of this as an extension to that.

It was the case (way back when) that I found Docker containers “slow” and not a very efficient way of using R. This is because R is an in-memory model for computation. It is important not to allow a running container to consume too much of the host machine’s memory, but at the same time have enough to be able run my container efficiently, in this case R/Bioconductor.

On my 16Gb RAM available, the default ‘hard limit’ given to my containers by docker was 2Gb RAM. And R gets a little less than that. So, the solution to get my containers run faster, increase the memory available to them.

Step 1: Go to Preferences on docker desktop (Image 1)

Image 1

Step 2: Go to the Advanced option (Image 2)

Increase the RAM to what you think is a sufficient amount based your host machine. Make sure you understand what these resources mean before changing them (here, also see the references section).

Image 2

You’ll have to restart your docker engine after changing these settings.

Now, once you have set your resource limits, you can run each image with a specific amount of memory available if you wish. Or just let docker do it thing, and figure it out automatically based on the number of containers you are running.

Step 3: Run your image

Run your image, and set memory, cpu limits if you need.

docker run \
-rm \ ## Automatically remove the container when it exits
--memory=6g \ ## memory limit
--cpus=1.5 \ ## number of CPUs
-v /shared/data-store:/home/rstudio/data \
-v /shared/library-store:/usr/local/lib/R/host-site-library \
-e PASSWORD=bioc \
-p 8787:8787 \
bioconductor/bioconductor_full:devel

As you can see, my docker container is taking up as much memory as needed up to the 6Gb memory limit I set which is container specific within the 8GB resource limit.

As a sanity check, my ‘Activity Monitor’ (Image 3) display shows that it is using about 4.94Gb of memory.

Image 3

If I launch another image, they will both stay within the 8Gb resource limit.

References on customizations

  • Read about container resources and what each flag means.

TL;DR: Increase the memory limit to get the R/Bioconductor containers to run faster.

--

--