C++ – Building Apps with Docker: Handling Dynamic Linking

cdocker

I have a C++ project, cloned from GitHub, which I'd like to build with a Docker image (a customized Debian stable). The image contains the usual tools for compiling C++ apps plus all the dependencies required by the project.

My ideal workflow would be:

  1. cd into the project dir;
  2. invoke Docker and let the container configure+build the app;
  3. run the executable in the host system (i.e. not in the Docker container).

Unfortunately the executable links against some shared libraries, available in the Docker image but not installed in the host. I see two options here:

  1. force a static build – tricky but should get the job done;
  2. install the missing dependencies in the host – easy but It would defeat the whole purpose: keep the system clean from additional stuff.

What is a smart way to overcome this hurdle? Is there any known best practice on the subject?

Best Answer

Managing dependencies and dynamic libraries is not a new problem. In your scenario, the usage of Docker is unrelated to that problem as you are only using Docker to bundle your build toolchain (i.e., compilers).

The environment where your application is executed will need all dependencies. However, that does not mean you need to pollute the host system.

For example, it may be possible to execute the application within a Docker container that provides the dependencies. However, that is often not suitable as Docker strongly prefers immutable images, and may impose an unsuitable amount of isolation.

Of course other containerization tech exists as well. This may be appropriate if your software requires specially configured services/daemons. But you may not want to deploy via containers e.g. because users would have to install container management software, because images may require excessive storage, or because deploying container images makes it difficult to apply security updates to dependencies independently of your software.

Using static linking may be a very convenient approach, when possible. Note that this may not work if the dependencies not only encompass libraries but also other resources (fonts, images, databases, external services). This may also affect software licensing, in particular if you have LGPL dependencies.

It may be convenient to declare all dependencies, and use an external dependency management system such as APT to resolve these. This can be very user-friendly, but can also imply dependencies that you cannot control. If a dependency is not provided, you can package it in a way that installs under /opt.

It is not necessary to install dependencies globally. If you configure your dependencies to install under a local prefix (and set LD_LIBRARY_PATH suitably) then you can install dependencies within your project directory. You can then deploy them together with your app. Crucially, this allows you to easily multiple versions of your dependencies alongside each other. (Re-)installing dependencies should then be part of your normal build process.

These strategies are not exclusive but can be mixed and matched as necessary. For example, requiring that libstdc++ in a suitable version is installed on the host system might be OK, but you may want to link other dependencies statically. Or if you have a deployment script that can install dependencies locally, that script could also be used in the Dockerfile of the build container.

In the end, which of these variants is preferable depends on how your software will be installed and used. Anything is potentially fine, as long as everything can be scripted.

Related Topic