Docker – How to forward application logs from Docker containers to ELK

dockerdocker-composelogstash

I'm trying to centralise logging in an environment that using multiple application technologies (Java, Rails and various DBs).

We want to developers to bring up stacks with Docker Compose, but we want to them to refer to a central log source (ELK) to debug issues, rather than trying to open shells into running Docker containers.

The applications all write to the file system rather than to STDOUT/STDERR, which removes all of the options associated with the Docker logging driver, and logspout too.

What we have done is configure the containers to have rsyslog include the application log files and forward those to logstash which has a syslog input. This works in terms of moving the logs from A to B, but managing multi-technology logs in ELK based on the syslog input is horrible (eg trying to capture multine Java stacktraces, or MySQL slow queries).

Is there a better way to do this? Should I be running logstash in each container, so that I can apply filters and codecs directly to the log files, so that I don't have to rely on the syslog input?

Of is there some way to use the Docker logging driver with application log files that are written to the file system?

Best Answer

Recent versions of Docker support transmitting logs in 'GELF' format to a network port. Logstash has a GELF input. You could run Logstash on every node and have all Docker instances on the node forward to it.

As a Logstash input: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-gelf.html

gelf {
}

For Docker output: https://docs.docker.com/engine/admin/logging/overview/#gelf

$ docker run -dit \
             --log-driver=gelf \
             --log-opt gelf-address=udp://127.0.0.1:12201 \
             alpine sh

(The gelf-address is from outside the containers perspective, not inside)

Related Topic