Linux – Unix / Linux command to count lines per second from stdin

linuxperformance

I am trying to count the number of SQL queries per second from a log file and I want to do it in real time by pipeing stdout from grep into some command. ( I am doing some performance testing )

I could write it myself, but thought for sure this would exist.

I looked at wc but didn't see an option to allow this.

I could also use it to count requests per second by piping a tail from the access log.

Best Answer

pv is your command! Pipe Viewer prints stats about the data passing through it, and can run anywhere in your pipeline, since it pipes stdin directly over to stdout. For example:

tail -f /var/log/nginx/access.log | pv --line-mode --rate > /dev/null

The pv command prints to stderr the current number of lines per second (the default is bytes per second), which, for this particular data source (Nginx's default log file), equates to incoming web requests per second. I only care about the counts, so I pipe stdout into /dev/null. There are also options like:

  • -b (total number of lines),
  • --average-rate (average rate since starting), and
  • --timer (tracks how long the pipe has been going).

If you don't say --line-mode, it'll count bytes, which is probably not what you want for server logs, but could be handy elsewhere.

Final note: ... | pv -lb > file.txt is a lot like ... | tee file.txt | awk '{printf "\r%lu", NR}', which is also handy for counting lines, but the pv call is way shorter, though the output is not quite as exciting — pv updates every second by default, while that awk command updates continuously.