I have a Linux process generating very big log files.
Those file could grow infinitely if I didn't do anything.
Is there any way to limit a file's size and to make it act like some kind of fifo buffer keeping only a certain amount of data?
I also tried logrotate, but it can't run as soon as the file has reached a given size.
Log files could grow very fast and I don't want to run logrotate on a daily basis.
Thanks for your help.
Best Answer
You can run run logrotate with a config file specifically for the log file in question and put it into a cron job that runs more often, e.g. every hour or every 15 minutes.
See
man logrotate
.