Inputting bulk logs in Logstash (stored in different folders) using the “file” plugin without specifying individual filters

logstash

So I have a question about the basic functionality of logstash. I have many different .L01 files which I have exported into log files. Each .L01 file represents a host and the export function dumps it into a folder-tree. I need to read the logs from these individual "host folders" including all the different sub-folders. To manually add the path of each input log is obviously not feasible. The files are all different types of logs, so I can't define a standard filter for them. If I just run logstash without defining any filters, it just dumps the whole log in the @message field like this:

   "message" => "6.703: Destination:C:\\WINDOWS\\INF\\inetres.adm \r",
  "@version" => "1",
"@timestamp" => "2014-07-18T00:42:32.544Z",
      "tags" => [
    [0] "host_1"
],
      "host" => "<hostname>",
      "path" => "<whatever the path is>"

So how do I configure logstash if I essentially need to read all logs generated on a computer (so naturally different types of logs) but from a folder (so can't use input plugins like syslog). Do I need to write individual filters for every single kind of log? That would need me to manually parse through all the folders and subfolders and then write a filter for each log after examining the content. I don't think I can do that because I need to do this for approximately 20 hosts. I would appreciate any help towards this.

Thank You

Best Answer

You can specify multiple paths in the file input

logstash / inputs / file

path (required setting)

Value type is array

There is no default value for this setting.

You may also configure multiple paths. See an example on the Logstash configuration page.

If you need to specify to look into subdirectories, you can utilize /**/ to specify to go a level deeper.

I hope that helps clarify things.