Why doesn’t logstash grab or index the files from the mapped drive

elasticsearchkibanalogginglogstashwindows-server-2008-r2

I don't understand why logstash is so finicky with network resources. I shared a folder on another machine and then mapped it as Z: under Windows Explorer. I've verified the path and everything. I can get logstash (with ELK stack) to input local files but it just doesn't seem to do anything with network or mapped resources.

Is there something insanely simple I'm missing here? Do I need additional arguments for outputting mapped drive inputs to elasticsearch?

input { 
file {
  type => "BbLog"
  path => "Z:/*"
}
}

output {
  elasticsearch {
    host => "localhost"
  }
}

Best Answer

A quick search for similar posts found a possible answer, try setting your path to something like:

path => "z:/**/*.log"

Obviously you would want to edit that to fit your configuration, use double * as your wildcards for expansion. I would suggest that starting logstash up asking it to read everything off a network drive probably isn't the best idea, by default the file input will try to expand wildcard globs every 15 seconds, so it might never finish expanding before starting again.

I can't say I have ever used logstash on Windows, but I know there are a few issues when people do, these are probably documented on the github pages if you go and have a dig. Failing that the #logstash irc channel on freenode is almost always quite active.