Using Logstash as shipper

elasticsearchlogginglogstashmonitoringredis

We are shipping logs from servers and using THE logstash on each server for shipping.

So we read logs from the glob "/root/Desktop/Logstash-Input/**/*_log".

 input {
            file{
                    path => "/root/Desktop/Logstash-Input/**/*_log"
                    start_position => "beginning"
            }
    }

From this glob we extract fields from the path which we want added to the event. Eg: from the directory path extract the server, logtype, etc. We do this:

filter {

grok {

        match => ["path", "/root/Desktop/Logstash-Input/(?<server>[^/]+)/(?<logtype>[^/]+)/(?<logdate>[\d]+.[\d]+.[\d]+)/(?<logfilename>.*)_log"]
}
}

And then we output these logs to the central logstash server using lumberjack output plugin.

output {

        lumberjack {
                hosts => ["xx.xx.xx.xx"]
                port => 4545
                ssl_certificate => "./logstash.pub"
    }

        stdout { codec => rubydebug }
}

Problem is that logs shipped to the central server loose the fields added using grok. Eg server, logtype, etc are not present on central server. However, client machine console shows the fields added but on central logstash server only message, timestamp , version is present.

Client (from where logs are shipped) console:

output received {:event=>{"message"=>"2014-05-26T00:00:01+05:30 host crond[268]: (root) CMD (2014/05/31/server2/cron/log)", "@version"=>"1", "@timestamp"=>"2014-07-16T06:07:21.927Z", "host"=>"host", "path"=>"/root/Desktop/Logstash-Input/Server2/CronLog/2014.05.31/cron_log", "server"=>"Server2", "logtype"=>"CronLog", "logdate"=>"2014.05.31", "logfilename"=>"cron"}, :level=>:debug, :file=>"(eval)", :line=>"37"}
    {
              "message" => "2014-05-26T00:00:01+05:30 bx920as1 crond[268]: (root) CMD (2014/05/31/server2/cron/log)",
             "@version" => "1",
           "@timestamp" => "2014-07-16T06:07:21.927Z",
                 "host" => "host",
                 "path" => "/root/Desktop/Logstash-Input/Server2/CronLog/2014.05.31/cron_log",
               "server" => "Server2",
              "logtype" => "CronLog",
              "logdate" => "2014.05.31",
          "logfilename" => "cron"
    }

Central Server (where logs are shipped to) console:

{
       "message" => "2014-07-16T05:33:17.073+0000 host 2014-05-26T00:00:01+05:30 bx920as1 crond[288]: (root) CMD (2014/05/31/server2/cron/log)",
      "@version" => "1",
    "@timestamp" => "2014-07-16T05:34:02.370Z"
}

So the grokked fields are dropped while shipping. Why is it so??

How can I retain the fields??

Best Answer

SOLVED:

I solved it by adding codec => "json" to my lumberjack output and input.

Output:

output {

    lumberjack {
            hosts => ["xx.xx.xx.xx"]
            port => 4545
            ssl_certificate => "./logstash.pub"
            codec => "json"
}

Input:

input { 
    lumberjack {
        port => 4545
        ssl_certificate => "/etc/ssl/logstash.pub"
        ssl_key => "/etc/ssl/logstash.key"  
        codec => "json"
  }
}
Related Topic