Kibana: no indices error

elasticsearchkibanalogstashrsyslogsyslog

None of the existing answers helped, so here a new question.

Usecase: Redirecting syslog (or) monitoring static file.

I have successfully installed logstash (1.4.2), elasticsearch(1.1.1) and kibana(3.0.1) but struggling to get rid of error

No results There were no results because no indices were found that match your selected time span

Sample logstash files used are as below. Please let me know, if anything else is required from my end.

  1. syslog (listening on port 9000, yes I have added ". @@localhost:9000" to /etc/rsyslog.d/50-default.conf and restarted rsyslog )

    sudo cat > /etc/logstash/conf.d/10-syslog.conf <<EOF
    input {
      tcp {
        port => 9000
        type => syslog
      }
      udp {
        port => 9000
        type => syslog
      }
    }
    filter {
      if [type] == "syslog" {
        grok {
          match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
          add_field => [ "received_at", "%{@timestamp}" ]
          add_field => [ "received_from", "%{host}" ]
        }
        syslog_pri { }
        date {
          match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
        }
      }
    }
    output {
      elasticsearch { host => localhost }
      stdout { codec => rubydebug }
    }
    EOF
    
  2. Static file (file present with syslog type of data)

    input {
       file {
          path => "/var/log/awasthi.log"
          type => syslog
          start_position => "beginning"
          sincedb_path => "/dev/null"
       }
    }
    
    filter {
    if [type] == "syslog" {
      grok {
        match => [ "message", "%{SYSLOGTIMESTAMP} %{NOTSPACE:hostname1}/%{NOTSPACE}"]
      }
    }
    }
    
    output {
      stdout { codec => rubydebug }
    }
    

Best Answer

As mentioned in another thread:

"I had something similar, it sounds like you havent setup ACL's to allow the logstash user to view that log file.

Use 'setfacl -m u:logstash-:r-x /var/log' for example, and then test by editing /etc/passwd and giving the logstash user a shell temporarily. Then, su - logstash, and try and cd or cat that file. If it works, then the data should appear in your Kibana setup."

I use redis infront of elasticsearch, so i can test if logstash is working by simply running 'LLEN' and 'LPOP'.