Quantcast
Channel: Info sent from Logstash via elastic output not showing in Kibana, but file output works fine - what am I doing wrong? - Server Fault
Viewing all articles
Browse latest Browse all 2

Info sent from Logstash via elastic output not showing in Kibana, but file output works fine - what am I doing wrong?

$
0
0

I have an "ELK stack" configuration and, at first, was doing the standard 'filebeat' syslog feeding from logstash with the elasticsearch output plugin. It worked just fine.

Now I have added a TCP input port (with assigned "type" for this data so as to do if [type] == "thistype" to differentiate in filters), its own grok filter, and output to both elasticsearch with its own unique index name and document_type name and file. When the data comes in over the TCP port it writes the properly formatted data to the output file as expected in the file output plugin but no data is showing up in Kibana when I choose the given index. Kibana recognizes the index from my output configuration and also lists all the fields/keys I assign in the grok filter; however, again, no data is searchable. The data is definitely being grok'd properly as it is, as I mentioned, appearing in the file plugin output.

What am I doing wrong here? My configuration is as follows:

input {  tcp {    host => "10.1.1.10"    port => 12345    type => "odata"    id => "odata"    codec => line  }}filter {  if [type] == "odata" {    grok {      match => { "message" => "%{QUOTEDSTRING:oid},\"%{WORD:oword1}\",\"%{IPV4:oclientip}\",\"%{DATA:otimestamp}\",%{QUOTEDSTRING:opl},%{QUOTEDSTRING:oos},%{QUOTEDSTRING:oua}" }      remove_field => "message"    }    date {      match => ["otimestamp", "YYYY-MM-dd HH:mm:ss Z"]    }    mutate {      remove_field => "otimestamp"      remove_field => "host"      remove_field => "@version"    }  }}output {# the if .. is here because there are other types that are handled in this output since I centralize the input, filter, and output files to three distinct files.  if [type] == "odata" {    elasticsearch {      hosts => ["10.1.1.1:9200", "10.1.1.2:9200"]      sniffing => false      index => "odataindex"      document_type => "odatatype"    }    file {      path => "/tmp/odata_output"    }  }}

Again, the grok filter is fine; the data is ingested via tcp just fine; and the file output plugin outputs the interpreted/grokd fields fine. Kibana sees the "odataindex" index and the fields (such as oid, oclientip, oua, etc.) from the grok filter. It just doesn't return any data when I do a search.

Any ideas? I am new to elastic/kibana/logstash and would also appreciate any tips on debugging these things.

Thank you in advance!


Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images