Loading...

Some recipes for Filebeat.

Input

The input section declares the file prospectors.

Use pipeline

Assign Elasticsearch Ingest Pipeline in prospector


# monit logs
filebeat.prospectors:
- input_type: log
  paths:
     - /var/log/monit.log
  fields:
    type: "logs"
    host: "${beat.hostname:dhost}"
    application: "monit"
    environment: "${APP_ENV:default}"
  fields_under_root: true
  exclude_files: [".gz$"]
  pipeline: "monit_logs"

Output

Send to logstash or elasticsearch.

Deciding Index Output

  • Ingest pipeline can also be used in output, but it is better to declare at the prospector level
  • Use conditions to determine where the message event should be written to

output.elasticsearch:
  hosts:
  username: "le-mapper"
  password: "mapper-king"  
  index: "logs-${FO_ENV:default}-%{+yyyy.MM.dd}"  
  indices:
    - index: "myApp-%{+yyyy.MM.dd}"
      pipeline: "myApp"
      when.equals:
        application: "myApp"
    - index: "error-%{+yyyy.MM.dd}"
      when.equals:
        level: "ERROR"
    - index: "devops-%{+yyyy.MM.dd}"
      when.equals:
        application: "monit"
        

Index Destination

Using a field value as target index is also possible.

filebeat.prospectors:
- input_type: log
  paths: ["/var/log/syslog"]
  fields.kind: "syslog"     

Add to the output section


output.elasticsearch:  
  hosts: ["elastic1:9200", "elastic2:9200", "aws-es:9200"]
  index: "%{[fields.kind]:logs}-%{+yyyy.MM.dd}"