Installing and configuring ELK: Elasticsearch + Logstash + Kibana (with filebeat)

Installing and setting up Kibana to analyze some log files is not a trivial task. Kibana is just one part in a stack of tools typically used together:

  • Elasticsearch: full text search engine. Installation instructions here
  • Logstash: data collection and filtering. Installation instructions here
  • Kibana: analytics and visualization platform. Installation instructions here

Together these are referred to as the ELK stack.

Working through the install steps and many other how-to guides, another piece of the puzzle commonly referenced is Filebeat, which watches local files (e.g. logs) and ships them to either Elasticseach or Logstash. Installation instructions here.

I have an nginx access.log file that I’m interested in taking a look at for common patterns and trends. The flow and interaction of each of these tools in the stack looks like:

  • Filebeat -> logstash -> elasticsearch -> kibana

First I configured Logstash to define a simple pipeline for ingesting my log files – at this point I haven’t configured any filtering – this is in pipeline1.conf:

input {
    beats {
        port => "5043"
    }
}
# The filter part of this file is commented out to indicate that it is
# optional.
# filter {
#
# }
output {
    elasticsearch {
        hosts => [ "192.168.1.94:9200" ]
    }
}

This defines the incoming log data from Filebeat and outputing the result to Elasticsearch.

Start up  Logstash with:

sudo bin/logstash --path.settings=/etc/logstash/ -f pipeline1.conf --config.reload.automatic

Now to configure Filebeat to push my nginx access.log into Logstash:

filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - /home/kev/vps1-logs/access.log

output.logstash:
  hosts: ["192.168.1.94:5043"]

Start filebeat to upload the nginx access.log file (this is a local copy of the acess.log from my nginx server, but I understand you can use filebeat to transfer updates from your server to you ELK server on the fly):

/usr/share/filebeat/bin$ sudo ./filebeat -e -c /etc/filebeat/filebeat.yml -d "publish"

After the data had transferred, hitting the kibana site now I can start querying my log data from http://192.168.1.94:5601/

It looks like I might have some work to do to better tag my log data, but with the data imported, time to start checking out the query syntax.