Wednesday, October 4, 2017

Configuring and Processing Apache Tomcat logs using Elasticsearch-Logstash-Kibana (ELK) Stack

Configuring and Processing Apache Tomcat logs using Elasticsearch-Logstash-Kibana (ELK) Stack We can process the Apache tomcat log files... thumbnail 1 summary
Configuring and Processing Apache Tomcat logs using Elasticsearch-Logstash-Kibana (ELK) Stack
We can process the Apache tomcat log files in ELK stack and that will help to debug the log files easily.
Please refer Installing the ELK Stack on Windows for Installation and initial configuration.

First we need to Index log file into Elasticsearch using Logstash and then we have to configure the indexed data to Kibana for visualization.


Step 1 :

Create a file called “apache-log-file.conf” in “D:\ELK\logstash-5.6.0\bin” and add the below content.
input {
    file {
        type => "log"
        path => ["C:/Tomcat/apache-tomcat-7.0.37/logs/catalina.2017-08-30.log"]
                        start_position => "beginning"
    }
}
filter {
      grok {
        match => { "message" => "%{COMBINEDAPACHELOG}" }
      }
      date {
        match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
      }
    }
output {
  elasticsearch { hosts => ["localhost:9200"]
  index => "apache"
  }
  stdout { codec => rubydebug }
}
Step 2
Re-start the logstash service and then start the logstash with below configuration.
D:\ELK\logstash-5.6.0\bin>logstash -f apache-log-file.conf

You can see the message successfully started.
When the tomcat logs the data it will start capturing and print in the console as well push it to elastic search host as per our configuration.
You can see the output in console as below,
[2017-10-04T23:22:12,520][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port
{
          "path" => "C:/ Tomcat/apache-tomcat-7.0.37/logs/catalina.2017-08-30.log",
    "@timestamp" => 2017-10-04T17:55:15.811Z,
      "@version" => "1",
          "host" => "BALA-LAPTOP",
       "message" => "Aug 30, 2017 11:19:11 PM org.apache.coyote.AbstractProtocol init\r",
          "type" => "log",
          "tags" => [
        [0] "_grokparsefailure"
    ]
}
{
          "path" => "C:/ Tomcat/apache-tomcat-7.0.37/logs/catalina.2017-08-30.log",
    "@timestamp" => 2017-10-04T17:55:15.814Z,
      "@version" => "1",
          "host" => " BALA-LAPTOP ",
       "message" => "INFO: Initializing ProtocolHandler [\"http-bio-8080\"]\r",
          "type" => "log",
          "tags" => [
        [0] "_grokparsefailure"
    ]
}
{
          "path" => "C/Tomcat/apache-tomcat-7.0.37/logs/catalina.2017-08-30.log",
    "@timestamp" => 2017-10-04T17:55:15.818Z,
      "@version" => "1",
          "host" => " BALA-LAPTOP ",
       "message" => "Aug 30, 2017 11:19:11 PM org.apache.coyote.AbstractProtocol init\r",
          "type" => "log",
          "tags" => [
        [0] "_grokparsefailure"
    ]
}
You can verify the data index by hitting this http://localhost:9200/_cat/indices

Step 3
Next we will load the indexed data to Kibana dashboards.
Goto Management->Index Patterns and click on  +Create Index Pattern button.After all you will reach the below screen.

Give Index pattern as apache and click create button. Once it’s got created you can see the below screen.

Goto Discover link and select your index pattern to view the data.
After this you can search, create visual charts and lot more stuffs.
Happy debugging!
Thanks.

No comments

Post a Comment