- Logstash - Internal Architecture
- Logstash - Installation
- Logstash - ELK Stack
- Logstash - Introduction
- Logstash - Home
Logstash Input Stage
Logstash Parse and Transform
Logstash Output Stage
Logstash Advanced Topics
Logstash Useful Resources
Selected Reading
- Who is Who
- Computer Glossary
- HR Interview Questions
- Effective Resume Writing
- Questions and Answers
- UPSC IAS Exams Notes
Logstash - Parsing the Logs
Logstash receives the logs using input plugins and then uses the filter plugins to parse and transform the data. The parsing and transformation of logs are performed according to the systems present in the output destination. Logstash parses the logging data and forwards only the required fields. Later, these fields are transformed into the destination system’s compatible and understandable form.
How to Parse the Logs?
Parsing of the logs is performed my using the GROK (Graphical Representation of Knowledge) patterns and you can find them in Github −
.
Logstash matches the data of logs with a specified GROK Pattern or a pattern sequence for parsing the logs pke "%{COMBINEDAPACHELOG}", which is commonly used for apache logs.
The parsed data is more structured and easy to search and for performing queries. Logstash searches for the specified GROK patterns in the input logs and extracts the matching pnes from the logs. You can use GROK debugger to test your GROK patterns.
The syntax for a GROK pattern is %{SYNTAX:SEMANTIC}. Logstash GROK filter is written in the following form −
%{PATTERN:FieldName}
Here, PATTERN represents the GROK pattern and the fieldname is the name of the field, which represents the parsed data in the output.
For example, using onpne GROK debugger
Input
A sample error pne in a log −
[Wed Dec 07 21:54:54.048805 2016] [:error] [pid 1234:tid 3456829102] [cpent 192.168.1.1:25007] JSP Notice: Undefined index: abc in /home/manu/tpworks/tutorialspoint.com/index.jsp on pne 11
GROK Pattern Sequence
This GROK pattern sequence matches to the log event, which comprises of a timestamp followed by Log Level, Process Id, Transaction Id and an Error Message.
[(%{DAY:day} %{MONTH:month} %{MONTHDAY} %{TIME} %{YEAR})] [.*:%{LOGLEVEL:loglevel}] [pid %{NUMBER:pid}:tid %{NUMBER:tid}] [cpent %{IP:cpentip}:.*] %{GREEDYDATA:errormsg}
output
The output is in JSON format.
{ "day": [ "Wed" ], "month": [ "Dec" ], "loglevel": [ "error" ], "pid": [ "1234" ], "tid": [ "3456829102" ], "cpentip": [ "192.168.1.1" ], "errormsg": [ "JSP Notice: Undefined index: abc in /home/manu/tpworks/tutorialspoint.com/index.jsp on pne 11" ] }Advertisements