您的位置:首页 > 运维架构 > Nginx

Logstash收集nginx日志之使用grok过滤插件解析日志

2018-06-12 11:50 543 查看

grok作为一个logstash的过滤插件,支持根据模式解析文本日志行,拆成字段。

  • nginx日志的配置:
log_format  main  '$remote_addr - $remote_user [$time_local] "$request" $status $body_bytes_sent "$http_referer" "$http_user_agent" "$http_x_forwarded_for"';

 

  • logstash中grok的正则(添加在/usr/local/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns/grok-patterns文件中)为:

WZ ([^ ]*)
NGINXACCESS %{IP:remote_ip} \- \- \[%{HTTPDATE:timestamp}\] "%{WORD:method} %{WZ:request} HTTP/%{NUMBER:httpversion}" %{NUMBER:status} %{NUMBER:bytes} %{QS:referer} %{QS:agent} %{NUMBER:elapsed} %{NUMBER:serverelapsed} %{QS:xforward}

 

logstash的配置为:

input{
file{
path => "/usr/local/nginx/logs/access.log"
type => "nginx"
start_position => "beginning"
}
}

filter {
grok {
match => { "message" => "%{NGINXACCESS}" }
}
}
output{
if [type] == "nginx" {
elasticsearch {
hosts=> ["172.17.102.202:9200"]
index=> "nginx"
}
}
}

 

内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: