利用ELK技术栈收集nginx日志
之前的一篇文章已经介绍如何使用nginx写入post的数据入日志,详细见链接:
nginx的post命令记录body到日志中
接下来使用filebeat、logstash、elasticsearch把日志文件收集、整理并存储,以方便后续进行查看和分析,需要安装的包如下:
filebeat-7.7.0-linux-x86_64.tar.gz
下载链接:https://pan.baidu.com/s/1hRTEZpBzXEfulmOtmqQmmQ
提取码:fdbr
logstash-7.7.0.tar.gz
下载链接:https://pan.baidu.com/s/12KLCDNbJ5rwT3r0a9ustCw
提取码:qpi6
一、filebeat配置
1、修改filebeat文件夹下的filebeat.yml,这里只列出修改的地方,需要注意的是把其他输出注释掉,开启logstash:
# Change to true to enable this input configuration. enabled: true # Paths that should be crawled and fetched. Glob based paths. paths: - /usr/local/nginx_stream/nginx/logs/app_post.log #============================== Kibana ===================================== #setup.kibana: #-------------------------- Elasticsearch output ------------------------------ #output.elasticsearch: #----------------------------- Logstash output -------------------------------- output.logstash: # The Logstash hosts hosts: ["localhost:5044"]
2、启动filebeat,命令如下,路径根据自己的配置就好了:
/usr/local/nginx_stream/filebeat-7.7.0-linux-x86_64/filebeat -c /usr/local/nginx_stream/filebeat-7.7.0-linux-x86_64/filebeat.yml run &
二、logstash配置
1、进入logstash-7.7.0/config,拷贝 “logstash-sample.conf” 为 logstash.yml,按照自己的要求进行修改,如下:
# Sample Logstash configuration for creating a simple # Beats -> Logstash -> Elasticsearch pipeline. input { beats { port => 5044 } } filter { grok { match => [ "message", "\[%{HTTPDATE:dt}\]\s%{GREEDYDATA:jsondata}" ] } mutate { gsub => ["jsondata", "[\\]", ""] #gsub => ["jsondata", "[\"]", ""] remove_field => ["host"] remove_field => ["tags"] remove_field => ["agent"] remove_field => ["ecs"] remove_field => ["@version"] remove_field => ["input"] remove_field => ["log"] } split { field => "jsondata" terminator => "#" } json { source => "jsondata" } } output { stdout { } elasticsearch { hosts => ["192.168.1.227:9200"] index => "testlog-%{+YYYY.MM.dd}" } }
具体含义后面再解释,最后输出到elasticsearch。
三、启动
启动命令如下:
nginx:
/usr/local/nginx_stream/nginx/sbin/nginx
filebeat:
/usr/local/nginx_stream/filebeat-7.7.0-linux-x86_64/filebeat -c /usr/local/nginx_stream/filebeat-7.7.0-linux-x86_64/filebeat.yml run &
logstash:
/usr/local/nginx_stream/logstash-7.7.0/bin/logstash -f /usr/local/nginx_stream/logstash-7.7.0/config/logstash.conf &
四、测试
使用postman向nginx发送带有数据的请求,在es中就可以看到日志已经在es中显示了
body中的数据个数如下:
#{"a": "im ll 中文", "b": "aaaaaaaa"}#{"a":"im ll 英文", "b":"bbbbbbbbbbb"}
几秒钟后看到elasticsearch中有如下数据代码成功了:
最后,参考的链接发在下面:
整体配置下来坑不是很多,还是比较顺利的
1、logstash关于反斜杠的替换,启动时总是报以下错误:
[2020-05-27T06:11:26,062][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", \"{\", \",\", \"]\" at line 18, column 33 (byte 294) after filter {\n grok {\n match => [\n \"message\", \"\\[%{HTTPDATE:dt}\\]\\s%{GREEDYDATA:jsondata}\"\n ]\n }\n\n mutate {\n gsub => [\"jsondata\", \"\\\\\", \"", :backtrace=>["/usr/local/nginx_stream/logstash-7.7.0/logstash-core/lib/logstash/compiler.rb:58:in `compile_imperative'", "/usr/local/nginx_stream/logstash-7.7.0/logstash-core/lib/logstash/compiler.rb:66:in `compile_graph'", "/usr/local/nginx_stream/logstash-7.7.0/logstash-core/lib/logstash/compiler.rb:28:in `block in compile_sources'", "org/jruby/RubyArray.java:2577:in `map'", "/usr/local/nginx_stream/logstash-7.7.0/logstash-core/lib/logstash/compiler.rb:27:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:181:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:67:in `initialize'", "/usr/local/nginx_stream/logstash-7.7.0/logstash-core/lib/logstash/java_pipeline.rb:43:in `initialize'", "/usr/local/nginx_stream/logstash-7.7.0/logstash-core/lib/logstash/pipeline_action/create.rb:52:in `execute'", "/usr/local/nginx_stream/logstash-7.7.0/logstash-core/lib/logstash/agent.rb:342:in `block in converge_state'"]}
后来查到了这篇帖子解决:
https://www.geek-share.com/detail/2728904711.html
2、其他参考链接如下:
filebeat官网文档
logstash官网下载地址
logstash官网文档
grok调试工具
ELK Stack 中文指南
- ELK技术1--收集nginx正确和错误日志
- ELK日志系统之使用Rsyslog快速方便的收集Nginx日志
- ELK6.3.1版本使用filebeat收集nginx的日志配置文件
- ELK日志系统之使用Rsyslog快速方便的收集Nginx日志
- 搭建ELK收集Nginx日志
- ELK收集nginx日志并展示来源IP城市分布图
- ELK日志系统之使用Rsyslog快速方便的收集Nginx日志
- ELK+kafka收集 Nginx与tomcat日志
- ELK之nginx日志的简单收集
- ELK日志服务器的快速搭建并收集nginx日志 推荐
- centos6.5下安装配置ELK及收集nginx日志
- elk部署配置,收集nginx和tomcat日志
- ELK技术2--ELK收集tomcat日志
- ELK技术3--ELK收集java日志多行匹配模式
- ELK集群部署及收集nginx日志
- ELK Stack (2) —— ELK + Redis收集Nginx日志
- ELK收集nginx日志并展示来源IP城市分布图
- ELK实战之收集Nginx的json格式日志
- ELK收集nginx日志一例
- ubuntu搭建日志分析工具ELK收集Nginx日志