您的位置:首页 > 其它

日志分析实践与应用

2019-02-08 13:06 211 查看

日志分析实践与应用

      这个场景是,日志系统平时为了系统处理能力,我们使用INFO级别或ERROR级别,当发现问题时,我们需要不停服务的动态的将日志级别变更为DEBUG以便在执行细节发现问题,下面列举了动态变更的操作,需要借助代码、定时和配置中心服务。

      在微服务的场景,日志是散落在各个服务集群节点中,不方便查看,所以我们需要通过集中收集到一处保存、查看和分析。

应用程序中日志的配置

logback.xml

  • 1.在configuration中配置include,引用defaults.xml、console-appender.xml和file-appender.xml基础配置,可以复用变量、默认配置和策略。
  • 2.定义一个stash的appender,配置目标主机和端口以及转码器用什么。通过配置将日志发送到统一日志管理平台进一步分析与保存。
  • 注:对于推荐使用logback-spring.xml不使用logback.xml,官方也没有给出推荐理由,经测试logback.xml配置依然可用,也可以在变更后自动重启,所以没换,只是注意configuration属性scan不能设置为true,由spring来扫描即可。

具体配置如下:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<include resource="org/springframework/boot/logging/logback/defaults.xml" />
<include resource="org/springframework/boot/logging/logback/console-appender.xml "/>
<include resource="org/springframework/boot/logging/logback/file-appender.xml "/>
<!-- 控制台打印日志的相关配置 -->
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<!-- 日志格式 -->
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss} [%level] - %m%n</pattern>
</encoder>
</appender>

<!-- 文件保存日志的相关配置 -->
<appender name="ERROR-OUT" class="ch.qos.logback.core.rolling.RollingFileAppender">
<!-- 保存日志文件的路径 -->
<file>d:/logs/error.log</file>
<!-- 日志格式 -->
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss} [%class:%line] - %m%n</pattern>
</encoder>
<!-- 日志级别过滤器 -->
<filter class="ch.qos.logback.classic.filter.LevelFilter">
<!-- 过滤的级别 -->
<level>ERROR</level>
<!-- 匹配时的操作:接收(记录) -->
<onMatch>ACCEPT</onMatch>
<!-- 不匹配时的操作:拒绝(不记录) -->
<onMismatch>DENY</onMismatch>
</filter>
<!-- 循环策略:基于时间创建日志文件 -->
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!-- 日志文件名格式 -->
<fileNamePattern>error.%d{yyyy-MM-dd}.log</fileNamePattern>
<!-- 最大保存时间:30天-->
<maxHistory>30</maxHistory>
</rollingPolicy>
</appender>

<!-- stash日志保存(方案1) -->
<appender name="STASH-OUT" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>localhost:4560</destination>
<encoder class="net.logstash.logback.encoder.LogstashEncoder" />
</appender>

<!-- fluent日志保存(方案2) -->
<appender name="FLUENT" class="cn.qos.logback.more.appenders.DataFluentAppender">
<remoteHost>${fluentHost}</remoteHost>
</appender>

<!-- 基于dubug处理日志:具体控制台或者文件对日志级别的处理还要看所在appender配置的filter,如果没有配置filter,则使用root配置 -->
<root level="debug">
<appender-ref ref="STDOUT" />
<appender-ref ref="ERROR-OUT" />
<appender-ref ref="STASH-OUT" />
</root>
</configuration>

定时更新日志代码

/**
* 日志级别更新定时2分钟一刷新
*/
@Scheduled(fixedRate=1000*60*2)
public void refresh() {
String moduleKey = "com.ftsafe";
//判断只有logback的日志实现才适用此方法
if (log instanceof ch.qos.logback.classic.Logger) {
Config applicationConfig = ConfigService.getAppConfig();
String levelConfig = applicationConfig.getProperty("logger.level."+ moduleKey, null);
ch.qos.logback.classic.Logger classicLog = (ch.qos.logback.classic.Logger) log;
ch.qos.logback.classic.Logger logger = classicLog.getLoggerContext().getLogger(moduleKey);
logger.setLevel(Level.toLevel(levelConfig));
log.debug("logger modify level {}", levelConfig);
log.info("logger modify level {}", levelConfig);
}
log.info("logger refresh invoked!");
log.debug("logger refresh invoked!");
}

集中日志管理(win环境)

Elasticsearch

是一个搜索和分析引擎
解压

https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.0.0.zip

启动

d:/baiduYun/java/elasticsearch-6.0.0/bin/elasticsearch.bat

Kibana

允许用户在 Elasticsearch 中使用图表和图表可视化数据
解压

https://artifacts.elastic.co/downloads/kibana/kibana-6.0.0-windows-x86_64.zip

启动

d:/baiduYun/java/kibana-6.0.0-windows-x86_64/bin/kibana.bat

logstash

是一个服务器端的数据处理管道,可以同时从多个源获取数据,将其转换为Elasticsearch之类的“stash”
解压

https://artifacts.elastic.co/downloads/logstash/Logstash-6.0.0.zip

配置logstash.conf配置文件内容

input {
tcp {
port => 4560
host => localhost
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

启动

d:/baiduYun/java/logstash-6.0.0/bin/logstash.bat -f d:\baiduYun\java\logstash-6.0.0\bin\logstash.conf

浏览日志

访问http://localhost:5601进入kibana界面,可以在Discover界面输入检索条件,查出希望检索到的内容。

附录:

spring logging

https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-logging.html
defaults.xml

<?xml version="1.0" encoding="UTF-8"?>

<!--
Default logback configuration provided for import, equivalent to the programmatic
initialization performed by Boot
-->

<included>
<conversionRule conversionWord="clr" converterClass="org.springframework.boot.logging.logback.ColorConverter" />
<conversionRule conversionWord="wex" converterClass="org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter" />
<conversionRule conversionWord="wEx" converterClass="org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter" />
<property name="CONSOLE_LOG_PATTERN" value="${CONSOLE_LOG_PATTERN:-%clr(%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}"/>
<property name="FILE_LOG_PATTERN" value="${FILE_LOG_PATTERN:-%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}} ${LOG_LEVEL_PATTERN:-%5p} ${PID:- } --- [%t] %-40.40logger{39} : %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}"/>

<logger name="org.apache.catalina.startup.DigesterFactory" level="ERROR"/>
<logger name="org.apache.catalina.util.LifecycleBase" level="ERROR"/>
<logger name="org.apache.coyote.http11.Http11NioProtocol" level="WARN"/>
<logger name="org.apache.sshd.common.util.SecurityUtils" level="WARN"/>
<logger name="org.apache.tomcat.util.net.NioSelectorPool" level="WARN"/>
<logger name="org.eclipse.jetty.util.component.AbstractLifeCycle" level="ERROR"/>
<logger name="org.hibernate.validator.internal.util.Version" level="WARN"/>
</included>

console-appender.xml

<?xml version="1.0" encoding="UTF-8"?>

<!--
Console appender logback configuration provided for import, equivalent to the programmatic
initialization performed by Boot
-->

<included>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>${CONSOLE_LOG_PATTERN}</pattern>
</encoder>
</appender>
</included>

file-appender.xml

<?xml version="1.0" encoding="UTF-8"?>

<!--
File appender logback configuration provided for import, equivalent to the programmatic
initialization performed by Boot
-->

<included>
<appender name="FILE"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<encoder>
<pattern>${FILE_LOG_PATTERN}</pattern>
</encoder>
<file>${LOG_FILE}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
<fileNamePattern>${LOG_FILE}.%d{yyyy-MM-dd}.%i.gz</fileNamePattern>
<maxFileSize>${LOG_FILE_MAX_SIZE:-10MB}</maxFileSize>
<maxHistory>${LOG_FILE_MAX_HISTORY:-0}</maxHistory>
</rollingPolicy>
</appender>
</included>

参考内容

内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息