您的位置:首页 > 运维架构 > Nginx

Nginx日志分割与Nginx日志分析脚本(很实用)

2012-11-12 17:06 495 查看
[align=center]Nginx日志分割与Nginx日志分析脚本(很实用)[/align]



nginx日志切割脚本:

vi /root/cutlog.sh
#!/bin/bash

I=`ps aux | grep nginx | grep root | grep -v 'grep nginx' | awk '{print $14}'` #查找nginx进程

if [ $I == /usr/local/nginx/sbin/nginx ];then

ACCLOG=`cat /usr/local/nginx/conf/nginx.conf | grep ' access_log' | awk '{print $2}'` #如果nginx进程在,就找到配置文件,读取accesslog路径

ERRLOG=`cat /usr/local/nginx/conf/nginx.conf| grep ^error | awk '{print $2}'| cut -d";" -f1` #错误日志的路径

ls $ACCLOG #查看是否有此文件

if [ $? -eq 0 ];then #如果有

mv $ACCLOG $ACCLOG.`date -d "-1 day" +%F` #重命名当前日志

mv $ERRLOG $ERRLOG.`date -d "-1 day" +%F`

touch $ACCLOG #创建空日志

touch $ERRLOG

chown nginx:root $ACCLOG #修改属主

chown nginx:root $ERRLOG

[ -f /usr/local/nginx/logs/nginx.pid ] && kill -USR1 `cat /usr/local/nginx/logs/nginx.pid` #判断进程,并重新加载(这里的kill -USR1会使nginx将新产生的日志写到刚创建的新日志里面。)

/mnt/logs/checklog.sh $ACCLOG.`date "-1 day" +%F` #这个是日志分析脚本

gzip $ACCLOG.`date -d "-1 day" +%F` #压缩日志

gzip $ERRLOG.`date -d "-1 day" +%F`
mv $ACCLOG.`date -d "-10 day" +%F`.* /mnt/history.nginx.log/ #将10天前的老日志清理到其他地方,(你们如果想删除的可以自己改成删除)

mv $ERRLOG.`date -d "-10 day" +%F`.* /mnt/history.nginx.log/

fi

fi

nginx日志分析脚本:
vi /mnt/logs/checklog.sh
#!/bin/bash

echo -e "####################`date +%F`" >> /mnt/logs/400.txt

echo -e "####################`date +%F`" >> /mnt/logs/URL.txt

echo -e "####################`date +%F`" >> /mnt/logs/IP.txt

cat $1 | wc -l >> /mnt/logs/IP.txt #分析IP

cat $1 | awk -F'"' '{print $3}' | awk '{print $1}' | sort | uniq -c| sort -rn > /mnt/logs/CODE.txt #分析返回值

cat $1 | awk '{print $1}' | sort | uniq -c| sort -rn | head -n20 >> /mnt/logs/IP.txt

N=`cat /mnt/logs/CODE.txt | wc -l`

for I in $(seq 1 $N)

do

M=`head -n$I /mnt/logs/CODE.txt | tail -n1 | awk '{print $2}'`

if [ $M -ge 400 ]

then
echo "#####FIND $M###############">>/mnt/logs/400.txt #分析错误请求

cat $1 | grep "\" $M " | grep -v ' "-" "-" - ' | sort | awk '{print $1 $2 $3 $6 $7 $8 $9 $10 $11 $12 $13 $14 $15 $16 $17 $18 $19 $20 $21}' | sort | uniq -c | sort -rn | head -n5 >> /mnt/logs/400.txt

fi

done

cat $1 | grep -v ' "-" "-" - ' | awk -F'T' '{print $2}' | awk -F'?' '{print $1}' | sort |awk '{print $1}' | sed 's/\(\/review\/file\/download\/\).*/\1/g' | sort | uniq -c | sort -rn | head -n20 >> /mnt/logs/URL.txt
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  脚本 Nginx 分割