您的位置:首页 > 编程语言 > Python开发

每天定时删除30天前的elasticsearch索引的python脚本

2018-05-10 14:18 183 查看

公司用的ELK来收集日志,但由于资源有限,磁盘经常爆满。所以写了个python脚本放在crontab里面,每天执行定时删除30天前的elasticsearch索引。

不怎么会写代码,大牛轻喷。。

#!/usr/bin/env python
# coding: utf-8
# author: yd

import urllib
import urllib2
import re
import datetime
import time

def http_get(url_get):                                      #获取elasticsearch的所有索引
request_get = urllib2.Request(url_get)
response_get = urllib2.urlopen(request_get).read()
return response_get

def match(response_get):
pattern = re.compile(r'\d+\.\d+\.\d+')
res_match = pattern.findall(response_get)
date_now = datetime.date.today()                       #获取当前日期
days_before_30 = date_now - datetime.timedelta(days=30)    #获取30天前的日期
date_format = days_before_30.__format__('%Y.%m.%d')         #对日期进行格式化输出,例:2018.04.23
if date_format in res_match:
http_delete(date_format)
else:
pass

def http_delete(url_delete_date):                                                  #删除30天前的索引
url_delete = 'http://localhost:9200/logstash-%s?pretty' % (url_delete_date)
request_delete = urllib2.Request(url_delete)
request_delete.get_method = lambda:'DELETE'                   #设置HTTP请求方式
response_delete = urllib2.urlopen(request_delete).read()
return response_delete

if __name__ == '__main__':
url_get = 'http://localhost:9200/_cat/indices?v'
match(http_get(url_get))


阅读更多
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: