您的位置:首页 > 其它

Logstash学习10_Logstash从Kafka或文件接收数据的配置demo介绍

2017-02-13 14:51 295 查看
下面介绍两个Logstash的配置Demo:

Demo1:

input {
kafka {
zk_connect => "10.10.16.2:2181,10.10.16.3:2181,10.10.16.4:2181"
group_id => "test-consumer-group"
topic_id => "MyPattern"
reset_beginning => false # boolean (optional), default: false
consumer_threads => 5 # number (optional), default: 1
decorate_events => true # boolean (optional), default: false
}
}

filter{
mutate{

split => ["message",","]
}
#第一个数据的内容中ORA-xxxxx这种格式,则这条内容是ora错误。添加二个字段
mutate{
add_field => {
"SRC_ADDRESS" => "%{[message][0]}"
"DEST_ADDRESS" => "%{[message][1]}"
"SRC_PORT" => "%{[message][2]}"
"DEST_PORT" => "%{[message][3]}"
"TRANS_PROTOCOL" => "%{[message][4]}"
"PACKETS" => "%{[message][5]}"
"BYTES" => "%{[message][6]}"
"FLAGS" => "%{[message][7]}"
"START_TIME" => "%{[message][8]}"
}
}

}


Demo2:

input {
file {
path => ["/home/test.csv"]
#type => "system"
#start_position => "beginning"
}
}

filter{
mutate{
gsub => [ "message", "\r", "" ]
}

mutate {
split => ["message",","]
}

mutate{
add_field => {
"id" => "%{[message][0]}"
"time" => "%{[message][1]}"
"userId" => "%{[message][2]}"
"pc" => "%{[message][3]}"
"stat" => "%{[message][4]}"
}
}
}

output {
stdout { codec => rubydebug }
}


第一个是从Kafka接收数据,第二个是从文件接收数据,所以input的配置不同。
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐