您的位置:首页 > 运维架构 > Linux

日志收集系统Flume的简单安装Centos7

2017-04-13 13:49 267 查看
先设置java home

JAVA_HOME=/opt/java

PATH=$PATH:$JAVA_HOME/bin

export JAVA_HOME PATH

从下载最新的版本:http://flume.apache.org/download.html

tar zxvf apache-flume-1.*.0-bin.tar.gz

mv apache-flume-1.*.0-bin flume

cd flume

cp conf/flume-conf.properties.template conf/flume-conf.properties

先用telnet来测试日志收集:

vi conf/flume-conf.properties

# The configuration file needs to define the sources,

# the channels and the sinks.

# Sources, channels and sinks are defined per agent,

# in this case called 'agent'

agent.sources = r1

agent.channels = c1

agent.sinks = s1

# For each one of the sources, the type is defined

agent.sources.r1.type = netcat

agent.sources.r1.bind = localhost

agent.sources.r1.port = 8888

#如果收集本地的,则:

agent.sources.r1.type = exec

agent.sources.r1.command = tail -F /etc/httpd/logs/access_log

agent.sources.r1.batchSize=20

# The channel can be defined as follows.

agent.sources.r1.channels = c1

# Each sink's type must be defined

agent.sinks.s1.type = file_roll

agent.sinks.s1.sink.directory = /tmp/log/flume

#如果是kafka

agent.sinks.s1.type=org.apache.flume.sink.kafka.KafkaSink

#设置Kafka的broker地址和端口号

agent.sinks.s1.brokerList=192.168.6.202:9092

#设置Kafka的Topic

agent.sinks.s1.topic=test

#设置序列化方式

agent.sinks.s1.serializer.class=kafka.serializer.StringEncoder

#Specify the channel the sink should use

agent.sinks.s1.channel = c1

# Each channel's type is defined.

agent.channels.c1.type = memory

# Other config values specific to each type of channel(sink or source)

# can be defined as well

# In this case, it specifies the capacity of the memory channel

agent.channels.c1.capacity = 100

配置好后:

mkdir -p /tmp/log/flume

bin/flume-ng agent --conf conf -f conf/flume-conf.properties -n agent&

telnet localhost 8888

输入

hello world!

hello Flume!

在/tmp/log/flume里就能看到刚才的内容了
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  Flume