您的位置:首页 > 大数据 > Hadoop

Flume之监听目录变化并上传文件到HDFS中

2017-05-08 16:21 501 查看
vim /usr/local/flume/conf/exex-hdfs.conf

a1.sources = r1        

a1.sinks = k1        

a1.channels = c1    

# Describe/configure the source

a1.sources.r1.type = spooldir        

a1.sources.r1.spoolDir = /opt/data/logs

a1.sources.r1.fileHeader = true

# Describe the sink   

a1.sinks.k1.type = hdfs

a1.sinks.k1.hdfs.path = hdfs://master:9000/input/flume/%y/%m

a1.sinks.k1.hdfs.filePrefix = events-

a1.sinks.k1.hdfs.round = true

a1.sinks.k1.hdfs.roundValue = 40

a1.sinks.k1.hdfs.roundUnit = second

a1.sinks.k1.hdfs.useLocalTimeStamp = true

# Use a channel which buffers events in memory

a1.channels.c1.type = memory                

a1.channels.c1.capacity = 1000            

a1.channels.c1.transactionCapacity = 100   

# Bind the source and sink to the channel    

a1.sources.r1.channels = c1

a1.sinks.k1.channel = c1

//启动脚本

flume-ng agent --conf conf --conf-file  exec-hdfs.conf --name a1 -Dflume.root.logger=INFO,console
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: