Hadoop实战-Flume之Sink Failover(十六)
2017-05-16 22:52
302 查看
a1.sources = r1 a1.sinks = k1 k2 a1.channels = c1 # Describe/configure the source a1.sources.r1.type = netcat a1.sources.r1.bind = localhost a1.sources.r1.port = 44444 # Describe the sink a1.sinks.k1.type =file_roll a1.sinks.k1.sink.directory=/home/chenyun/data/flume/file_sinke1 a1.sinks.k2.type =file_roll a1.sinks.k2.sink.directory=/home/chenyun/data/flume/file_sinke2 # Use a channel which buffers events in memory a1.channels.c1.type = memory a1.channels.c1.capacity = 1000 a1.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a1.sources.r1.channels = c1 a1.sinks.k1.channel = c1 a1.sinks.k2.channel = c1 a1.sinkgroups = g1 a1.sinkgroups.g1.sinks = k1 k2 a1.sinkgroups.g1.processor.type = failover a1.sinkgroups.g1.processor.priority.k1 = 5 #谁的权重高,就谁工作 a1.sinkgroups.g1.processor.priority.k2 = 10 a1.sinkgroups.g1.processor.maxpenalty = 10000
相关文章推荐
- Hadoop实战-Flume之Sink Load-balancing(十七)
- Hadoop实战-Flume之自定义Sink(十九)
- Hadoop实战-Flume之Hdfs Sink(十)
- Hadoop应用开发实战(flume应用开发、搜索引擎算法、Pipes、集群、PageRank算法)
- Hadoop实战-Flume之Source interceptor(十一)(2017-05-16 22:40)
- Hadoop硬实战之一:使用flume将系统日志文件导入HDFS
- Hadoop应用开发实战(flume应用开发、搜索引擎算法、Pipes、集群、PageRank算法)
- Hadoop实战-Flume之Source regex_extractor(十二)
- Hadoop实战-Flume之Source regex_filter(十三)
- Hadoop实战-Flume之Source replicating(十四)
- Hadoop实战-Flume之Source multiplexing(十五)
- Hadoop实战-Flume之自定义Source(十八)
- Flume--failover sink processor
- 【Flume】四、Sink Processors(failover && load balance)
- Hadoop实战-Flume之Hello world(九)
- Hadoop Streaming 实战: 多路输出
- Hadoop实战问题集锦
- flume高并发优化——(10)消灭elasticsearch sink多次插入
- 王家林的“云计算分布式大数据Hadoop实战高手之路---从零开始”的第十一讲Hadoop图文训练课程:MapReduce的原理机制和流程图剖析
- (十六)洞悉linux下的Netfilter&iptables:开发自己的hook函数【实战】(下)