Flume安装测试
2015-12-30 14:47
218 查看
1、从官网下载最新版的二进制Flume安装文件:apache-flume-1.6.0-bin.tar.gz
2、在指定目录/home/hadoop/下解压缩安装文件:tar -zxvf apache-flume-1.6.0-bin.tar.gz
3、在Flume安装目录下执行如下命令测试Flume是否安装成功:bin/flume-ng version
出现如下内容则表示Flume安装成功
4、Flume测试案例:
案例1:Avro
Avro可以发送一个给定的文件给Flume,Avro 源使用AVRO RPC机制。
a)创建agent配置文件
vi /home/hadoop/apache-flume-1.6.0-bin/conf/avro.conf
a1.sources = r1
a1.sinks = k1
a1.channels = c1
# Describe/configure the source
a1.sources.r1.type = avro
a1.sources.r1.channels = c1
a1.sources.r1.bind = 172.26.40.74
a1.sources.r1.port = 4141
# Describe the sink
a1.sinks.k1.type = logger
# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
b)启动flume agent a1
bin/flume-ng agent -c . -f /home/hadoop/apache-flume-1.6.0-bin/conf/avro.conf -n a1 -Dflume.root.logger=INFO,console
c)创建指定文件
echo "hello world" > /home/hadoop/apache-flume-1.6.0-bin/log.00
d)使用avro-client发送文件
bin/flume-ng avro-client -c . -H 172.26.40.74 -p 4141 -F /home/hadoop/apache-flume-1.6.0-bin/log.00
f)在172.26.40.74的控制台,可以看到以下信息:
案例2:Spool
Spool监测配置的目录下新增的文件,并将文件中的数据读取出来。需要注意两点:
1) 拷贝到spool目录下的文件不可以再打开编辑。
2) spool目录下不可包含相应的子目录
a)创建agent配置文件
vi /home/hadoop/apache-flume-1.6.0-bin/conf/spool.conf
a1.sources = r1
a1.sinks = k1
a1.channels = c1
# Describe/configure the source
a1.sources.r1.type = spooldir
a1.sources.r1.channels = c1
a1.sources.r1.spoolDir = /home/hadoop/apache-flume-1.6.0-bin/logs
a1.sources.r1.fileHeader = true
# Describe the sink
a1.sinks.k1.type = logger
# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
b)启动flume agent a1
bin/flume-ng agent -c . -f /home/hadoop/apache-flume-1.6.0-bin/conf/spool.conf -n a1 -Dflume.root.logger=INFO,console
c)追加文件到/home/hadoop/flume-1.5.0-bin/logs目录
echo "spool test1" > /home/hadoop/apache-flume-1.6.0-bin/logs/spool_text.log
d)在m1的控制台,可以看到以下相关信息:
2、在指定目录/home/hadoop/下解压缩安装文件:tar -zxvf apache-flume-1.6.0-bin.tar.gz
3、在Flume安装目录下执行如下命令测试Flume是否安装成功:bin/flume-ng version
出现如下内容则表示Flume安装成功
4、Flume测试案例:
案例1:Avro
Avro可以发送一个给定的文件给Flume,Avro 源使用AVRO RPC机制。
a)创建agent配置文件
vi /home/hadoop/apache-flume-1.6.0-bin/conf/avro.conf
a1.sources = r1
a1.sinks = k1
a1.channels = c1
# Describe/configure the source
a1.sources.r1.type = avro
a1.sources.r1.channels = c1
a1.sources.r1.bind = 172.26.40.74
a1.sources.r1.port = 4141
# Describe the sink
a1.sinks.k1.type = logger
# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
b)启动flume agent a1
bin/flume-ng agent -c . -f /home/hadoop/apache-flume-1.6.0-bin/conf/avro.conf -n a1 -Dflume.root.logger=INFO,console
c)创建指定文件
echo "hello world" > /home/hadoop/apache-flume-1.6.0-bin/log.00
d)使用avro-client发送文件
bin/flume-ng avro-client -c . -H 172.26.40.74 -p 4141 -F /home/hadoop/apache-flume-1.6.0-bin/log.00
f)在172.26.40.74的控制台,可以看到以下信息:
案例2:Spool
Spool监测配置的目录下新增的文件,并将文件中的数据读取出来。需要注意两点:
1) 拷贝到spool目录下的文件不可以再打开编辑。
2) spool目录下不可包含相应的子目录
a)创建agent配置文件
vi /home/hadoop/apache-flume-1.6.0-bin/conf/spool.conf
a1.sources = r1
a1.sinks = k1
a1.channels = c1
# Describe/configure the source
a1.sources.r1.type = spooldir
a1.sources.r1.channels = c1
a1.sources.r1.spoolDir = /home/hadoop/apache-flume-1.6.0-bin/logs
a1.sources.r1.fileHeader = true
# Describe the sink
a1.sinks.k1.type = logger
# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
b)启动flume agent a1
bin/flume-ng agent -c . -f /home/hadoop/apache-flume-1.6.0-bin/conf/spool.conf -n a1 -Dflume.root.logger=INFO,console
c)追加文件到/home/hadoop/flume-1.5.0-bin/logs目录
echo "spool test1" > /home/hadoop/apache-flume-1.6.0-bin/logs/spool_text.log
d)在m1的控制台,可以看到以下相关信息:
相关文章推荐
- 本地推送.极光推送.APNs推送
- BugPhobia开发篇章:Beta阶段第IX次Scrum Meeting
- Windows创建用户定义的服务(srvany.exe和instsrv.exe )
- 命令行的艺术
- OC基础概念理解—多态
- IOS-自动布局详解
- android中ViewPage使用的细节问题
- Winsock 编程流程
- 实验四 主存空间的分配和回收模拟
- data类型的Url格式及Base64编码:把小数据直接嵌入到Url中
- 「拼房」你敢吗?这款App这样玩「陌生人社交」
- 深入剖析 JavaScript 的深复制
- js如何知道checkbox是否被选中,并且选中的话让input元素变为不可编辑
- html5游戏开发,弹球小游戏!!!
- SublimeText3 安装和配置,以及配置 Python 环境
- Eclipse设置打印线
- 2015年直通车新玩法,你知道嘛!!
- mxnet实战之艺术画
- springmvc注解开发-高级之图片上传
- MySQL数据类型和常用字段属性总结