Writing from Flume to HDFS
2016-06-01 17:09
573 查看
Example: Writing from Flume to HDFS
Apache Flume is a service for collecting log data. You can capture events in Flume and store them in HDFS for analysis. For a conceptual description of Flume, see theFlumeUser Guide. This example is a quick walkthrough to get Flume up and running.
Flume Out of the Box
To use Flume in a fresh Quickstart VM:Import a new VM instance.
Configure the new VM.
Allocate a minimum of 10023 MB memory.
Allocate 2 CPUs.
Allocate 20MB video memory.
Consider setting the clipboard to bidirectional.
Start the VM.
Launch Cloudera Manager.
In the browser, click the Cloudera Manager link.
Start Hue.
Start Flume.
Use Telnet to test the default Flume implementation.
Open a terminal window.
Install Telnet with the commandsudo yum install telnet.
Launch Telnet with the command telnet localhost 10001.
At the prompt, enter Hello world!.
Open /var/log/flume-ng/flume-cmf-flume-AGENT-quickstart.cloudera.log.
Scroll to the bottom of the log, which should have an entry similar to the following.
2015-06-05 15:45:55,561 INFO org.apache.flume.sink.LoggerSink: Event: { headers:{} body: 48 65 6C 6C 6F 20 77 6F 72 6C 64 21 0D Hello world!. }
Writing from Flume to HDFS
You can configure Flume to write incoming messages to data files stored in HDFS for later processing.To configure Flume to write to HDFS:
In the VM web browser, open Hue.
Click File Browser.
Create the /flume/events directory.
In the /user/clouderadirectory, click New->Directory.
Create a directory namedflume.
In the flume directory, create a directory named events.
Check the box to the left of theevents directory, then click thePermissions setting.
Enable Write access for Groupand Other users.
Click Submit.
Change the Flume configuration.
Open Cloudera Manager in your web browser.
In the list of services, click Flume.
Click the Configuration tab.
Scroll or search for theConfiguration File item.
Append the following lines to theConfiguration File settings.
tier1.sinks.sink1.type= HDFS tier1.sinks.sink1.fileType=DataStream tier1.sinks.sink1.channel = channel1 tier1.sinks.sink1.hdfs.path = hdfs://localhost:8020/user/cloudera/flume/events
At the top of the settings list, click Save Changes.
On the far right, choose Actions->Restart to restart Flume.
When the restrart is complete, clickClose.
Click the Home tab. If necessary, start the Yarn service.
In a terminal window, launch Telnet with the command telnet localhost 10001.
At the prompt, enter Hello HDFS!.
In the Hue File Browser, open the/user/cloudera/flume/eventsdirectory.
There will be a file named FlumeDatawith a serial number as the file extension. Click the file name link to view the data sent by Flume
to HDFS. The output is similar to the following.
0000000: 53 45 51 06 21 6f 72 67 2e 61 70 61 63 68 65 2e SEQ.!org.apache. 0000010: 68 61 64 6f 6f 70 2e 69 6f 2e 4c 6f 6e 67 57 72 hadoop.io.LongWr 0000020: 69 74 61 62 6c 65 22 6f 72 67 2e 61 70 61 63 68 itable"org.apach 0000030: 65 2e 68 61 64 6f 6f 70 2e 69 6f 2e 42 79 74 65 e.hadoop.io.Byte 0000040: 73 57 72 69 74 61 62 6c 65 00 00 00 00 00 00 85 sWritable....... 0000050: a6 6f 46 0c f4 16 33 a6 eb 43 c2 21 5c 1b 4f 00 .oF...3..C.!\.O. 0000060: 00 00 18 00 00 00 08 00 00 01 4d c6 1b 01 1f 00 ..........M..... 0000070: 00 00 0c 48 65 6c 6c 6f 20 48 44 46 53 21 0d ...Hello HDFS!.
Sentiment Analysis of Input from Flume
Now that Flume is sending data to HDFS, you can apply the Sentiment Analysis example to comments you enter.All of the source for this example is provided in flumeToHDFS.tar.gz,
which contains:
flume.config
makefile
Map.java
MrManager.java
Reduce.java
neg-words.txt
pos-words.txt
stop-words.txt
/shakespeare
comedies
histories
poems
tragedies
To test sentiment analysis with Flume input:
Expand flumeToHDFS.tar.gz on the VM.
In a terminal window, navigate to the /flume2hdfs
Launch Telnet with the commandtelnet localhost 10001.
Enter the following lines, hitting Enter after each line.(Telnet returns the response OK to each line).
I enjoy using CDH. I think CDH is wonderful. I like the power and flexibility of CDH. I dislike brussels sprouts. I hate mustard greens. Flume is a great product. I have several use cases in mind for which it is well suited.
Enter run_flume to start the Sentiment Analysis example via the makefile.
The application returns results from all counters, ending with the custom counters and report.
org.myorg.Map$Gauge NEGATIVE=2 POSITIVE=6 ********** Sentiment score = (6.0 - 2.0) / (6.0 + 2.0) Sentiment score = 0.5 Positivity score = 6.0/(6.0+2.0) Positivity score = 75% **********
Page generated October 23, 2015.
相关文章推荐
- 基于HDFS的SparkStreaming案例实战
- hdfs 命令行操作
- 【漫画解读】HDFS存储原理
- 通过Nifi 导入csv文件到HDFS
- HDFS EC:将纠删码技术融入HDFS
- hdfs 使用
- hadoop yarn container
- Hadoop Map 数目决定因素
- SparkStreaming0nHDFS实战
- HDFS内存存储
- flume-ng+Kafka+Storm+HDFS 实时系统搭建
- 第85课:基于HDFS的SparkStreaming案例实战和内幕源码解密
- HDFS文件上传:8020端口拒绝连接问题解决!
- HDFS学习之FileSystem
- spark读写hdfs后出现的异常错误
- hdfs文件误删恢复
- hdfs块丢失导致的异常问题排查解决
- sparkStreming on HDFS
- hdfs 常用命令
- hadoop升级后,hive报错