您的位置:首页 > 运维架构

Flume向HDFS写数据实例

2013-01-09 00:00 525 查看

Goal:

Use Flume to pool a folder on local file system and write it to HDFS.

Version Information:

hadoop-0.22.0

apache-flume-1.3.1

Flume Configuration:

Edit file
flume-env.sh under
/$FLUME_HOME$/conf:

export JAVA_HOME=your jave home
export FLUME_CLASSPATH=your flume home
export HADOOP_CLASSPATH=your hadoop home

Edit file flume-conf.properties under /$FLUME_HOME$/conf:

# Configure the agent
agent.sources = spooldirSource
agent.channels = memoryChannel
agent.sinks = hdfsSink

# Configure the source
agent.sources.spooldirSource.type = spooldir
agent.sources.spooldirSource.spoolDir = /tmp/flume/
agent.sources.spooldirSource.channels = memoryChannel

# Configure the sink
agent.sinks.hdfsSink.type = hdfs
agent.sinks.hdfsSink.hdfs.path = hdfs://masternode:9000/flume/events
agent.sinks.hdfsSink.hdfs.filePrefix = events-
agent.sinks.hdfsSink.channel = memoryChannel

# Configure the channel
agent.channels.memoryChannel.type = memory
agent.channels.memoryChannel.capacity = 100


Copy Hadoop Jars to Flume lib directory:

Copy hadoop-hdfs-0.22.0.jar and hadoop-common-0.22.0.jar to /$FLUME_HOME$/lib.

Start Flume Agent:

./bin/flume-ng agent -n agent -c conf -f conf/flume-conf.properties

Write File:

echo "Hello World">>/tmp/flume/test


View Logs:

Under /$FLUME_HOME$/logs
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  Flume Hadoop