flume保存文件到hdfs的时候报错
2016-05-05 17:26
363 查看
flume配置sink向hdfs中写入文件,在启动的时候遇到的报错问题
1. Failed to start agent because dependencies were not found in classpath. Error follows.
java.lang.NoClassDefFoundError: org/apache/hadoop/io/SequenceFile$CompressionType
这是因为没有相应的的jar包导致的, 进入到/hadoop/share/hadoop/common/*.jar 和/hadoop/share/hadoop/common/lib/*.jar 拷贝到flume安装目录的lib下就可以。
2. HDFS IO error
java.io.IOException: No FileSystem for scheme: hdfs
运行的时候还是会报找不到hdfs文件的错误,进入到hadoop的安装目录找到/share/hadoop/hdfs/hadoop-hdfs-2.4.1.jar 同样拷贝到flume的安装目录的lib下。再启动就可以了。
1. Failed to start agent because dependencies were not found in classpath. Error follows.
java.lang.NoClassDefFoundError: org/apache/hadoop/io/SequenceFile$CompressionType
[ERROR - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:145)] Failed to start agent because dependencies were not found in classpath. Error follows. java.lang.NoClassDefFoundError: org/apache/hadoop/io/SequenceFile$CompressionType at org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:239) at org.apache.flume.conf.Configurables.configure(Configurables.java:41) at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:413) at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:98) at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.SequenceFile$CompressionType at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 12 more
这是因为没有相应的的jar包导致的, 进入到/hadoop/share/hadoop/common/*.jar 和/hadoop/share/hadoop/common/lib/*.jar 拷贝到flume安装目录的lib下就可以。
2. HDFS IO error
java.io.IOException: No FileSystem for scheme: hdfs
[WARN - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:455)] HDFS IO error java.io.IOException: No FileSystem for scheme: hdfs at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:243) at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:235) at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:679) at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50) at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:676) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)
运行的时候还是会报找不到hdfs文件的错误,进入到hadoop的安装目录找到/share/hadoop/hdfs/hadoop-hdfs-2.4.1.jar 同样拷贝到flume的安装目录的lib下。再启动就可以了。
相关文章推荐
- 在HDFS上配置Alluxio
- DEPRECATED: Use of this script to execute hdfs command is deprecated.
- 笔记
- HDFS 读写数据详细步骤
- HDFS 基本文件操作API
- HDFS高级操作命令和工具
- HDFS 文件操作命令格式与注意事项
- HDFS 启动与关闭
- HDFS 可靠性的设计实现
- HDFS 文件操作基础命令
- HBase与HDFS结合使用
- DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command
- HDFS详解
- HDFS快照管理
- 清理Kylin的中间存储数据(HDFS & HBase Tables)
- Hadoop2.7实战v1.0之Linux参数调优
- IMF传奇行动第85课:Spark Streaming第四课:基于HDFS的Spark Streaming案例实战和内幕源码解密
- HDFS写入和读取流程
- HDFS写入和读取流程
- 六:熟悉HDFS基本常用命令(一)