您的位置:首页 > 移动开发

解决HBase中snappy出错

2016-06-10 20:22 369 查看

错误说明

最近使用
hbase shell
时,报错

ERROR: org.apache.hadoop.hbase.DoNotRetryIOException: Compression algorithm 'snappy' previously failed test.


官网解释

参照官方文档的Install Snappy Support小节,

HBase does not ship with Snappy support because of licensing issues. You can install Snappy binaries (for instance, by using yum install snappy on CentOS) or build Snappy from source. After installing Snappy, search for the shared library, which will be called libsnappy.so.X where X is a number. If you built from source, copy the shared library to a known location on your system, such as /opt/snappy/lib/.

In addition to the Snappy library, HBase also needs access to the Hadoop shared library, which will be called something like libhadoop.so.X.Y, where X and Y are both numbers. Make note of the location of the Hadoop library, or copy it to the same location as the Snappy library.

Each of these library locations need to be added to the environment variable HBASE_LIBRARY_PATH for the operating system user that runs HBase. You need to restart the RegionServer for the changes to take effect.

意为HBase因为版权问题没有将snappy库添加进来,需要用户手动设置HBASE_LIBRARY_PATH环境变量,以指明snappy库的位置。

问题解决

测试HBase snappy(未设置HBASE_LIBRARY_PATH环境变量)

hbase org.apache.hadoop.hbase.util.CompressionTest file:///home/asin/Temp/test.txt  snappy


意为使用HBase对本地文件
/home/asin/Temp/test.txt
进行压缩,报错如下,

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
at org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:303)
at org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:90)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:849)
at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:124)
at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:116)
at org.apache.hadoop.hbase.io.hfile.HFileWriterV3.<init>(HFileWriterV3.java:67)
at org.apache.hadoop.hbase.io.hfile.HFileWriterV3$WriterFactoryV3.createWriter(HFileWriterV3.java:59)
at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:298)
at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:124)
at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:160)


设置环境变量

export HBASE_LIBRARY_PATH=/home/asin/SoftWare/hbase-1.1.4/lib/Linux-amd64-64


注:Linux-amd64-64文件夹下包含
libsnappy.so
等文件,该文件夹从CDH中获取到。

重启HBase Server,测试(设置环境变量后)

SUCCESS


注意:将Linux-amd64-64文件夹放到HBase安装目录下的lib中与设置环境变量有同样的效果。

参考

Hadoop HBase 配置 安装 Snappy 终极教程

HBase表增加snappy压缩

HBase使用snappy压缩遇到compression test fail 问题解决
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  hbase snappy 压缩