spark支持lzo
2016-04-13 09:12
302 查看
在 spark-env.sh中添加
export SPARK_LIBRARY_PATH=$SPARK_LIBRARY_PATH:/usr/hdp/current/share/lzo/0.6.0/lib/native/Linux-amd64-64/*
export SPARK_CLASSPATH=$SPARK_CLASSPATH:/usr/hdp/2.2.8.0-3150/spark/lib/hadoop-lzo-0.4.20-SNAPSHOT.jar
export SPARK_LIBRARY_PATH=$SPARK_LIBRARY_PATH:/usr/hdp/current/share/lzo/0.6.0/lib/native/Linux-amd64-64/*
export SPARK_CLASSPATH=$SPARK_CLASSPATH:/usr/hdp/2.2.8.0-3150/spark/lib/hadoop-lzo-0.4.20-SNAPSHOT.jar
相关文章推荐
- Spark RDD API详解(一) Map和Reduce
- 使用spark和spark mllib进行股票预测
- Spark随谈——开发指南(译)
- Spark,一种快速数据分析替代方案
- eclipse 开发 spark Streaming wordCount
- Understanding Spark Caching
- ClassNotFoundException:scala.PreDef$
- Windows 下Spark 快速搭建Spark源码阅读环境
- hadoop安装lzo
- Spark中将对象序列化存储到hdfs
- Spark初探
- Spark Streaming初探
- Spark本地开发环境搭建
- 搭建hadoop/spark集群环境
- Spark HA部署方案
- Spark HA原理架构图
- spark内存概述
- Spark Shuffle之Hash Shuffle
- Spark Shuffle之Sort Shuffle
- Spark Shuffle之Tungsten Sort Shuffle