搭建单机CDH环境,并更新spark环境
2018-01-21 11:25
686 查看
搭建单机CDH环境,并更新spark环境
1,安装VMWare Player,http://dlsw.baidu.com/sw-search-sp/soft/90/13927/VMware_player_7.0.0_2305329.1420626349.exe
2,启动BIOS虚拟化,http://www.cnblogs.com/stono/p/8323516.html
3,下载CDH QuickStart版本,https://downloads.cloudera.com/demo_vm/vmware/cloudera-quickstart-vm-5.12.0-0-vmware.zip
4,用vmware player启动CDH,内存8G,CPU4个;root密码cloudera
5,重新安装spark,下载命令 wget http://apache.mirrors.tds.net/spark/spark-2.0.0/spark-2.0.0-bin-hadoop2.7.tgz
下载的时候多下载几次,开始可能出现404问题;
6,下载后配置spark,
1,安装VMWare Player,http://dlsw.baidu.com/sw-search-sp/soft/90/13927/VMware_player_7.0.0_2305329.1420626349.exe
2,启动BIOS虚拟化,http://www.cnblogs.com/stono/p/8323516.html
3,下载CDH QuickStart版本,https://downloads.cloudera.com/demo_vm/vmware/cloudera-quickstart-vm-5.12.0-0-vmware.zip
4,用vmware player启动CDH,内存8G,CPU4个;root密码cloudera
5,重新安装spark,下载命令 wget http://apache.mirrors.tds.net/spark/spark-2.0.0/spark-2.0.0-bin-hadoop2.7.tgz
下载的时候多下载几次,开始可能出现404问题;
6,下载后配置spark,
tar xzvf spark-2.0.0-bin-hadoop2.7.tgz cd spark-2.0.0-bin-hadoop2.7 vi /etc/profile.d/spark2.sh export SPARK_HOME=/home/cloudera/spark-2.0.0-bin-hadoop2.7 export PATH=$PATH:/home/cloudera/spark-2.0.0-bin-hadoop2.7/bin cp conf/spark-env.sh.template conf/spark-env.sh cp conf/spark-defaults.conf.template conf/spark-defaults.conf vi conf/spark-env.sh export HADOOP_CONF_DIR=/etc/hadoop/conf export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera cp /etc/hive/conf/hive-site.xml conf/ 修改conf/log4j.properties中的日志级别为ERROR
相关文章推荐
- 单机Docker搭建Hadoop/Spark环境
- Ubuntu /Spark单机环境搭建
- Mac下hadoop,hive, hbase,spark单机环境搭建
- Ubuntu 16.04 Spark单机环境搭建
- centos7 搭建spark单机环境
- 【Windows】【Scala + Spark】【Eclipse】单机开发环境搭建 - 及示例程序
- 单机搭建spark环境
- 在Ubuntu14.04 64bit上搭建单机Spark环境,IDE为Intelli IDEA
- 在Ubuntu 14.04 64bit上搭建单机本地节点Spark 1.3.0环境
- spark windows java 单机搭建环境并且读取文本中字母数量
- Spark1.2.0单机环境搭建
- 【转】搭建spark环境 单机版
- 在Ubuntu 14.04 64bit上搭建单机本地节点Spark 1.3.0环境
- windows7 spark单机环境搭建及pycharm访问spark
- Ubuntu16.04+Spark单机环境搭建
- 单机搭建基于Hadoop的Spark环境
- 单机搭建基于Hadoop的Spark环境
- 在Win7虚拟机下搭建Hadoop2.5.2+Spark1.5.2单机环境
- CentOS7 从零搭建Spark 2.0 单机环境
- ubuntu 14.04 Spark单机环境搭建与实例使用