分布式内存文件系统alluxio的安装,底层存储使用hdfs
2016-12-05 23:56
1046 查看
install alluxio: alluxio-1.3.0-hadoop2.6-bin.tar
cdh5.8 (hadoop 2.6)
spark2(namenode) datanodes: spark3 spark4 spark5
one master three workers
底层存储使用hdfs
(1) tar -xvf alluxio-1.3.0-hadoop2.6-bin.tar -C /opt/ (解压到opt下, 产生/opt/alluxio-1.3.0目录)
(2) execute command: ../bin/alluxio bootstrapConf spark2,
then get alluxio-env.sh file in the conf direct
here spark2 is alluxio_master(<ALLUXIO_MASTER_HOSTNAME>)
(3) change conf/workers, add all the workers address in the file
(4) cp alluxio-site.properties.template alluxio-site.properties
# alluxio.underfs.address=${alluxio.work.dir}/underFSStorage ==>>
alluxio.underfs.address=hdfs://spark2:8020/user/alluxio
(5) cd $ALLUXIO_HOME
scp -r ./alluxio/ root@spark3:/opt
scp -r ./alluxio/ root@spark4:/opt
scp -r ./alluxio/ root@spark5:/opt
(6) cd $ALLUXIO_HOME
./alluxio format
./alluxio-start.sh all
look the web ui: spark2:19999
run tests on the cluster:
./alluxio runTests
visit HDFS web UI at http://spark2:50070
in the direct: /user/alluxio/, you will find files
named like: /default_tests_files/BasicFile_STORE_SYNC_PERSIST
stop the cluster:
./alluxio-stop.sh all
Well Done!!!
(7) learn to use it:
some basic commands:
cd $ALLUXIO_HOME/bin
./alluxio fs ls / list all files
./bin/alluxio fs copyFromLocal ./LICENSE /LICENSE copy LICENSE from local disk file to alluxio file system
./bin/alluxio fs cat /LICENSE look the file content
/alluxio fs persist /LICENSE 将文件/LICENSE持久化存储到UFS(alluxio底层存储文件系统,这里指的是hdfs)
Alluxio默认只写入数据到Alluxio存储空间(写到内存中),而不会写入UFS。
通过hdfs webui(或alluxio webui) 可以查看持久化前后hdfs对应的alluxio目录下,文件的变化
alluxio文件完整路径: / ==》 alluxio://spark2:19998/
./bin/alluxio fs rm /LICENSE remove the file from the memory and the ufs
./bin/alluxio fs free /LICENSE just remove the file from the memory
cdh5.8 (hadoop 2.6)
spark2(namenode) datanodes: spark3 spark4 spark5
one master three workers
底层存储使用hdfs
(1) tar -xvf alluxio-1.3.0-hadoop2.6-bin.tar -C /opt/ (解压到opt下, 产生/opt/alluxio-1.3.0目录)
(2) execute command: ../bin/alluxio bootstrapConf spark2,
then get alluxio-env.sh file in the conf direct
here spark2 is alluxio_master(<ALLUXIO_MASTER_HOSTNAME>)
(3) change conf/workers, add all the workers address in the file
(4) cp alluxio-site.properties.template alluxio-site.properties
# alluxio.underfs.address=${alluxio.work.dir}/underFSStorage ==>>
alluxio.underfs.address=hdfs://spark2:8020/user/alluxio
(5) cd $ALLUXIO_HOME
scp -r ./alluxio/ root@spark3:/opt
scp -r ./alluxio/ root@spark4:/opt
scp -r ./alluxio/ root@spark5:/opt
(6) cd $ALLUXIO_HOME
./alluxio format
./alluxio-start.sh all
look the web ui: spark2:19999
run tests on the cluster:
./alluxio runTests
visit HDFS web UI at http://spark2:50070
in the direct: /user/alluxio/, you will find files
named like: /default_tests_files/BasicFile_STORE_SYNC_PERSIST
stop the cluster:
./alluxio-stop.sh all
Well Done!!!
(7) learn to use it:
some basic commands:
cd $ALLUXIO_HOME/bin
./alluxio fs ls / list all files
./bin/alluxio fs copyFromLocal ./LICENSE /LICENSE copy LICENSE from local disk file to alluxio file system
./bin/alluxio fs cat /LICENSE look the file content
/alluxio fs persist /LICENSE 将文件/LICENSE持久化存储到UFS(alluxio底层存储文件系统,这里指的是hdfs)
Alluxio默认只写入数据到Alluxio存储空间(写到内存中),而不会写入UFS。
通过hdfs webui(或alluxio webui) 可以查看持久化前后hdfs对应的alluxio目录下,文件的变化
alluxio文件完整路径: / ==》 alluxio://spark2:19998/
./bin/alluxio fs rm /LICENSE remove the file from the memory and the ufs
./bin/alluxio fs free /LICENSE just remove the file from the memory
相关文章推荐
- 分布式内存文件系统Alluxio实战 推荐
- .NET使用DFS分布式文件存储系统
- Alluxio 内存存储系统部署
- 分布式文件存储系统 mogilefs 的使用(1)
- 如何在Centos7上安装Glusterfs分布式文件存储系统
- Tachyon:Spark生态系统中的分布式内存文件系统的使用
- Hadoop分布式文件存储系统HDFS高可用HA搭建(何志雄)
- 在VMWare Workstation上使用RedHat Linux安装和配置Hadoop群集环境05_HDFS文件系统和Mapreduce框架的启动和运行
- PHP高性能分布式内存对象缓存系统扩展Memcached的安装及使用
- HDFS分布式文件存储系统
- 分布式内存文件系统Tachyon介绍及安装部署
- Alluxio安装(以HDFS作为底层文件系统)
- 分布式文件存储系统 mogilefs 的使用(2)
- 安装 Mac OS X 时不要使用大小写敏感的日志文件系统
- Hadoop (HDFS)分布式文件系统基本操作
- 全面了解安装使用Linux下的日志文件系统
- HDFS文件系统和OpenStack swift对象存储有何不同
- [zz]HDFS文件系统和OpenStack swift对象存储有何不同
- 使用内存文件系统
- 如何解决Mysql "发生系统错误2,找不到指定的文件" 的问题(第一次安装使用)