hdfs下载文件到本地
2015-09-17 17:27
375 查看
import java.io.FileOutputStream;
import java.io.OutputStream;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
public static void main(String[] args) throws Exception
{
String dest = "hdfs://master001:9000/zhaimo/test62015_09_17_04_38_02/part-r-00000";
String local = "d:\\test16.txt";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(dest),conf);
FSDataInputStream fsdi = fs.open(new Path(dest));
OutputStream output = new FileOutputStream(local);
IOUtils.copyBytes(fsdi,output,4096,true);
}
import java.io.OutputStream;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
public static void main(String[] args) throws Exception
{
String dest = "hdfs://master001:9000/zhaimo/test62015_09_17_04_38_02/part-r-00000";
String local = "d:\\test16.txt";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(dest),conf);
FSDataInputStream fsdi = fs.open(new Path(dest));
OutputStream output = new FileOutputStream(local);
IOUtils.copyBytes(fsdi,output,4096,true);
}
相关文章推荐
- Hadoop_2.1.0 MapReduce序列图
- hadoop的hdfs文件操作实现上传文件到hdfs
- MongoDB中的MapReduce简介
- MongoDB学习笔记之MapReduce使用示例
- MongoDB中MapReduce编程模型使用实例
- MapReduce中ArrayWritable 使用指南
- Java函数式编程(七):MapReduce
- java连接hdfs ha和调用mapreduce jar示例
- java实现将ftp和http的文件直接传送到hdfs
- 用PHP和Shell写Hadoop的MapReduce程序
- JavaScript mapreduce工作原理简析
- mongodb mapredReduce 多个条件分组(group by)
- 在Hadoop2.5.0下利用Java读写HDFS
- HDFS 文件操作
- HBase基本原理
- HDFS DatanodeProtocol——sendHeartbeat
- HDFS DatanodeProtocol——register
- Hadoop集群提交作业问题总结
- Hadoop源码分析 HDFS ClientProtocol——addBlock
- Hadoop HDFS Java API