HDFS常用API
2018-01-18 23:56
405 查看
URL读取数据
InputStream in = null; try { in = new URL("hdfs://hadoop:9000/input/text1.txt").openStream(); IOUtils.copyBytes(in, System.out, 4096, false); }finally{ IOUtils.closeStream(in); }
FIleSystem
读数据String uri = "hdfs://centos1:9000/input/bank_log.txt"; FileSystem fs =FileSystem.get(URI.create(uri), conf); InputStream in = null; OutputStream out = null; try { in = fs.open(new Path(uri)); IOUtils.copyBytes(in, System.out, 4096, false); }finally{ IOUtils.closeStream(in); }
获取文件元数据
Path file = new Path("/dir/file"); FileStatus stat = fs.getFileStatus(file); assertThat(stat.getPath().toUri().getPath(), is("/dir/file")); assertThat(stat.isDirectory(), is(false)); assertThat(stat.getLen(), is(7L)); assertThat(stat.getModificationTime(), is(lessThanOrEqualTo(System.currentTimeMillis()))); assertThat(stat.getReplication(), is((short) 1)); assertThat(stat.getBlockSize(), is(128 * 1024 * 1024L)); assertThat(stat.getOwner(), is(System.getProperty("user.name"))); assertThat(stat.getGroup(), is("supergroup")); assertThat(stat.getPermission().toString(), is("rw-r--r--"));
列出文件
String uri = "hdfs://centos1:9000/input/"; FileSystem fs =FileSystem.get(URI.create(uri), conf); //FileStatus[] status = fs.globStatus(new Path("/*"), new PathFilter) FileStatus[] status = fs.globStatus(new Path("/*")); // FileStatus[] status = fs.listStatus(new Path(uri)); Path[] listPath = FileUtil.stat2Paths(status); for(Path p:listPath){ System.out.println(p); }
PathFilter
public class RegexExcludePathFilter implements PathFilter { private final String regex; public RegexExcludePathFilter(String regex) { this.regex = regex; } public boolean accept(Path path) { return !path.toString().matches(regex); } }
FSDataInputStream
String uri = "hdfs://hadoop:9000/input/text1.txt"; FileSystem fs =FileSystem.get(URI.create(uri), conf); FSDataInputStream in = null; try { in = fs.open(new Path(uri)); IOUtils.copyBytes(in, System.out, 4096, false); //seek移动到文件中任意一个绝对位置 //inputSream.skip() 只能相对当前位置定位到另一个新位置 in.seek(0); IOUtils.copyBytes(in, System.out, 4096, false); }finally{ IOUtils.closeStream(in); }
FSDataOutputStream
写数据String localUri = "F:/NL/hadoop/input/bank_log.txt"; String uri = "hdfs://centos1:9000/input/bank_log.txt"; InputStream in = new BufferedInputStream(new FileInputStream(localUri)); FileSystem fs =FileSystem.get(URI.create(uri), conf); OutputStream out = fs.create(new Path(uri), new Progressable() { public void progress() { // TODO Auto-generated method stub System.out.print("."); } }); IOUtils.copyBytes(in, out, 4096, false);
相关文章推荐
- 第二篇:Hadoop HDFS常用JAVA api操作程序
- Hadoop 第五课 几个文件搞定HDFS常用的Java Api
- hdfs 常用java API---代码篇(二)
- hadoop实战之HDFS常用JavaAPI
- 4. HDFS 常用Java API 总结
- HDFS API 学习:几个常用的API
- HDFS常用的文件API操作
- HDFS常用的文件API操作
- hdfs 常用java API---代码篇(一)
- 常用HDFS的API操作
- HDFS常用的Java Api详解
- HDFS 常用api
- 常用HDFS的API操作
- 【HDFS基础】常用JavaAPI
- hdfs常用API和putMerge功能实现
- hdfs常用API和putMerge功能实现
- HDFS 的API 封装成 class Scala操作
- Java常用API----字符串操作
- Android 常用API总结
- JavaSE学习总结第12天_API常用对象2