Hadoop 学习笔记(一) HDFS API
2013-11-23 11:29
423 查看
http://www.cnblogs.com/liuling/p/2013-6-17-01.html 这个也不错 http://www.teamwiki.cn/hadoop/thrift thrift编程 1.上传本地文件到HDFS package proj; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; public class CopyFile { public static void main(String[] args) throws Exception { Configuration conf = new Configuration(); //要点:没有这句会传到本地文件系统,而不是hdfs conf.set("fs.default.name","hdfs://localhost:9000"); FileSystem hdfs = FileSystem.get(conf); Path src = new Path("/codes/c/hello.c"); Path dst = new Path("in"); hdfs.copyFromLocalFile(src, dst); System.out.println("Upload to" + conf.get("fs.default.name")); FileStatus[] files = hdfs.listStatus(dst); for (FileStatus fileStatus : files) { System.out.println(fileStatus.getPath()); } } }
2.创建HDFS文件 package proj; import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FSDataOutputStream; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; public class CreateFile { public static void main(String[] args) throws IOException { Configuration conf = new Configuration(); //要点:没有这句会传到本地文件系统,而不是hdfs conf.set("fs.default.name","hdfs://localhost:9000"); FileSystem hdfs = FileSystem.get(conf); Path dfs = new Path("in/test.txt"); FSDataOutputStream outputStream = hdfs.create(dfs); byte[] buff = "hello prince of persia".getBytes(); outputStream.write(buff, 0, buff.length); System.out.println("run over"); } }
3.重命名 package proj; import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; public class Rename { public static void main(String[] args) throws IOException { Configuration conf = new Configuration(); //要点:没有这句会传到本地文件系统,而不是hdfs conf.set("fs.default.name","hdfs://localhost:9000"); FileSystem hdfs = FileSystem.get(conf); Path frpath = new Path("in/test.txt"); Path topath = new Path("in/test3.txt"); boolean isRename = hdfs.rename(frpath, topath); System.out.println(isRename); } }
相关文章推荐
- hadoop学习笔记:创建maven项目与使用hdfs的读写API
- Hadoop 学习笔记(二) HDFS API
- Hadoop学习笔记:HDFS的java API使用
- hadoop学习笔记--5.HDFS的java api接口访问
- hadoop学习之HDFS(2.8):hdfs的javaAPI使用及示例
- Hadoop学习笔记(二)——HDFS
- hadoop2.5.2学习及实践笔记(五)—— HDFS shell命令行常见操作
- [原创] hadoop学习笔记:重新格式化HDFS文件系统
- hadoop学习笔记之HDFS
- Hadoop学习二(java api调用操作HDFS)
- <hadoop学习历程>--笔记心得3-HDFS数据管理与容错
- hadoop学习笔记-基于hdfs搭建简单网盘应用
- Hadoop学习笔记一(通过Java API 操作HDFS,文件上传、下载)
- Hadoop学习笔记——1.java读取Oracle中表的数据,创建新文件写入Hdfs
- hadoop2.7.2学习笔记05-hadoop文件系统API定义-简介
- Hadoop学习笔记二(新旧版本API的区别)
- Hadoop学习笔记之HDFS
- hadoop2.7.2学习笔记20-HDFS Snapshots
- hadoop-hdfs学习笔记
- hadoop学习笔记(HDFS)