震撼推荐:人生的第二次挑战!(转载自51CTO企业网管新生代群)
2009-09-14 18:18
253 查看
Example 3-1. Displaying files from a Hadoop filesystem on standard output using a
URLStreamHandler
Example 3-2. Displaying files from a Hadoop filesystem on standard output by using the FileSystem
directly
Example 3-3 is a simple extension of Example 3-2 that writes a file to standard out
twice: after writing it once, it seeks to the start of the file and streams through it once
again.
Example 3-4 shows how to copy a local file to a Hadoop filesystem. We illustrate progress
by printing a period every time the progress() method is called by Hadoop, which
is after each 64 K packet of data is written to the datanode pipeline. (Note that this
particular behavior is not specified by the API, so it is subject to change in later versions
of Hadoop. The API merely allows you to infer that “something is happening.”)
URLStreamHandler
//Reading Data from a Hadoop URL public class URLCat { static { URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory()); } public static void main(String[] args) throws Exception { InputStream in = null; try { in = new URL(args[0]).openStream(); IOUtils.copyBytes(in, System.out, 4096, false); } finally { IOUtils.closeStream(in); } } } ----------------------------------------- result: Here’s a sample run: % hadoop URLCat hdfs://localhost/user/tom/quangle.txt On the top of the Crumpetty Tree The Quangle Wangle sat, But his face you could not see, On account of his Beaver Hat.
Example 3-2. Displaying files from a Hadoop filesystem on standard output by using the FileSystem
directly
public class FileSystemCat { public static void main(String[] args) throws Exception { String uri = args[0]; Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create(uri), conf); InputStream in = null; try { in = fs.open(new Path(uri)); IOUtils.copyBytes(in, System.out, 4096, false); } finally { IOUtils.closeStream(in); } } } ------------------------------------------ The program runs as follows: % hadoop FileSystemCat hdfs://localhost/user/tom/quangle.txt On the top of the Crumpetty Tree The Quangle Wangle sat, But his face you could not see, On account of his Beaver Hat. The
Example 3-3 is a simple extension of Example 3-2 that writes a file to standard out
twice: after writing it once, it seeks to the start of the file and streams through it once
again.
//Example 3-3. Displaying files from a Hadoop filesystem on standard output twice, by using seek public class FileSystemDoubleCat { public static void main(String[] args) throws Exception { String uri = args[0]; Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create(uri), conf); //通过get()方法获得一个FileSystem流 FSDataInputStream in = null; try { in = fs.open(new Path(uri)); //通过open()方法打开一个FSDataInputStream流 IOUtils.copyBytes(in, System.out, 4096, false); in.seek(0); // go back to the start of the file IOUtils.copyBytes(in, System.out, 4096, false); } finally { IOUtils.closeStream(in); } } } ---------------------------------------------------- Here’s the result of running it on a small file: % hadoop FileSystemDoubleCat hdfs://localhost/user/tom/quangle.txt On the top of the Crumpetty Tree The Quangle Wangle sat, But his face you could not see, On account of his Beaver Hat. On the top of the Crumpetty Tree The Quangle Wangle sat, But his face you could not see, On account of his Beaver Hat.
Example 3-4 shows how to copy a local file to a Hadoop filesystem. We illustrate progress
by printing a period every time the progress() method is called by Hadoop, which
is after each 64 K packet of data is written to the datanode pipeline. (Note that this
particular behavior is not specified by the API, so it is subject to change in later versions
of Hadoop. The API merely allows you to infer that “something is happening.”)
//Example 3-4. Copying a local file to a Hadoop filesystem, and shows progress public class FileCopyWithProgress { public static void main(String[] args) throws Exception { String localSrc = args[0]; String dst = args[1]; InputStream in = new BufferedInputStream(new FileInputStream(localSrc)); Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create(dst), conf); OutputStream out = fs.create(new Path(dst), new Progressable() { public void progress() { System.out.print("."); } }); IOUtils.copyBytes(in, out, 4096, true); } } Typical usage: % hadoop FileCopyWithProgress input/docs/1400-8.txt hdfs://localhost/user/tom/1400-8.txt ...............
相关文章推荐
- 淡定的人生不寂寞-【我与51CTO一“七”成长】 推荐
- Android开发推荐资料大合集 【转载自51CTO】
- 如果中国企业都像奇瑞这么拼,该多好--震撼的奇瑞成长史[转载]
- 推荐系统的挑战(转载)
- 挑战自己:从网管到销售的转行 推荐
- 强烈推荐:给去美国的新生说几句(转载),超实用
- 震撼今世的北京之旅,托付一生的网管事业 推荐
- 人生的第二次挑战
- 【转载】推荐系统的十大挑战
- 企业网管新生代
- 一个同济的女孩的就业之路,人生感想(推荐,很感动 ,转载)
- 身为企业网管需要明白的三个道理 推荐
- 企业在信息化时代中面临的挑战
- 对51cto进行的一次安全检测 推荐
- 什么是优秀管理者的第一课?【如何搞垮一个企业】 推荐
- 推荐一个LISP的开发工具-转载
- 一个老程序员的观察 国内外IT企业 震撼
- C#异步操作(转载推荐)