编写Java代码对HDFS进行增删改查操作代码实例

本文实例为大家分享了Java代码对HDFS进行增删改查操作的具体代码,供大家参考,具体内容如下

创新互联建站是一家集网站建设,黄岩企业网站建设,黄岩品牌网站建设,网站定制,黄岩网站建设报价,网络营销,网络优化,黄岩网站推广为一体的创新建站企业,帮助传统企业提升企业形象加强企业竞争力。可充分满足这一群体相比中小企业更为丰富、高端、多元的互联网需求。同时我们时刻保持专业、时尚、前沿,时刻以成就客户成长自我,坚持不断学习、思考、沉淀、净化自己,让我们为更多的企业打造出实用型网站。

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.net.URI;

import org.apache.commons.compress.utils.IOUtils;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.BlockLocation;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

public class FileOpreation {

	public static void main(String[] args) throws IOException {
		//CreateFile();
		//DeleteFile();
		//CopyFileToHDFS();
		//MkDirs();
		//DelDirs();
		ListDirectory();
		DownLoad();

	}
	public static void CreateFile() throws IOException {
		String uri = "hdfs://Alvis:9000";
		Configuration configuration =new Configuration();
		FileSystem fSystem = FileSystem.get(URI.create(uri), configuration);
		byte[] file_content_buff="hello hadoop world, test write file !\n".getBytes();
		Path dfs = new Path("/home/test.txt");
		FSDataOutputStream outputStream = fSystem.create(dfs);
		outputStream.write(file_content_buff.length);
	}
	public FileOpreation() {
		// TODO Auto-generated constructor stub
	}public static void DeleteFile() throws IOException {
		String uri = "hdfs://Alvis:9000";
		Configuration configuration =new Configuration();
		FileSystem fSystem = FileSystem.get(URI.create(uri), configuration);
		Path deletf = new Path("/home/test.txt");
		boolean delResult = fSystem.delete(deletf,true);
		System.out.println(delResult==true?"删除成功":"删除失败");
	}
  
	public static void CopyFileToHDFS() throws IOException {
		String uri = "hdfs://Alvis:9000";
		Configuration configuration =new Configuration();
		FileSystem fSystem = FileSystem.get(URI.create(uri), configuration);
		Path src = new Path("E:\\SerializationTest\\APITest.txt");
		Path dest_src = new Path("/home");
		fSystem.copyFromLocalFile(src, dest_src);
	}
	
	public static void MkDirs() throws IOException {
		String uri = "hdfs://Alvis:9000";
		Configuration configuration =new Configuration();
		FileSystem fSystem = FileSystem.get(URI.create(uri), configuration);
		Path src = new Path("/Test");
		fSystem.mkdirs(src);
		
	}
	
	public static void DelDirs() throws IOException {
		String uri = "hdfs://Alvis:9000";
		Configuration configuration = new Configuration();
		FileSystem fSystem = FileSystem.get(URI.create(uri), configuration);
		Path src = new Path("/Test");
		fSystem.delete(src);

	}
	
	public static void ListDirectory() throws IOException {
		String uri = "hdfs://Alvis:9000";
		Configuration configuration = new Configuration();
		FileSystem fSystem = FileSystem.get(URI.create(uri), configuration);
		FileStatus[] fStatus = fSystem.listStatus(new Path("/output"));
		for(FileStatus status : fStatus)
			if (status.isFile()) {
				System.out.println("文件路径:"+status.getPath().toString());
				System.out.println("文件路径 getReplication:"+status.getReplication());
				System.out.println("文件路径 getBlockSize:"+status.getBlockSize());
				BlockLocation[] blockLocations = fSystem.getFileBlockLocations(status, 0, status.getBlockSize());
				for(BlockLocation location : blockLocations){
					System.out.println("主机名:"+location.getHosts()[0]);
					System.out.println("主机名:"+location.getNames()[0]);
			  }
		  }
			else {
				System.out.println("directory:"+status.getPath().toString());
			}
	}
	
	public static void DownLoad() throws IOException {
		Configuration configuration = new Configuration();
		configuration.set("fs.defaultFS", "hdfs://Alvis:9000");
		FileSystem fSystem =FileSystem.get(configuration);
		FSDataInputStream inputStream =fSystem.open( new Path("/input/wc.jar"));
		FileOutputStream outputStream = new FileOutputStream(new File("E:\\LearnLife\\DownLoad\\wc.jar"));
		IOUtils.copy(inputStream, outputStream);
		System.out.println("下载成功!");
	}
}

思想:

一、定义虚拟机接口

二、先拿到HDFS远程调用接口对象Configuration

三、定义分布式文件系统FileSystem对象获取对象

四、给定路径

五、用FileSystem对象调用操作

以上所述是小编给大家介绍的Java代码对HDFS进行增删改查操作详解整合,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对创新互联网站的支持!


分享标题:编写Java代码对HDFS进行增删改查操作代码实例
URL链接:http://pwwzsj.com/article/pjggos.html