本文主要是介绍快看 | Java连接集成Kerberos的HA HDFS方案,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
点击上方蓝色字体,选择“设为星标”
回复”资源“获取更多资源
来源:http://suo.im/5SGnSD
大数据技术与架构
点击右侧关注,大数据开发领域最强公众号!
暴走大数据
点击右侧关注,暴走大数据!
在实施方案前,假设读者已经基本熟悉以下技术 (不细说)
Java,maven
hdfs,kerberos
方案实施
最后目录大概如下
新建maven工程,pom.xml配置, 添加以下
<dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-client</artifactId><version>2.6.5</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-common</artifactId><version>2.6.5</version></dependency>
拷贝kdc下的krb5.conf文件
[libdefaults]
default_realm = W.COMdns_lookup_realm = falsedns_lookup_kdc = falseticket_lifetime = 24hrenew_lifetime = 7dforwardable = true[realms]W.COM = {kdc = node1:88admin_server = node1:749default_domain = W.COMkdc = bi-slave1}#这里的kdc做了HA,需要把从kdc也加进realm里面
#默认的端口号也可以去掉
生成keytab文件
kadmin.local: xst -norandkey -k deng_yb.keytab deng_yb@W.COM
下载core-site.xml文件
拿到core-site.xml文件
HDFSUtils
package deng.yb.hdfsUtils;import java.io.IOException;import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.security.UserGroupInformation;public class HDFSUtils {/*** 加载hdfs客户端配置* @return*/public static Configuration initConfiguration() {Configuration configuration = new Configuration();configuration.addResource(new Path(getPath("core-site.xml")));return configuration;}/*** 初始化Kerberos环境*/public static void initKerberosENV(Configuration conf) {System.setProperty("java.security.krb5.conf",getPath("krb5.conf"));System.setProperty("javax.security.auth.useSubjectCredsOnly", "false");// System.setProperty("sun.security.krb5.debug", "true");try {UserGroupInformation.setConfiguration(conf);UserGroupInformation.loginUserFromKeytab("deng_yb@W.COM",getPath("deng_yb.keytab"));System.out.println(UserGroupInformation.getCurrentUser());} catch (IOException e) {e.printStackTrace();}}public static String getPath(String file) {if (null == file) {return null;}return Thread.currentThread().getContextClassLoader().getResource(file).getPath();}
}
测试
package deng.yb.hdfsUtils;import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;/*** 链接HA,kerberos 集群**/
public class App
{public static void main( String[] args ){//初始化HDFS Configuration 配置Configuration configuration = HDFSUtils.initConfiguration();//初始化Kerberos环境HDFSUtils.initKerberosENV(configuration);try {//其中nameservice1是HA别名Path dstPath = new Path("hdfs://nameservice1/");FileSystem fileSystem = FileSystem.get(configuration);FileStatus[] listStatus = fileSystem.listStatus(dstPath);for (FileStatus fileStatus : listStatus) {Path path = fileStatus.getPath();System.out.println(path);}} catch (Exception e){e.printStackTrace();}}
}
执行main方法,结果如下
欢迎点赞+收藏+转发朋友圈素质三连
文章不错?点个【在看】吧! ????
这篇关于快看 | Java连接集成Kerberos的HA HDFS方案的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!