背景

1、软件系统(转储系统)需要向生产环境迁移:迁到国产操作系统、国产资源池(Hbase存储不变)

2、老环境上的转储系统本身存在写入hbase的性能问题、及部分省份写入hbase失败的问题(20%失败)

3、全国31个省写入数据量大

4、向国产环境迁移没有成功,该工程(6年前的工程)作者之前从来没有参与过

5、新的环境迁移涉及主机端口开通、及网络端口打通等诸多事项

环境

 1、Oracle JDK8u202(Oracle JDK8最后一个非商业版本)   下载地址:Oracle JDK8u202

 2、Hbase 1.1.2.2.5  Apache HBase – Apache HBase Downloads(多年前的项目,所以其版本较旧, 该版本已归档:   Index of /dist/hbase )

 3、Hadoop 2.7.1.2.5  Apache Hadoop (该版本已归档: Central Repository: org/apache/hadoop/hadoop-common/2.7.1)

 4、Kerberos 1.10.3-10   MIT Kerberos Consortium  (该版本已归档:Historic MIT Kerberos Releases)

Hbase测试代码

话不多说,先看代码作者在POD上手撕的测试代码,这是指登陆代码(因为只要登陆成功就能解决所有问题):

主类

package com.asiainfo.crm.ai;import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.security.UserProvider;
import org.apache.hadoop.hbase.shaded.com.google.common.base.Preconditions;
import org.apache.hadoop.hbase.util.Strings;
import org.apache.hadoop.net.DNS;/***  export LIB_HOME=/app/tomcat/webapps/ROOT/WEB-INF/lib/javac -d ../classes -cp $LIB_HOME\*:. Transfer2Application.javajava -cp $LIB_HOME\*:. Transfer2Application*/
public class Transfer2Application
{public static void main(String[] args){try {System.out.println("##################  hello world! 1 ");//初始化Hbase信息initHbase();} catch (Exception e) {e.printStackTrace();}}/*** 系统启动时初始化Hbase连接信息,包括kerberos认证** @throws Exception*/private static void initHbase() throws Exception{System.out.println("##################  hello world! initHbase  ");System.getProperties().setProperty("hadoop.home.dir", "/app/tomcat/temp/hbase");System.setProperty("java.security.krb5.conf", "/app/tomcat/krb5.conf");Configuration conf = HBaseConfiguration.create();conf.set("hbase.keytab.file", "/app/tomcat/gz**.app.keytab");conf.set("hbase.kerberos.principal", "gz***/gz**-***.dcs.com@DCS.COM");UserProvider userProvider = UserProvider.instantiate(conf);// login the server principal (if using secure Hadoop)if (userProvider.isHadoopSecurityEnabled() && userProvider.isHBaseSecurityEnabled()){String machineName = Strings.domainNamePointerToHostName(DNS.getDefaultHost(conf.get( "hbase.rest.dns.interface", "default"),conf.get( "hbase.rest.dns.nameserver", "default")));String keytabFilename = conf.get( "hbase.keytab.file" );Preconditions.checkArgument(keytabFilename != null && !keytabFilename.isEmpty(),"hbase.keytab.file"+ " should be set if security is enabled");String principalConfig = conf.get( "hbase.kerberos.principal" );Preconditions.checkArgument(principalConfig != null && !principalConfig.isEmpty(),"hbase.kerberos.principal" + " should be set if security is enabled");userProvider.login( "hbase.keytab.file" , "hbase.kerberos.principal", machineName);System.out.println("########## login success! #######");}else{System.out.println("不具备登陆条件!");System.out.println("userProvider.isHadoopSecurityEnabled() :"+userProvider.isHadoopSecurityEnabled() );System.out.println("userProvider.isHBaseSecurityEnabled():"+userProvider.isHBaseSecurityEnabled());}}
}

POD上手撕代码的几条重要命令

# 应用pod上 lib目录路径,用于设置java classpath
export LIB_HOME=/app/tomcat/webapps/ROOT/WEB-INF/lib/
# 进入src目录并创建主类源文件
cd src
# 编译主类,并将lib目录下所有jar加入到classpath中
javac -d ../classes -cp $LIB_HOME\*:. Transfer2Application.java
# 进入classes目录
cd ../classes
# 运行二进制文件
java -cp $LIB_HOME\*:. Transfer2Application

运行成功效果日志(基于shell)

##################  hello world! 1 
##################  hello world! initHbase  
14:23:01.194 [main] DEBUG org.apache.hadoop.util.Shell - setsid exited with exit code 0
14:23:01.207 [main] DEBUG org.apache.hadoop.security.Groups -  Creating new Groups object
14:23:01.234 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
14:23:01.235 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
14:23:01.235 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
14:23:01.235 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14:23:01.235 [main] DEBUG org.apache.hadoop.util.PerformanceAdvisory - Falling back to shell based
14:23:01.235 [main] DEBUG org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
14:23:01.285 [main] DEBUG org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
14:23:01.338 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
14:23:01.345 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
14:23:01.345 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[GetGroups])
14:23:01.346 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
14:23:01.556 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login
14:23:01.557 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login commit
14:23:01.558 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - using kerberos user:gz***/gz***-ctdfs01.dcs.com@DCS.COM
14:23:01.558 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Using user: "gz***/gz***-ctdfs01.dcs.com@DCS.COM" with name gz***/gz***-ctdfs01.dcs.com@DCS.COM
14:23:01.558 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - User entry: "gz***/gz***-ctdfs01.dcs.com@DCS.COM"
14:23:01.558 [main] INFO org.apache.hadoop.security.UserGroupInformation - Login successful for user gz***/gz***-ctdfs01.dcs.com@DCS.COM using keytab file /app/tomcat/gz***.app.keytab
########## login success! #######

运行成功效果日志(基于动态库)

sh-4.2# java -cp $LIB_HOME\*:. Transfer2Application
##################  hello world! 1 
##################  hello world! initHbase  
17:46:05.469 [main] DEBUG org.apache.hadoop.util.Shell - setsid exited with exit code 0
17:46:05.481 [main] DEBUG org.apache.hadoop.security.Groups -  Creating new Groups object
17:46:05.504 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
17:46:05.504 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Loaded the native-hadoop library
17:46:05.504 [main] DEBUG org.apache.hadoop.security.JniBasedUnixGroupsMapping - Using JniBasedUnixGroupsMapping for Group resolution
17:46:05.505 [main] DEBUG org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
17:46:05.553 [main] DEBUG org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
17:46:05.604 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
17:46:05.611 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
17:46:05.611 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[GetGroups])
17:46:05.612 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
17:46:05.818 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login
17:46:05.819 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login commit
17:46:05.820 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - using kerberos user:gzcrm/gzcrm-ctdfs01.dcs.com@DCS.COM
17:46:05.820 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Using user: "gzcrm/gzcrm-ctdfs01.dcs.com@DCS.COM" with name gzcrm/gzcrm-ctdfs01.dcs.com@DCS.COM
17:46:05.820 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - User entry: "gzcrm/gzcrm-ctdfs01.dcs.com@DCS.COM"
17:46:05.821 [main] INFO org.apache.hadoop.security.UserGroupInformation - Login successful for user gzcrm/gzcrm-ctdfs01.dcs.com@DCS.COM using keytab file /app/tomcat/gzcrm.app.keytab
########## login success! #######

说明:

    基于动态库的方式操作hbase会得到更好的性能效果。

常见问题(FAQ):

Q1:运行时报 Kerberos krb5 configuration not found


18:39:46.030 [main] DEBUG org.apache.hadoop.util.Shell - setsid exited with exit code 0
18:39:46.041 [main] DEBUG org.apache.hadoop.security.Groups -  Creating new Groups object
18:39:46.058 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
18:39:46.059 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
18:39:46.059 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
18:39:46.059 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18:39:46.059 [main] DEBUG org.apache.hadoop.util.PerformanceAdvisory - Falling back to shell based
18:39:46.059 [main] DEBUG org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
18:39:46.113 [main] DEBUG org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
18:39:46.166 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
18:39:46.172 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
18:39:46.172 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[GetGroups])
18:39:46.173 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
18:39:46.219 [main] DEBUG org.apache.hadoop.security.authentication.util.KerberosName - Kerberos krb5 configuration not found, setting default realm to empty
java.lang.IllegalArgumentException: Can't get Kerberos realmat org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:277)at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:262)at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:339)at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:333)at org.apache.hadoop.hbase.security.User$SecureHadoopUser.isSecurityEnabled(User.java:428)at org.apache.hadoop.hbase.security.User.isSecurityEnabled(User.java:268)at org.apache.hadoop.hbase.security.UserProvider.isHadoopSecurityEnabled(UserProvider.java:159)at Transfer2Application.initHbase(Transfer2Application.java:44)at Transfer2Application.main(Transfer2Application.java:22)
Caused by: java.lang.reflect.InvocationTargetExceptionat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:84)at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)... 9 more
Caused by: KrbException: Cannot locate default realmat sun.security.krb5.Config.getDefaultRealm(Config.java:1029)... 15 more

Q2:运行时报hbase.keytab.file should be set if security is enabled

18:41:21.911 [main] DEBUG org.apache.hadoop.util.Shell - setsid exited with exit code 0
18:41:21.922 [main] DEBUG org.apache.hadoop.security.Groups -  Creating new Groups object
18:41:21.945 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
18:41:21.945 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
18:41:21.945 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
18:41:21.946 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18:41:21.946 [main] DEBUG org.apache.hadoop.util.PerformanceAdvisory - Falling back to shell based
18:41:21.946 [main] DEBUG org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
18:41:21.995 [main] DEBUG org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
18:41:22.046 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
18:41:22.052 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
18:41:22.053 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, about=, sampleName=Ops, type=DEFAULT, valueName=Time, value=[GetGroups])
18:41:22.053 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
java.lang.IllegalArgumentException: hbase.keytab.file should be set if security is enabledat org.apache.hadoop.hbase.shaded.com.google.common.base.Preconditions.checkArgument(Preconditions.java:92)at Transfer2Application.initHbase(Transfer2Application.java:53)at Transfer2Application.main(Transfer2Application.java:22):22)

A:没有正确设置hbase.keytab.file

conf.set("hbase.keytab.file", "/app/tomcat/gzcrm.app.keytab");

Q3:运行时报Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path

18:39:46.041 [main] DEBUG org.apache.hadoop.security.Groups -  Creating new Groups object
18:39:46.058 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
18:39:46.059 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
18:39:46.059 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
18:39:46.059 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18:39:46.059 [main] DEBUG org.apache.hadoop.util.PerformanceAdvisory - Falling back to shell based

A:没有成功加载hadoop动态库。从官方下载动态库并拷贝到动态库路径中即可。官方动态库下载地址:https://archive.apache.org/dist/hadoop/common/hadoop-2.7.1/hadoop-2.7.1.tar.gz

动态库文件路径:hadoop-2.7.1/lib/native

把这个目录所有文件拷贝到:/usr/lib 目录下 ,其为日志中提示的java.library.path的目录之一
注意:

     默认拷贝到lib下的文件不包含可直接使用的动态库(不会被加载),原因是默认是的文件链接,拷贝进去就丢失了。需要重新创建文件链接、或做二次拷贝、或是重命名

sh-4.2# ls -lh /usr/lib|grep hadoop
-rwxrwxrwx 1 root root 1.4M Aug 24 14:22 libhadoop.a
-rwxrwxrwx 1 root root 1.6M Aug 24 14:22 libhadooppipes.a
-rwxrwxrwx 1 root root 790K Aug 24 14:22 libhadoop.so.1.0.0
-rwxrwxrwx 1 root root 466K Aug 24 14:22 libhadooputils.a
sh-4.2# 
sh-4.2# 
sh-4.2# cd /usr/lib
sh-4.2# cp libhadoop.so.1.0.0 libhadoop.so
sh-4.2# cp libhdfs.so.0.0.0 libhdfs.so
sh-4.2# 
sh-4.2# ls -lh /usr/lib|grep hadoop
-rwxrwxrwx 1 root root 1.4M Aug 24 14:22 libhadoop.a
-rwxrwxrwx 1 root root 1.6M Aug 24 14:22 libhadooppipes.a
-rwxr-xr-x 1 root root 790K Aug 24 17:44 libhadoop.so
-rwxrwxrwx 1 root root 790K Aug 24 14:22 libhadoop.so.1.0.0
-rwxrwxrwx 1 root root 466K Aug 24 14:22 libhadooputils.a
sh-4.2# 
sh-4.2# 
sh-4.2# ls -lh /usr/lib|grep hdfs
-rwxrwxrwx 1 root root 437K Aug 24 14:22 libhdfs.a
-rwxr-xr-x 1 root root 276K Aug 24 17:45 libhdfs.so
-rwxrwxrwx 1 root root 276K Aug 24 14:22 libhdfs.so.0.0.0

本地查看:

hadoop-2.7.1

hadoop-2.10.2

Q4:运行时报各种网络不通,包括不能访问kerberos服务

A:打通网络端口,开通hbase服务端主机端口,清单如下(实际大家的端口号可能不一致,但都有):

端口号客户端是否必需打通该端口端口说明
8090资源管理页面端口
50470https节点访问端口
8485共享目录端口
8020rpc端口
50070节点http访问端口(需要打通)
2181zookeeper服务端口(需要打通)
8088RM资源管理器http端口
16000master服务端口
3888主zookeeper端口
16010master 信息端口
13562mapreduce分片服务端口
2049nfs文件系统访问端口
16100多播端口
8080服务rest端口
16020RS区域服务端口
21ftp端口
4242nfs挂载端口
88KDC通信端口
80Kerberos服务端口
749这是一个可选的端口,主要用于Kerberos的TCP服
800这些端口用于Kerberos的KDC之间的通信,特别是在设置了跨多个KDC的环境时
801这些端口用于Kerberos的KDC之间的通信,特别是在设置了跨多个KDC的环境时
464这是Kpasswd服务的端口,用于更改Kerberos用户的密码。
750一些实现可能使用这个端口来支持Kpasswd服务的额外功能。
636如果Kerberos配置为使用LDAP over SSL(LDAPS),则LDAPS服务通常运行在这个端口上。虽然这不是Kerberos原生的一部分,但它常被用于管理Kerberos用户账户和策略。
9090Thrift接口端口,支持非Java客户端访问HBase
60020RegionServer工作节点端口,处理客户端读写请求
2888集群内机器通讯使用,Leader监听此端口

附件一:批量验证网络端口连通性的脚本

Linux shell 批量验证网络主机端口连通性_linux shell批量测试ip连通性-CSDN博客

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。
如若转载,请注明出处:http://www.pswp.cn/news/920064.shtml
繁体地址,请注明出处:http://hk.pswp.cn/news/920064.shtml
英文地址,请注明出处:http://en.pswp.cn/news/920064.shtml

如若内容造成侵权/违法违规/事实不符,请联系英文站点网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

C++知识杂项搜集

C使用如下库优化事件的注册和发布,ZeroMQzmqpp 通信机制,请求-应带方式,push-pull方式,publisher-subcriber发布-订阅模式eventpp 事件注册和回调sockpp tcp/udp封装threadpool 线程池Jinja 一个 python 的模板实现配置是实现…

连锁零售排班难?自动排班系统来解决

零售、连锁企业门店多、员工杂、班次密,排班时总有绕不开的问题:跨门店调人成本怎么算?节假日高峰期人手怎么补?全职兼职混合排班怎么平衡?其实,这些场景化难题,盖雅自动排班系统早就有了针对性…

Android用Coil 3检查媒体资源是否有效,Kotlin

Android用Coil 3检查媒体资源是否有效,Kotlin WorkerThreadfun checkImage(ctx: Context, uri: Uri): Boolean {val t System.currentTimeMillis()val request ImageRequest.Builder(ctx).data(uri).memoryCacheKey(uri.toString()).precision(Precision.INEXACT)…

Seaborn数据可视化实战:Seaborn数据可视化入门-绘制统计图表与数据分析

使用Seaborn绘制统计图表:从入门到精通 学习目标 通过本课程的学习,你将掌握如何使用Seaborn库绘制各种统计图表,包括直方图、密度图和箱形图。你将了解这些图表在数据分析中的应用,以及如何通过图表来更好地理解数据。 相关知识点…

​Mac用户安装JDK 22完整流程(Intel版dmg文件安装指南附安装包下载)​

一、准备工作 ​确认你的 Mac 是 Intel 芯片的​ 如果你的 Mac 是 2020 年及之前出的,大概率是 Intel 芯片,可以用这个 ​jdk-22_macos-x64_bin.dmg。如果是 2020 年之后出的 M1 或 M2 芯片的 Mac(就是 Apple 芯片),那…

C语言——链表指定区间反转

目录 1.创建一个链表 1.链表节点定义 2.创建新节点 3.链表生成(输入) 4.链表输出 2.链表指定区间反转函数 1.创建哑节点 2.找到第m-1位的节点,开始 反转 3.连接反转后的链表与未反转的链表 3.未使用哑节点的运行结果 这段代码可以…

设计一个完整可用的 Spring Boot Starter

目录 1. 创建项目结构 2. 添加核心依赖 (pom.xml) 3. 实现核心组件 (1) 配置属性类 (2) 服务实现类 (3) 自动配置类 4. 注册自动配置 5. 配置元数据支持 6. 打包发布 7. 其他项目引用 (1) 添加依赖 (2) 配置参数 (3) 使用服务 设计要点 要设计一个完整可用的 Spr…

Bright Data 代理 + MCP :解决 Google 搜索反爬的完整方案

个人主页:chian-ocean 专栏 引言 人工智能技术和大数据的发展,实时访问网页数据成为许多应用的核心需求。相比传统方案依赖静态或定期更新的数据,AI可以实时抓取和分析网页上的及时更新的信息,迅速适应变化的环境,提…

Java基础第4天总结(多态)

package com.itheima.duotai;public class Animal {String name "动物";public void run(){System.out.println("动物会跑~~~");} }package com.itheima.duotai;public class Wolf extends Animal{String nama "狼";Overridepublic void run(…

Git克隆时遇到“Filename too long“错误的完美解决方案

Git克隆时遇到"Filename too long"错误的完美解决方案 问题描述 在使用Git克隆项目时,你是否遇到过这样的错误: $ git clone gitexample.com:project.git Cloning into project... remote: Enumerating objects: 1883, done. remote: Count…

分享一个基于Python与spark大数据的护肤品市场用户行为分析与可视化平台,基于hadoop的护肤品使用行为追踪与分析可视化平台的设计与实现

💕💕作者:计算机源码社 💕💕个人简介:本人八年开发经验,擅长Java、Python、PHP、.NET、Node.js、Spark、hadoop、Android、微信小程序、爬虫、大数据、机器学习等,大家有这一块的问题…

页面中嵌入Coze的Chat SDK

Coze 为将 AI 聊天机器人(Bot)嵌入您的网页提供了两种主流方式:Web SDK 和 API 接口调用。它们分别适用于不同的场景,下面我将为您介绍这两种方法,并提供一些选择建议。 特性 Web SDK API 接口调用 实现方式 引入一段JS代码,快速嵌入一个预制的聊天窗口 通过HTTP API发送…

DataEase+MaxKB:让BI再多个“A”

一、前言当前DataEase BI更多聚焦于BI展示层,然而,在与组件Copilot 以及后续计划替换的 Sqlbot的融合方面,目前仍存在一些亟待解决的问题,当它们尝试与 DataEase 进行结合应用时,出现了两种较为突出的状况。一方面&…

VUE 的弹出框实现图片预览和视频预览

这是一个基于Vue3封装的媒体预览组件,主要功能包括:多格式支持:可同时预览图片和视频图片操作功能:缩放(支持滚轮缩放和按钮控制)旋转(90度增量旋转)拖拽(仅在放大状态下…

【Linux基础知识系列】第一百零九篇 - 使用shell的输入与输出重定向

在 Linux 系统中,Shell 是用户与操作系统交互的界面,通过命令行输入命令来执行各种操作。输入与输出重定向是 Shell 编程中非常重要的概念,它允许用户将命令的输出保存到文件中,或者从文件中读取输入,从而实现更灵活的…

Redis面试精讲 Day 30:Redis面试真题解析与答题技巧

【Redis面试精讲 Day 30】Redis面试真题解析与答题技巧 在“Redis面试精讲”系列的第30天,我们迎来收官之作——Redis面试真题解析与答题技巧。这一天的核心目标是:帮助你系统化梳理前29天所学知识,掌握高频面试题的解题思路,提升…

设计模式:单例模式(Singleton Pattern)

文章目录一、单例模式的概念二、单例模式的结构三、常见实现方式3.1 饿汉式单例3.2 懒汉式单例一、单例模式的概念 单例模式(Singleton Pattern)是一种创建型设计模式,它的核心思想是:保证在一个进程中,某个类仅有一个…

Swift 解法详解 LeetCode 362:敲击计数器,让数据统计更高效

文章目录 摘要 描述 题解答案 题解代码分析 代码讲解 示例测试及结果 时间复杂度 空间复杂度 总结 摘要 “敲击计数器”这道题听上去像个小游戏里的功能,但其实它背后对应的是一个常见的需求:在过去一段时间内统计事件发生的次数。比如网站的访问量统计、API 调用次数限制、…

coze工作流200+源码,涵盖AI文案生成、图像处理、视频生成、自动化脚本等多个领域

AI 博主风哥在github分享了 200 实用生产力coze工作流,涵盖AI文案生成、图像处理、视频生成、自动化脚本等多个领域,导入即用,项目地址https://github.com/Hammer1/cozeworkflows github下载慢也可前往该地址下载https://pan.baidu.com/s/1fC…

AI与SEO关键词协同优化

内容概要 人工智能(AI)技术的迅猛发展正深刻变革着搜索引擎优化(SEO)的实践方式,特别是在关键词策略这一核心领域。两者的深度融合,为企业在数字海洋中精准导航提供了前所未有的强大工具。通过AI驱动的智能…