找不到hdfs fsck/command的可信证书

rdlzhqv9  于 2021-06-03  发布在  Hadoop
关注(0)|答案(1)|浏览(407)

我已经在windows中配置了用kerberos保护的hadoop-2.6.0。一切正常。但是当我执行下面的命令

hdfs fsck /

我得到以下例外。
例外

通过连接到namenodehttps://hostname:50470线程“main”javax.net.ssl.sslhandshakeexception中的异常:sun.security.val idator.validatorexception:在sun.security.ssl.alerts.getsslexception(警报)中找不到受信任的证书。java:192)在sun.security.ssl.sslsocketimpl.fatal(sslsocketimpl。java:1884)在sun.security.ssl.handshaker.fatalse(握手器。java:276)在sun.security.ssl.handshaker.fatalse(握手器。java:270)在sun.security.ssl.clienthandshaker.servercertificate(clienthandshaker。java:1341)在sun.security.ssl.clienthandshaker.processmessage(clienthandshaker.java:153)在sun.security.ssl.handshaker.processloop(handshaker。java:868)在sun.security.ssl.handshaker.process\u record(handshaker。java:804)在sun.security.ssl.sslsocketimpl.readrecord(sslsocketimpl。java:1016)在sun.security.ssl.sslsocketimpl.performitialhandshake(sslsocketimpl。java:1312)在sun.security.ssl.sslsocketimpl.startAndShake(sslsocketimpl。java:1339 )在sun.security.ssl.sslsocketimpl.startAndShake(sslsocketimpl。java:1323 )在sun.net。www.protocol.https.httpsclient.afterconnect(httpsclient.java:563)位于sun.net。www.protocol.https.abstractdelegatehttpsurlconnection.connect (abstractdelegatehttpsurlconnection。java:185)在sun.net。www.protocol.https.httpsurlconnectionimpl.connect(httpsurlcon连接模板。java:153)在org.apache.hadoop.security.authentication.client.kerberosauthenticato r.authenticate(kerberosauthenticator。java:186)位于org.apache.hadoop.security.authentication.client.authenticatedurl.open连接(authenticatedurl。java:216)网址:org.apache.hadoop.hdfs.web.urlconnectionfactory.openconnection(urlconnectionfactory)。java:164)在org.apache.hadoop.hdfs.tools.dfsck.dowork(dfsck。java:303)访问org.apache.hadoop.hdfs.tools.dfsck.access$000(dfsck。java:72)在org.apache.hadoop.hdfs.tools.dfsck$1.run(dfsck。java:145)在org.apache.hadoop.hdfs.tools.dfsck$1.run(dfsck。java:142)位于javax.security.auth.subject.doas(subject)的java.security.accesscontroller.doprivileged(本机方法)。java:415)在org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation。java:1628)在org.apache.hadoop.hdfs.tools.dfsck.run(dfsck。java:141)在org.apache.hadoop.util.toolrunner.run(toolrunner。java:70)在org.apache.hadoop.util.toolrunner.run(toolrunner。java:84)在org.apache.hadoop.hdfs.tools.dfsck.main(dfsck。java:341)原因:sun.security.validator.validatorexception:sun.security.validator.simplevalidator.buildtrustedchain(simplevalidator)没有受信任的证书。java:384) 在sun.security.validator.simplevalidator.enginevalidate(simplevalidator。java:134)在sun.security.validator.validator.validate(validator。java:260)在sun.security.ssl.x509trustmanagerimpl.validate(x509trustmanagerimpl.java:326)在sun.security.ssl.x509trustmanagerimpl.checktrusted(x509trustmanagerimpl。java:231)在sun.security.ssl.x509trustmanagerimpl.checkservertrusted(x509trustman.impl。java:107)位于org.apache.hadoop.security.ssl.reloadingx509trustmanager.checkservert rusted(reloadingx509trustmanager)。java:129)位于sun.security.ssl.abstractTrustManagerRapper.checkservertrusted(sslco ntextimpl。java:813)在sun.security.ssl.clienthandshaker.servercertificate(clienthandshaker。java:1323) ... 24个以上
我已从默认浏览器导出证书,并使用以下命令将其添加到密钥库中,但仍面临相同的错误。

keytool -import -alias nncert -keystore c:\Java\jre\lib\security\cacerts -file nn.crt

帮我解决这个问题。
谢谢。

sczxawaw

sczxawaw1#

我已经按照这个链接的步骤解决了这个问题。
一旦证书被导出,它应该添加到java信任库%java\u home%\jre\lib\security\cacerts的默认位置,因为fsck命令使用ssl的默认位置。
希望这能对别人有所帮助。

相关问题