如何在没有kerberos的情况下启动spark 3.0.0 kubernetes工作负载?

bvpmtnay  于 2021-05-27  发布在  Spark
关注(0)|答案(2)|浏览(394)

似乎在spark 3.0.0上,当我使用kubernetes进行spark提交时,它需要kerberos,我使用的spark提交与2.4.5中运行良好的spark提交相同,我得到以下错误:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/07/04 08:17:51 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file
20/07/04 08:17:51 INFO KerberosConfDriverFeatureStep: You have not specified a krb5.conf file locally or via a ConfigMap. Make sure that you have the krb5.conf locally on the driver image.
Exception in thread "main" org.apache.hadoop.security.KerberosAuthException: failure to login: javax.security.auth.login.LoginException: java.lang.NullPointerException: invalid null input: name

更具体地说,我不想使用kerberos

q0qdq0h2

q0qdq0h21#

要么你在操作系统级别处理这个id(185),要么你可以在dockerfile中对它进行注解,但你将是root。。。。埃里克

eiee3dmh

eiee3dmh2#

好的,找到了,您需要在映像构建步骤中指定一个spark\u uid。

相关问题