bin.hadoop2.6版本)

k97glaaz  于 2021-06-26  发布在  Hive
关注(0)|答案(2)|浏览(274)

我安装了这个spark版本:spark-1.6.1-bin-hadoop2.6.tgz。
现在当我开始用 ./spark-shell 命令我得到这个问题(它显示了很多错误行,所以我只是把一些似乎很重要)

Cleanup action completed
        16/03/27 00:19:35 ERROR Schema: Failed initialising database.
        Failed to create database 'metastore_db', see the next exception for details.
        org.datanucleus.exceptions.NucleusDataStoreException: Failed to create database 'metastore_db', see the next exception for details.
            at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:516)

        Caused by: java.sql.SQLException: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.
            org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
            ... 128 more
        Caused by: ERROR XBM0H: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.

        Nested Throwables StackTrace:
        java.sql.SQLException: Failed to create database 'metastore_db', see the next exception for details.
  org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
            ... 128 more
        Caused by: ERROR XBM0H: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.
            at org.apache.derby.iapi.error.StandardException.newException

        Caused by: java.sql.SQLException: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.
            at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
            at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
            at 
            ... 128 more

        <console>:16: error: not found: value sqlContext
                 import sqlContext.implicits._
                        ^
        <console>:16: error: not found: value sqlContext
                 import sqlContext.sql
                        ^

        scala>

我尝试了一些配置来解决这个问题,我在有关值sqlcontext not found问题的其他问题中搜索了这些问题,例如:
/etc/hosts文件:

127.0.0.1  hadoophost localhost localhost.localdomain localhost4 localhost4.localdomain4
    ::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
    10.2.0.15 hadoophost
``` `echo $HOSTNAME` 退货:
Hadoop主机
.bashrc文件包含:

export SPARK_LOCAL_IP=127.0.0.1

但不工作,你能给一些帮助,试图了解为什么Spark没有启动正确?
hive-default.xml.template

Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at

   http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->

在主文件夹中,我得到了相同的问题:

[hadoopadmin@hadoop home]$ pwd
/home
[hadoopadmin@hadoop home]$

文件夹权限:

[hadoopdadmin@hadoop spark-1.6.1-bin-hadoop2.6]$ ls -la
total 1416
drwxr-xr-x. 12 hadoop hadoop 4096 .
drwxr-xr-x. 16 root root 4096 ..
drwxr-xr-x. 2 hadoop hadoop 4096 bin
-rw-r--r--. 1 hadoop hadoop 1343562 CHANGES.txt
drwxr-xr-x. 2 hadoop hadoop 4096 conf
drwxr-xr-x. 3 hadoop hadoop 4096 data
drwxr-xr-x. 3 hadoop hadoop 4096 ec2
drwxr-xr-x. 3 hadoop hadoop 4096 examples
drwxr-xr-x. 2 hadoop hadoop 4096 lib
-rw-r--r--. 1 hadoop hadoop 17352 LICENSE
drwxr-xr-x. 2 hadoop hadoop 4096 licenses
-rw-r--r--. 1 hadoop hadoop 23529 NOTICE
drwxr-xr-x. 6 hadoop hadoop 4096 python
drwxr-xr-x. 3 hadoop hadoop 4096 R
-rw-r--r--. 1 hadoop hadoop 3359 README.md
-rw-r--r--. 1 hadoop hadoop 120 RELEASE
drwxr-xr-x. 2 hadoop hadoop 4096 sbin

zbwhf8kr

zbwhf8kr1#

您正在使用带有Hive支持的spark Build。
有两种可能的解决方案,根据你想做什么以后与你的Spark壳或在你的Spark工作-
您希望访问hadoop+hive安装中的配置单元表。您应该将hive-site.xml放在spark安装的conf子目录中。从现有配置单元安装中查找hive-site.xml。例如,在my cloudera vm中,hive-site.xml位于/usr/lib/hive/conf。执行此步骤后启动spark shell将成功连接到现有的hive metastore,并且不会尝试在当前工作目录中创建临时的.metastore数据库。
您不想访问hadoop+hive安装中的配置单元表。如果您不关心连接到配置单元表,那么您可以遵循阿尔贝托的解决方案。修复启动sparkshell的目录中的权限问题。确保允许您在该目录中创建目录/文件。
希望这有帮助。

chy5wohz

chy5wohz2#

显然你没有权限在那个目录下写东西,我建议你运行 ./spark-shell 在你的 HOME (您可能希望将该命令添加到 PATH ),或用户可访问和写入的任何其他目录。
这可能与spark一起适用于您的笔记本电脑

相关问题