使用derby初始化时发生配置单元异常

fhity93d  于 2021-05-27  发布在  Hadoop
关注(0)|答案(1)|浏览(420)

我正在尝试按照这里的说明初始化配置单元(https://kontext.tech/column/hadoop/309/apache-hive-311-installation-on-windows-10-using-windows-subsystem-for-linux)
每次我收到这个例外。我知道有一个问题,因为似乎有2 slf4j,但到目前为止,这只是一个警告,现在一切似乎都南下。。。像往常一样欢迎任何想法
myuser@my001-pc:~$$hive\u home/bin/schematool-dbtype derby-initschema slf4j:类路径包含多个slf4j绑定。slf4j:在中找到绑定[jar:file:/home/myuser/hadoop/apache-hive-3.1.2-bin/lib/log4j-slf4j-impl-2.10.0.jar/org/slf4j/impl/staticloggerbinder.class]slf4j:在中找到绑定[jar:file:/home/myuser/hadoop/hadoop-3.3.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar/org/slf4j/impl/staticloggerbinder.class]slf4j:请参阅http://www.slf4j.org/codes.html#multiple_bindings 为了解释。slf4j:实际绑定的类型为[org.apache.logging.slf4j.log4jloggerfactory]线程“main”java.lang.runtimeexception:com.ctc.wstx.exc.wstxparsingexception:具有多个根非法(epilog中的start标记?)。在[行,列,系统id]:[28,2,“file:/home/myuser/hadoop/hadoop-3.3.0/etc/hadoop/core site.xml”]位于org.apache.hadoop.conf.configuration.loadresource(配置)。java:3051)在org.apache.hadoop.conf.configuration.loadresources(配置。java:2995)在org.apache.hadoop.conf.configuration.getprops(配置。java:2875)在org.apache.hadoop.conf.configuration.get(配置。java:1223)在org.apache.hadoop.conf.configuration.gettimeduration(配置。java:1840)在org.apache.hadoop.conf.configuration.gettimeduration(配置。java:1817)在org.apache.hadoop.util.shutdownhookmanager.getshutdownttimeout(shutdownhookmanager。java:183)在org.apache.hadoop.util.shutdownhookmanager$hookentry.(shutdownhookmanager。java:207)在org.apache.hadoop.util.shutdownhookmanager.addshutdownhook(shutdownhookmanager。java:304)在org.apache.hadoop.util.runjar.run(runjar。java:301)在org.apache.hadoop.util.runjar.main(runjar。java:236)原因:com.ctc.wstx.exc.wstxparsingexception:非法有多个根(epilog中的起始标记?)。在[行,列,系统id]:[28,2,“file:/home/myuser/hadoop/hadoop-3.3.0/etc/hadoop/core site.xml”]位于com.ctc.wstx.sr.streamscanner.constructwfcexception(streamscanner)。java:621)在com.ctc.wstx.sr.streamscanner.throwparseerror(streamscanner。java:491)在com.ctc.wstx.sr.streamscanner.throwparseerror(streamscanner。java:475)在com.ctc.wstx.sr.basicstreamreader.handleextraroot(basicstreamreader。java:2242)在com.ctc.wstx.sr.basicstreamreader.nextfromprolog(basicstreamreader。java:2156)在com.ctc.wstx.sr.basicstreamreader.next(basicstreamreader。java:1183)在org.apache.hadoop.conf.configuration$parser.parsenext(配置。java:3347)在org.apache.hadoop.conf.configuration$parser.parse(配置。java:3141)在org.apache.hadoop.conf.configuration.loadresource(配置。java:3034) ... 线程“thread-1”java.lang.runtimeexception中还有10个异常:com.ctc.wstx.exc.wstxparsingexception:具有多个根(epilog中的start标记?)是非法的。在[行,列,系统id]:[28,2,“file:/home/myuser/hadoop/hadoop-3.3.0/etc/hadoop/core site.xml”]位于org.apache.hadoop.conf.configuration.loadresource(配置)。java:3051)在org.apache.hadoop.conf.configuration.loadresources(配置。java:2995)在org.apache.hadoop.conf.configuration.getprops(配置。java:2875)在org.apache.hadoop.conf.configuration.get(配置。java:1223)在org.apache.hadoop.conf.configuration.gettimeduration(配置。java:1840)在org.apache.hadoop.conf.configuration.gettimeduration(配置。java:1817)在org.apache.hadoop.util.shutdownhookmanager.getshutdownttimeout(shutdownhookmanager。java:183)在org.apache.hadoop.util.shutdownhookmanager.shutdownexecutor(shutdownhookmanager。java:145)在org.apache.hadoop.util.shutdownhookmanager.access$300(shutdownhookmanager。java:65)在org.apache.hadoop.util.shutdownhookmanager$1.run(shutdownhookmanager。java:102)原因:com.ctc.wstx.exc.wstxparsingexception:具有多个根(开始标记)非法在结语里?)。在[行,列,系统id]:[28,2,“file:/home/myuser/hadoop/hadoop-3.3.0/etc/hadoop/core site.xml”]位于com.ctc.wstx.sr.streamscanner.constructwfcexception(streamscanner)。java:621)在com.ctc.wstx.sr.streamscanner.throwparseerror(streamscanner。java:491)在com.ctc.wstx.sr.streamscanner.throwparseerror(streamscanner。java:475)在com.ctc.wstx.sr.basicstreamreader.handleextraroot(basicstreamreader。java:2242)在com.ctc.wstx.sr.basicstreamreader.nextfromprolog(basicstreamreader。java:2156)在com.ctc.wstx.sr.basicstreamreader.next(basicstreamreader。java:1183)在org.apache.hadoop.conf.configuration$parser.parsenext(配置。java:3347)在org.apache.hadoop.conf.configuration$parser.parse(配置。java:3141)在org.apache.hadoop.conf.configuration.loadresource(配置。java:3034) ... 9个以上

2guxujil

2guxujil1#

看起来您在core-site.xml中定义了多个元素。看行号 28 你的 /home/MyUser/hadoop/hadoop-3.3.0/etc/hadoop/core-site.xml . 格式良好的xml文档应该只有一个顶级(“根”)元素。例如,以下文件是合法的。

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
   <property>
      <name>hadoop.tmp.dir</name>
      <value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
      <description>A base for other temporary directories</description>
   </property>
   <property>
      <name>fs.default.name</name>
      <value>hdfs://localhost:8020</value>
   </property>
</configuration>

另一方面,以下文档是非法的,因为 <configuration> 元素定义两次:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
   <property>
      <name>hadoop.tmp.dir</name>
      <value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
      <description>A base for other temporary directories</description>
   </property>
   <property>
      <name>fs.default.name</name>
      <value>hdfs://localhost:8020</value>
   </property>
</configuration>
<configuration>
</configuration>

相关问题