如果在导出过程中遇到java.lang.noclassdeffounderror异常,我应该如何修复sqoop?

vzgqcmou  于 2021-07-13  发布在  Hive
关注(0)|答案(0)|浏览(297)

我正在尝试使用以下命令将配置单元数据库表导出到amazon aws群集上的mysql数据库表:

sqoop export --connect jdbc:mysql://database_hostname/universities --table 19_20 --username admin -P --export-dir '/final/hive/19_20'

我正在尝试从文件夹'/final/hive/19\u 20'中导出,该文件夹是配置单元输出目录,并将其导出到mysql数据库'universions'表'19\u 20'中。作为回应,我得到:

Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/share/aws/redshift/jdbc/redshift-jdbc42-1.2.37.1061.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
21/04/11 01:42:13 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
Enter password:
21/04/11 01:42:18 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
21/04/11 01:42:18 INFO tool.CodeGenTool: Beginning code generation
21/04/11 01:42:19 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `19_20` AS t LIMIT 1
21/04/11 01:42:19 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `19_20` AS t LIMIT 1
21/04/11 01:42:19 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
/tmp/sqoop-hadoop/compile/8aac2b94e7d11dc02d064c8213465c05/_19_20.java:37: warning: Can't initialize javac processor due to (most likely) a class loader problem: java.lang.NoClassDefFoundError: com/sun/tools/javac/processing/JavacProcessingEnvironment
public class _19_20 extends SqoopRecord  implements DBWritable, Writable {
       ^
        at lombok.javac.apt.LombokProcessor.getJavacProcessingEnvironment(LombokProcessor.java:411)
        at lombok.javac.apt.LombokProcessor.init(LombokProcessor.java:91)
        at lombok.core.AnnotationProcessor$JavacDescriptor.want(AnnotationProcessor.java:124)
        at lombok.core.AnnotationProcessor.init(AnnotationProcessor.java:177)
        at lombok.launch.AnnotationProcessorHider$AnnotationProcessor.init(AnnotationProcessor.java:73)
        at com.sun.tools.javac.processing.JavacProcessingEnvironment$ProcessorState.<init>(JavacProcessingEnvironment.java:508)
        at com.sun.tools.javac.processing.JavacProcessingEnvironment$DiscoveredProcessors$ProcessorStateIterator.next(JavacProcessingEnvironment.java:605)
        at com.sun.tools.javac.processing.JavacProcessingEnvironment.discoverAndRunProcs(JavacProcessingEnvironment.java:698)
        at com.sun.tools.javac.processing.JavacProcessingEnvironment.access$1800(JavacProcessingEnvironment.java:91)
        at com.sun.tools.javac.processing.JavacProcessingEnvironment$Round.run(JavacProcessingEnvironment.java:1043)
        at com.sun.tools.javac.processing.JavacProcessingEnvironment.doProcessing(JavacProcessingEnvironment.java:1184)
        at com.sun.tools.javac.main.JavaCompiler.processAnnotations(JavaCompiler.java:1170)
        at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:856)
        at com.sun.tools.javac.main.Main.compile(Main.java:523)
        at com.sun.tools.javac.api.JavacTaskImpl.doCall(JavacTaskImpl.java:129)
        at com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:138)
        at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:224)
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:63)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
  Caused by: java.lang.ClassNotFoundException: com.sun.tools.javac.processing.JavacProcessingEnvironment
        at java.lang.ClassLoader.findClass(ClassLoader.java:523)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at lombok.launch.ShadowClassLoader.loadClass(ShadowClassLoader.java:530)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        ... 26 more
Note: /tmp/sqoop-hadoop/compile/8aac2b94e7d11dc02d064c8213465c05/_19_20.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
1 warning
21/04/11 01:42:24 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/8aac2b94e7d11dc02d064c8213465c05/19_20.jar
21/04/11 01:42:24 INFO mapreduce.ExportJobBase: Beginning export of 19_20
21/04/11 01:42:24 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
21/04/11 01:42:26 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
21/04/11 01:42:26 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/11 01:42:26 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
21/04/11 01:42:26 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-6-179.ec2.internal/172.31.6.179:8032
21/04/11 01:42:26 INFO client.AHSProxy: Connecting to Application History server at ip-172-31-6-179.ec2.internal/172.31.6.179:10200
21/04/11 01:42:28 INFO input.FileInputFormat: Total input files to process : 1
21/04/11 01:42:29 INFO input.FileInputFormat: Total input files to process : 1
21/04/11 01:42:29 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
21/04/11 01:42:29 INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 3fb854bbfdabadafad1fa2cca072658fa097fd67]
21/04/11 01:42:29 INFO mapreduce.JobSubmitter: number of splits:4
21/04/11 01:42:29 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/04/11 01:42:29 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1618090360850_0017
21/04/11 01:42:29 INFO conf.Configuration: resource-types.xml not found
21/04/11 01:42:29 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
21/04/11 01:42:29 INFO resource.ResourceUtils: Adding resource type - name = memory-mb, units = Mi, type = COUNTABLE
21/04/11 01:42:29 INFO resource.ResourceUtils: Adding resource type - name = vcores, units = , type = COUNTABLE
21/04/11 01:42:29 INFO impl.YarnClientImpl: Submitted application application_1618090360850_0017
21/04/11 01:42:29 INFO mapreduce.Job: The url to track the job: http://ip-172-31-6-179.ec2.internal:20888/proxy/application_1618090360850_0017/
21/04/11 01:42:29 INFO mapreduce.Job: Running job: job_1618090360850_0017
21/04/11 01:42:37 INFO mapreduce.Job: Job job_1618090360850_0017 running in uber mode : false
21/04/11 01:42:37 INFO mapreduce.Job:  map 0% reduce 0%
21/04/11 01:43:00 INFO mapreduce.Job:  map 100% reduce 0%
21/04/11 01:43:01 INFO mapreduce.Job: Job job_1618090360850_0017 failed with state FAILED due to: Task failed task_1618090360850_0017_m_000002
Job failed as tasks failed. failedMaps:1 failedReduces:0

21/04/11 01:43:01 INFO mapreduce.Job: Counters: 12
        Job Counters
                Failed map tasks=3
                Killed map tasks=1
                Launched map tasks=4
                Data-local map tasks=4
                Total time spent by all maps in occupied slots (ms)=3779136
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=78732
                Total vcore-milliseconds taken by all map tasks=78732
                Total megabyte-milliseconds taken by all map tasks=120932352
        Map-Reduce Framework
                CPU time spent (ms)=0
                Physical memory (bytes) snapshot=0
                Virtual memory (bytes) snapshot=0
21/04/11 01:43:01 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
21/04/11 01:43:01 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 34.8867 seconds (0 bytes/sec)
21/04/11 01:43:01 INFO mapreduce.ExportJobBase: Exported 0 records.
21/04/11 01:43:01 ERROR mapreduce.ExportJobBase: Export job failed!
21/04/11 01:43:01 ERROR tool.ExportTool: Error during export:
Export job failed!
        at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445)
        at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

请让我知道,如果这个可以修复,以及如何修复它。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题