使用sqoop将数据从postgres导入配置单元

rjjhvcjd  于 2021-05-27  发布在  Hadoop
关注(0)|答案(1)|浏览(517)

我想将数据从postgres导入配置单元,并输入以下命令:

sqoop import --connect jdbc:postgresql://localhost:5432/ --username postgres --password postgres --table users --hive-import --m 1

但我看到一条失败的信息:

Warning: /usr/local/sqoop/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/local/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2366: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: invalid variable name
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2461: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: invalid variable name
2020-11-16 09:12:43,658 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
2020-11-16 09:12:43,711 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
2020-11-16 09:12:43,711 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
2020-11-16 09:12:43,711 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
2020-11-16 09:12:43,779 INFO manager.SqlManager: Using default fetchSize of 1000
2020-11-16 09:12:43,780 INFO tool.CodeGenTool: Beginning code generation
2020-11-16 09:12:43,981 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM "users" AS t LIMIT 1
2020-11-16 09:12:44,009 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: /tmp/sqoop-hadoop/compile/1de46ca6c2305faed7095f3728a74afc/users.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
2020-11-16 09:12:44,665 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/1de46ca6c2305faed7095f3728a74afc/users.jar
2020-11-16 09:12:44,747 WARN manager.PostgresqlManager: It looks like you are importing from postgresql.
2020-11-16 09:12:44,747 WARN manager.PostgresqlManager: This transfer can be faster! Use the --direct
2020-11-16 09:12:44,747 WARN manager.PostgresqlManager: option to exercise a postgresql-specific fast path.
2020-11-16 09:12:44,751 INFO mapreduce.ImportJobBase: Beginning import of users
2020-11-16 09:12:44,751 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2020-11-16 09:12:44,820 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
2020-11-16 09:12:45,145 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
2020-11-16 09:12:45,205 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
2020-11-16 09:12:45,923 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/hadoop/.staging/job_1605504371417_0002
2020-11-16 09:12:46,471 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:46,978 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:47,266 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:48,045 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:48,440 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:48,830 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:49,190 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:49,522 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:49,903 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:50,726 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:51,060 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:51,449 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:51,816 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:52,186 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:52,974 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:53,362 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:53,651 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:54,063 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:54,419 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:54,820 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:55,873 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:56,231 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:56,643 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:56,921 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:57,722 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:58,122 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:58,911 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:12:59,690 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:00,045 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:00,435 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:00,890 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:01,202 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:01,569 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:01,937 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:02,327 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:02,617 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:02,973 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:03,350 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:03,717 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:04,540 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:04,917 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:05,203 INFO db.DBInputFormat: Using read commited transaction isolation
2020-11-16 09:13:06,286 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:06,675 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:07,163 INFO mapreduce.JobSubmitter: number of splits:1
2020-11-16 09:13:07,565 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-11-16 09:13:07,661 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1605504371417_0002
2020-11-16 09:13:07,661 INFO mapreduce.JobSubmitter: Executing with tokens: []
2020-11-16 09:13:07,858 INFO conf.Configuration: resource-types.xml not found
2020-11-16 09:13:07,858 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2020-11-16 09:13:07,926 INFO impl.YarnClientImpl: Submitted application application_1605504371417_0002
2020-11-16 09:13:07,968 INFO mapreduce.Job: The url to track the job: http://alim-VirtualBox:8088/proxy/application_1605504371417_0002/
2020-11-16 09:13:07,968 INFO mapreduce.Job: Running job: job_1605504371417_0002
2020-11-16 09:13:12,079 INFO mapreduce.Job: Job job_1605504371417_0002 running in uber mode : false
2020-11-16 09:13:12,082 INFO mapreduce.Job:  map 0% reduce 0%
2020-11-16 09:13:16,147 INFO mapreduce.Job:  map 100% reduce 0%
2020-11-16 09:13:19,246 INFO mapreduce.Job: Job job_1605504371417_0002 completed successfully
2020-11-16 09:13:19,306 INFO mapreduce.Job: Counters: 33
    File System Counters
        FILE: Number of bytes read=0
        FILE: Number of bytes written=234905
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=87
        HDFS: Number of bytes written=54
        HDFS: Number of read operations=6
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=2
        HDFS: Number of bytes read erasure-coded=0
    Job Counters 
        Launched map tasks=1
        Other local map tasks=1
        Total time spent by all maps in occupied slots (ms)=2231
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=2231
        Total vcore-milliseconds taken by all map tasks=2231
        Total megabyte-milliseconds taken by all map tasks=2284544
    Map-Reduce Framework
        Map input records=3
        Map output records=3
        Input split bytes=87
        Spilled Records=0
        Failed Shuffles=0
        Merged Map outputs=0
        GC time elapsed (ms)=26
        CPU time spent (ms)=770
        Physical memory (bytes) snapshot=215732224
        Virtual memory (bytes) snapshot=2561839104
        Total committed heap usage (bytes)=200802304
        Peak Map Physical memory (bytes)=215732224
        Peak Map Virtual memory (bytes)=2561839104
    File Input Format Counters 
        Bytes Read=0
    File Output Format Counters 
        Bytes Written=54
2020-11-16 09:13:19,309 INFO mapreduce.ImportJobBase: Transferred 54 bytes in 34.1584 seconds (1.5809 bytes/sec)
2020-11-16 09:13:19,316 INFO mapreduce.ImportJobBase: Retrieved 3 records.
2020-11-16 09:13:19,316 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table users
2020-11-16 09:13:19,343 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM "users" AS t LIMIT 1
2020-11-16 09:13:19,353 INFO hive.HiveImport: Loading uploaded data into Hive
2020-11-16 09:13:19,360 INFO conf.HiveConf: Found configuration file file:/usr/local/hive/conf/hive-site.xml
2020-11-16 09:13:20,274 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
2020-11-16 09:13:20,274 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2020-11-16 09:13:20,274 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2020-11-16 09:13:20,274 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2020-11-16 09:13:20,277 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2020-11-16 09:13:21,685 INFO hive.HiveImport: Hive Session ID = c35a4fbf-8b8b-488c-838f-68711d017e49
2020-11-16 09:13:21,726 INFO hive.HiveImport: 
2020-11-16 09:13:21,727 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true
2020-11-16 09:15:05,415 INFO hive.HiveImport: FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
2020-11-16 09:15:58,418 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive exited with status 64
    at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:384)
    at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
    at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

有什么问题吗?我怎样才能修复这个故障???

mfpqipee

mfpqipee1#

我检查了我的postgresql数据类型并更改了一些配置单元不支持的数据类型!!!!

相关问题