使用pyspark从mariadb读取查询

8ehkhllq  于 2021-05-27  发布在  Hadoop
关注(0)|答案(1)|浏览(462)

这个问题在这里已经有了答案

在apachespark2.0.0中,是否可以从外部数据库获取查询(而不是获取整个表)(1个答案)
11个月前关门了。
我正在尝试读取从mariadb到pyspark dataframe的查询结果。我用过的jar是

--jars mariadb-java-client-2.2.2.jar

我能用英语读一张table

df = spark.read.format("jdbc")\
        .option("url","jdbc:mariadb://xxx.xxx.xx.xx:xxxx/hdpms")\
        .option("driver", "org.mariadb.jdbc.Driver")\
        .option("dbtable", Mytable)\
        .option("user", "xxxxx_xxxxx")\
        .option("password", "xxxxx")\
        .load()

现在我正在寻找一个命令来运行一个简单的查询,比如

SELECT col1,col2,col3,.. From MyTable Where date>2019 and cond2;

尽管我可以使用

"MyTable date>2019 and cond2 --"

jar补充道 SELECT * FROM 开始和结束时 where 1=0 但我面临以下错误

py4j.protocol.Py4JJavaError: An error occurred while calling o455.showString.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 3.0 failed 4 times, most recent failure: Lost task 0.3 in stage 3.0 (TID 12, xhadoopm3095p.aetna.com, executor 2): java.sql.SQLException: Value "DATE_CREATED" cannot be parse as Timestamp
        at org.mariadb.jdbc.internal.com.read.resultset.rowprotocol.TextRowProtocol.getInternalTimestamp(TextRowProtocol.java:592)
        at org.mariadb.jdbc.internal.com.read.resultset.SelectResultSet.getTimestamp(SelectResultSet.java:1178)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetter$11.apply(JdbcUtils.scala:439)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$makeGetter$11.apply(JdbcUtils.scala:438)

谁能帮我一下吗。谢谢您

ibps3vxo

ibps3vxo1#

df = spark.read.format("jdbc")\
        .option("url","jdbc:mariadb://xxx.xxx.xx.xx:xxxx/hdpms")\
        .option("driver", "org.mariadb.jdbc.Driver")\
        .option("dbtable", "(SELECT col1,col2,col3,.. From MyTable Where date>2019 and cond2) tmp")\
        .option("user", "xxxxx_xxxxx")\
        .option("password", "xxxxx")\
        .load()

使用query为表创建一个别名,它就可以工作了

相关问题