pyspark2.4.5与python3.8.3不兼容,如何解决这个问题?

2admgd59  于 2021-05-29  发布在  Spark
关注(0)|答案(1)|浏览(614)

代码

from pyspark import SparkContext,SparkConf

conf=SparkConf().setMaster('local').setAppName('Test App')
sc=SparkContext(conf)

错误消息

Traceback (most recent call last):
      File "C:\Users\Test\PycharmProjects\python-test\MainFile.py", line 5, in <module>
        from pyspark import SparkContext,SparkConf
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\__init__.py", line 51, in <module>
        from pyspark.context import SparkContext
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\context.py", line 31, in <module>
        from pyspark import accumulators
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\accumulators.py", line 97, in <module>
        from pyspark.serializers import read_int, PickleSerializer
      File "C:\Test\Python_3.8.3_Latest\lib\sit`enter code here`e-packages\pyspark\serializers.py", line 72, in <module>
        from pyspark import cloudpickle
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\cloudpickle.py", line 145, in <module>
        _cell_set_template_code = _make_cell_set_template_code()
      File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\cloudpickle.py", line 126, in _make_cell_set_template_code
        return types.CodeType(
    TypeError: an integer is required (got type bytes)
2cmtqfgy

2cmtqfgy1#

尽管最新的spark doc说它支持 Python 2.7+/3.4+ ,它实际上还不支持python3.8。根据这个pr,spark3.0将支持python3.8。因此,您可以试用spark 3.0预览版(假设您不打算进行生产部署),也可以“暂时”退回到Python3.6/3.7 for spark 2.4.x。

相关问题