如何解决spark上下文的路径问题?analysisexception:路径不存在:file:/opt/workspace/

x33g5p2x  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(371)

我在macos上运行jupyterlab。部分代码:

new_list =[]        
for k in get_matching_s3_keys(bucket='cw-milenko-tests', prefix='Json_gzips/ticr_calculated_2', suffix='.gz'):
    new_list.append(k)

dfs = [spark.read.json(file) for file in new_list]

print (map(lambda df: len(df.schema), dfs))

我从s3下载,然后保存到列表。我有个错误:

AnalysisException: Path does not exist: file:/opt/workspace/Json_gzips/ticr_calculated_2_2020-05-27T00-01-21.json.gz;

这就是我用的Spark束

我在docker上用了这个repo spark集群
如何检查docker容器是否通信?

docker ps
CONTAINER ID        IMAGE                          COMMAND                  CREATED             STATUS              PORTS                                            NAMES
a01477cd9316        andreper/spark-worker:latest   "/bin/sh -c 'bin/spa…"   4 days ago          Up 3 hours          0.0.0.0:8082->8081/tcp                           spark-worker-2
f448de886c72        andreper/spark-worker:latest   "/bin/sh -c 'bin/spa…"   4 days ago          Up 3 hours          0.0.0.0:8081->8081/tcp                           spark-worker-1
5789c47ef46e        andreper/jupyterlab:latest     "/bin/sh -c 'jupyter…"   4 days ago          Up 3 hours          0.0.0.0:8888->8888/tcp                           jupyterlab
63e3d3c90ed6        andreper/spark-master:latest   "/bin/sh -c 'bin/spa…"   4 days ago          Up 3 hours          0.0.0.0:7077->7077/tcp, 0.0.0.0:8080->8080/tcp   spark-master

我检查了jupyterlab和spark master的底座

milenko@Cloudwalkers-MacBook-Pro spark-cluster-on-docker % docker inspect -f '{{ .Mounts }}' 5789c47ef46e
[{volume hadoop-distributed-file-system /var/lib/docker/volumes/hadoop-distributed-file-system/_data /opt/workspace local rw true }]
milenko@Cloudwalkers-MacBook-Pro spark-cluster-on-docker % docker inspect -f '{{ .Mounts }}' 63e3d3c90ed6
[{volume hadoop-distributed-file-system /var/lib/docker/volumes/hadoop-distributed-file-system/_data /opt/workspace local rw true }]

如何将这个文件上传到hdfs中相应的路径?

aydmsdu9

aydmsdu91#

你可以用 hdfs dfs -copyFromLocal /local/path/to.json /hdfs/path/to.json 从本地存储向hdfs添加文件。
添加 file:///path/to/your.json 并检查spark是否可以在本地文件系统中找到它。

相关问题