rn procfsmetricsgetter:尝试计算pagesize时出现异常,因为processtree度量的结果报告被停止

qrjkbowd  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(1293)

我安装了spark并正确设置了spark和python的所有环境变量,如中所述 stackoverflow 但在启动spark时仍然收到警告

20/09/06 13:33:52 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
yizd12fk

yizd12fk1#

我也面临同样的问题。原因是没有正确设置环境变量。前面我所做的只是在警告出现时单击“回车”,然后从下一行继续执行过程,没有任何问题。虽然这是一个暂时的解决办法,但它奏效了。
另一个对我永久有效的解决方案是正确配置环境/路径变量。

Steps:

1. Make sure to have py4j zip file available. My py4j was in the directory, 
   "C:\Spark\spark-3.0.1-bin-hadoop2.7\python\lib".
2. Now, Go to Environment Variables.
3. Go to PATH in user variables.
4. Click on Edit, and Now add these path as a new. 
5. %SPARK_HOME%\bin
6. %SPARK_HOME%\python
7. %SPARK_HOME%\python\lib\py4j-(version number)-src.zip, mine was 0.10.9
   so I typed, %SPARK_HOME%\python\lib\py4j-0.10.9-src.zip
8. Finally add, %PYTHONPATH%

单击ok,保存所有更改。重新运行程序,它应该可以正常工作。

相关问题