Python – Encountering ” WARN ProcfsMetricsGetter: Exception when trying to compute pagesize” error when running Spark

apache-sparkpysparkpython

I installed spark and when trying to run it, I am getting the error:
WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped

Can someone help me with that?

Best Answer

I received this same message, running Spark 3.0.1 on Windows 10, using Scala 2.12.10. It's not actually an Error, in the sense that it ends your program execution. Its a warning related to /proc file systems on Linux machines.

If you are also on a Windows machine, the answer maybe, to quote Wing Yew Poon @ Apache: "The warning happened because the command "getconf PAGESIZE" was run and it is not a valid command on Windows so an exception was caught." (From the Spark jira issue here).

If your program failed right after throwing this Exception message, it is for some other reason. In my case, Spark was crashing with this message right after this warning:

20/11/13 12:41:51 ERROR MicroBatchExecution: Query [id = 32320bc7-d7ba-49b4-8a56-1166a4f2d6db, runId = d7cc93c2-41ef-4765-aecd-9cd453c25905] terminated with error
org.apache.spark.SparkException: Job 1 cancelled because SparkContext was shut down

This warning can be hidden by setting spark.executor.processTreeMetrics.enabled to false. To quote Mr. Poon again, "it is a minor bug that you see this warning. But it can be safely ignored."