Apache-spark – How to set Spark application exit status

apache-sparkexit-codehadoop-yarnspark-submit

I'm writing a spark application and run it using spark-submit shell script (using yarn-cluster/yarn-client)

As I see now, exit code of spark-submit is decided according to the related yarn application – if SUCCEEDED status is 0, otherwise 1.

I want to have the option to return another exit code – for a state that my application succeeded with some errors.

Is it possible? to return different exit code from the application?

I tried to use System.exit() but didn't succeed…

Thanks.

Best Answer

It is possible in client mode but not in cluster mode. You have a workaround for cluster mode.

My answer to this question should help you.

Related Topic