Apache-spark – Could not bind on a random free port error while trying to connect to spark master

amazon ec2apache-sparkpysparkpython-3.x

I have a spark master running on amazon ec2.
I tried to connect to it using pyspark as follows from another ec2 instance as follows:

spark = SparkSession.builder.appName("MyApp") \
                            .master("spark_url_as_obtained_in_web_ui") \
                            .getOrCreate()

The following were the errors:

To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

2018-04-04 20:03:04 WARN Utils:66 – Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.

…………

java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.

I tried all the solutions as described here but to no avail:

  1. Connecting to a remote Spark master – Java / Scala

  2. All masters are unresponsive ! ? Spark master is not responding with datastax architecture

  3. Spark Standalone Cluster – Slave not connecting to Master

  4. Spark master-machine:7077 not reachable

  5. spark submit "Service 'Driver' could not bind on port" error

  6. https://community.hortonworks.com/questions/8257/how-can-i-resolve-it.html

What could be going wrong??

Best Answer

Set spark.driver.bindAddress to your local IP like 127.0.0.1.

pyspark -c spark.driver.bindAddress=127.0.0.1
Related Topic