Apache Spark User List - Setting executor memory when using spark-shell
I set the size of my executors on a standalone cluster when using the shell like this:
./bin/spark-shell --master $MASTER --total-executor-cores $CORES_ACROSS_CLUSTER --driver-java-options "-Dspark.executor.memory=$MEMORY_PER_EXECUTOR"
It doesn't seem particularly clean, but it works.
Read full article from Apache Spark User List - Setting executor memory when using spark-shell
No comments:
Post a Comment