Class SparkExecutionContext.SparkClusterConfig
- java.lang.Object
-
- org.apache.sysds.runtime.controlprogram.context.SparkExecutionContext.SparkClusterConfig
-
- Enclosing class:
- SparkExecutionContext
public static class SparkExecutionContext.SparkClusterConfig extends Object
Captures relevant spark cluster configuration properties, e.g., memory budgets and degree of parallelism. This configuration abstracts legacy (< Spark 1.6) and current configurations and provides a unified view.
-
-
Field Summary
Fields Modifier and Type Field Description static long
RESERVED_SYSTEM_MEMORY_BYTES
-
Constructor Summary
Constructors Constructor Description SparkClusterConfig()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description void
analyzeSparkConfiguation(org.apache.spark.SparkConf conf)
void
analyzeSparkConfiguationLegacy(org.apache.spark.SparkConf conf)
long
getBroadcastMemoryBudget()
long
getDataMemoryBudget(boolean min, boolean refresh)
int
getDefaultParallelism(boolean refresh)
int
getNumExecutors()
String
toString()
-
-
-
Field Detail
-
RESERVED_SYSTEM_MEMORY_BYTES
public static final long RESERVED_SYSTEM_MEMORY_BYTES
- See Also:
- Constant Field Values
-
-
Method Detail
-
getBroadcastMemoryBudget
public long getBroadcastMemoryBudget()
-
getDataMemoryBudget
public long getDataMemoryBudget(boolean min, boolean refresh)
-
getNumExecutors
public int getNumExecutors()
-
getDefaultParallelism
public int getDefaultParallelism(boolean refresh)
-
analyzeSparkConfiguationLegacy
public void analyzeSparkConfiguationLegacy(org.apache.spark.SparkConf conf)
-
analyzeSparkConfiguation
public void analyzeSparkConfiguation(org.apache.spark.SparkConf conf)
-
-