Class SparkExecutionContext.SparkClusterConfig

  • Enclosing class:
    SparkExecutionContext

    public static class SparkExecutionContext.SparkClusterConfig
    extends Object
    Captures relevant spark cluster configuration properties, e.g., memory budgets and degree of parallelism. This configuration abstracts legacy (< Spark 1.6) and current configurations and provides a unified view.
    • Constructor Detail

      • SparkClusterConfig

        public SparkClusterConfig()
    • Method Detail

      • getBroadcastMemoryBudget

        public long getBroadcastMemoryBudget()
      • getDataMemoryBudget

        public long getDataMemoryBudget​(boolean min,
                                        boolean refresh)
      • getNumExecutors

        public int getNumExecutors()
      • getDefaultParallelism

        public int getDefaultParallelism​(boolean refresh)
      • analyzeSparkConfiguationLegacy

        public void analyzeSparkConfiguationLegacy​(org.apache.spark.SparkConf conf)
      • analyzeSparkConfiguation

        public void analyzeSparkConfiguation​(org.apache.spark.SparkConf conf)