Spark: Out of Memory During Workbook Execution

Problem

When a workbook is saved and run, workbook jobs that use Spark run out of memory and face out of memory (OOM) errors.


Cause 

Spark jobs do not have enough memory available to run for the workbook execution.

Solution

Out of memory errors can be caused by many issues. If they occur, try the following setting adjustments:

  • Set das.execution-framework to Tez, or
  • Set das.compute.column-metrics to false, or
  • Set das.execution-framework to SparkClient and increase Datameer memory in das-env.sh to export JAVA_OPTIONS="-Xmx2048m, or
  • Set das.execution-framework to SparkCluster and set spark.driver.memory to 2g