memoryoverhead专题

一次spark sql 优化的经历: SparkException: Job aborted / spark.yarn.executor.memoryOverhead

问题背景 某天 跑 sparkSQL 的时候,遇到报错: org.apache.spark.SparkException: Job aborted. at org.apache.spark.sql.execution.datasources.FileFormatWriter . w r i t e ( F i l e F o r m a t W r i t e r . s c a l a : 1