Spark代码调优(一)
环境极其恶劣情况下: import org.apache.spark.SparkContext import org.apache.spark.rdd.RDD import org.apache.spark.sql.{DataFrame, Row, SQLContext} import org.apache.spark.sql.hive.HiveContext val sqlContext = new HiveContext(sc) val sql = sqlContext.sql("select * from ysylbs9 ").collect 中间发生报错: cluster.YarnScheduler: Lost executor 2 on zdbdsps025.iccc.com: Container marked as failed: container_e55_1478671093534_0624_01_000003 on host: zdbdsps025.iccc.com. Exit status: 143. Diagnostics: Container killed ...

