Spark1.4启动spark-shell时initializing失败
错误信息如下: 5/11/03 16:48:15 INFO spark.SparkContext: Running Spark version 1.4.1 15/11/03 16:48:15 WARN spark.SparkConf: In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN). 15/11/03 16:48:15 WARN spark.SparkConf: SPARK_JAVA_OPTS was detected (set to '-verbose:gc -XX:-UseGCOverheadLimit -XX:+UseCompressedOops -XX:-PrintGCDetails -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:H...