java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)
但是,appattempt日志显示了完全不同的异常,这与IO/网络有关。我的问题是:我应该相信屏幕上的诊断还是appattempt日志?是导致终止的IO异常还是内存不足导致appattempt日志中的IO异常?这是我应该查看的另一个日志/诊断吗?谢谢。$ofRef.foreach(ArrayOps.scala:186)
at org.apache.spark.storage.DiskBlockManager.
$.apply(DeltaLog.scala:740)
at org.apache.spark.sql.delta.DeltaLog$.forTable(DeltaLog.scala:712)at org.apache.spark.sql.delta.sources.DeltaDataSource.createRelation(DeltaDataSource.scala:169)at org.apac
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385) at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala
")错误:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql$.resolveTable(JDBCRDD.scala:167) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelatio
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStreamat org.apache.spark.Logging$.<init>(Logging.scala:162)
at org.apache.spark</em
我的spark流程序收到以下错误: Exception in thread "main“java.lang.NoClassDefFoundError:org/apache/spark/internal/我在互联网上找到的信息提示我,旧版本的org.apache.spark.Logging在新版本中变成了org.apache.spark.interna
scala> 22/10/11 15:47:31 INFO org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnDriverEndpoint10/11 15:47:31 INFO org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered31 INFO <