当试图通过spark保存到MySQL数据库时,我得到了以下错误:: java.lang.RuntimeExceptionat scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.sources.ResolvedDataSource$(SQLContext.scala:10
for details at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:209)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.<em
:1099) at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099) at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1121) at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1071) at org
我正试着把数据保存到蜂巢表中。简化的代码如下所示:df.write().mode("overwrite").saveAsTable("schemaName.tableName:55)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:251)
然而,我面临着网关的问题:一些未解决的主机具有ec2实例的公共IP。('example') at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:444)
at org.apache.spark.sql.DataFrameW
我有一个奇怪的错误,我试图写数据给蜂巢,它在火花壳工作良好,但当我使用火花提交,它抛出数据库/表在默认错误中找不到。下面是我试图在spark submit中编写的代码,我使用的是自定义构建的spark 2.0.0
val sqlContext = new org.apache.spark.sql.SQLContext18 INFO SparkSqlParser:解析命令: spark_schema.mea
我通过在我的数据帧上调用.saveAsTable创建了一个Spark SQL表。该命令完全成功。但是,现在当我查询表时,拼图文件似乎已损坏。java.io.IOException: hdfs://ip:8020/user/hive/warehouse/people/part-r-00001.parquet not a SequenceFile"scala