编译错误:
The method updateStateByKey(Function2<List<Integer>,Optional<S>,Optional<S>>) in the type JavaPairDStream<String,Integer> is not applicable for the arguments (Function2<List<Integer>,Optional<Integer</em
我尝试使用transform将JavaPairRDD连接到JavaPairDStream中的RDD,但得到以下错误:
incompatible types: no instance(s) of typevariable(s) W exist so that org.apache.spark.api.java.JavaPairRDD<java.lang.String,scala.Tuple2<LogType,W>> conform
我正在尝试学习火花串流,当我的演示主机是"local2“时,这是正常的。: java.io.FileNotFoundException: HADOOP_HOME和hadoop.home.dir in unset。应该注意的是,我在idea中提交了代码 @Component
private static final String[]{"E:\\project\\spark-de