使用Kafka(v2.11-0.10.1.0)-spark-streaming(v-2.0.1-bin-hadoop2.7).
我有Kafka生产者和Spark-streaming消费者来生产和消费。一切正常,直到我停止消费者(大约2分钟)并重新开始。消费者启动并读取数据,绝对完美。但是,我迷失了2分钟的数据,其中消费者是关闭的。, "", d
from channel, socket has likely been closed.at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:607)
at org.apache.spark.streaming.<