/ Read data using spark jsc.newAPIHadoopRDD(conf, TableInputFormat.class, ImmutableBytesWritable.class, Result.class);
问题出在newAPIHadoopRDD方法上。Bound mismatch: The generic method newAPIHadoopRDD</e
我在scala中的代码:
val documents = sc.newAPIHadoopRDD([Object,org.bson.BSONObject,com.mongodb.hadoop.mapred.BSONFileInputFormat] do not conform to method newAPIHadoopRDD'stype parameter bounds [K,V,F <: org.
Exception in thread "main" java.lang.RuntimeException: java.lang.NullPointerException at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
at org.apache.hado