今天新安装的flume,使用flume来做kafka与hive对接时出现了以下两个的错误:
Caused by: org.apache.hive.hcatalog.streaming.ConnectionError: HiveEndPoint{metaStoreUri='thrift://localhost:9083', database='db', table='student', partitionVals=}
(HiveWriter.java:99)
at org.apache.flume.sink.hive.HiveSink.getOrCreateWriter(HiveSink.java:346)
at org.apache.flume.sink.hive.HiveSink.drainOneBatch(HiveSink.java:297)
at org.apache.flume.sink.hive.HiveSink.process(HiveSink.java:254)
at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:67)
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
本着问题要一个一个顺序解决的态度,对Caused by: org.apache.hive.hcatalog.streaming.ConnectionError: HiveEndPoint{metaStoreUri=‘thrift://localhost:9083’, database=‘db’, table=‘student’, partitionVals=}问题进行搜索,但是大多数百度的内容都是表要分桶,开启事务,格式是:org ,因为他们的问题也与我的不全一样。
第一个问题解决无果后,我联想会不会是第二个问题导致的第一个发生错误,是他的前置条件,于是查看第二个报错的问题
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
哈哈,发现是hive/lib下与flume/lib下的guava的jar包的版本不一致导致的,这个问题在安装运行hive时与hadoop下的guava版本不一致时就遇到过,就很可惜,用了很长的一块时间。