spark提交任务,参数的形式是JSON
比如:spark2-submit --class com.iflytek.test.Jcseg_HiveDemo spark_hive.jar {"tablename":"dhzp","fields":["text1","text2"]} {"tablename":"dhzp111","fields":["text1_jcseg","text2_jcseg"]}
第一个参数:{"tablename":"dhzp","fields":["text1","text2"]}
第二个参数:{"tablename":"dhzp111","fields":["text1_jcseg","text2_jcseg"]}
结果后台实际接收的参数是这样的:
tablename:dhzp fields:[text1 text2] tablename:dhzp111 fields:[text1_jcseg text2_jcseg]
没有把我的参数JSON参数当作一个整体,而是当作逗号或者空格分割了。这个问题该怎么解决呢?一般来说分俩步:
1.有双引号将整体包裹起来
2.包裹的双引号里面的内容需要加\转义
如下:
spark2-submit --class com.iflytek.test.Jcseg_HiveDemo spark_hive.jar "{\"tablename\":\"dhzp\",\"fields\":[\"text1\",\"text2\"]}" "{\"tablename\":\"dhzp111\",\"fields\":[\"text1_jcseg\",\"text2_jcseg\"]}"
后台接收的参数如下:
{"tablename":"dhzp","fields":["text1","text2"]} {"tablename":"dhzp111","fields":["text1_jcseg","text2_jcseg"]}