和spark-class,则相当于是分两步执行: java -Xmx128m -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit...4) 添加从spark-submit脚本输入参数中解析出来的参数和mainclass org.apache.spark.deploy.SparkSubmit。...5) 至此构成一个完整的java命令,main class为SparkSubmit org.apache.spark.deploy.SparkSubmit 以 spark on yarn 为例 主要逻辑就是梳理参数...,向yarn提交作业 org.apache.spark.deploy.SparkSubmit#main org.apache.spark.deploy.SparkSubmit#doSubmit...org.apache.spark.deploy.SparkSubmit#parseArguments org.apache.spark.deploy.SparkSubmit
(Utils.scala:1138) ... 31 more Exception in thread "main" org.apache.spark.SparkException: Failed...DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.spark.deploy.SparkSubmit...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) at org.apache.spark.deploy.SparkSubmit...$.doRunMain$1(SparkSubmit.scala:166) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala...:189) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) at org.apache.spark.deploy.SparkSubmit.main
:38) at org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:120) at org.apache.spark.streaming.DStreamGraph...DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.spark.deploy.SparkSubmit...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664) at org.apache.spark.deploy.SparkSubmit...$.doRunMain$1(SparkSubmit.scala:169) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala...:192) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) at org.apache.spark.deploy.SparkSubmit.main
$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) at org.apache.spark.deploy.SparkSubmit...:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) at org.apache.spark.deploy.SparkSubmit...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) at org.apache.spark.deploy.SparkSubmit...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) at org.apache.spark.deploy.SparkSubmit
(QueryExecution.scala:48) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:63) at org.apache.spark.sql.SparkSession.sql...sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) at org.apache.spark.deploy.SparkSubmit...$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala...:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main
(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit...$$runMain(SparkSubmit.scala:855) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala...:161) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) at org.apache.spark.deploy.SparkSubmit.doSubmit...(SparkSubmit.scala:86) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930...) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939) at org.apache.spark.deploy.SparkSubmit.main
$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) 37 at org.apache.spark.deploy.SparkSubmit...$.doRunMain$1(SparkSubmit.scala:180) 38 at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala...:205) 39 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) 40 at org.apache.spark.deploy.SparkSubmit.main...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) 95 at org.apache.spark.deploy.SparkSubmit...:205) 97 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) 98 at org.apache.spark.deploy.SparkSubmit.main
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start...(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit...$$runMain(SparkSubmit.scala:894) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala...:198) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) at org.apache.spark.deploy.SparkSubmit...$.main(SparkSubmit.scala:137) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 错误解决思路
$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) at org.apache.spark.deploy.SparkSubmit...$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala...:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) at org.apache.spark.deploy.SparkSubmit...:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main
二,Standalone提交一个应用源码过程 1,启动Driver的过程 Org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit...org.apache.spark.deploy.rest.RestSubmissionClient Org.apache.spark.deploy.rest.StandaloneRestServer...org.apache.spark.deploy.master.Master Org.apache.spark.deploy.worker.Worker Org.apache.spark.deploy.worker.DriverRunner...Org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend Org.apache.spark.deploy.client.AppClient...org.apache.spark.deploy.master.Master Org.apache.spark.deploy.worker.Worker Org.apache.spark.deploy.worker.ExecutorRunner
) 76 at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala...DelegatingMethodAccessorImpl.java:43) 88 at java.lang.reflect.Method.invoke(Method.java:606) 89 at org.apache.spark.deploy.SparkSubmit...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 90 at org.apache.spark.deploy.SparkSubmit...$.doRunMain$1(SparkSubmit.scala:181) 91 at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala...:206) 92 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 93 at org.apache.spark.deploy.SparkSubmit.main
at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit...$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit...(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala...:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
执行 mvn clean deploy ......想把 jar 包更新到私服仓库,报错: Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.8.2:deploy...(default-deploy) on project xxx-xxx-xxx: Deployment failed: repository element was not specified in
首先我们使用新的API方法连接mysql加载数据 创建DF import org.apache.spark.sql.DataFrame import org.apache.spark....DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) at org.apache.spark.deploy.SparkSubmit...$.doRunMain$1(SparkSubmit.scala:170) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala...:193) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) at org.apache.spark.deploy.SparkSubmit.main
CoarseGrainedExecutorB ackend 1. bin/spark-submit 启动脚本分析 启动类org.apache.spark.deploy.SparkSubmit exec..."${SPARK_HOME}"/bin/spark-class org.apache.spark.deploy.SparkSubmit "$@" /bin/spark-class exec "${CMD.../jars/*:/opt/module/hadoop-2.7.2/etc/hadoop/ org.apache.spark.deploy.SparkSubmit --master.../examples/jars/spark-examples_2.11-2.1.1.jar 100 2. org.apache.spark.deploy.SparkSubmit 源码分析 SparkSubmit.../spark-yarn/conf/:/opt/module/spark-yarn/jars/*:/opt/module/hadoop-2.7.2/etc/hadoop/ -Xmx1g org.apache.spark.deploy.SparkSubmit
一、SparkSubmit 提交 上次我们已经说完了 Spark Standalone 的 Master 和 Worker 的启动流程,本次我们从一个提交 Spark 作业的命令开始阅读 Spark 任务提交的源码.../bin/spark-submit --class org.apache.spark.examples.SparkPi \ --master local \ --deploy-mode cluster...脚本里面,最终是执行这样一句: exec "${SPARK_HOME}"/bin/spark-class org.apache.spark.deploy.SparkSubmit "$@" 执行的是...主要是从参数中,解析出执行的主类,childMainClass 点进去,重点看 childMainClass 赋值的地方:如果是 standalone 模式,不是 restful 的形式,则主类是:org.apache.spark.deploy.ClientApp...如果是 yarn-cluster 模式,则主类是:org.apache.spark.deploy.yarn.YarnClusterApplication 然后下面还有对提交到 mesos 和 k8s
$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) 135 at org.apache.spark.deploy.SparkSubmit...:205) 137 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) 138 at org.apache.spark.deploy.SparkSubmit.main...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) 234 at org.apache.spark.deploy.SparkSubmit...:205) 236 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) 237 at org.apache.spark.deploy.SparkSubmit.main...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) 303 at org.apache.spark.deploy.SparkSubmit
(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit...$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) at org.apache.spark.deploy.SparkSubmit...$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala...:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main...(SparkSubmit.scala) at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:178) at
:98) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:220) at org.apache.spark.rdd.RDD...(RDD.scala:218) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1335) at org.apache.spark.rdd.RDD.count...DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit...$.launch(SparkSubmit.scala:367) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:77...) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.NullPointerException
通过命令行自动部署到tomcat,在pom文件里面增加了tomcat maven依赖,结果今天eclipse 部署web项目到tomcat时提示java.lang.NoSuchMethodException: org.apache.catalina.deploy.WebXml... org.apache.tomcat.maven tomcat7-maven-plugin</artifactId
领取专属 10元无门槛券
手把手带您无忧上云