重庆分公司,新征程启航

为企业提供网站建设、域名注册、服务器等服务

如何分析Spark集群的启动日志

这篇文章给大家介绍如何分析Spark集群的启动日志,内容非常详细,感兴趣的小伙伴们可以参考借鉴,希望对大家能有所帮助。

创新互联主打移动网站、成都网站制作、网站设计、外贸网站建设、网站改版、网络推广、网站维护、域名注册、等互联网信息服务,为各行业提供服务。在技术实力的保障下,我们为客户承诺稳定,放心的服务,根据网站的内容与功能再决定采用什么样的设计。最后,要实现符合网站需求的内容、功能与设计,我们还会规划稳定安全的技术方案做保障。

Created by Wang, Jerry, last modified on Aug 24, 2015

added by Jerry:…
/root/devExpert/spark-1.4.1/sbin/…/conf – ( Jerry: I haven’t copied out the template from my own confiugration file yet )
starting org.apache.spark.deploy.master.Master, logging to /root/devExpert/spark-1.4.1/sbin/…/logs/spark-root-org.apache.spark.deploy.master.Master-1-NKGV50849583FV1.out
Jerry- location of log file:

NKGV50849583FV1:~/devExpert/spark-1.4.1/logs # vi spark-root-org.apache.spark.deploy.master.Master-1-NKGV50849583FV1.out

1 added by Jerry: loading load-spark-env.sh !!!1
2 added by Jerry, number of Jars: 1
3 added by Jerry, launch_classpath: /root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar
4 added by Jerry,RUNNER:/usr/jdk1.7.0_79/bin/java
5 added by Jerry, printf argument list: org.apache.spark.deploy.master.Master --ip NKGV50849583FV1 --port 7077 --webui-port 8080
Jerry: this is default value

6 Spark Command: /usr/jdk1.7.0_79/bin/java -cp /root/devExpert/spark-1.4.1/sbin/…/conf/:/root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assem bly-1.4.1-hadoop2.4.0.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucle us-core-3.2.10.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar -Xms512m -Xmx512m -XX:MaxPermSize=256m org.apache.spark.dep loy.master.Master --ip NKGV50849583FV1 --port 7077 --webui-port 8080
7 ========================================
8 added by Jerry, I am in if-else branch: /usr/jdk1.7.0_79/bin/java -cp /root/devExpert/spark-1.4.1/sbin/…/conf/:/root/devExpert/spark-1.4.1/assembly/targ et/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar:/root/devExpert/spark-1.4.1/l ib_managed/jars/datanucleus-core-3.2.10.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar -Xms512m -Xmx512m -XX:MaxPermSize= 256m org.apache.spark.deploy.master.Master --ip NKGV50849583FV1 --port 7077 --webui-port 8080
9 Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
10 15/08/16 12:45:17 INFO Master: Registered signal handlers for [TERM, HUP, INT]
11 15/08/16 12:45:17 WARN Utils: Your hostname, NKGV50849583FV1 resolves to a loopback address: 127.0.0.1; using 10.128.184.131 instead (on interface eth0)
12 15/08/16 12:45:17 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Jerry: useful hint
13 15/08/16 12:45:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
14 15/08/16 12:45:18 INFO SecurityManager: Changing view acls to: root
15 15/08/16 12:45:18 INFO SecurityManager: Changing modify acls to: root
16 15/08/16 12:45:18 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with mo dify permissions: Set(root)
17 15/08/16 12:45:19 INFO Slf4jLogger: Slf4jLogger started
18 15/08/16 12:45:19 INFO Remoting: Starting remoting
19 15/08/16 12:45:19 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkMaster@NKGV50849583FV1:7077]
20 15/08/16 12:45:19 INFO Utils: Successfully started service ‘sparkMaster’ on port 7077.

21 15/08/16 12:45:19 INFO Utils: Successfully started service on port 6066.
22 15/08/16 12:45:19 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066
23 15/08/16 12:45:19 INFO Master: Starting Spark master at spark://NKGV50849583FV1:7077
24 15/08/16 12:45:19 INFO Master: Running Spark version 1.4.1
25 15/08/16 12:45:19 INFO Utils: Successfully started service ‘MasterUI’ on port 8080.

26 15/08/16 12:45:19 INFO MasterWebUI: Started MasterWebUI at http://10.128.184.131:8080
27 15/08/16 12:45:20 INFO Master: I have been elected leader! New state: ALIVE - cool!

关于如何分析Spark集群的启动日志就分享到这里了,希望以上内容可以对大家有一定的帮助,可以学到更多知识。如果觉得文章不错,可以把它分享出去让更多的人看到。


当前标题:如何分析Spark集群的启动日志
网页URL:http://cqcxhl.com/article/gcgdpo.html

其他资讯

在线咨询
服务热线
服务热线:028-86922220
TOP