简单的hadoop启动脚本
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 | #!/bin/bash # The next lines are for chkconfig on RedHat systems. # chkconfig: 35 98 02 # description: Starts and stops hadoop Server #autor:516249940@qq.com #date:2017-03-06 # The next lines are for chkconfig on SuSE systems. # /etc/init.d/xxx # ### BEGIN INIT INFO # Provides: xxx # Required-Start: $network $syslog # Required-Stop: # Default-Start: 2 3 5 # Default-Stop: 0 6 # Short-Description: Starts and stops hadoop Server # Description: Starts and stops hadoop Server ### END INIT INFO HADOOP_SBIN= "/usr/local/hadoop/sbin" JAVA_ETC= "/etc/alternatives" case $1 in start) $HADOOP_SBIN /start-dfs .sh $HADOOP_SBIN /start-yarn .sh $HADOOP_SBIN /mr-jobhistory-daemon .sh start historyserver echo "the hadoop is ok" ;; stop) $HADOOP_SBIN /mr-jobhistory-daemon .sh stop historyserver $HADOOP_SBIN /stop-yarn .sh $HADOOP_SBIN /stop-dfs .sh echo "the hadoop is stop" ;; restart|reload|force-reload) $HADOOP_SBIN /mr-jobhistory-daemon .sh stop historyserver $HADOOP_SBIN /stop-yarn .sh $HADOOP_SBIN /stop-dfs .sh echo "the hadoop is stop" sleep 3 $HADOOP_SBIN /start-dfs .sh $HADOOP_SBIN /start-yarn .sh $HADOOP_SBIN /mr-jobhistory-daemon .sh start historyserver echo "the hadoop is ok" ;; status) # 查看状态需要做的步骤 $JAVA_ETC /jps ;; *) echo "$0 {start|stop|restart|status}" exit 4 ;; esac |
低调大师中文资讯倾力打造互联网数据资讯、行业资源、电子商务、移动互联网、网络营销平台。
持续更新报道IT业界、互联网、市场资讯、驱动更新,是最及时权威的产业资讯及硬件资讯报道平台。
转载内容版权归作者及来源网站所有,本站原创内容转载请注明来源。
- 上一篇
spark启动简单脚本
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 #!/bin/bash #ThenextlinesareforchkconfigonRedHatsystems. #chkconfig:359802 #description:StartsandstopsxxxServer #autor:516249940@qq.com #date:2017-03-06 #ThenextlinesareforchkconfigonSuSEsystems. #/etc/init.d/xxx # ###BEGININITINFO #Provides:xxx #Required-Start:$network$syslog #Required-Stop: #Default-Start:235 #Default-Stop:06 #Short-Description:StartsandstopsxxxServ...
- 下一篇
软链接ln -s以及如何解决其产生“Too many levels of symbolic links ”的错误?
1 2 3 4 5 [hadoop@hddcluster2script]$ ls /etc/init .d /hadoop .sh ls :cannotaccess /etc/init .d /hadoop .sh:Toomanylevelsofsymboliclinks [hadoop@hddcluster2script]$ ls /home/hadoop/script/hadoop .sh /etc/init .d /hadoop .sh ls :cannotaccess /etc/init .d /hadoop .sh:Toomanylevelsofsymboliclinks /home/hadoop/script/hadoop .sh 解决办法:sudo 删除链接,然后补上全路径。 在做ln的时候要将文件的绝对路径下的完整目录写上去!例子如下: 1 2 3 4 5 6 7 8 9 10 11 12 [hadoop@hddcluster2script]$ sudo rm /etc/init .d /hadoop .sh [hadoop@hddcluster2script]$ su...
相关文章
文章评论
共有0条评论来说两句吧...