设计网站专业,合肥软件建设,国网法治建设网站,发布到wordpress目录 一、准备工作 
1、安装jdk#xff08;每个节点都执行#xff09; 
2、修改主机配置 #xff08;每个节点都执行#xff09; 
3、配置ssh无密登录 #xff08;每个节点都执行#xff09; 
二、安装Hadoop#xff08;每个节点都执行#xff09; 
三、集群启动配置每个节点都执行 
2、修改主机配置 每个节点都执行 
3、配置ssh无密登录 每个节点都执行 
二、安装Hadoop每个节点都执行 
三、集群启动配置每个节点都执行 
1、core-site.xml 
2、hdfs-site.xml 
3、yarn-site.xml  
4、mapred-site.xml 
5、workers 
四、启动集群和测试每个节点都执行 
1、配置java环境 
2、指定root启动用户  
3、启动 
3.1、如果集群是第一次启动 
3.2、启动HDFS 在hadoop1节点 
3.3、启动YARN在配置ResourceManager的hadoop2节点 
3.4、查看 HDFS的NameNode 
3.5、查看YARN的ResourceManager 
4、 测试 4.1、测试 4.2、文件存储路径 4.3、统计文本个数 
五、配置Hadoop脚本 
1、启动脚本hadoop.sh 
2、查看进程脚本jpsall.sh 
3、拷贝到其他服务器 一、准备工作 hadoop1  hadoop2  hadoop3 IP192.168.139.176192.168.139.214192.168.139.215 HDFS   NameNode DataNode   DataNode  SecondaryNameNode DataNode  YARN   NodeManager  ResourceManager NodeManager   NodeManager  
1、安装jdk每个节点都执行 
tar -zxf jdk-8u431-linux-x64.tar.gz
mv jdk1.8.0_431 /usr/local/java#进入/etc/profile.d目录
vim java_env.sh#编辑环境变量
#java
JAVA_HOME/usr/local/java
JRE_HOME/usr/local/java/jre
CLASSPATH.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib
PATH$JAVA_HOME/bin:$PATH
export PATH JAVA_HOME CLASSPATH#刷新
source /etc/profile 
2、修改主机配置 每个节点都执行 
vim /etc/hosts192.168.139.176 hadoop1
192.168.139.214 hadoop2
192.168.139.215 hadoop3#修改主机名(每个节点对应修改
vim /etc/hostname 
hadoop1注意这里本地的host文件也要修改一下 后面访问配置的是主机名如果不配置需修改为ip 
3、配置ssh无密登录 每个节点都执行 
#生成密钥
ssh-keygen -t rsa#复制到其他节点
ssh-copy-id hadoop1
ssh-copy-id hadoop2
ssh-copy-id hadoop3 
二、安装Hadoop每个节点都执行 
tar -zxf hadoop-3.4.0.tar.gz
mv hadoop-3.4.0 /usr/local/#配置环境变量进入/etc/profile.d目录vim hadoop_env.sh#添加如下内容
#hadoop
export HADOOP_HOME/usr/local/hadoop-3.4.0
export PATH$PATH:$HADOOP_HOME/bin
export PATH$PATH:$HADOOP_HOME/sbin#查看版本
hadoop version 
三、集群启动配置每个节点都执行 
修改/usr/local/hadoop-3.4.0/etc/hadoop目录下 
1、core-site.xml 
?xml version1.0 encodingUTF-8?
?xml-stylesheet typetext/xsl hrefconfiguration.xsl?
!--Licensed under the Apache License, Version 2.0 (the License);you may not use this file except in compliance with the License.You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, softwaredistributed under the License is distributed on an AS IS BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.See the License for the specific language governing permissions andlimitations under the License. See accompanying LICENSE file.
--!-- Put site-specific property overrides in this file. --configuration!-- 指定NameNode的地址 --propertynamefs.defaultFS/namevaluehdfs://hadoop1:8020/value/property!-- 指定hadoop数据的存储目录 --propertynamehadoop.tmp.dir/namevalue/usr/local/hadoop-3.4.0/data/value/property!-- 配置HDFS网页登录使用的静态用户为root ,实际生产请创建新用户--propertynamehadoop.http.staticuser.user/namevalueroot/value/property/configuration2、hdfs-site.xml 
?xml version1.0 encodingUTF-8?
?xml-stylesheet typetext/xsl hrefconfiguration.xsl?
!--Licensed under the Apache License, Version 2.0 (the License);you may not use this file except in compliance with the License.You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, softwaredistributed under the License is distributed on an AS IS BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.See the License for the specific language governing permissions andlimitations under the License. See accompanying LICENSE file.
--!-- Put site-specific property overrides in this file. --configuration
!-- nn web端访问地址--propertynamedfs.namenode.http-address/namevaluehadoop1:9870/value/property!-- 2nn web端访问地址--propertynamedfs.namenode.secondary.http-address/namevaluehadoop3:9868/value/property/configuration3、yarn-site.xml  
?xml version1.0?
!--Licensed under the Apache License, Version 2.0 (the License);you may not use this file except in compliance with the License.You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, softwaredistributed under the License is distributed on an AS IS BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.See the License for the specific language governing permissions andlimitations under the License. See accompanying LICENSE file.
--
configuration!-- Site specific YARN configuration properties --!-- 指定MR走shuffle --propertynameyarn.nodemanager.aux-services/namevaluemapreduce_shuffle/value/property!-- 指定ResourceManager的地址--propertynameyarn.resourcemanager.hostname/namevaluehadoop2/value/property!-- 环境变量的继承 --propertynameyarn.nodemanager.env-whitelist/namevalueJAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_HOME,PATH,LANG,TZ,HADOOP_MAPRED_HOME/value/property!-- 开启日志聚集功能 --propertynameyarn.log-aggregation-enable/namevaluetrue/value/property!-- 设置日志聚集服务器地址 --propertynameyarn.log.server.url/namevaluehttp://hadoop102:19888/jobhistory/logs/value/property!-- 设置日志保留时间为7天 --propertynameyarn.log-aggregation.retain-seconds/namevalue604800/value/property
/configuration4、mapred-site.xml 
?xml version1.0?
?xml-stylesheet typetext/xsl hrefconfiguration.xsl?
!--Licensed under the Apache License, Version 2.0 (the License);you may not use this file except in compliance with the License.You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, softwaredistributed under the License is distributed on an AS IS BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.See the License for the specific language governing permissions andlimitations under the License. See accompanying LICENSE file.
--
!-- Put site-specific property overrides in this file. --
configuration!-- 指定MapReduce程序运行在Yarn上 --propertynamemapreduce.framework.name/namevalueyarn/value/property!-- 历史服务器端地址 --propertynamemapreduce.jobhistory.address/namevaluehadoop1:10020/value/property!-- 历史服务器web端地址 --propertynamemapreduce.jobhistory.webapp.address/namevaluehadoop1:19888/value/property
/configuration5、workers 
hadoop1
hadoop2
hadoop3注意该文件中添加的内容结尾不允许有空格文件中不允许有空行 
四、启动集群和测试每个节点都执行 
1、配置java环境 
#修改这个文件/usr/local/hadoop/etc/hadoop/hadoop-env.shexport JAVA_HOME/usr/local/java 
2、指定root启动用户  
#在start-dfs.shstop-dfs.sh 添加如下内容 方法上面HDFS_DATANODE_USERroot
HADOOP_SECURE_DN_USERhdfs
HDFS_NAMENODE_USERroot
HDFS_SECONDARYNAMENODE_USERroot在 start-yarn.sh stop-yarn.sh 添加如下内容 方法上面
YARN_RESOURCEMANAGER_USERroot
YARN_NODEMANAGER_USERroot 
注hadoop默认情况下的是不支持root账户启动的在实际生产请创建用户组和用户并且授予该用户root的权限 
3、启动 
3.1、如果集群是第一次启动 
需要在hadoop1节点格式化NameNode注意格式化NameNode会产生新的集群id导致NameNode和DataNode的集群id不一致集群找不到已往数据。如果集群在运行过程中报错需要重新格式化NameNode的话一定要先停止namenode和datanode进程并且要删除所有机器的data和logs目录然后再进行格式化。 
hdfs namenode -format 
3.2、启动HDFS 在hadoop1节点 
/usr/local/hadoop-3.4.0/sbin/start-dfs.sh 
3.3、启动YARN在配置ResourceManager的hadoop2节点 
/usr/local/hadoop-3.4.0/sbin/start-yarn.sh 
3.4、查看 HDFS的NameNode 
http://192.168.139.176:9870/ 3.5、查看YARN的ResourceManager 
http://192.168.139.214:8088 4、 测试 
4.1、测试 
#创建文件
hadoop fs -mkdir /input#创建文件
touch text.txt#上传文件
hadoop fs -put  text.txt /input#删除
hadoop fs -rm -r /output 4.2、文件存储路径 
/usr/local/hadoop-3.4.0/data/dfs/data/current/BP-511066843-192.168.139.176-1734965488199/current/finalized/subdir0/subdir0 4.3、统计文本个数 
hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.4.0.jar wordcount /input  /output 
五、配置Hadoop脚本 
1、启动脚本hadoop.sh 
#!/bin/bashif [ $# -lt 1 ]
thenecho No Args Input...exit ;
ficase $1 in
start)echo   启动 hadoop集群 echo  --------------- 启动 hdfs ---------------ssh hadoop1 /usr/local/hadoop-3.4.0/sbin/start-dfs.shecho  --------------- 启动 yarn ---------------ssh hadoop2 /usr/local/hadoop-3.4.0/sbin/start-yarn.shecho  --------------- 启动 historyserver ---------------ssh hadoop1 /usr/local/hadoop-3.4.0/bin/mapred --daemon start historyserver
;;
stop)echo   关闭 hadoop集群 echo  --------------- 关闭 historyserver ---------------ssh hadoop1 /usr/local/hadoop-3.4.0/bin/mapred --daemon stop historyserverecho  --------------- 关闭 yarn ---------------ssh hadoop2 /usr/local/hadoop-3.4.0/sbin/stop-yarn.shecho  --------------- 关闭 hdfs ---------------ssh hadoop1 /usr/local/hadoop-3.4.0/sbin/stop-dfs.sh
;;
*)echo Input Args Error...
;;
esac#授权
chmod x hadoop.sh 
2、查看进程脚本jpsall.sh 
#!/bin/bashfor host in hadoop1 hadoop2 hadoop3
doecho  $host ssh $host jps 
done3、拷贝到其他服务器 
scp roothadoop1:/usr/local/hadoop-3.4.0 hadoop.sh jpsall.sh roothadoop2:/usr/local/hadoop-3.4.0/scp roothadoop1:/usr/local/hadoop-3.4.0 hadoop.sh jpsall.sh roothadoop3:/usr/local/hadoop-3.4.0/ 文章转载自: http://www.morning.rmfh.cn.gov.cn.rmfh.cn http://www.morning.dhqzc.cn.gov.cn.dhqzc.cn http://www.morning.chzbq.cn.gov.cn.chzbq.cn http://www.morning.pghfy.cn.gov.cn.pghfy.cn http://www.morning.wklhn.cn.gov.cn.wklhn.cn http://www.morning.dfkmz.cn.gov.cn.dfkmz.cn http://www.morning.pthmn.cn.gov.cn.pthmn.cn http://www.morning.gnhsg.cn.gov.cn.gnhsg.cn http://www.morning.bnylg.cn.gov.cn.bnylg.cn http://www.morning.ljllt.cn.gov.cn.ljllt.cn http://www.morning.ykswq.cn.gov.cn.ykswq.cn http://www.morning.ttcmdsg.cn.gov.cn.ttcmdsg.cn http://www.morning.gqbks.cn.gov.cn.gqbks.cn http://www.morning.rdnjc.cn.gov.cn.rdnjc.cn http://www.morning.rongxiaoman.com.gov.cn.rongxiaoman.com http://www.morning.qnksk.cn.gov.cn.qnksk.cn http://www.morning.hnrls.cn.gov.cn.hnrls.cn http://www.morning.kbqbx.cn.gov.cn.kbqbx.cn http://www.morning.ybgpk.cn.gov.cn.ybgpk.cn http://www.morning.zshuhd015.cn.gov.cn.zshuhd015.cn http://www.morning.mbqyl.cn.gov.cn.mbqyl.cn http://www.morning.yrjym.cn.gov.cn.yrjym.cn http://www.morning.rqgbd.cn.gov.cn.rqgbd.cn http://www.morning.lkkgq.cn.gov.cn.lkkgq.cn http://www.morning.gsjw.cn.gov.cn.gsjw.cn http://www.morning.fpqsd.cn.gov.cn.fpqsd.cn http://www.morning.khxwp.cn.gov.cn.khxwp.cn http://www.morning.zdfrg.cn.gov.cn.zdfrg.cn http://www.morning.pzcqz.cn.gov.cn.pzcqz.cn http://www.morning.xdlwm.cn.gov.cn.xdlwm.cn http://www.morning.dpsyr.cn.gov.cn.dpsyr.cn http://www.morning.tlyms.cn.gov.cn.tlyms.cn http://www.morning.qzmnr.cn.gov.cn.qzmnr.cn http://www.morning.jlgjn.cn.gov.cn.jlgjn.cn http://www.morning.rkck.cn.gov.cn.rkck.cn http://www.morning.qwyms.cn.gov.cn.qwyms.cn http://www.morning.fdhwh.cn.gov.cn.fdhwh.cn http://www.morning.gwkwt.cn.gov.cn.gwkwt.cn http://www.morning.rwyw.cn.gov.cn.rwyw.cn http://www.morning.tsqrc.cn.gov.cn.tsqrc.cn http://www.morning.gqbks.cn.gov.cn.gqbks.cn http://www.morning.tzrmp.cn.gov.cn.tzrmp.cn http://www.morning.bqwnp.cn.gov.cn.bqwnp.cn http://www.morning.yqwrj.cn.gov.cn.yqwrj.cn http://www.morning.dbfj.cn.gov.cn.dbfj.cn http://www.morning.zpkfb.cn.gov.cn.zpkfb.cn http://www.morning.buyid.com.cn.gov.cn.buyid.com.cn http://www.morning.cwrpd.cn.gov.cn.cwrpd.cn http://www.morning.dbdmr.cn.gov.cn.dbdmr.cn http://www.morning.ltqzq.cn.gov.cn.ltqzq.cn http://www.morning.ddqdl.cn.gov.cn.ddqdl.cn http://www.morning.bqmhm.cn.gov.cn.bqmhm.cn http://www.morning.bfbl.cn.gov.cn.bfbl.cn http://www.morning.xnpj.cn.gov.cn.xnpj.cn http://www.morning.bxqpl.cn.gov.cn.bxqpl.cn http://www.morning.bdsyu.cn.gov.cn.bdsyu.cn http://www.morning.ykkrg.cn.gov.cn.ykkrg.cn http://www.morning.blzrj.cn.gov.cn.blzrj.cn http://www.morning.wnnts.cn.gov.cn.wnnts.cn http://www.morning.mpflb.cn.gov.cn.mpflb.cn http://www.morning.rppf.cn.gov.cn.rppf.cn http://www.morning.rdgb.cn.gov.cn.rdgb.cn http://www.morning.mkfr.cn.gov.cn.mkfr.cn http://www.morning.fnnkl.cn.gov.cn.fnnkl.cn http://www.morning.ljdtn.cn.gov.cn.ljdtn.cn http://www.morning.qjrjs.cn.gov.cn.qjrjs.cn http://www.morning.rykx.cn.gov.cn.rykx.cn http://www.morning.lysrt.cn.gov.cn.lysrt.cn http://www.morning.dtrz.cn.gov.cn.dtrz.cn http://www.morning.tymnr.cn.gov.cn.tymnr.cn http://www.morning.homayy.com.gov.cn.homayy.com http://www.morning.hgscb.cn.gov.cn.hgscb.cn http://www.morning.tpnch.cn.gov.cn.tpnch.cn http://www.morning.pwghp.cn.gov.cn.pwghp.cn http://www.morning.dkcpt.cn.gov.cn.dkcpt.cn http://www.morning.dbphz.cn.gov.cn.dbphz.cn http://www.morning.lrgfd.cn.gov.cn.lrgfd.cn http://www.morning.mgzjz.cn.gov.cn.mgzjz.cn http://www.morning.wnkqt.cn.gov.cn.wnkqt.cn http://www.morning.npbgj.cn.gov.cn.npbgj.cn