安装环境:
操作系统:Centos 8
jdk:1.8
Hadoop 2.7.4
三个节点主机名分别为 node-01 node-02 node-03
网络配置:
配置静态ip
1
vi /etc/sysconfig/network-scripts/ifcfg-ens33
关闭网络,即关闭所有的网卡
nmcli networking off
开启网络,开启所有的网卡
nmcli networking on
关闭防火墙
1
2
3systemctl stop firewalld
systemctl disable firewalld
systemctl status firewalld设置主机名
1
hostnamectl set-hostname node-01
配置ip映射主机名
1
vim /etc/hosts
添加对应的ip和主机名
ssh免密登录:
1
ssh-keygen -b 1024 -t rsa
一直按回车即可
将node-01的公钥传到node-02和node-03上
node-01:
1
2
3ssh-copy-id node-01
ssh-copy-id node-02
ssh-copy-id node-03node-02,node-03:
1
ssh-copy-id node-01
修改authorized_keys权限
1
2cd .ssh
chmod 600 authorized_keys
时间同步
1
crontab -e
输入
1
0 1 * * * /usr/sbin/ntpdate cn.pool.ntp.org
安装jdk
上传安装包
rz命令,选择相应的安装包
解压
1
tar -zxvf 压缩包名称
配置环境变量
1
vim .bash_profile
1
2
3export JAVA_HOME=/root/export/servers/jdk1.8.0_301
export PATH=$JAVA_HOME/bin:$PATHsource
1
source .bash_profile
复制到node-02,node-03上
1
2scp -r export/servers/jdk1.8.0_301 root@node-02:/root/export/servers
scp -r export/servers/jdk1.8.0_301 root@node-02:/root/export/servers
修改Hadoop配置文件
```shell
vim hadoop-env.sh
vim yarn-env.sh1
2
3
4
5
6
7
修改JAVA_HOME
![image-20211031200037450](https://gitee.com/a-c-dream/cloudimage/raw/master/img/image-20211031200037450.png)
2. ```shell
cp mapred-site.xml.template mapred-site.xml1
2
3
4
5
6
7
8
9
10<configuration>
<!-- 指定MapReduce运行时框架,这里指定在Yarn上,默认是local -->
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>core-site.xml
1
2
3
4
5
6
7
8
9
10
11
12
13<configuration>
<!-- 用于设置Hadoop的文件系统,由URI指定 -->
<property>
<name>fs.defaultFS</name>
<!-- 用于指定namenode地址在hadoop01机器上 -->
<value>hdfs://node-01:9000</value>
</property>
<!-- 配置Hadoop的临时目录,默认/tmp/hadoop-${user.name} -->
<property>
<name>hadoop.tmp.dir</name>
<value>/export/servers/hadoop-2.7.4/tmp</value>
</property>
</configuration>hdfs-site.xml
1
2
3
4
5
6
7
8<configuration>
<!-- 指定HDFS副本的数量 -->
<property>
<name>dfs.replication</name>
<value>3</value>
</property>
</configuration>yarn-site.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16<configuration>
<!-- Site specific YARN configuration properties -->
<!-- 指定Yarn集群的管理者(ResourceManager)的地址 -->
<property>
<name>yarn.resourcemanager.hostname</name>
<value>node-01</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
</configuration>slaves
复制到node-02,node-03
1
2scp -r export/sever/export/servers/hadoop-2.7.4/ node-02:/root/export/sever/export/servers
scp -r export/sever/export/servers/hadoop-2.7.4/ node-03:/root/export/sever/export/servers 配置Hadoop环境变量
1
vim .bash_profile
1
2
3export HADOOP_HOME=/root/export/servers/hadoop-2.7.4
export PATH=$HADOOP_HOME/bin:$HADOOP_HOME/sbin$PATH1
source .bash_profile