分享

使用Ambari搭建Spark集群

 株野 2018-01-30

原创:2018年01月09日 17:44:01

来源:http://f./thread-881024-1-1.html

标签:ambari / hadoop / SPARK / 搭建

 记录了一些有坑的地方,其他按老师的视频可以正常过去的不在此细说

环境:
Centos7.2
Centos6.8
(开始服务器和多个节点都用Centos7,ambari2.5.0.3,结果出现openssl错误,
后服务端换Centos6.8,ambari换2.6.0.0版本,各个节点全部升级,包括Server 和Agent 不再提示错误)

ip地址:192.168.0.50 - 192.168.0.56

# uname -a
Linux k8s-m1 3.10.0-693.11.1.el7.x86_64 #1 SMP Mon Dec 4 23:52:40 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
# free -m
              total        used        free      shared  buff/cache   available
Mem:           1839         705          76          78        1057         848
Swap:          2047           5        2042


一:初始化环境
1、升级系统准备
#yum update
#yum upgrade

# yum install wget
(开始选择的2.5.0.3,有错误过不去)
# wget -nv  http://public-repo-1.hortonworks ... 2.5.0.3/ambari.repo -p /etc/yum.repos.d/ambari.repo
选择下面这个正常安装
# wget -nv  http://public-repo-1.hortonworks ... 2.6.0.0/ambari.repo -p /etc/yum.repos.d/ambari.repo

(此处版本的选择对后面影响很大)

查看源列表是否更新
# yum repolist
Loaded plugins: fastestmirror
ambari-2.5.0.3                                                                                                                        | 2.9 kB  00:00:00     
ambari-2.5.0.3/primary_db                                                                                                             | 8.5 kB  00:00:02     
Loading mirror speeds from cached hostfile
* base: mirror.bit.edu.cn
* extras: mirrors.shuosc.org
* updates: mirrors.shuosc.org
repo id                                                             repo name                                                                          status
ambari-2.5.0.3                                                      ambari Version - ambari-2.5.0.3                                                       12
base/7/x86_64                                                       CentOS-7 - Base                                                                    9,591
extras/7/x86_64                                                     CentOS-7 - Extras                                                                    327
kubernetes                                                          kubernetes                                                                            17
updates/7/x86_64                                                    CentOS-7 - Updates                                                                 1,573
repolist: 11,520

# yum update
# yum install -y ambari-server
安装Java(可以省略,在ambari-server setup的时候会安装)
# wget http://download.oracle.com/otn-pub/java/jdk/8u151-b12/e758a0de34e24606bca991d704f6dcbf/jdk-8u151-linux-x64.tar.gz
# tar -zxvf jdk-8u151-linux-x64.tar.gz  -C /opt
#vi /etc/profile
最后加入
export JAVA_HOME=/opt/java/
export PATH=$PATH:$JAVA_HOME/bin

# source /etc/profile
验证java安装无误
# java -version
java version "1.8.0_151"
Java(TM) SE Runtime Environment (build 1.8.0_151-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.151-b12, mixed mode)

执行ambari-server安装
# ambari-server setup
在选择jdk的步骤,选择刚刚安装的jdk,路径是/opt/java
直到出现:
Adjusting ambari-server permissions and ownership...
Ambari Server 'setup' completed successfully.

启动
#ambari-server start

直到出现:
Waiting for server start........................................................
Server started listening on 8080

DB configs consistency check: no errors and warnings were found.
Ambari Server 'start' completed successfully.

安装lsof
#yum install -y lsof

查看8080是否启动
# lsof -i:8080
COMMAND  PID USER   FD   TYPE DEVICE SIZE/OFF NODE NAME
java    4412 root 1437u  IPv6  52661      0t0  TCP *:webcache (LISTEN)


进入操作界面
http://node0:8080/#/login
用户名密码都是admin
选择 launch install wizard
集群名称spark,下一步



Host Checks 问题

1、The following hosts have Transparent Huge Pages (THP) enabled. THP should be disabled to avoid potential Hadoop performance issues.
centos6.8的节点上提示这个,centos7无问题
# echo never > /sys/kernel/mm/redhat_transparent_hugepage/defrag  
echo never > /sys/kernel/mm/redhat_transparent_hugepage/enabled  
[root@node0 ambari-server]# echo never > /sys/kernel/mm/transparent_hugepage/enabled  
[root@node0 ambari-server]# echo never > /sys/kernel/mm/transparent_hugepage/defrag


2.所有节点无时间服务
The following services should be up
Service
ntpd or chronyd


安装
yum install chrony -y

修改配置
cat <<EOF> /etc/chrony.conf  server 192.168.1.1 iburst  stratumweight 0  driftfile /var/lib/chrony/drift  rtcsync  makestep 10 3  allow 192.168.1.0/24  bindcmdaddress 127.0.0.1  bindcmdaddress ::1  keyfile /etc/chrony.keys  commandkey 1  generatecommandkey  noclientlog  logchange 0.5  logdir /var/log/chrony  EOF  

启动服务(Centos7.x)
systemctl enable chronyd.service  systemctl start chronyd.service

centos6.9版本需要使用旧命令来启动服务

postgres的驱动问题,在server节点执行
#wget http://central./maven2/org/postgresql/postgresql/9.2-1002-jdbc4/postgresql-9.2-1002-jdbc4.jar
#ambari-server setup --jdbc-db=postgres --jdbc-driver=/root/PostgreSQL-9.2-1002-jdbc4.jar


Install,Start and Test 过程要多次重试

2018-01-09 09:49:20,610 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1./HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1./HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}2018-01-09 09:49:20,610 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match2018-01-09 09:49:20,611 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 09:49:20,723 - Skipping installation of existing package unzip2018-01-09 09:49:20,724 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 09:49:20,737 - Skipping installation of existing package curl2018-01-09 09:49:20,738 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 09:49:20,757 - Skipping installation of existing package hdp-select2018-01-09 09:49:20,766 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed2018-01-09 09:49:20,774 - Skipping stack-select on SMARTSENSE because it does not exist in the stack-select package structure


2018-01-09 09:49:34,813 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1./HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1./HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}2018-01-09 09:49:34,814 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match2018-01-09 09:49:34,815 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 09:49:34,970 - Skipping installation of existing package unzip2018-01-09 09:49:34,970 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 09:49:34,985 - Skipping installation of existing package curl2018-01-09 09:49:34,986 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 09:49:35,000 - Skipping installation of existing package hdp-select2018-01-09 09:49:35,007 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed2018-01-09 09:49:35,015 - Skipping stack-select on SMARTSENSE because it does not exist in the stack-select package structure.

2018-01-09 09:49:34,710 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.62018-01-09 09:49:34,719 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf2018-01-09 09:49:34,720 - Group['livy'] {}2018-01-09 09:49:34,722 - Group['spark'] {}2018-01-09 09:49:34,722 - Group['hdfs'] {}2018-01-09 09:49:34,722 - Group['hadoop'] {}2018-01-09 09:49:34,723 - Group['users'] {}2018-01-09 09:49:34,723 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 09:49:34,725 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 09:49:34,726 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 09:49:34,727 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 09:49:34,728 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 09:49:34,729 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}2018-01-09 09:49:34,730 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}2018-01-09 09:49:34,731 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}2018-01-09 09:49:34,732 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 09:49:34,733 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 09:49:34,734 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 09:49:34,735 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}2018-01-09 09:49:34,737 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}2018-01-09 09:49:34,743 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if2018-01-09 09:49:34,744 - Group['hdfs'] {}2018-01-09 09:49:34,744 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}2018-01-09 09:49:34,745 - FS Type: 2018-01-09 09:49:34,745 - Directory['/etc/hadoop'] {'mode': 0755}2018-01-09 09:49:34,769 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}2018-01-09 09:49:34,770 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match2018-01-09 09:49:34,771 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}2018-01-09 09:49:34,797 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1./HDP/centos7/2.x/updates/2.6.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}2018-01-09 09:49:34,807 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1./HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0'}2018-01-09 09:49:34,808 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match2018-01-09 09:49:34,809 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1./HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}2018-01-09 09:49:34,813 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1./HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1./HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}2018-01-09 09:49:34,814 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match2018-01-09 09:49:34,815 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 09:49:34,970 - Skipping installation of existing package unzip2018-01-09 09:49:34,970 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 09:49:34,985 - Skipping installation of existing package curl2018-01-09 09:49:34,986 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 09:49:35,000 - Skipping installation of existing package hdp-select2018-01-09 09:49:35,007 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed2018-01-09 09:49:35,015 - Skipping stack-select on SMARTSENSE because it does not exist in the stack-select package structure.Command failed after 1 tries

似乎说ambari.repo选择的源有点问题,ambari在安装过程中又重新加入了一个repo,贴出来比较一下,
# cat ambari-hdp-1.repo
[HDP-2.6-repo-1]
name=HDP-2.6-repo-1
baseurl=http://public-repo-1./HDP/centos7/2.x/updates/2.6.3.0

path=/
enabled=1
gpgcheck=0
[HDP-UTILS-1.1.0.21-repo-1]
name=HDP-UTILS-1.1.0.21-repo-1
baseurl=http://public-repo-1./HDP-UTILS-1.1.0.21/repos/centos7

path=/
enabled=1
gpgcheck=0


# cat ambari.repo
#VERSION_NUMBER=2.6.0.0-267
[ambari-2.6.0.0]
name=ambari Version - ambari-2.6.0.0
baseurl=http://public-repo-1./ambari/centos7/2.x/updates/2.6.0.0
gpgcheck=1
gpgkey=http://public-repo-1./ambari/centos7/2.x/updates/2.6.0.0/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1
priority=1



2018-01-09 10:47:46,218 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.62018-01-09 10:47:46,226 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf2018-01-09 10:47:46,228 - Group['livy'] {}2018-01-09 10:47:46,229 - Group['spark'] {}2018-01-09 10:47:46,229 - Group['hdfs'] {}2018-01-09 10:47:46,229 - Group['hadoop'] {}2018-01-09 10:47:46,230 - Group['users'] {}2018-01-09 10:47:46,230 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 10:47:46,232 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 10:47:46,233 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 10:47:46,234 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 10:47:46,235 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 10:47:46,236 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}2018-01-09 10:47:46,237 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}2018-01-09 10:47:46,238 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}2018-01-09 10:47:46,239 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 10:47:46,240 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 10:47:46,242 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}2018-01-09 10:47:46,242 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}2018-01-09 10:47:46,244 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}2018-01-09 10:47:46,249 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if2018-01-09 10:47:46,250 - Group['hdfs'] {}2018-01-09 10:47:46,250 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}2018-01-09 10:47:46,251 - FS Type: 2018-01-09 10:47:46,251 - Directory['/etc/hadoop'] {'mode': 0755}2018-01-09 10:47:46,276 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}2018-01-09 10:47:46,277 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}2018-01-09 10:47:46,295 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1./HDP/centos7/2.x/updates/2.6.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}2018-01-09 10:47:46,307 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1./HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0'}2018-01-09 10:47:46,308 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match2018-01-09 10:47:46,308 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1./HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}2018-01-09 10:47:46,313 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1./HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1./HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}2018-01-09 10:47:46,313 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match2018-01-09 10:47:46,314 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 10:47:46,422 - Skipping installation of existing package unzip2018-01-09 10:47:46,422 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 10:47:46,435 - Skipping installation of existing package curl2018-01-09 10:47:46,435 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 10:47:46,448 - Skipping installation of existing package hdp-select2018-01-09 10:47:46,454 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed2018-01-09 10:47:46,761 - Command repositories: HDP-2.6-repo-1, HDP-UTILS-1.1.0.21-repo-12018-01-09 10:47:46,761 - Applicable repositories: HDP-2.6-repo-1, HDP-UTILS-1.1.0.21-repo-12018-01-09 10:47:46,763 - Looking for matching packages in the following repositories: HDP-2.6-repo-1, HDP-UTILS-1.1.0.21-repo-12018-01-09 10:47:49,160 - Package['slider_2_6_3_0_235'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 10:47:49,273 - Installing package slider_2_6_3_0_235 ('/usr/bin/yum -d 0 -e 0 -y install slider_2_6_3_0_235')2018-01-09 10:52:20,943 - Package['Storm_2_6_3_0_235-slider-client'] {'retry_on_repo_unavailability': False, 'retry_count': 5}2018-01-09 10:52:20,970 - Installing package storm_2_6_3_0_235-slider-client ('/usr/bin/yum -d 0 -e 0 -y install storm_2_6_3_0_235-slider-client')Command failed after 1 tries

此处需要到具体的节点执行错误提示里面的yum安装,比如:
#yum -d 0 -e 0 -y install storm_2_6_3_0_235-slider-client
#yum -d 0 -e 0 -y install slider_2_6_3_0_235


解决完所有错误 ,继续Retry,多Retry几次就会过去



    本站是提供个人知识管理的网络存储空间,所有内容均由用户发布,不代表本站观点。请注意甄别内容中的联系方式、诱导购买等信息,谨防诈骗。如发现有害或侵权内容,请点击一键举报。
    转藏 分享 献花(0

    0条评论

    发表

    请遵守用户 评论公约

    类似文章 更多