千家信息网

大数据之---Hive全网最详细的编译tar及部署终极篇

发表于:2024-11-26 作者:千家信息网编辑
千家信息网最后更新 2024年11月26日,1、软件环境RHEL6角色jdkhadoop-2.8.1.tar.gzapache-maven-3.3.9mysql-5.1xx.xx.xx.xx ip地址NNhadoop01xx.xx.xx.xx
千家信息网最后更新 2024年11月26日大数据之---Hive全网最详细的编译tar及部署终极篇

1、软件环境

RHEL6角色jdk
hadoop-2.8.1.tar.gz
apache-maven-3.3.9mysql-5.1
xx.xx.xx.xx ip地址NNhadoop01
xx.xx.xx.xx ip地址DNhadoop02
xx.xx.xx.xx ip地址DNhadoop03
xx.xx.xx.xx ip地址DNhadoop04
xx.xx.xx.xx ip地址DNhadoop05

本次涉及伪分布式部署只是要主机hadoop01,软件安装参考伪分布式部署终极篇

编译hive之前,我们需要阅读编译hive的readme.txt 以便我们下在相应的软件版本

Requirements

============


- Java 1.6, 1.7 ---jdk只能使用1.6或者1.7 不能使用1.8


- Hadoop 1.x, 2.x

2、安装jdk


mkdir /usr/java && cd /usr/java/
tar -zxvf /tmp/server-jre-7u80-linux-x64.tar.gz
chown -R root:root /usr/java/jdk1.7.0_80/
echo 'export JAVA_HOME=/usr/java/jdk1.7.0_80'>>/etc/profile
source /etc/profile

3、安装maven


cd /usr/local/
unzip /tmp/apache-maven-3.3.9-bin.zip
chown root: /usr/local/apache-maven-3.3.9 -R
echo 'export MAVEN_HOME=/usr/local/apache-maven-3.3.9'>>/etc/profile
echo 'export MAVEN_OPTS="-Xms256m -Xmx512m"'>>/etc/profile
echo 'export PATH=$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH'>>/etc/profile
source /etc/profile

--JDK 和maven部署可参考---大数据之----部署安装编译打包hadoop终极篇

4、安装mysql


yum -y install mysql-server mysql
/etc/init.d/mysqld start
chkconfig mysqld on
mysqladmin -u root password 123456
mysql -uroot -p123456
use mysql;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'localhost' IDENTIFIED BY 'v123456' WITH GRANT OPTION;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'127.0.0.1' IDENTIFIED BY '123456' WITH GRANT OPTION;
GRANT ALL PRIVILEGES ON *.* TO 'root'@'%' IDENTIFIED BY '123456' WITH GRANT OPTION;
update user set password=password('123456') where user='root';
delete from user where not (user='root') ;
delete from user where user='root' and password='';
drop database test;
DROP USER ''@'%';
flush privileges;

5、下载hive源码包:


# http://archive.cloudera.com/cdh6/cdh/5/
# 根据cdh版本选择对应hive软件包:
# hive-1.1.0-cdh6.7.1-src.tar.gz
# 解压后使用maven命令编译成安装包

6、编译:


cd /tmp/
tar -xf hive-1.1.0-cdh6.7.1-src.tar.gz
cd /tmp/hive-1.1.0-cdh6.7.1
mvn clean package -DskipTests -Phadoop-2 -Pdist
# 编译生成的包在以下位置:
# packaging/target/apache-hive-1.1.0-cdh6.7.1-bin.tar.gz

7、安装编译生成的Hive包,然后测试


cd /usr/local/
tar -xf /tmp/apache-hive-1.1.0-cdh6.7.1-bin.tar.gz
ln -s apache-hive-1.1.0-cdh6.7.1-bin hive
chown -R hadoop:hadoop apache-hive-1.1.0-cdh6.7.1-bin
chown -R hadoop:hadoop hive
echo 'export HIVE_HOME=/usr/local/hive'>>/etc/profile
echo 'export PATH=$HIVE_HOME/bin:$PATH'>>/etc/profile

8、更改环境变量


su - hadoop
cd /usr/local/hive
cd conf

1、hive-env.sh
cp hive-env.sh.template hive-env.sh&&vi hive-env.sh
HADOOP_HOME=/usr/local/hadoop

2、hive-site.xml
vi hive-site.xml





javax.jdo.option.ConnectionURL
jdbc:mysql://localhost:3306/vincent_hive?createDatabaseIfNotExist=true


javax.jdo.option.ConnectionDriverName
com.mysql.jdbc.Driver


javax.jdo.option.ConnectionUserName
root


javax.jdo.option.ConnectionPassword
vincent


9、拷贝mysql驱动包到$HIVE_HOME/lib


# 上方的hive-site.xml使用了java的mysql驱动包
# 需要将这个包上传到hive的lib目录之下
# 解压 mysql-connector-java-5.1.45.zip 对应的文件到目录即可
cd /tmp
unzip mysql-connector-java-5.1.45.zip
cd mysql-connector-java-5.1.45
cp mysql-connector-java-5.1.45-bin.jar /usr/local/hive/lib/

未拷贝有相关报错:
The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH.
Please check your CLASSPATH specification,
and the name of the driver.

0