怎么在CentOS下重新编译hadoop源码
这篇文章主要介绍"怎么在CentOS下重新编译hadoop源码",在日常操作中,相信很多人在怎么在CentOS下重新编译hadoop源码问题上存在疑惑,小编查阅了各式资料,整理出简单好用的操作方法,希望对大家解答"怎么在CentOS下重新编译hadoop源码"的疑惑有所帮助!接下来,请跟着小编一起来学习吧!
网上找了Unable to load native-hadoop library for your platform... using builtin-java classes where applicabl警告的原因,说的是由于hadoop一些本地库里编译时用到的C库与本机上的版本不同造成的,在本机环境下重新编译hadoop即可。
不过这个警告对hadoop使用影响不大。
然而作为一个有强迫症的程序员尝试了一些方法后无果,只能自己编译源码
切换到root用户
下载Ant Maven ProtocolBuffer findbugs CMake 的tar包放到/hadoop目录下
我用的版本是:
[hadoop@vm1 Downloads]$ lsapache-ant-1.9.5.tar.gz findbugs-2.0.2.tar.gz jdk-8u45-linux-x64.gzapache-maven-3.0.5.tar.gz hadoop-2.7.0-src.tar.gz protobuf-2.5.0cmake-2.8.6 hadoop-2.7.0.tar.gz protobuf-2.5.0.tar.gzcmake-2.8.6.tar.gz jdk-7u79-linux-x64.gz
yum -y install lzo-devel zlib-devel gcc autoconf automake libtooltar zxf protobuf-2.5.0.tar.gzcd protobuf-2.5.0./configure
这时候因为protobuf需要c++支持,如果机器没装c++会出现如下错误:
checking whether to enable maintainer-specific portions of Makefiles... yeschecking build system type... x86_64-unknown-linux-gnuchecking host system type... x86_64-unknown-linux-gnuchecking target system type... x86_64-unknown-linux-gnuchecking for a BSD-compatible install... /usr/bin/install -cchecking whether build environment is sane... yeschecking for a thread-safe mkdir -p... /bin/mkdir -pchecking for gawk... gawkchecking whether make sets $(MAKE)... yeschecking for gcc... gccchecking whether the C compiler works... yeschecking for C compiler default output file name... a.outchecking for suffix of executables... checking whether we are cross compiling... nochecking for suffix of object files... ochecking whether we are using the GNU C compiler... yeschecking whether gcc accepts -g... yeschecking for gcc option to accept ISO C89... none neededchecking for style of include used by make... GNUchecking dependency style of gcc... gcc3checking for g++... nochecking for c++... nochecking for gpp... nochecking for aCC... nochecking for CC... nochecking for cxx... nochecking for cc++... nochecking for cl.exe... nochecking for FCC... nochecking for KCC... nochecking for RCC... nochecking for xlC_r... nochecking for xlC... nochecking whether we are using the GNU C++ compiler... nochecking whether g++ accepts -g... nochecking dependency style of g++... nonechecking how to run the C++ preprocessor... /lib/cppconfigure: error: in `/hadoop/protobuf-2.5.0':configure: error: C++ preprocessor "/lib/cpp" fails sanity checkSee `config.log' for more details
----------------------------------------------------------------------------------------
这时候需要
yum install glibc-headersyum install gcc-c++
然后再到protobuf文件夹下执行./configure
这下好了。那么goon
make make checkmake installtar apache-ant-1.9.2-bin.tar.gzmv apache-ant-1.9.2 /hadoop/app/ant192tar apache-maven-3.0.5-bin.tar.gzmv apache-maven-3.0.5 /hadoop/maven305tar zxf findbugs-2.0.2.tar.gzmv findbugs-2.0.2 /hadoop/findbugs202tar zxf cmake-2.8.6.tar.gzcd cmake-2.8.6./bootstrap; make; make installcd ..tar zxf hadoop-2.7.0-src.tar.gzmv hadoop-2.7.0-src /hadoop/hadoop270_srcchown -R hadoop:hadoop /hadoop/hadoop270_srcvi /etc/profileexport ANT_HOME=/hadoop/ant192export MAVEN_HOME=/hadoop/maven305export FINDBUGS_HOME=/hadoop/findbugs202export PATH=${ANT_HOME}/bin:${MAVEN_HOME}/bin:${FINDBUGS_HOME}/bin:$PATHsource /etc/profilesu - hadoopcd /hadoop/hadoop270_srcmvn clean package -DskipTests -Pdist,native,docs -Dtar
如果是第一次配置maven这一步会有点久,最好配置下maven的镜像地址
编译最后可能出现这个错误:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1[ERROR] around Ant part ...... @ 5:124 in /home/hadoop/app/hadoop270_src/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
是zlib1g-dev 和 libssl-dev没有安装, 编译本地库需要这2个库的支持
解决方法:
yum install openssl-devel
然后重新:
mvn clean package -DskipTests -Pdist,native,docs -Dtar
注意:在jdk1.8环境下,可能出现错误:
[WARNING] The requested profile "native" could not be activated because it does not exist.[WARNING] The requested profile "docs" could not be activated because it does not exist.[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-dist: An Ant BuildException has occured: exec returned: 1[ERROR] around Ant part ...... @ 38:100 in /home/hadoop/app/hadoop270_src/hadoop-dist/target/antrun/build-main.xml
解决方法:将1.8换成1.7
那么编译成功:
[INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 25:22.002s[INFO] Finished at: Tue Jul 07 21:20:38 PDT 2015[INFO] Final Memory: 131M/405M[INFO] ------------------------------------------------------------------------[hadoop@vm1 hadoop270_src]$ lsBUILDING.txt hadoop-dist hadoop-project NOTICE.txtdev-support hadoop-hdfs-project hadoop-project-dist pom.xmlhadoop-assemblies hadoop-mapreduce-project hadoop-tools README.txthadoop-client hadoop-maven-plugins hadoop-yarn-projecthadoop-common-project hadoop-minicluster LICENSE.txt[hadoop@vm1 hadoop270_src]$ cd hadoop-dist/[hadoop@vm1 hadoop-dist]$ lspom.xml target[hadoop@vm1 hadoop-dist]$ cd target/[hadoop@vm1 target]$ lsantrun hadoop-2.7.0 hadoop-dist-2.7.0-javadoc.jar test-dirdist-layout-stitching.sh hadoop-2.7.0.tar.gz javadoc-bundle-optionsdist-tar-stitching.sh hadoop-dist-2.7.0.jar maven-archiver[hadoop@vm1 target]$ pwd/hadoop/app/hadoop270_src/hadoop-dist/target
用自己编译好的hadoop包配置好相应环境,启动hdfs已经没有(Unable to load native-hadoop library for your platform... using builtin-java classes where applicabl)警告:
[hadoop@vm1 hadoop-2.7.0]$ ./sbin/start-dfs.sh Starting namenodes on [vm1]vm1: starting namenode, logging to /home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-namenode-vm1.outvm1: starting datanode, logging to /home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-datanode-vm1.outStarting secondary namenodes [0.0.0.0]0.0.0.0: starting secondarynamenode, logging to /home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-secondarynamenode-vm1.out[hadoop@vm1 hadoop-2.7.0]$ ./sbin/start-yarn.sh starting yarn daemonsstarting resourcemanager, logging to /home/hadoop/app/hadoop-2.7.0/logs/yarn-hadoop-resourcemanager-vm1.outvm1: starting nodemanager, logging to /home/hadoop/app/hadoop-2.7.0/logs/yarn-hadoop-nodemanager-vm1.out[hadoop@vm1 hadoop-2.7.0]$ jps3251 NodeManager3540 Jps3145 ResourceManager2699 NameNode2828 DataNode2991 SecondaryNameNode
到此,关于"怎么在CentOS下重新编译hadoop源码"的学习就结束了,希望能够解决大家的疑惑。理论与实践的搭配能更好的帮助大家学习,快去试试吧!若想继续学习更多相关知识,请继续关注网站,小编会继续努力为大家带来更多实用的文章!