编译hadoop过程曲折,还真是不容易,克服种种困难,终于编译成功了。在此,分享下这个编译过程。
1、下载hadoop2.6.5的源码
2、解压后,先看下BUILD.txt文件,该文件详细说明了build注意事项、以及需要的条件
* Unix System * JDK 1.6+ * Maven 3.0 or later * Findbugs 1.3.9 (if running findbugs) * ProtocolBuffer 2.5.0 * CMake 2.6 or newer (if compiling native code) * Zlib devel (if compiling native code) * openssl devel ( if compiling native hadoop-pipes ) * Internet connection for first build (to fetch all Maven and Hadoop dependencies) Building distributions: Create binary distribution without native code and without documentation: $ mvn package -Pdist -DskipTests -Dtar Create binary distribution with native code and with documentation: $ mvn package -Pdist,native,docs -DskipTests -Dtar Create source distribution: $ mvn package -Psrc -DskipTests Create source and binary distributions with native code and documentation: $ mvn package -Pdist,native,docs,src -DskipTests -Dtar Create a local staging version of the website (in /tmp/hadoop-site) $ mvn clean site; mvn site:stage -DstagingDirectory=/tmp/hadoop-site
3、安装CentOS的编译环境
yum install lzo-devel zlib-devel gcc gcc-c++ yum install openssl-devel yum install ncurses-devel yum install autoconf automake libtool cmake
下面是补充的部分(编译过程中会出现各种错误,可能跟以下有关,最好也一起安装了。如果已安装过,yum自动忽略)
sudo yum install kernel-devel sudo yum -y install gcc* sudo yum -y install cmake sudo yum -y install glibc-headers sudo yum -y install gcc-c++ sudo yum -y install zip-devel sudo yum -y install openssl-devel sudo yum -y install svn sudo yum -y install git sudo yum -y install ncurses-devel sudo yum -y install lzo-devel sudo yum -y install autoconf sudo yum -y install libtool sudo yum -y install automake
4、安装maven
tar -zvxf maven-3.0.1.tar.gz
vim /root/.bashrc ==》配置全局变量
5、安装protobuf(需要在root用户下安装)
tar -zvxf protobuf-2.5.0.tar.gz
./configure
make & make check & make install
卸载命令:make uninstall
6、安装findbugs
tar -zvxf findbugs-3.0.0.tar.gz
vim /root/.bashrc ==》配置全局变量
export FINDBUGS_HOME=/usr/local/findbugs-3.0.0
export PATH=$PATH:$FINDBUGS_HOME/bin
7、安装ant(补充)
tar -zvxf ant.1.9.9.tar.gz
vim /root/.bashrc ==》配置全局变量
8、mvn编译命令
mvn clean package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true -rf :hadoop-pipes
9、编译过程中出现的报错信息和解决方案
(1)[ERROR] Unresolveable build extension: Plugin org.apache.felix:maven-bundle-plugin:2.4.0 or one of its dependencies could not be resolved: The following artifacts could not be resolved: biz.aQute.bnd:bndlib:jar:2.1.0, org.osgi:org.osgi.core:jar:4.2.0, org.apache.felix:org.apache.felix.bundlerepository:jar:1.6.6, org.easymock:easymock:jar:2.4, org.codehaus.plexus:plexus-interpolation:jar:1.15, org.apache.maven.shared:maven-dependency-tree:jar:2.1, org.codehaus.plexus:plexus-component-annotations:jar:1.5.5, org.eclipse.aether:aether-util:jar:0.9.0.M2: Could not transfer artifact biz.aQute.bnd:bndlib:jar:2.1.0 from/to central (http://repo.maven.apache.org/maven2): Connection to http://repo.maven.apache.org refused: 连接超时 -> [Help 2]
repo.maven.apache.org timeout,原因是国内连接这个URL失败,翻过长城可以顺利连接。
解决办法:
修改 /path-to-maven/conf/setting.xml,参照:http://maven.apache.org/guides/mini/guide-mirror-settings.html,添加mirror: <mirror> <id>UK</id> <name>UK Central</name> <url>http://uk.maven.org/maven2</url> <mirrorOf>central</mirrorOf> </mirror>
(2)Exit code: 1 - /home/lpf/devTool/hadoop-2.6.0-src/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/interfaceStability.java:27: 错误: 意外的结束标记: </ul>
后面加上 -Dmaven.javadoc.skip=true,在stackoverflow上找到的
解决方案:
mvn package -Pdist,native,docs -DskipTests -Dtar -Dmaven.javadoc.skip=true
(3)出现错误:Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: input file /opt/hadoop-2.2.0-src/hadoop-hdfs-project/hadoop-hdfs/target/findbugsXml.xml does not exist
解决办法:
去掉docs参数 mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true -rf :hadoop-pipes
(4)Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-hdfs-httpfs: An Ant BuildException has occured: exec returned: 2
解决方案:
找到hadoop-hdfs-project/pom.xml文件,将<!--module>hadoop-hdfs-httpfs</module-->给注释掉
(5)ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-kms: An Ant BuildException has occured: Can‘t get http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.43/bin/apache-tomcat-6.0.43.tar.gz to /home/liuwl/opt/datas/hadoop-2.5.0-cdh5.3.6/hadoop-common-project/hadoop-kms/downloads/apache-tomcat-6.0.43.tar.gz [ERROR] around Ant part ...<get dest="downloads/apache-tomcat-6.0.43.tar.gz" skipexisting="true" verbose="true" src="http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.43/bin/apache-tomcat-6.0.43.tar.gz"/>... @ 5:182 in /home/liuwl/opt/datas/hadoop-2.5.0-cdh5.3.6/hadoop-common-project/hadoop-kms/target/antrun/build-main.xml
解决方案:
没完整下载到tomcat6.0.43.tar.gz包, 在/hadoop-common-project/hadoop-kms/downloads目录下手动下载tomcat6.0.43.tar.gz包。 地址在hadoop-common-project/hadoop-kms/target/antrun/build-main.xml文件中有。 注意:tomcat版本要看build-main.xml中说明
10、编译完成后,编译包在hadoop-2.6.5-src/hadoop-dist/target/hadoop-2.6.5.tar.gz
libhadoop.so.1.0.0和libhdfs.so.0.0.0在hadoop-2.6.5-src/hadoop-dist/target/hadoop-2.6.5/lib/native下
相关推荐
Hadoop2.6.5在CentOS6.8版本下的集群部署,图文教程,很详细
Centos6.8 32位 64位下编译 hadoop 2.6.4 源码
Hadoop 2.6.5在CentOS6.8版本下的集群部署 1
Hadoop2.7.2 centos7 64位编译后的库文件
CentOS-6.4-x86_64下,hadoop-2.6.5,支持snappy的native lib
centos6.7编译hadoop2.6 里面详细的写了过程 。ide为idea,这里注意一下
使用前提条件: 1、Centos7.0 2、配置好yum源 3、配置好jdk(最好1.8)环境变量,即执行java-version有效
解决WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable问题。
centos7系统下搭建hadoop-2.6.5大数据集群的详细安装步骤
CentOS6.5下编译hadoop 2.8.1的源码,所得编译结果,用来替换lib库中的native文件夹,自己编译出来的,不是假资源
我编译用到的包(protobuf-2.5.0.tar.gz,findbugs-3.0.1.tar.gz,apache-ant-1.9.13-bin.tar.gz,snappy-1.1.1.tar.gz)和编译的过程详解(遇到的错误)都在压缩包中(hadoop源码请到官网下载)。 背景Hadoop官网...
2.6.0源码编译生成的lib包目录。替换官网下载的资源里的lib目录即可
本文档是教您如何在centos下安装Hadoop,并运行简单实例。很详细,而且很准确。
centos6.7环境下 hadoop2.7.2编译后的文件 解压然后配置环境变量即可使用
在centos7 下面自己编译,安装部署Hadoop集群环境。包括需要插件等等
CentOS6.0-Hadoop安装手册,详细介绍centos下hadoop的搭建过程
centos8安装hadoop3.3.docx
基于centos6.5 已经编译好的hadoop-2.6.4 Hadoop2.6.4 重新编译 64 位本地库
在centos7下编译hadoop2.7.3所需工具:apache-ant-1.9.4-bin.tar.gz、apache-maven-3.3.3-bin.tar.gz、findbugs-3.0.1.tar.gz、hadoop-2.7.3-src.tar.gz、protobuf-2.5.0.tar.gz
centos6.2下hadoop全分布式集群配置