hadoop2.7环境的编译安装


0.环境说明: 编译所用的操作系统为: [root@host11 hadoop-2.7.1-src]# cat /etc/redhat-release CentOS release 6.5 (Final) hadoop的版本为2.7.11.安装依赖软件包: yum install svn autoconf automake libtool cmake ncurses-devel openssl-devel gcc*
2.配置java和maven环境: wget wget http://download.oracle.com/otn-pub/java/jdk/8u60-b27/jdk-8u60-linux-x64.tar.gz?AuthParam=1443446776_174368b9ab1a6a92468aba5cd4d092d0 tar -zxvf jdk-8u60-linux-x64.tar.gz -C /usr/local cd /usr/local ln -s jdk1.8.0_60 jdk echo ‘export JAVA_HOME=/usr/local/jdk’ >>/etc/profile; echo ‘export PATH=$JAVA_HOME/bin:$PATH’>>/etc/profile; wget http://mirrors.hust.edu.cn/apache/maven/maven-3/3.3.3/binaries/apache-maven-3.3.3-bin.tar.gz tar -zxvf apache-maven-3.3.3-bin.tar.gz -C /usr/local cd /usr/local ln -s apache-maven-3.3.3 maven echo ‘export PATH=/usr/local/maven/bin:$PATH’ >/etc/profile.d/maven.sh;
3.下载并安装protobuf(必须使用2.5版本) wget https://codeload.github.com/google/protobuf/zip/v2.5.0 unzip protobuf-2.5.0.zip
wget http://googletest.googlecode.com/files/gtest-1.5.0.tar.bz2 tar -jxvf gtest-1.5.0.tar.bz2 mv gtest-1.5.0 ./protobuf-2.5.0/gtest ./autogen.sh ./configure make make check make install which protoc [root@host11 protobuf-master]# which protoc /usr/local/bin/protoc 4.下载并安装ant: wget http://mirrors.cnnic.cn/apache//ant/binaries/apache-ant-1.9.6-bin.zip unzip apache-ant-1.9.6-bin.zip mv apache-ant-1.9.6 /usr/local/ant echo ‘export PATH=/usr/local/ant/bin:$PATH’ >/etc/profile.d/ant.sh 5.编译hadoop: tar -zxvf tar -zxvf hadoop-2.7.1-src.tar.gz mvn package -Pdist,native -DskipTests -Dtar 6.故障处理: 第一次编译故障: [ERROR] Failed to execute goal on project hadoop-auth: Could not resolve dependencies for project org.apache.hadoop:hadoop-auth:jar:2.7.1: The following artifacts could not be resolved: org.mockito:mockito-all:jar:1.8.5, org.mortbay.jetty:jetty-util:jar:6.1.26, org.mortbay.jetty:jetty:jar:6.1.26, org.apache.tomcat.embed:tomcat-embed-core:jar:7.0.55, org.apache.httpcomponents:httpclient:jar:4.2.5, org.apache.zookeeper:zookeeper:jar:3.4.6: Could not transfer artifact org.mockito:mockito-all:jar:1.8.5 from/to central (https://repo.maven.apache.org/maven2): GET request of: org/mockito/mockito-all/1.8.5/mockito-all-1.8.5.jar from central failed: SSL peer shut down incorrectly -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn -rf :hadoop-auth 解决办法: 这种情况很常见,这是因为插件没有下载完毕造成的。多执行几次下面命令就可以了 mvn package -Pdist,native -DskipTests -Dtar 第二次编译故障: [ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.7.1:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc version is ‘libprotoc 3.0.0’, expected version is ‘2.5.0’ -> [Help 1] protobuf版本过新,需要使用2.5的版本; 7.编译成功的日志: [INFO] Apache Hadoop Main …………………………… SUCCESS [ 7.502 s] [INFO] Apache Hadoop Project POM …………………….. SUCCESS [ 4.844 s] [INFO] Apache Hadoop Annotations …………………….. SUCCESS [ 10.274 s] [INFO] Apache Hadoop Assemblies ……………………… SUCCESS [ 0.477 s] [INFO] Apache Hadoop Project Dist POM ………………… SUCCESS [ 4.568 s] [INFO] Apache Hadoop Maven Plugins …………………… SUCCESS [ 11.000 s] [INFO] Apache Hadoop MiniKDC ………………………… SUCCESS [ 9.870 s] [INFO] Apache Hadoop Auth …………………………… SUCCESS [ 9.003 s] [INFO] Apache Hadoop Auth Examples …………………… SUCCESS [ 9.321 s] [INFO] Apache Hadoop Common …………………………. SUCCESS [03:21 min] [INFO] Apache Hadoop NFS ……………………………. SUCCESS [ 20.029 s] [INFO] Apache Hadoop KMS ……………………………. SUCCESS [ 21.350 s] [INFO] Apache Hadoop Common Project ………………….. SUCCESS [ 0.079 s] [INFO] Apache Hadoop HDFS …………………………… SUCCESS [10:57 min] [INFO] Apache Hadoop HttpFS …………………………. SUCCESS [01:15 min] [INFO] Apache Hadoop HDFS BookKeeper Journal ………….. SUCCESS [ 46.255 s] [INFO] Apache Hadoop HDFS-NFS ……………………….. SUCCESS [ 21.495 s] [INFO] Apache Hadoop HDFS Project ……………………. SUCCESS [ 0.242 s] [INFO] hadoop-yarn …………………………………. SUCCESS [ 0.137 s] [INFO] hadoop-yarn-api ……………………………… SUCCESS [01:34 min] [INFO] hadoop-yarn-common …………………………… SUCCESS [01:31 min] [INFO] hadoop-yarn-server …………………………… SUCCESS [ 0.291 s] [INFO] hadoop-yarn-server-common …………………….. SUCCESS [ 35.037 s] [INFO] hadoop-yarn-server-nodemanager ………………… SUCCESS [ 44.224 s] [INFO] hadoop-yarn-server-web-proxy ………………….. SUCCESS [ 4.315 s] [INFO] hadoop-yarn-server-applicationhistoryservice ……. SUCCESS [ 17.461 s] [INFO] hadoop-yarn-server-resourcemanager …………….. SUCCESS [ 46.435 s] [INFO] hadoop-yarn-server-tests ……………………… SUCCESS [ 10.698 s] [INFO] hadoop-yarn-client …………………………… SUCCESS [ 8.976 s] [INFO] hadoop-yarn-server-sharedcachemanager ………….. SUCCESS [ 10.343 s] [INFO] hadoop-yarn-applications ……………………… SUCCESS [ 0.113 s] [INFO] hadoop-yarn-applications-distributedshell ………. SUCCESS [ 7.395 s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher ….. SUCCESS [ 4.006 s] [INFO] hadoop-yarn-site …………………………….. SUCCESS [ 0.108 s] [INFO] hadoop-yarn-registry …………………………. SUCCESS [ 12.317 s] [INFO] hadoop-yarn-project ………………………….. SUCCESS [ 18.781 s] [INFO] hadoop-mapreduce-client ………………………. SUCCESS [ 0.396 s] [INFO] hadoop-mapreduce-client-core ………………….. SUCCESS [ 46.350 s] [INFO] hadoop-mapreduce-client-common ………………… SUCCESS [ 34.772 s] [INFO] hadoop-mapreduce-client-shuffle ……………….. SUCCESS [ 8.779 s] [INFO] hadoop-mapreduce-client-app …………………… SUCCESS [ 22.440 s] [INFO] hadoop-mapreduce-client-hs ……………………. SUCCESS [ 12.865 s] [INFO] hadoop-mapreduce-client-jobclient ……………… SUCCESS [01:45 min] [INFO] hadoop-mapreduce-client-hs-plugins …………….. SUCCESS [ 6.051 s] [INFO] Apache Hadoop MapReduce Examples ………………. SUCCESS [ 8.077 s] [INFO] hadoop-mapreduce …………………………….. SUCCESS [ 12.782 s] [INFO] Apache Hadoop MapReduce Streaming ……………… SUCCESS [ 24.680 s] [INFO] Apache Hadoop Distributed Copy ………………… SUCCESS [ 50.965 s] [INFO] Apache Hadoop Archives ……………………….. SUCCESS [ 6.861 s] [INFO] Apache Hadoop Rumen ………………………….. SUCCESS [ 12.928 s] [INFO] Apache Hadoop Gridmix ………………………… SUCCESS [ 6.784 s] [INFO] Apache Hadoop Data Join ………………………. SUCCESS [ 3.629 s] [INFO] Apache Hadoop Ant Tasks ………………………. SUCCESS [ 7.135 s] [INFO] Apache Hadoop Extras …………………………. SUCCESS [ 6.233 s] [INFO] Apache Hadoop Pipes ………………………….. SUCCESS [ 31.548 s] [INFO] Apache Hadoop OpenStack support ……………….. SUCCESS [ 10.084 s] [INFO] Apache Hadoop Amazon Web Services support ………. SUCCESS [35:23 min] [INFO] Apache Hadoop Azure support …………………… SUCCESS [ 36.126 s] [INFO] Apache Hadoop Client …………………………. SUCCESS [ 24.463 s] [INFO] Apache Hadoop Mini-Cluster ……………………. SUCCESS [ 0.353 s] [INFO] Apache Hadoop Scheduler Load Simulator …………. SUCCESS [ 12.506 s] [INFO] Apache Hadoop Tools Dist ……………………… SUCCESS [ 34.475 s] [INFO] Apache Hadoop Tools ………………………….. SUCCESS [ 0.159 s] [INFO] Apache Hadoop Distribution ……………………. SUCCESS [02:37 min] [INFO] ———————————————————————— [INFO] BUILD SUCCESS [INFO] ———————————————————————— [INFO] Total time: 01:12 h [INFO] Finished at: 2015-10-03T03:54:29+08:00 [INFO] Final Memory: 91M/237M [INFO] ———————- 香港云主机————————————————– [root@host11 hadoop-2.7.1-src]#
8.检查生成包: cd /tmp/hadoop-2.7.1-src/hadoop-dist/target; [root@host11 target]# ls -ld hadoop* drwxr-xr-x 9 root root 4096 10月 3 03:51 hadoop-2.7.1 -rw-r–r– 1 root root 194796372 10月 3 03:52 hadoop-2.7.1.tar.gz -rw-r–r– 1 root root 2823 10月 3 03:52 hadoop-dist-2.7.1.jar -rw-r–r– 1 root root 390430395 10月 3 03:54 hadoop-dist-2.7.1-javadoc.jar
9.至此编译工作顺利结束;

相关推荐: win7系统如何清除浏览器缓存

这篇文章主要介绍了win7系统如何清除浏览器缓存,具有一定借鉴价值,感兴趣的朋友可以参考下,希望大家阅读完这篇文章之后大有收获,下面让小编带着大家一起了解一下。1、以IE浏览器为例,在浏览器的菜单栏中点击工具,然后点击“internet选项”;如图所示:2、在…

免责声明:本站发布的图片视频文字,以转载和分享为主,文章观点不代表本站立场,本站不承担相关法律责任;如果涉及侵权请联系邮箱:360163164@qq.com举报,并提供相关证据,经查实将立刻删除涉嫌侵权内容。

(0)
打赏 微信扫一扫 微信扫一扫
上一篇 07/23 18:21
下一篇 07/23 18:21

相关推荐