查阅有关官方介绍 http://wiki.apache.org/hadoop/HowToContribute 中有说明:Hadoop本地库只支持*nix平台,已经广泛使用在GNU/Linux平台上,但是不支持 Cygwin 和 Mac OS X 。搜索后发现已经有人给出了Mac OSX 系统下编译生成本地库的patch,下面详细介绍在Mac OSX 平台下编译Hadoop本地库的方法。
[一]、环境说明:
- Hadoop 2.2.0
- Mac OS X 10.9.1
详细的环境依赖(protoc、cmake 等)参见:Hadoop2.2.0源码编译 (http://www.micmiu.com/opensource/hadoop/hadoop-build-source-2-2-0/)中介绍。
[二]、Mac OSX 编译本地库的步骤:
1、checkout Hadoop 2.2.0的源码
1 |
$svn co https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0/ |
2、patch 相关补丁
官方讨论地址:https://issues.apache.org/jira/browse/HADOOP-9648 里面有详细介绍
补丁下载链接:https://issues.apache.org/jira/secure/attachment/12617363/HADOOP-9648.v2.patch
1 2 3 |
#切换到hadoop 源码的根目录 $wget https://issues.apache.org/jira/secure/attachment/12617363/HADOOP-9648.v2.patch $patch -p1 < HADOOP-9648.v2.patch |
ps:如果要回退patch 执行:patch -RE -p1 < HADOOP-9648.v2.patch
即可。
3、编译本地库
在Hadoop源码的根目录下执行编译本地库命令:
1 |
$ mvn package -Pdist,native -DskipTests -Dtar |
编译成功看到如下日志信息:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 |
[INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [1.511s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.493s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [0.823s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.561s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.245s] [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [2.465s] [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [0.749s] [INFO] Apache Hadoop Auth ................................ SUCCESS [0.832s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [2.070s] [INFO] Apache Hadoop Common .............................. SUCCESS [1:00.030s] [INFO] Apache Hadoop NFS ................................. SUCCESS [0.285s] [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.049s] [INFO] Apache Hadoop HDFS ................................ SUCCESS [1:13.339s] [INFO] Apache Hadoop HttpFS .............................. SUCCESS [20.259s] [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [0.767s] [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [0.279s] [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.046s] [INFO] hadoop-yarn ....................................... SUCCESS [0.239s] [INFO] hadoop-yarn-api ................................... SUCCESS [7.641s] [INFO] hadoop-yarn-common ................................ SUCCESS [5.479s] [INFO] hadoop-yarn-server ................................ SUCCESS [0.114s] [INFO] hadoop-yarn-server-common ......................... SUCCESS [1.743s] [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [6.381s] [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [0.259s] [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [0.578s] [INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.303s] [INFO] hadoop-yarn-client ................................ SUCCESS [0.233s] [INFO] hadoop-yarn-applications .......................... SUCCESS [0.062s] [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [0.253s] [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.074s] [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1.504s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [0.242s] [INFO] hadoop-yarn-site .................................. SUCCESS [0.172s] [INFO] hadoop-yarn-project ............................... SUCCESS [1.235s] [INFO] hadoop-mapreduce-client-common .................... SUCCESS [3.664s] [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [0.183s] [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [0.495s] [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [1.296s] [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [0.580s] [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [0.213s] [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [0.344s] [INFO] hadoop-mapreduce .................................. SUCCESS [1.303s] [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [0.257s] [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [9.925s] [INFO] Apache Hadoop Archives ............................ SUCCESS [0.282s] [INFO] Apache Hadoop Rumen ............................... SUCCESS [0.403s] [INFO] Apache Hadoop Gridmix ............................. SUCCESS [0.283s] [INFO] Apache Hadoop Data Join ........................... SUCCESS [0.197s] [INFO] Apache Hadoop Extras .............................. SUCCESS [0.241s] [INFO] Apache Hadoop Pipes ............................... SUCCESS [8.249s] [INFO] Apache Hadoop OpenStack support ................... SUCCESS [0.492s] [INFO] Apache Hadoop Client .............................. SUCCESS [0.373s] [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.133s] [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [0.439s] [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [0.596s] [INFO] Apache Hadoop Tools ............................... SUCCESS [0.044s] [INFO] Apache Hadoop Distribution ........................ SUCCESS [0.194s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 3:44.266s [INFO] Finished at: Fri Jan 17 10:06:17 CST 2014 [INFO] Final Memory: 66M/123M [INFO] ------------------------------------------------------------------------ micmiu-mbp:trunk micmiu$ |
编译通过后可在 <HADOOP源码根目录>/hadoop-dist/target/hadoop-2.2.0/lib/ 目录下看到如下内容:
1 2 3 4 5 6 7 8 9 10 11 12 |
micmiu-mbp:lib micmiu$ tree . |____.DS_Store |____native | |____libhadoop.1.0.0.dylib | |____libhadoop.a | |____libhadoop.dylib | |____libhadooppipes.a | |____libhadooputils.a | |____libhdfs.0.0.0.dylib | |____libhdfs.a | |____libhdfs.dylib |
然后把 上面生成的本地库 copy到部署环境相应的位置,再建立软连接即可:
1 2 |
$ls -s libhadoop.1.0.0.dylib libhadoop.so $ls -s libhdfs.0.0.0.dylib libhdfs.so |
下载地址:http://yun.baidu.com/s/1c0jBZDQ
[三]、参考:
- http://wiki.apache.org/hadoop/HowToContribute
- http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
- https://issues.apache.org/jira/browse/HADOOP-9648
- https://issues.apache.org/jira/browse/HADOOP-3659
—————– EOF @Michael Sun —————–
楼主,我编译到最后面没过,
[INFO] Apache Hadoop Pipes ………………………….. FAILURE [ 0.627 s]
[INFO] Apache Hadoop OpenStack support ……………….. SKIPPED
[INFO] Apache Hadoop Amazon Web Services support ………. SKIPPED
[INFO] Apache Hadoop Azure support …………………… SKIPPED
[INFO] Apache Hadoop Client …………………………. SKIPPED
[INFO] Apache Hadoop Mini-Cluster ……………………. SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator …………. SKIPPED
[INFO] Apache Hadoop Tools Dist ……………………… SKIPPED
[INFO] Apache Hadoop Tools ………………………….. SKIPPED
[INFO] Apache Hadoop Distribution ……………………. SKIPPED
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part …… @ 5:135 in /Users/wanwenqing/Hadoop/hadoop-2.7.2-src/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
[ERROR] -> [Help 1]
能帮忙看看吗
😛 博主,给俺也发一份吧,下载的不好使
我的网盘上有 http://yun.baidu.com/s/1c0rfIOo
不知 博主有没有 2.5.0 版本的
sorry 没有,估计编译过程都差不多的
博主能给我传一份吗,simonyy81@gmail.com,谢谢了
去百度网盘 http://yun.baidu.com/s/1c0rfIOo 下载
😛 给我传一个吧,谢谢了,搞了2天了,还是有很多错误
去百度网盘 http://yun.baidu.com/s/1c0rfIOo 下载
😛 你好,你的网盘下载链接失效了,能再放一个吗?
净网行动惹的货,好像很多分享的文件夹都不能访问了,给我个邮箱,告诉我你要哪个平台的 我发给你
可以发给我一份么?
我的是mac ox系统,刚开始部署hadoop!
daweilang@163.com
谢谢!
去百度网盘 http://yun.baidu.com/s/1c0rfIOo 下载
467449165@qq.com