我已经在CentOS VM中安装了hadoop 2.5.2。我对 hadoop 很陌生,正在尝试基于 tutorials 在 hadoop 2.x 中执行 C++ 代码
我发现在 Hadoop 2.x 版本中,没有文件夹 (HADOOP_INSTALL)/c++/$(PLATFORM)/
。我看到 $(HADOOP_INSTALL)/include
下有包含文件,以及 $HADOOP_INSTALL/lib/native/libhadooppipes.a
下的 hadoopipes.a
等库文件。我像这样调整了我的 makefile:
CC = g++
HADOOP_INSTALL = /usr/local/hadoop
CPPFLAGS = -m32 -I$(HADOOP_INSTALL)/include
wordcount: WordCount.cpp
$(CC) $(CPPFLAGS) $< -Wall -L$(HADOOP_INSTALL)/lib/native -lhadooppipes \
-lhadooputils -lpthread -g -O2 -o $@
and changed in code ( rest of code is same as in the link above )
#include "Pipes.hh"
#include "TemplateFactory.hh"
#include "StringUtils.hh"
when I compile I get
$ make wordcount
g++ -m32 -I/usr/local/hadoop/include WordCount.cpp -Wall -L/usr/local/hadoop/lib/native -lhadooppipes \
-lhadooputils -lpthread -g -O2 -o wordcount
/usr/bin/ld: skipping incompatible /usr/local/hadoop/lib/native/libhadooppipes.a when searching for -lhadooppipes
/usr/bin/ld: cannot find -lhadooppipes
collect2: ld returned 1 exit status
谁能指点一下如何在 hadoop 2.x 上编译 c++。 (不过我可以在hadoop 1.x中执行c++程序。我感兴趣的是,如何在hadoop 2.x中使用hadoop管道执行c++程序)提前感谢
请您参考如下方法:
问题:
How to compile 32 bit binary by linking it to hadoop archive ( file format elf64-x86-64 architecture: i386:x86-64).Please advice
答案:为此,您需要multilib。如果它已经安装在 Ubuntu 14.04 或 Arch Linux 上,则必须在 64 位系统上启用 32 位应用程序。
例如,像这样source说-
Enabling the multilib repository allows the user to run and build 32-bit applications on 64-bit installations of Arch Linux. multilib creates a directory containing 32-bit instruction set libraries inside /usr/lib32/, which 32-bit binary applications may need when executed.
否则,您必须安装multilib -
sudo apt-get install gcc-multilib
和
sudo apt-get install ia32-libs-dev
这个link也可以提供帮助。 Google 还提供了有关多库支持的更多信息。