我试图在 Windows 7 上从源代码安装 Hadoop 2.4.1,在最后阶段出现以下错误。我搜索了解决方案,但无济于事。
操作系统 Windows 7:6.1(32 位)
Java:Java 版本“1.8.0_11”
协议(protocol) 2.5.0
Apache 行家 3.2.2
我使用这个教程:https://wiki.apache.org/hadoop/Hadoop2OnWindows和 https://www.srccodes.com/p/article/38/build-install-configure-run-apache-hadoop-2.2.0-microsoft-windows-os
Maven 命令:mvn package -Pdist,native-win -DskipTests -Dtar
请帮忙。
错误代码:
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop Distribution 2.4.1
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-dist ---
[INFO] Executing tasks
main:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-dist ---
[INFO] Using default encoding to copy filtered resources.
[INFO]
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-dist ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-dist ---
[INFO] Using default encoding to copy filtered resources.
[INFO]
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-dist ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hadoop-dist ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (dist) @ hadoop-dist ---
[INFO] Executing tasks
main:
[exec]
[exec] Current directory /cygdrive/c/hdc/hadoop-dist/target
[exec]
[exec] $ rm -rf hadoop-2.4.1
[exec] $ mkdir hadoop-2.4.1
[exec] $ cd hadoop-2.4.1
[exec] $ cp -r /cygdrive/c/hdc/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/bin /cygdrive/c/hdc/ha
doop-common-project/hadoop-common/target/hadoop-common-2.4.1/etc /cygdrive/c/hdc/hadoop-common-project/hadoop-common/targ
et/hadoop-common-2.4.1/libexec /cygdrive/c/hdc/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/sbin /cygdr
ive/c/hdc/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/share .
[exec] cp.exe: /cygdrive/c/hdc/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/bin: No such file or d
irectory
[exec] cp.exe: /cygdrive/c/hdc/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/etc: No such file or d
irectory
[exec] cp.exe: /cygdrive/c/hdc/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/libexec: No such file
or directory
[exec] cp.exe: /cygdrive/c/hdc/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/sbin: No such file or
directory
[exec] cp.exe: /cygdrive/c/hdc/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/share: No such file or
directory
[exec]
[exec] Failed!
[exec]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 4.006 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.592 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 4.343 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.233 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 3.262 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 4.871 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 4.450 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 3.150 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 3.384 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [02:26 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 9.297 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.058 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [03:31 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 34.454 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 9.568 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 4.962 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.071 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.055 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [ 42.210 s]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 36.877 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.048 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 7.941 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 14.402 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 2.625 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 4.543 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 16.911 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 1.438 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 3.495 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.042 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 2.033 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 1.682 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.051 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 3.453 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.117 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 33.057 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 23.963 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 2.181 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 13.202 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 8.889 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 7.842 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 1.379 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 5.019 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 3.700 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 3.600 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 12.786 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.083 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 6.695 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 4.275 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 1.995 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 2.223 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 0.042 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 4.186 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 7.937 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.158 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 3.874 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 5.658 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.038 s]
[INFO] Apache Hadoop Distribution ......................... FAILURE [ 2.740 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:23 min
[INFO] Finished at: 2014-07-18T17:43:34+04:00
[INFO] Final Memory: 65M/238M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-dist: An Ant
BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec failonerror="true" dir="C:\hdc\hadoop-dist\target" executable="sh">... @ 34:76 in C:\hdc
\hadoop-dist\target\antrun\build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-dist
最佳答案
我遇到了同样的问题。
“Reactor Summar”上面有一些信息,
比如
“cp:无法打开 `/cygdrive/c/hadoop/hadoop-common-project/hadoop-common/target/hadoop-common-3.0.0-SNAPSHOT/bin/hadoop.dll' 进行读取:权限被拒绝”。
只需以管理员身份运行命令提示符
关于windows - Ant BuildException 错误构建 Hadoop 2.4.1,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24827416/
我需要在客户计算机上运行Ruby应用程序。通常需要几天才能完成(复制大备份文件)。问题是如果启用sleep,它会中断应用程序。否则,计算机将持续运行数周,直到我下次访问为止。有什么方法可以防止执行期间休眠并让Windows在执行后休眠吗?欢迎任何疯狂的想法;-) 最佳答案 Here建议使用SetThreadExecutionStateWinAPI函数,使应用程序能够通知系统它正在使用中,从而防止系统在应用程序运行时进入休眠状态或关闭显示。像这样的东西:require'Win32API'ES_AWAYMODE_REQUIRED=0x0
大约一年前,我决定确保每个包含非唯一文本的Flash通知都将从模块中的方法中获取文本。我这样做的最初原因是为了避免一遍又一遍地输入相同的字符串。如果我想更改措辞,我可以在一个地方轻松完成,而且一遍又一遍地重复同一件事而出现拼写错误的可能性也会降低。我最终得到的是这样的:moduleMessagesdefformat_error_messages(errors)errors.map{|attribute,message|"Error:#{attribute.to_s.titleize}#{message}."}enddeferror_message_could_not_find(obje
为什么4.1%2返回0.0999999999999996?但是4.2%2==0.2。 最佳答案 参见此处:WhatEveryProgrammerShouldKnowAboutFloating-PointArithmetic实数是无限的。计算机使用的位数有限(今天是32位、64位)。因此计算机进行的浮点运算不能代表所有的实数。0.1是这些数字之一。请注意,这不是与Ruby相关的问题,而是与所有编程语言相关的问题,因为它来自计算机表示实数的方式。 关于ruby-为什么4.1%2使用Ruby返
我遵循MichaelHartl的“RubyonRails教程:学习Web开发”,并创建了检查用户名和电子邮件长度有效性的测试(名称最多50个字符,电子邮件最多255个字符)。test/helpers/application_helper_test.rb的内容是:require'test_helper'classApplicationHelperTest在运行bundleexecraketest时,所有测试都通过了,但我看到以下消息在最后被标记为错误:ERROR["test_full_title_helper",ApplicationHelperTest,1.820016791]test
我是rails的新手,想在form字段上应用验证。myviewsnew.html.erb.....模拟.rbclassSimulation{:in=>1..25,:message=>'Therowmustbebetween1and25'}end模拟Controller.rbclassSimulationsController我想检查模型类中row字段的整数范围,如果不在范围内则返回错误信息。我可以检查上面代码的范围,但无法返回错误消息提前致谢 最佳答案 关键是您使用的是模型表单,一种显示ActiveRecord模型实例属性的表单。c
我正在尝试编写一个将文件上传到AWS并公开该文件的Ruby脚本。我做了以下事情:s3=Aws::S3::Resource.new(credentials:Aws::Credentials.new(KEY,SECRET),region:'us-west-2')obj=s3.bucket('stg-db').object('key')obj.upload_file(filename)这似乎工作正常,除了该文件不是公开可用的,而且我无法获得它的公共(public)URL。但是当我登录到S3时,我可以正常查看我的文件。为了使其公开可用,我将最后一行更改为obj.upload_file(file
我克隆了一个rails仓库,我现在正尝试捆绑安装背景:OSXElCapitanruby2.2.3p173(2015-08-18修订版51636)[x86_64-darwin15]rails-v在您的Gemfile中列出的或native可用的任何gem源中找不到gem'pg(>=0)ruby'。运行bundleinstall以安装缺少的gem。bundleinstallFetchinggemmetadatafromhttps://rubygems.org/............Fetchingversionmetadatafromhttps://rubygems.org/...Fe
在Cooper的书BeginningRuby中,第166页有一个我无法重现的示例。classSongincludeComparableattr_accessor:lengthdef(other)@lengthother.lengthenddefinitialize(song_name,length)@song_name=song_name@length=lengthendenda=Song.new('Rockaroundtheclock',143)b=Song.new('BohemianRhapsody',544)c=Song.new('MinuteWaltz',60)a.betwee
我是Google云的新手,我正在尝试对其进行首次部署。我的第一个部署是RubyonRails项目。我基本上是在关注thisguideinthegoogleclouddocumentation.唯一的区别是我使用的是我自己的项目,而不是他们提供的“helloworld”项目。这是我的app.yaml文件runtime:customvm:trueentrypoint:bundleexecrackup-p8080-Eproductionconfig.ruresources:cpu:0.5memory_gb:1.3disk_size_gb:10当我转到我的项目目录并运行gcloudprevie
我有两个Rails模型,即Invoice和Invoice_details。一个Invoice_details属于Invoice,一个Invoice有多个Invoice_details。我无法使用accepts_nested_attributes_forinInvoice通过Invoice模型保存Invoice_details。我收到以下错误:(0.2ms)BEGIN(0.2ms)ROLLBACKCompleted422UnprocessableEntityin25ms(ActiveRecord:4.0ms)ActiveRecord::RecordInvalid(Validationfa