我已经在 Windows 10 中安装了 Apache Hadoop 2.6.0。我一直在尝试解决这个问题,但未能从我的角度理解错误或任何错误。
我已正确设置所有路径,Hadoop 版本在命令提示符中正确显示版本。
我已经在 hadoop 目录中创建了 temp 目录,例如 c:\hadoop\temp。
当我试图格式化 Namenode 时,我得到这个错误:
C:\hadoop\bin>hdfs namenode -format
18/07/18 20:44:55 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = TheBhaskarDas/192.168.44.1
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.6.5
STARTUP_MSG: classpath = C:\hadoop\etc\hadoop;C:\hadoop\share\hadoop\common\lib\activation-1.1.jar;C:\hadoop\share\hadoop\common\lib\apacheds-i18n-2.0.0-M15.jar;C:\hadoop\share\hadoop\common\lib\apacheds-kerberos-codec-2.0.0-M15.jar;C:\hadoop\share\hadoop\common\lib\api-asn1-api-1.0.0-M20.jar;C:\hadoop\share\hadoop\common\lib\api-util-1.0.0-M20.jar;C:\hadoop\share\hadoop\common\lib\asm-3.2.jar;C:\hadoop\share\hadoop\common\lib\avro-1.7.4.jar;C:\hadoop\share\hadoop\common\lib\commons-beanutils-1.7.0.jar;C:\hadoop\share\hadoop\common\lib\commons-beanutils-core-1.8.0.jar;C:\hadoop\share\hadoop\common\lib\commons-cli-1.2.jar;C:\hadoop\share\hadoop\common\lib\commons-codec-1.4.jar;C:\hadoop\share\hadoop\common\lib\commons-collections-3.2.2.jar;C:\hadoop\share\hadoop\common\lib\commons-compress-1.4.1.jar;C:\hadoop\share\hadoop\common\lib\commons-configuration-1.6.jar;C:\hadoop\share\hadoop\common\lib\commons-digester-1.8.jar;C:\hadoop\share\hadoop\common\lib\commons-el-1.0.jar;C:\hadoop\share\hadoop\common\lib\commons-httpclient-3.1.jar;C:\hadoop\share\hadoop\common\lib\commons-io-2.4.jar;C:\hadoop\share\hadoop\common\lib\commons-lang-2.6.jar;C:\hadoop\share\hadoop\common\lib\commons-logging-1.1.3.jar;C:\hadoop\share\hadoop\common\lib\commons-math3-3.1.1.jar;C:\hadoop\share\hadoop\common\lib\commons-net-3.1.jar;C:\hadoop\share\hadoop\common\lib\curator-client-2.6.0.jar;C:\hadoop\share\hadoop\common\lib\curator-framework-2.6.0.jar;C:\hadoop\share\hadoop\common\lib\curator-recipes-2.6.0.jar;C:\hadoop\share\hadoop\common\lib\gson-2.2.4.jar;C:\hadoop\share\hadoop\common\lib\guava-11.0.2.jar;C:\hadoop\share\hadoop\common\lib\hadoop-annotations-2.6.5.jar;C:\hadoop\share\hadoop\common\lib\hadoop-auth-2.6.5.jar;C:\hadoop\share\hadoop\common\lib\hamcrest-core-1.3.jar;C:\hadoop\share\hadoop\common\lib\htrace-core-3.0.4.jar;C:\hadoop\share\hadoop\common\lib\httpclient-4.2.5.jar;C:\hadoop\share\hadoop\common\lib\httpcore-4.2.5.jar;C:\hadoop\share\hadoop\common\lib\jackson-core-asl-1.9.13.jar;C:\hadoop\share\hadoop\common\lib\jackson-jaxrs-1.9.13.jar;C:\hadoop\share\hadoop\common\lib\jackson-mapper-asl-1.9.13.jar;C:\hadoop\share\hadoop\common\lib\jackson-xc-1.9.13.jar;C:\hadoop\share\hadoop\common\lib\jasper-compiler-5.5.23.jar;C:\hadoop\share\hadoop\common\lib\jasper-runtime-5.5.23.jar;C:\hadoop\share\hadoop\common\lib\java-xmlbuilder-0.4.jar;C:\hadoop\share\hadoop\common\lib\jaxb-api-2.2.2.jar;C:\hadoop\share\hadoop\common\lib\jaxb-impl-2.2.3-1.jar;C:\hadoop\share\hadoop\common\lib\jersey-core-1.9.jar;C:\hadoop\share\hadoop\common\lib\jersey-json-1.9.jar;C:\hadoop\share\hadoop\common\lib\jersey-server-1.9.jar;C:\hadoop\share\hadoop\common\lib\jets3t-0.9.0.jar;C:\hadoop\share\hadoop\common\lib\jettison-1.1.jar;C:\hadoop\share\hadoop\common\lib\jetty-6.1.26.jar;C:\hadoop\share\hadoop\common\lib\jetty-util-6.1.26.jar;C:\hadoop\share\hadoop\common\lib\jsch-0.1.42.jar;C:\hadoop\share\hadoop\common\lib\jsp-api-2.1.jar;C:\hadoop\share\hadoop\common\lib\jsr305-1.3.9.jar;C:\hadoop\share\hadoop\common\lib\junit-4.11.jar;C:\hadoop\share\hadoop\common\lib\log4j-1.2.17.jar;C:\hadoop\share\hadoop\common\lib\mockito-all-1.8.5.jar;C:\hadoop\share\hadoop\common\lib\netty-3.6.2.Final.jar;C:\hadoop\share\hadoop\common\lib\paranamer-2.3.jar;C:\hadoop\share\hadoop\common\lib\protobuf-java-2.5.0.jar;C:\hadoop\share\hadoop\common\lib\servlet-api-2.5.jar;C:\hadoop\share\hadoop\common\lib\slf4j-api-1.7.5.jar;C:\hadoop\share\hadoop\common\lib\slf4j-log4j12-1.7.5.jar;C:\hadoop\share\hadoop\common\lib\snappy-java-1.0.4.1.jar;C:\hadoop\share\hadoop\common\lib\stax-api-1.0-2.jar;C:\hadoop\share\hadoop\common\lib\xmlenc-0.52.jar;C:\hadoop\share\hadoop\common\lib\xz-1.0.jar;C:\hadoop\share\hadoop\common\lib\zookeeper-3.4.6.jar;C:\hadoop\share\hadoop\common\hadoop-common-2.6.5-tests.jar;C:\hadoop\share\hadoop\common\hadoop-common-2.6.5.jar;C:\hadoop\share\hadoop\common\hadoop-nfs-2.6.5.jar;C:\hadoop\share\hadoop\hdfs;C:\hadoop\share\hadoop\hdfs\lib\asm-3.2.jar;C:\hadoop\share\hadoop\hdfs\lib\commons-cli-1.2.jar;C:\hadoop\share\hadoop\hdfs\lib\commons-codec-1.4.jar;C:\hadoop\share\hadoop\hdfs\lib\commons-daemon-1.0.13.jar;C:\hadoop\share\hadoop\hdfs\lib\commons-el-1.0.jar;C:\hadoop\share\hadoop\hdfs\lib\commons-io-2.4.jar;C:\hadoop\share\hadoop\hdfs\lib\commons-lang-2.6.jar;C:\hadoop\share\hadoop\hdfs\lib\commons-logging-1.1.3.jar;C:\hadoop\share\hadoop\hdfs\lib\guava-11.0.2.jar;C:\hadoop\share\hadoop\hdfs\lib\htrace-core-3.0.4.jar;C:\hadoop\share\hadoop\hdfs\lib\jackson-core-asl-1.9.13.jar;C:\hadoop\share\hadoop\hdfs\lib\jackson-mapper-asl-1.9.13.jar;C:\hadoop\share\hadoop\hdfs\lib\jasper-runtime-5.5.23.jar;C:\hadoop\share\hadoop\hdfs\lib\jersey-core-1.9.jar;C:\hadoop\share\hadoop\hdfs\lib\jersey-server-1.9.jar;C:\hadoop\share\hadoop\hdfs\lib\jetty-6.1.26.jar;C:\hadoop\share\hadoop\hdfs\lib\jetty-util-6.1.26.jar;C:\hadoop\share\hadoop\hdfs\lib\jsp-api-2.1.jar;C:\hadoop\share\hadoop\hdfs\lib\jsr305-1.3.9.jar;C:\hadoop\share\hadoop\hdfs\lib\log4j-1.2.17.jar;C:\hadoop\share\hadoop\hdfs\lib\netty-3.6.2.Final.jar;C:\hadoop\share\hadoop\hdfs\lib\protobuf-java-2.5.0.jar;C:\hadoop\share\hadoop\hdfs\lib\servlet-api-2.5.jar;C:\hadoop\share\hadoop\hdfs\lib\xercesImpl-2.9.1.jar;C:\hadoop\share\hadoop\hdfs\lib\xml-apis-1.3.04.jar;C:\hadoop\share\hadoop\hdfs\lib\xmlenc-0.52.jar;C:\hadoop\share\hadoop\hdfs\hadoop-hdfs-2.6.5-tests.jar;C:\hadoop\share\hadoop\hdfs\hadoop-hdfs-2.6.5.jar;C:\hadoop\share\hadoop\hdfs\hadoop-hdfs-nfs-2.6.5.jar;C:\hadoop\share\hadoop\yarn\lib\activation-1.1.jar;C:\hadoop\share\hadoop\yarn\lib\aopalliance-1.0.jar;C:\hadoop\share\hadoop\yarn\lib\asm-3.2.jar;C:\hadoop\share\hadoop\yarn\lib\commons-cli-1.2.jar;C:\hadoop\share\hadoop\yarn\lib\commons-codec-1.4.jar;C:\hadoop\share\hadoop\yarn\lib\commons-collections-3.2.2.jar;C:\hadoop\share\hadoop\yarn\lib\commons-compress-1.4.1.jar;C:\hadoop\share\hadoop\yarn\lib\commons-httpclient-3.1.jar;C:\hadoop\share\hadoop\yarn\lib\commons-io-2.4.jar;C:\hadoop\share\hadoop\yarn\lib\commons-lang-2.6.jar;C:\hadoop\share\hadoop\yarn\lib\commons-logging-1.1.3.jar;C:\hadoop\share\hadoop\yarn\lib\guava-11.0.2.jar;C:\hadoop\share\hadoop\yarn\lib\guice-3.0.jar;C:\hadoop\share\hadoop\yarn\lib\guice-servlet-3.0.jar;C:\hadoop\share\hadoop\yarn\lib\jackson-core-asl-1.9.13.jar;C:\hadoop\share\hadoop\yarn\lib\jackson-jaxrs-1.9.13.jar;C:\hadoop\share\hadoop\yarn\lib\jackson-mapper-asl-1.9.13.jar;C:\hadoop\share\hadoop\yarn\lib\jackson-xc-1.9.13.jar;C:\hadoop\share\hadoop\yarn\lib\javax.inject-1.jar;C:\hadoop\share\hadoop\yarn\lib\jaxb-api-2.2.2.jar;C:\hadoop\share\hadoop\yarn\lib\jaxb-impl-2.2.3-1.jar;C:\hadoop\share\hadoop\yarn\lib\jersey-client-1.9.jar;C:\hadoop\share\hadoop\yarn\lib\jersey-core-1.9.jar;C:\hadoop\share\hadoop\yarn\lib\jersey-guice-1.9.jar;C:\hadoop\share\hadoop\yarn\lib\jersey-json-1.9.jar;C:\hadoop\share\hadoop\yarn\lib\jersey-server-1.9.jar;C:\hadoop\share\hadoop\yarn\lib\jettison-1.1.jar;C:\hadoop\share\hadoop\yarn\lib\jetty-6.1.26.jar;C:\hadoop\share\hadoop\yarn\lib\jetty-util-6.1.26.jar;C:\hadoop\share\hadoop\yarn\lib\jline-0.9.94.jar;C:\hadoop\share\hadoop\yarn\lib\jsr305-1.3.9.jar;C:\hadoop\share\hadoop\yarn\lib\leveldbjni-all-1.8.jar;C:\hadoop\share\hadoop\yarn\lib\log4j-1.2.17.jar;C:\hadoop\share\hadoop\yarn\lib\netty-3.6.2.Final.jar;C:\hadoop\share\hadoop\yarn\lib\protobuf-java-2.5.0.jar;C:\hadoop\share\hadoop\yarn\lib\servlet-api-2.5.jar;C:\hadoop\share\hadoop\yarn\lib\stax-api-1.0-2.jar;C:\hadoop\share\hadoop\yarn\lib\xz-1.0.jar;C:\hadoop\share\hadoop\yarn\lib\zookeeper-3.4.6.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-api-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-applications-distributedshell-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-applications-unmanaged-am-launcher-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-client-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-common-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-registry-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-server-applicationhistoryservice-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-server-common-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-server-nodemanager-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-server-resourcemanager-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-server-tests-2.6.5.jar;C:\hadoop\share\hadoop\yarn\hadoop-yarn-server-web-proxy-2.6.5.jar;C:\hadoop\share\hadoop\mapreduce\lib\aopalliance-1.0.jar;C:\hadoop\share\hadoop\mapreduce\lib\asm-3.2.jar;C:\hadoop\share\hadoop\mapreduce\lib\avro-1.7.4.jar;C:\hadoop\share\hadoop\mapreduce\lib\commons-compress-1.4.1.jar;C:\hadoop\share\hadoop\mapreduce\lib\commons-io-2.4.jar;C:\hadoop\share\hadoop\mapreduce\lib\guice-3.0.jar;C:\hadoop\share\hadoop\mapreduce\lib\guice-servlet-3.0.jar;C:\hadoop\share\hadoop\mapreduce\lib\hadoop-annotations-2.6.5.jar;C:\hadoop\share\hadoop\mapreduce\lib\hamcrest-core-1.3.jar;C:\hadoop\share\hadoop\mapreduce\lib\jackson-core-asl-1.9.13.jar;C:\hadoop\share\hadoop\mapreduce\lib\jackson-mapper-asl-1.9.13.jar;C:\hadoop\share\hadoop\mapreduce\lib\javax.inject-1.jar;C:\hadoop\share\hadoop\mapreduce\lib\jersey-core-1.9.jar;C:\hadoop\share\hadoop\mapreduce\lib\jersey-guice-1.9.jar;C:\hadoop\share\hadoop\mapreduce\lib\jersey-server-1.9.jar;C:\hadoop\share\hadoop\mapreduce\lib\junit-4.11.jar;C:\hadoop\share\hadoop\mapreduce\lib\leveldbjni-all-1.8.jar;C:\hadoop\share\hadoop\mapreduce\lib\log4j-1.2.17.jar;C:\hadoop\share\hadoop\mapreduce\lib\netty-3.6.2.Final.jar;C:\hadoop\share\hadoop\mapreduce\lib\paranamer-2.3.jar;C:\hadoop\share\hadoop\mapreduce\lib\protobuf-java-2.5.0.jar;C:\hadoop\share\hadoop\mapreduce\lib\snappy-java-1.0.4.1.jar;C:\hadoop\share\hadoop\mapreduce\lib\xz-1.0.jar;C:\hadoop\share\hadoop\mapreduce\hadoop-mapreduce-client-app-2.6.5.jar;C:\hadoop\share\hadoop\mapreduce\hadoop-mapreduce-client-common-2.6.5.jar;C:\hadoop\share\hadoop\mapreduce\hadoop-mapreduce-client-core-2.6.5.jar;C:\hadoop\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-2.6.5.jar;C:\hadoop\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-plugins-2.6.5.jar;C:\hadoop\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-2.6.5-tests.jar;C:\hadoop\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-2.6.5.jar;C:\hadoop\share\hadoop\mapreduce\hadoop-mapreduce-client-shuffle-2.6.5.jar;C:\hadoop\share\hadoop\mapreduce\hadoop-mapreduce-examples-2.6.5.jar
STARTUP_MSG: build = https://github.com/apache/hadoop.git -r e8c9fe0b4c252caf2ebf1464220599650f119997; compiled by 'sjlee' on 2016-10-02T23:43Z
STARTUP_MSG: java = 1.8.0_181
************************************************************/
18/07/18 20:44:55 INFO namenode.NameNode: createNameNode [-format]
[Fatal Error] core-site.xml:19:6: The processing instruction target matching "[xX][mM][lL]" is not allowed.
18/07/18 20:44:55 FATAL conf.Configuration: error parsing conf core-site.xml
org.xml.sax.SAXParseException; systemId: file:/C:/hadoop/etc/hadoop/core-site.xml; lineNumber: 19; columnNumber: 6; The processing instruction target matching "[xX][mM][lL]" is not allowed.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2432)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2420)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2491)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1099)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1071)
at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1409)
at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:319)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:485)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1375)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1512)
18/07/18 20:44:55 FATAL namenode.NameNode: Failed to start namenode.
java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/C:/hadoop/etc/hadoop/core-site.xml; lineNumber: 19; columnNumber: 6; The processing instruction target matching "[xX][mM][lL]" is not allowed.
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2597)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1099)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1071)
at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1409)
at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:319)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:485)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1375)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1512)
Caused by: org.xml.sax.SAXParseException; systemId: file:/C:/hadoop/etc/hadoop/core-site.xml; lineNumber: 19; columnNumber: 6; The processing instruction target matching "[xX][mM][lL]" is not allowed.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2432)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2420)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2491)
... 11 more
18/07/18 20:44:55 INFO util.ExitUtil: Exiting with status 1
18/07/18 20:44:55 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at TheBhaskarDas/192.168.44.1
************************************************************/
C:\hadoop\bin>
核心站点.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>C:\hadoop\temp</value>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:50071</value>
</property>
</configuration>
最佳答案
我已经修复了。
我已经删除了 <?xml 之前的所有字符/任何内容并验证了 https://www.w3schools.com/xml/xml_validator.asp 中的 XML 文件
新的 core-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>\hadoop\temp</value>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:50071</value>
</property>
</configuration>
关于Hadoop 单节点集群在 namenode 格式化期间设置错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51405728/
我有一个Ruby程序,它使用rubyzip压缩XML文件的目录树。gem。我的问题是文件开始变得很重,我想提高压缩级别,因为压缩时间不是问题。我在rubyzipdocumentation中找不到一种为创建的ZIP文件指定压缩级别的方法。有人知道如何更改此设置吗?是否有另一个允许指定压缩级别的Ruby库? 最佳答案 这是我通过查看rubyzip内部创建的代码。level=Zlib::BEST_COMPRESSIONZip::ZipOutputStream.open(zip_file)do|zip|Dir.glob("**/*")d
我在使用omniauth/openid时遇到了一些麻烦。在尝试进行身份验证时,我在日志中发现了这一点:OpenID::FetchingError:Errorfetchinghttps://www.google.com/accounts/o8/.well-known/host-meta?hd=profiles.google.com%2Fmy_username:undefinedmethod`io'fornil:NilClass重要的是undefinedmethodio'fornil:NilClass来自openid/fetchers.rb,在下面的代码片段中:moduleNetclass
我想将html转换为纯文本。不过,我不想只删除标签,我想智能地保留尽可能多的格式。为插入换行符标签,检测段落并格式化它们等。输入非常简单,通常是格式良好的html(不是整个文档,只是一堆内容,通常没有anchor或图像)。我可以将几个正则表达式放在一起,让我达到80%,但我认为可能有一些现有的解决方案更智能。 最佳答案 首先,不要尝试为此使用正则表达式。很有可能你会想出一个脆弱/脆弱的解决方案,它会随着HTML的变化而崩溃,或者很难管理和维护。您可以使用Nokogiri快速解析HTML并提取文本:require'nokogiri'h
大约一年前,我决定确保每个包含非唯一文本的Flash通知都将从模块中的方法中获取文本。我这样做的最初原因是为了避免一遍又一遍地输入相同的字符串。如果我想更改措辞,我可以在一个地方轻松完成,而且一遍又一遍地重复同一件事而出现拼写错误的可能性也会降低。我最终得到的是这样的:moduleMessagesdefformat_error_messages(errors)errors.map{|attribute,message|"Error:#{attribute.to_s.titleize}#{message}."}enddeferror_message_could_not_find(obje
我正在查看instance_variable_set的文档并看到给出的示例代码是这样做的:obj.instance_variable_set(:@instnc_var,"valuefortheinstancevariable")然后允许您在类的任何实例方法中以@instnc_var的形式访问该变量。我想知道为什么在@instnc_var之前需要一个冒号:。冒号有什么作用? 最佳答案 我的第一直觉是告诉你不要使用instance_variable_set除非你真的知道你用它做什么。它本质上是一种元编程工具或绕过实例变量可见性的黑客攻击
我想设置一个默认日期,例如实际日期,我该如何设置?还有如何在组合框中设置默认值顺便问一下,date_field_tag和date_field之间有什么区别? 最佳答案 试试这个:将默认日期作为第二个参数传递。youcorrectlysetthedefaultvalueofcomboboxasshowninyourquestion. 关于ruby-on-rails-date_field_tag,如何设置默认日期?[rails上的ruby],我们在StackOverflow上找到一个类似的问
我遵循MichaelHartl的“RubyonRails教程:学习Web开发”,并创建了检查用户名和电子邮件长度有效性的测试(名称最多50个字符,电子邮件最多255个字符)。test/helpers/application_helper_test.rb的内容是:require'test_helper'classApplicationHelperTest在运行bundleexecraketest时,所有测试都通过了,但我看到以下消息在最后被标记为错误:ERROR["test_full_title_helper",ApplicationHelperTest,1.820016791]test
我是rails的新手,想在form字段上应用验证。myviewsnew.html.erb.....模拟.rbclassSimulation{:in=>1..25,:message=>'Therowmustbebetween1and25'}end模拟Controller.rbclassSimulationsController我想检查模型类中row字段的整数范围,如果不在范围内则返回错误信息。我可以检查上面代码的范围,但无法返回错误消息提前致谢 最佳答案 关键是您使用的是模型表单,一种显示ActiveRecord模型实例属性的表单。c
我正在尝试编写一个将文件上传到AWS并公开该文件的Ruby脚本。我做了以下事情:s3=Aws::S3::Resource.new(credentials:Aws::Credentials.new(KEY,SECRET),region:'us-west-2')obj=s3.bucket('stg-db').object('key')obj.upload_file(filename)这似乎工作正常,除了该文件不是公开可用的,而且我无法获得它的公共(public)URL。但是当我登录到S3时,我可以正常查看我的文件。为了使其公开可用,我将最后一行更改为obj.upload_file(file
我克隆了一个rails仓库,我现在正尝试捆绑安装背景:OSXElCapitanruby2.2.3p173(2015-08-18修订版51636)[x86_64-darwin15]rails-v在您的Gemfile中列出的或native可用的任何gem源中找不到gem'pg(>=0)ruby'。运行bundleinstall以安装缺少的gem。bundleinstallFetchinggemmetadatafromhttps://rubygems.org/............Fetchingversionmetadatafromhttps://rubygems.org/...Fe