草庐IT

runtimeexception

全部标签

jakarta-ee - 运行 hadoop pi 示例时出现 java.lang.runtimeexception java.net.connectexception

我已经在两台机器上配置了hadoop。我可以使用ssh在没有密码的情况下访问两台机器。我已经使用以下命令成功格式化了namenode:--bin/hadoopnamenode-format然后我尝试运行hadoop.tar附带的pi示例sandip@master:~/hadoop-1.0.4$bin/hadoopjarhadoop-examples-1.0.4.jarpi5500NumberofMaps=5SamplesperMap=50013/04/1404:13:04INFOipc.Client:Retryingconnecttoserver:master/192.168.188.

java - 运行 Apache Spark Kafka Stream 时获取 Hadoop OutputFormat RunTimeException

我正在运行一个程序,该程序使用ApacheSpark从ApacheKafka集群获取数据并将数据放入Hadoop文件中。我的程序如下:publicfinalclassSparkKafkaConsumer{publicstaticvoidmain(String[]args){SparkConfsparkConf=newSparkConf().setAppName("JavaKafkaWordCount");JavaStreamingContextjssc=newJavaStreamingContext(sparkConf,newDuration(2000));MaptopicMap=ne

java.lang.RuntimeException : Failed construction of Master: class org. apache.hadoop.hbase.master.HMaster

当我启动-hbase.shHMaster和HregionServer正在启动,但一段时间后不可见。通过查看日志,我发现了这一点。HMaster:java.lang.RuntimeException:FailedconstructionofMaster:classorg.apache.hadoop.hbase.master.HMasteratorg.apache.hadoop.hbase.master.HMaster.constructMaster(HMaster.java:3150)atorg.apache.hadoop.hbase.master.HMasterCommandLine.

hadoop - HiveException : java. lang.RuntimeException : Unable to instantiate org. apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 客户端

当我尝试启动hive时,出现以下错误:SLF4J:ClasspathcontainsmultipleSLF4Jbindings.SLF4J:Foundbindingin[jar:file:/home/ezhil/hadoop-ecosystem/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:Foundbindingin[jar:file:/home/ezhil/hadoop-ecosystem/hadoop/share/hadoop/common/lib/slf4j-

hadoop - 失败 : Error in metadata: java. lang.RuntimeException : Unable to instantiate org. apache.hadoop.hive.metastore.HiveMetaStoreClient

我在HDFS和配置单元实例运行时关闭了我的HDFS客户端。现在,当我重新登录到Hive时,我无法执行任何DDL任务,例如“显示表”或“描述表名”等。它给我如下错误ERRORexec.Task(SessionState.java:printError(401))-FAILED:Errorinmetadata:java.lang.RuntimeException:Unabletoinstantiateorg.apache.hadoop.hive.metastore.HiveMetaStoreClientorg.apache.hadoop.hive.ql.metadata.HiveExcep

java - hadoop - java.lang.RuntimeException : java. lang.InstantiationException 异常

我正在尝试从客户端(widows-7)运行map-reduce程序,这是map-reduce类:Configurationconf=newConfiguration();conf.addResource(newPath("C:\\app\\hadoop-2.0.0-cdh4.3.0\\etc\\hadoop\\core-site.xml"));conf.addResource(newPath("C:\\app\\hadoop-2.0.0-cdh4.3.0\\etc\\hadoop\\hdfs-site.xml"));conf.set("fs.defaultFS","hdfs://hos

hadoop - 如何解决 java.lang.RuntimeException : PipeMapRed. waitOutputThreads() : subprocess failed with code 2?

我正在尝试在Hadoop环境中执行NLTK。以下是我用于执行的命令。bin/hadoopjar$HADOOP_HOME/contrib/streaming/hadoop-streaming-1.0.4.jar-input/user/nltk/input/-output/user/nltk/output1/-file/home/hduser/softwares/NLTK/unsupervised_sentiment-master.zip-mapper/home/hduser/softwares/NLTK/unsupervised_sentiment-master/sentiment.py

java - 关于hadoop "java.lang.RuntimeException: java.lang.ClassNotFoundException: "的问题

这是我的源代码importjava.io.DataInput;importjava.io.DataOutput;importjava.io.IOException;importjava.util.ArrayList;importjava.util.regex.Matcher;importjava.util.regex.Pattern;importorg.apache.hadoop.conf.Configuration;importorg.apache.hadoop.fs.FileSystem;importorg.apache.hadoop.fs.Path;importorg.apach

apache - java.lang.RuntimeException : Unable to instantiate org. apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 客户端

我在ubuntu14.0上安装了Hadoop2.7.1和apache-hive-1.2.1版本。为什么会出现这个错误?是否需要任何Metastore安装?当我们在终端上输入hive命令时,xml的内部调用方式是什么,这些xml的流程是什么?是否需要任何其他配置?当我在ubuntu14.0终端上编写hive命令时,它抛出以下异常。$hiveLogginginitializedusingconfigurationinjar:file:/usr/local/hive/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar!/hive-log4j.pro

php - fatal error : Uncaught exception 'RuntimeException' with message 'Puli Factory is not available' while sending mail using mailgun

我正在尝试使用以下代码发送邮件,我正在使用guzzlehttp,但收到Fatalerror:Uncaughtexception'RuntimeException'消息'埔里工厂不可用'.请帮我找到解决方案,谢谢!这是我的代码:require'vendor/autoload.php';useMailgun\Mailgun;#Instantiatetheclient.$mgClient=newMailgun('key-');$domain="domain";#Makethecalltotheclient.$result=$mgClient->sendMessage("$domain",ar