本文介绍了Hadoop Basics的MapReduce程序中的java.lang.NoClassDefFoundError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试Hadoop的Basic MapReduce程序,其教程位于



这个类的完整代码是(代码在上面的url上是网上的) p>

  import java.io.IOException; 
import java.util.StringTokenizer;
导入org.apache.hadoop.conf.Configuration;
导入org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.KeyValueTextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;

public class Dictionary {
public static class WordMapper extends Mapper< Text,Text,Text,Text> {
private text = new Text();
$ b $ public void map(Text key,Text value,Context context)throws IOException,InterruptedException {
StringTokenizer itr = new StringTokenizer(value.toString(),,);
while(itr.hasMoreTokens()){
word.set(itr.nextToken());
context.write(key,word);
}
}
}

public static class meetingsnslationsReducer extends Reducer< Text,Text,Text,Text> {
private text result = new Text();
$ b $ public void reduce(Text key,Iterable< Text> values,Context context)throws IOException,InterruptedException {
String translations =;
for(Text val:values){
translations + =| + val.toString();
}
result.set(translations);
context.write(key,result);


$ b $ public static void main(String [] args)throws Exception {
System.out.println(welcome to Java 1);
Configuration conf = new Configuration();
System.out.println(欢迎来到Java 2);
工作职位=新职位(conf,dictionary);
job.setJarByClass(Dictionary.class);
job.setMapperClass(WordMapper.class);
job.setReducerClass(emplonslationsReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setInputFormatClass(KeyValueTextInputFormat.class);
FileInputFormat.addInputPath(job,new Path(/ tmp / hadoop -cscarioni / dfs / name / file));
FileOutputFormat.setOutputPath(job,new Path(output));
System.exit(job.waitForCompletion(true)?0:1);
}
}

但是在eclipse中运行后;我得到错误,

 欢迎来到Java 1 
线程main中的异常java.lang.NoClassDefFoundError :org / apache / commons / logging / LogFactory
at org.apache.hadoop.conf.Configuration。< clinit>(Configuration.java:73)
at Dictionary.main(Dictionary.java:43 )
导致:java.lang.ClassNotFoundException:org.apache.commons.logging.LogFactory $ b $在java.net.URLClassLoader $ 1.run(未知源)$ b $在java.security.AccessController .dePrivileged(本地方法)
在java.net.URLClassLoader.findClass(未知源)$ b $在java.lang.ClassLoader.loadClass(未知源)
在sun.misc.Launcher $ AppClassLoader .ClassLoader.loadClass(Unknown Source)
... 2 more

$当类在运行时不可见但在编译时不可见时,会出现NoClassDefFoundError错误。这可能与JAR文件有关,因为没有包含所有必需的类文件。

因此,请尝试在您的类路径中添加commons-logging-1.1.1 jar,您可以从


I'm trying Hadoop's Basic MapReduce Program whose tutorial is on http://java.dzone.com/articles/hadoop-basics-creating

The Full code of the class is(the code is present on net on above url)

import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.KeyValueTextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;

public class Dictionary {
public static class WordMapper extends Mapper<Text, Text, Text, Text> {
    private Text word = new Text();

    public void map(Text key, Text value, Context context) throws IOException, InterruptedException {
        StringTokenizer itr = new StringTokenizer(value.toString(), ",");
        while (itr.hasMoreTokens()) {
            word.set(itr.nextToken());
            context.write(key, word);
        }
    }
}

public static class AllTranslationsReducer extends Reducer<Text, Text, Text, Text> {
    private Text result = new Text();

    public void reduce(Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException {
        String translations = "";
        for (Text val : values) {
            translations += "|" + val.toString();
        }
        result.set(translations);
        context.write(key, result);
    }
}

public static void main(String[] args) throws Exception {
    System.out.println("welcome to Java 1");
    Configuration conf = new Configuration();
    System.out.println("welcome to Java 2");
    Job job = new Job(conf, "dictionary");
    job.setJarByClass(Dictionary.class);
    job.setMapperClass(WordMapper.class);
    job.setReducerClass(AllTranslationsReducer.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(Text.class);
    job.setInputFormatClass(KeyValueTextInputFormat.class);
    FileInputFormat.addInputPath(job, new Path("/tmp/hadoop-cscarioni/dfs/name/file"));
    FileOutputFormat.setOutputPath(job, new Path("output"));
    System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}

But after running in eclipse; I'm getting the error,

welcome to Java 1
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:73)
at Dictionary.main(Dictionary.java:43)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 2 more
解决方案

NoClassDefFoundError comes when a class is not visible at run time but was at compile time. Which may be related to JAR files, because all the required class files were not included.

So try adding in your class path commons-logging-1.1.1 jar which you can get from http://commons.apache.org/logging/download_logging.cgi

这篇关于Hadoop Basics的MapReduce程序中的java.lang.NoClassDefFoundError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-28 21:56