来自Map的键中的java类型不匹配:应为org.apache.hadoop.io.intwritable,收到org.apache.hadoop.io.longwritable

qkf9rpyu  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(250)

我已经检查过了,我不明白为什么会遇到这个错误。
制图器

public class movieMapper extends Mapper<LongWritable, Text, IntWritable, Text> {

public void map(LongWritable key, Text value, Context context ) throws IOException,InterruptedException {

    String token[]= value.toString().trim().split("::");

    int movieID=Integer.parseInt(token[0].trim());

    context.write(new IntWritable(movieID), new Text(token[1].trim()));

}

}

减速机

public class joinReducer extends Reducer<IntWritable, Text, Text, Text> {

public void reduce(IntWritable key, Iterable<Text> values, Context context) throws IOException,InterruptedException {
    float avgRating=0.0f;
    int tokenCount = 0;
    float ratingSum=0.0f;
    int count=0;

    String movieName="";

    for(Text val:values) {
        tokenCount+=1;
    }

    //If we have more than 40 views/ratings
    if(tokenCount-1>40) {

        for(Text val:values) {

            String temp = val.toString();

            if(val.equals("1")||val.equals("2")||val.equals("3")||val.equals("4")||val.equals("5")) {

                float tempRating= Float.parseFloat(val.toString().trim());
                ratingSum += tempRating;
                count++;

            }

            else {

                movieName=val.toString().trim();
            }

        }

        avgRating = ratingSum/ (float)count;

        context.write(new Text(Float.toString(avgRating)), new Text(movieName));
    }

}

}

驱动程序配置

Configuration conf= new Configuration();
    String parameter[]= new GenericOptionsParser(conf,args).getRemainingArgs();

    if(parameter.length!=3) {

        System.err.println("Three arguments needed  <File1> <File2> <Out>");
        System.exit(2);
    }

    //set Driver class

    Job job1 = Job.getInstance(conf, "Join");
    job1.setJarByClass(MyDriver.class);
    job1.setReducerClass(joinReducer.class);

    MultipleInputs.addInputPath(job1,  new Path(parameter[0]), TextInputFormat.class, movieMapper.class);
    MultipleInputs.addInputPath(job1,  new Path(parameter[1]), TextInputFormat.class, ratingMapper.class);

    job1.setMapOutputKeyClass(IntWritable.class);
    job1.setMapOutputValueClass(Text.class);

    job1.setOutputKeyClass(Text.class);
    job1.setOutputValueClass(Text.class);

    FileOutputFormat.setOutputPath(job1, new Path(parameter[2] + "/temp"));

    job1.waitForCompletion(true);

2013年6月18日09:47:20 info mapreduce.job:job job\u 1528823320386\u 0018在uber模式下运行:false 2013年6月18日09:47:20 info mapreduce.job:map 0%reduce 0%18/06/13 09:47:24 info mapreduce.job:task id:attempt\u 1528823320386\u 0018\u m\u0000000,状态:失败错误:java.io.ioexception:Map中的键类型不匹配:应为org.apache.hadoop.io.intwriteable,在org.apache.hadoop.mapred.maptask$mapoutputbuffer.collect(maptask)收到org.apache.hadoop.io.longwritable。java:1069)在org.apache.hadoop.mapred.maptask$newoutputcollector.write(maptask。java:712)在org.apache.hadoop.mapreduce.task.taskInputInputContextImpl.write(taskInputInputContextImpl。java:89)在org.apache.hadoop.mapreduce.lib.map.wrappedmapper$context.write(wrappedmapper。java:112)在org.apache.hadoop.mapreduce.mapper.map(mapper。java:124)在org.apache.hadoop.mapreduce.mapper.run(mapper。java:145)在org.apache.hadoop.mapreduce.lib.input.delegatingmapper.run(delegatingmapper。java:55)在org.apache.hadoop.mapred.maptask.runnewmapper(maptask。java:784)在org.apache.hadoop.mapred.maptask.run(maptask。java:341)在org.apache.hadoop.mapred.yarnchild$2.run(yarnchild。java:168)位于javax.security.auth.subject.doas(subject)的java.security.accesscontroller.doprivileged(本机方法)。java:422)在org.apache.hadoop.security.usergroupinformation.doas(用户组信息。java:1642)在org.apache.hadoop.mapred.yarnchild.main(yarnchild。java:163)
18/06/13 09:47:25 info mapreduce.job:map 50%reduce 0%18/06/13 09:47:29 info mapreduce.job:task id:attempt\u 1528823320386\u 0018\m\u000000\u 1,status:failed error:java.io.ioexception:map中的键类型不匹配:expected org.apache.hadoop.io.intwriteable,在org.apache.hadoop.mapred.maptask$mapoutputbuffer.collect(maptask)收到org.apache.hadoop.io.longwritable。java:1069)在org.apache.hadoop.mapred.maptask$newoutputcollector.write(maptask。java:712)在org.apache.hadoop.mapreduce.task.taskInputInputContextImpl.write(taskInputInputContextImpl。java:89)在org.apache.hadoop.mapreduce.lib.map.wrappedmapper$context.write(wrappedmapper。java:112)在org.apache.hadoop.mapreduce.mapper.map(mapper。java:124)在org.apache.hadoop.mapreduce.mapper.run(mapper。java:145)在org.apache.hadoop.mapreduce.lib.input.delegatingmapper.run(delegatingmapper。java:55)在org.apache.hadoop.mapred.maptask.runnewmapper(maptask。java:784)在org.apache.hadoop.mapred.maptask.run(maptask。java:341)在org.apache.hadoop.mapred.yarnchild$2.run(yarnchild。java:168)位于javax.security.auth.subject.doas(subject)的java.security.accesscontroller.doprivileged(本机方法)。java:422)在org.apache.hadoop.security.usergroupinformation.doas(用户组信息。java:1642)在org.apache.hadoop.mapred.yarnchild.main(yarnchild。java:163)

siv3szwd

siv3szwd1#

此作业中运行两个Map器,moviemapper和ratingmapper。ratingmapper在函数声明中有一个关键字拼写错误,Map函数的名称“map”被错误地写为“reduce”。
根据out config,reducer应该接受intwritable类型的键,但是它是longwritable类型的,因此出现了错误textinputformat生成longwritable类型的键和text类型的值)

相关问题