您的位置:首页 > 运维架构

hadoop中使用Mockito进行单元测试

2014-09-04 14:04 253 查看
由于hadoop中mapper和reducer这两个组件相互依赖,对这种测试传统的Junit单元测试可能不满足我们的需求。我们可以使用Mockito代替hadoop组件来对hadoop的mapper和reducer函数进行测试。下面我展示以计算ncdc中最高气温的map函数进行单元测试。

1、导入Mockito的jar包,一般hadoop工程中自带这个jar,如果没有自己去网上下载,这里不多说;

 

2、编写Mapper行数,这里的Mapper函数等下用来Mockito来单元测试,MaxTemperatureMapper类如下:该类实现了从ncdc气象数据中读取数据,并以年份为key,气温为value

package org.wucl.hadoop.maxtemperature;

import java.io.IOException;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

public class MaxTemperatureMapper extends
Mapper<LongWritable, Text, Text, IntWritable> {

private static final int MISSING = 9999;

@Override
protected void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException {
String line = value.toString();
String id = line.substring(15, 19);
int airTemperature;
if (line.charAt(87) == '+') {
airTemperature = Integer.parseInt(line.substring(88, 92));
} else {
airTemperature = Integer.parseInt(line.substring(87, 92));
}
String quality = line.substring(92, 93);
if (airTemperature != MISSING && quality.matches("[01459]")) {
context.write(new Text(id), new IntWritable(airTemperature));
//System.out.println(airTemperature);
}
}

}

(不熟悉ncdc 气象数据结构的tx自己百度或google,自己动手,丰衣足食)

3、构建Context实例:为对上面的map函数就行单元测试,首先需要构件一个Context实例,可以使用Mockito的mock方法来构建:

代码如下:

Context context = mock(Context.class);

(当然,类前需要静态导入Mockito包这里才可以肆无忌惮的直接调用mock()方法,后面将贴出完整代码)

 

4、调用map函数:context实例创建完成后就可以创建Mapper对象并调用map方法了:

MaxTemperatureMapper mapper = new MaxTemperatureMapper();
Text value = new Text(
"0043011990999991950051518004+68750+023550FM-12+0382" +
// Year ^^^^
"99999V0203201N00261220001CN9999999N9-00111+99999999999");
// Temperature ^^^^^
mapper.map(null, value, context);


5、断言:和junit的Assert有点像,还是直接看代码明白的快:

verify(context).write(new Text("1950"), new IntWritable(-11));

运行结果如下:



下面贴出Mockito完整的代码:

package org.wucl.hadoop.maxtemperature;

import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.verify;

import java.io.IOException;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper.Context;
import org.junit.Test;

public class MaxTemperatureMapperTest {

@Test
public void processesValidRecord() throws IOException, InterruptedException {
Context context = mock(Context.class);
MaxTemperatureMapper mapper = new MaxTemperatureMapper();
Text value = new Text(
"0043011990999991950051518004+68750+023550FM-12+0382" +
// Year ^^^^
"99999V0203201N00261220001CN9999999N9-00111+99999999999");
// Temperature ^^^^^
mapper.map(null, value, context);
verify(context).write(new Text("1950"), new IntWritable(-11));
}

@Test
public void processesValidRecord2() throws IOException,
InterruptedException {
Context context = mock(Context.class);
MaxTemperatureMapper mapper = new MaxTemperatureMapper();
Text value = new Text(
"0043011990999991950051518004+68750+023550FM-12+0382" +
// Year ^^^^
"99999V0203201N00261220001CN9999999N9+00201+99999999999");
// Temperature ^^^^^
mapper.map(null, value, context);
verify(context).write(new Text("1950"), new IntWritable(20));
}

}


 
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息