pom构建添加依赖:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>${spark.version}</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.9.5</version>
</dependency>
sbt构建添加依赖:
"org.apache.spark" %% "spark-core" % "2.0.0" % "test" classifier "tests",
"org.mockito" % "mockito-all" % "1.9.5" % "test"
对class中的方法打桩(不能对class中变量打桩),使用spy。一个例子:
ReadFile.scala:
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD
class ReadFile(sc: SparkContext) extends Serializable{
def input: RDD[String] = {
sc.textFile("hdfs://ip_address/xxx/data.txt")
}
def output = input.collect()
}
ReadFileSuite.scala:
import org.apache.spark.LocalSparkContext.withSpark
import org.apache.spark.{LocalSparkContext, SparkContext, SparkFunSuite}
import org.mockito.Mockito.{spy, when}
class ReadFileSuite extends SparkFunSuite with LocalSparkContext {
test("test1") {
withSpark(new SparkContext("local", "test")) { sc =>
val data = spy(new ReadFile(sc))
val stub_file = sc.textFile(getClass.getResource("/data.txt").getFile)
when(data.input).thenReturn(stub_file)
assert(data.output.length === 3)
}
}
}
ReadFileSuite中文件是从test/resources中读取的,而不是ReadFile中从hdfs读的路径。
如果class没有构造参数,可以使用mock(classOf[yourClass])创建mock的class。
参考了http://qiuguo0205.iteye.com/blog/1443344 和 http://qiuguo0205.iteye.com/blog/1456528
没有评论:
发表评论