spark--transform算子--map
2017-07-18 21:43
375 查看
import org.apache.spark.{SparkConf, SparkContext} /** * Created by liupeng on 2017/6/15. */ object T_map { System.setProperty("hadoop.home.dir","F:\\hadoop-2.6.5") def main(args: Array[String]): Unit = { val conf = new SparkConf().setAppName("map_test").setMaster("local") val sc = new SparkContext(conf) val numbers = Array(1, 2, 3, 4, 5) val numberRDD = sc.parallelize(numbers) //map 遍历元素,对每个元素进行操作 val resultRDD = numberRDD.map(a => a * 10) .foreach(println) } }
运行结果:
相关文章推荐
- spark--transform算子--flatMap
- spark--transform算子--mapPartitions
- spark--transform算子--mapPartitionsWithIndex
- Spark算子:RDD基本转换操作(1)–map、flagMap、distinct
- spark--transform算子--parallelized
- spark--transform算子--repartition
- Spark的Transform算子和Action算子列举和示例
- Spark算子:RDD基本转换操作map、flatMap
- Spark算子--map和flatMap
- spark--transform算子--distinct
- Spark算子:RDD基本转换操作(1)–map、flagMap、distinct
- Spark算子:RDD键值转换操作(1)–partitionBy、mapValues、flatMapValues
- Spark Transformation —— map算子
- spark--transform算子--filter
- spark--transform算子--sample
- Spark Transformation —— flatMap算子
- spark--transform算子--union
- Spark算子:RDD基本转换操作(1)–map、flagMap、distinct
- Spark算子:RDD键值转换操作(1)–partitionBy、mapValues、flatMapValues
- Spark算子[04]:map,flatMap,mapToPair,flatMapToPair