您的位置:首页 > 运维架构

hadoop实现kmeans二

2015-07-28 12:38 393 查看
输入数据,保存在2.txt中:(1,1) (9,9) (2,3) (10,30) (4,4) (34,40) (5,6) (15,20)

3.txt用于保存临时的中心

part-r-00000用于保存reduce的结果

程序的mapreduce过程及结果:

[java] view
plaincopy

初始化过程:(10,30) (2,3)

13/01/26 08:58:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

13/01/26 08:58:38 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.

13/01/26 08:58:38 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).

13/01/26 08:58:38 INFO input.FileInputFormat: Total input paths to process : 2

13/01/26 08:58:38 WARN snappy.LoadSnappy: Snappy native library not loaded

13/01/26 08:58:38 INFO mapred.JobClient: Running job: job_local_0001

13/01/26 08:58:39 INFO util.ProcessTree: setsid exited with exit code 0

13/01/26 08:58:39 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@15718f2

13/01/26 08:58:39 INFO mapred.MapTask: io.sort.mb = 100

13/01/26 08:58:39 INFO mapred.MapTask: data buffer = 79691776/99614720

13/01/26 08:58:39 INFO mapred.MapTask: record buffer = 262144/327680

0list:1

0c:10

1list:1

1c:30

中心点(2,3)对应坐标(1,1)

Mapper输出:(2,3) (1,1)

0list:9

0c:10

1list:9

1c:30

中心点(2,3)对应坐标(9,9)

Mapper输出:(2,3) (9,9)

0list:2

0c:10

1list:3

1c:30

中心点(2,3)对应坐标(2,3)

Mapper输出:(2,3) (2,3)

0list:10

0c:10

1list:30

1c:30

中心点(10,30)对应坐标(10,30)

Mapper输出:(10,30) (10,30)

0list:4

0c:10

1list:4

1c:30

中心点(2,3)对应坐标(4,4)

Mapper输出:(2,3) (4,4)

0list:34

0c:10

1list:40

1c:30

中心点(10,30)对应坐标(34,40)

Mapper输出:(10,30) (34,40)

0list:5

0c:10

1list:6

1c:30

中心点(2,3)对应坐标(5,6)

Mapper输出:(2,3) (5,6)

0list:15

0c:10

1list:20

1c:30

中心点(10,30)对应坐标(15,20)

Mapper输出:(10,30) (15,20)

13/01/26 08:58:39 INFO mapred.MapTask: Starting flush of map output

13/01/26 08:58:39 INFO mapred.MapTask: Finished spill 0

13/01/26 08:58:39 INFO mapred.Task: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting

13/01/26 08:58:39 INFO mapred.JobClient: map 0% reduce 0%

13/01/26 08:58:42 INFO mapred.LocalJobRunner:

13/01/26 08:58:42 INFO mapred.Task: Task 'attempt_local_0001_m_000000_0' done.

13/01/26 08:58:42 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@77eaf8

13/01/26 08:58:42 INFO mapred.MapTask: io.sort.mb = 100

13/01/26 08:58:42 INFO mapred.MapTask: data buffer = 79691776/99614720

13/01/26 08:58:42 INFO mapred.MapTask: record buffer = 262144/327680

0list:2

0c:10

1list:3

1c:30

中心点(2,3)对应坐标(2,3)

Mapper输出:(2,3) (2,3)

0list:10

0c:10

1list:30

1c:30

中心点(10,30)对应坐标(10,30)

Mapper输出:(10,30) (10,30)

0list:34

0c:10

1list:40

1c:30

中心点(10,30)对应坐标(34,40)

Mapper输出:(10,30) (34,40)

0list:1

0c:10

1list:1

1c:30

中心点(2,3)对应坐标(1,1)

Mapper输出:(2,3) (1,1)

13/01/26 08:58:42 INFO mapred.MapTask: Starting flush of map output

13/01/26 08:58:42 INFO mapred.MapTask: Finished spill 0

13/01/26 08:58:42 INFO mapred.Task: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting

13/01/26 08:58:42 INFO mapred.JobClient: map 100% reduce 0%

13/01/26 08:58:45 INFO mapred.LocalJobRunner:

13/01/26 08:58:45 INFO mapred.Task: Task 'attempt_local_0001_m_000001_0' done.

13/01/26 08:58:45 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@18d7ace

13/01/26 08:58:45 INFO mapred.LocalJobRunner:

13/01/26 08:58:45 INFO mapred.Merger: Merging 2 sorted segments

13/01/26 08:58:45 INFO mapred.Merger: Down to the last merge-pass, with 2 segments left of total size: 192 bytes

13/01/26 08:58:45 INFO mapred.LocalJobRunner:

Reduce过程第一次

(10,30)Reduce

val:(10,30)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(34,40)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(10,30)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(34,40)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(15,20)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

count:5

outVal:(10,30) (34,40) (10,30) (34,40) (15,20) /outVal

ave0i103.0

ave1i160.0

写入part:(10,30) (10,30) (34,40) (10,30) (34,40) (15,20) (20.6,32.0)

Reduce过程第一次

(2,3)Reduce

val:(1,1)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(9,9)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(2,3)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(4,4)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(5,6)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(2,3)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

val:(1,1)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@141fab6

temlength:2

count:7

outVal:(1,1) (9,9) (2,3) (4,4) (5,6) (2,3) (1,1) /outVal

ave0i24.0

ave1i27.0

写入part:(2,3) (1,1) (9,9) (2,3) (4,4) (5,6) (2,3) (1,1) (3.4285715,3.857143)

13/01/26 08:58:45 INFO mapred.Task: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting

13/01/26 08:58:45 INFO mapred.LocalJobRunner:

13/01/26 08:58:45 INFO mapred.Task: Task attempt_local_0001_r_000000_0 is allowed to commit now

13/01/26 08:58:45 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to hdfs://localhost:9000/home/administrator/hadoop/kmeans/output

13/01/26 08:58:48 INFO mapred.LocalJobRunner: reduce > reduce

13/01/26 08:58:48 INFO mapred.Task: Task 'attempt_local_0001_r_000000_0' done.

13/01/26 08:58:48 INFO mapred.JobClient: map 100% reduce 100%

13/01/26 08:58:48 INFO mapred.JobClient: Job complete: job_local_0001

13/01/26 08:58:48 INFO mapred.JobClient: Counters: 22

13/01/26 08:58:48 INFO mapred.JobClient: File Output Format Counters

13/01/26 08:58:48 INFO mapred.JobClient: Bytes Written=129

13/01/26 08:58:48 INFO mapred.JobClient: FileSystemCounters

13/01/26 08:58:48 INFO mapred.JobClient: FILE_BYTES_READ=1818

13/01/26 08:58:48 INFO mapred.JobClient: HDFS_BYTES_READ=450

13/01/26 08:58:48 INFO mapred.JobClient: FILE_BYTES_WRITTEN=122901

13/01/26 08:58:48 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=171

13/01/26 08:58:48 INFO mapred.JobClient: File Input Format Counters

13/01/26 08:58:48 INFO mapred.JobClient: Bytes Read=82

13/01/26 08:58:48 INFO mapred.JobClient: Map-Reduce Framework

13/01/26 08:58:48 INFO mapred.JobClient: Map output materialized bytes=200

13/01/26 08:58:48 INFO mapred.JobClient: Map input records=2

13/01/26 08:58:48 INFO mapred.JobClient: Reduce shuffle bytes=0

13/01/26 08:58:48 INFO mapred.JobClient: Spilled Records=24

13/01/26 08:58:48 INFO mapred.JobClient: Map output bytes=164

13/01/26 08:58:48 INFO mapred.JobClient: Total committed heap usage (bytes)=498860032

13/01/26 08:58:48 INFO mapred.JobClient: CPU time spent (ms)=0

13/01/26 08:58:48 INFO mapred.JobClient: SPLIT_RAW_BYTES=262

13/01/26 08:58:48 INFO mapred.JobClient: Combine input records=0

13/01/26 08:58:48 INFO mapred.JobClient: Reduce input records=12

13/01/26 08:58:48 INFO mapred.JobClient: Reduce input groups=2

13/01/26 08:58:48 INFO mapred.JobClient: Combine output records=0

13/01/26 08:58:48 INFO mapred.JobClient: Physical memory (bytes) snapshot=0

13/01/26 08:58:48 INFO mapred.JobClient: Reduce output records=2

13/01/26 08:58:48 INFO mapred.JobClient: Virtual memory (bytes) snapshot=0

13/01/26 08:58:48 INFO mapred.JobClient: Map output records=12

13/01/26 08:58:48 INFO mapred.JobClient: Running job: job_local_0001

13/01/26 08:58:48 INFO mapred.JobClient: Job complete: job_local_0001

13/01/26 08:58:48 INFO mapred.JobClient: Counters: 22

13/01/26 08:58:48 INFO mapred.JobClient: File Output Format Counters

13/01/26 08:58:48 INFO mapred.JobClient: Bytes Written=129

13/01/26 08:58:48 INFO mapred.JobClient: FileSystemCounters

13/01/26 08:58:48 INFO mapred.JobClient: FILE_BYTES_READ=1818

13/01/26 08:58:48 INFO mapred.JobClient: HDFS_BYTES_READ=450

13/01/26 08:58:48 INFO mapred.JobClient: FILE_BYTES_WRITTEN=122901

13/01/26 08:58:48 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=171

13/01/26 08:58:48 INFO mapred.JobClient: File Input Format Counters

13/01/26 08:58:48 INFO mapred.JobClient: Bytes Read=82

13/01/26 08:58:48 INFO mapred.JobClient: Map-Reduce Framework

13/01/26 08:58:48 INFO mapred.JobClient: Map output materialized bytes=200

13/01/26 08:58:48 INFO mapred.JobClient: Map input records=2

13/01/26 08:58:48 INFO mapred.JobClient: Reduce shuffle bytes=0

13/01/26 08:58:48 INFO mapred.JobClient: Spilled Records=24

13/01/26 08:58:48 INFO mapred.JobClient: Map output bytes=164

13/01/26 08:58:48 INFO mapred.JobClient: Total committed heap usage (bytes)=498860032

13/01/26 08:58:48 INFO mapred.JobClient: CPU time spent (ms)=0

13/01/26 08:58:48 INFO mapred.JobClient: SPLIT_RAW_BYTES=262

13/01/26 08:58:48 INFO mapred.JobClient: Combine input records=0

13/01/26 08:58:48 INFO mapred.JobClient: Reduce input records=12

13/01/26 08:58:48 INFO mapred.JobClient: Reduce input groups=2

13/01/26 08:58:48 INFO mapred.JobClient: Combine output records=0

13/01/26 08:58:48 INFO mapred.JobClient: Physical memory (bytes) snapshot=0

13/01/26 08:58:48 INFO mapred.JobClient: Reduce output records=2

13/01/26 08:58:48 INFO mapred.JobClient: Virtual memory (bytes) snapshot=0

13/01/26 08:58:48 INFO mapred.JobClient: Map output records=12

上一次MapReduce结果:第一行:(10,30) (10,30) (34,40) (10,30) (34,40) (15,20) (20.6,32.0)

第二行:(2,3) (1,1) (9,9) (2,3) (4,4) (5,6) (2,3) (1,1) (3.4285715,3.857143)



0坐标距离:116.36001

1坐标距离:2.7755103

新中心点:(20.6,32.0) (3.4285715,3.857143)

13/01/26 08:58:49 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.

13/01/26 08:58:49 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).

13/01/26 08:58:49 INFO input.FileInputFormat: Total input paths to process : 2

13/01/26 08:58:49 INFO mapred.JobClient: Running job: job_local_0002

13/01/26 08:58:49 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@18aab40

13/01/26 08:58:49 INFO mapred.MapTask: io.sort.mb = 100

13/01/26 08:58:49 INFO mapred.MapTask: data buffer = 79691776/99614720

13/01/26 08:58:49 INFO mapred.MapTask: record buffer = 262144/327680

0list:1

0c:20.6

1list:1

1c:32.0

中心点(3.4285715,3.857143)对应坐标(1,1)

Mapper输出:(3.4285715,3.857143) (1,1)

0list:9

0c:20.6

1list:9

1c:32.0

中心点(3.4285715,3.857143)对应坐标(9,9)

Mapper输出:(3.4285715,3.857143) (9,9)

0list:2

0c:20.6

1list:3

1c:32.0

中心点(3.4285715,3.857143)对应坐标(2,3)

Mapper输出:(3.4285715,3.857143) (2,3)

0list:10

0c:20.6

1list:30

1c:32.0

中心点(20.6,32.0)对应坐标(10,30)

Mapper输出:(20.6,32.0) (10,30)

0list:4

0c:20.6

1list:4

1c:32.0

中心点(3.4285715,3.857143)对应坐标(4,4)

Mapper输出:(3.4285715,3.857143) (4,4)

0list:34

0c:20.6

1list:40

1c:32.0

中心点(20.6,32.0)对应坐标(34,40)

Mapper输出:(20.6,32.0) (34,40)

0list:5

0c:20.6

1list:6

1c:32.0

中心点(3.4285715,3.857143)对应坐标(5,6)

Mapper输出:(3.4285715,3.857143) (5,6)

0list:15

0c:20.6

1list:20

1c:32.0

中心点(20.6,32.0)对应坐标(15,20)

Mapper输出:(20.6,32.0) (15,20)

13/01/26 08:58:49 INFO mapred.MapTask: Starting flush of map output

13/01/26 08:58:49 INFO mapred.MapTask: Finished spill 0

13/01/26 08:58:49 INFO mapred.Task: Task:attempt_local_0002_m_000000_0 is done. And is in the process of commiting

13/01/26 08:58:50 INFO mapred.JobClient: map 0% reduce 0%

13/01/26 08:58:52 INFO mapred.LocalJobRunner:

13/01/26 08:58:52 INFO mapred.Task: Task 'attempt_local_0002_m_000000_0' done.

13/01/26 08:58:52 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@147358f

13/01/26 08:58:52 INFO mapred.MapTask: io.sort.mb = 100

13/01/26 08:58:52 INFO mapred.MapTask: data buffer = 79691776/99614720

13/01/26 08:58:52 INFO mapred.MapTask: record buffer = 262144/327680

0list:2

0c:20.6

1list:3

1c:32.0

中心点(3.4285715,3.857143)对应坐标(2,3)

Mapper输出:(3.4285715,3.857143) (2,3)

0list:10

0c:20.6

1list:30

1c:32.0

中心点(20.6,32.0)对应坐标(10,30)

Mapper输出:(20.6,32.0) (10,30)

0list:34

0c:20.6

1list:40

1c:32.0

中心点(20.6,32.0)对应坐标(34,40)

Mapper输出:(20.6,32.0) (34,40)

0list:1

0c:20.6

1list:1

1c:32.0

中心点(3.4285715,3.857143)对应坐标(1,1)

Mapper输出:(3.4285715,3.857143) (1,1)

13/01/26 08:58:52 INFO mapred.MapTask: Starting flush of map output

13/01/26 08:58:52 INFO mapred.MapTask: Finished spill 0

13/01/26 08:58:52 INFO mapred.Task: Task:attempt_local_0002_m_000001_0 is done. And is in the process of commiting

13/01/26 08:58:53 INFO mapred.JobClient: map 100% reduce 0%

13/01/26 08:58:55 INFO mapred.LocalJobRunner:

13/01/26 08:58:55 INFO mapred.Task: Task 'attempt_local_0002_m_000001_0' done.

13/01/26 08:58:55 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@2798e7

13/01/26 08:58:55 INFO mapred.LocalJobRunner:

13/01/26 08:58:55 INFO mapred.Merger: Merging 2 sorted segments

13/01/26 08:58:55 INFO mapred.Merger: Down to the last merge-pass, with 2 segments left of total size: 317 bytes

13/01/26 08:58:55 INFO mapred.LocalJobRunner:

Reduce过程第一次

(20.6,32.0)Reduce

val:(10,30)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(34,40)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(10,30)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(34,40)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(15,20)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

count:5

outVal:(10,30) (34,40) (10,30) (34,40) (15,20) /outVal

ave0i103.0

ave1i160.0

写入part:(20.6,32.0) (10,30) (34,40) (10,30) (34,40) (15,20) (20.6,32.0)

Reduce过程第一次

(3.4285715,3.857143)Reduce

val:(1,1)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(9,9)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(2,3)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(4,4)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(5,6)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(2,3)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

val:(1,1)

values:org.apache.hadoop.mapreduce.ReduceContext$ValueIterable@13043d2

temlength:2

count:7

outVal:(1,1) (9,9) (2,3) (4,4) (5,6) (2,3) (1,1) /outVal

ave0i24.0

ave1i27.0

写入part:(3.4285715,3.857143) (1,1) (9,9) (2,3) (4,4) (5,6) (2,3) (1,1) (3.4285715,3.857143)

13/01/26 08:58:55 INFO mapred.Task: Task:attempt_local_0002_r_000000_0 is done. And is in the process of commiting

13/01/26 08:58:55 INFO mapred.LocalJobRunner:

13/01/26 08:58:55 INFO mapred.Task: Task attempt_local_0002_r_000000_0 is allowed to commit now

13/01/26 08:58:55 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0002_r_000000_0' to hdfs://localhost:9000/home/administrator/hadoop/kmeans/output

13/01/26 08:58:58 INFO mapred.LocalJobRunner: reduce > reduce

13/01/26 08:58:58 INFO mapred.Task: Task 'attempt_local_0002_r_000000_0' done.

13/01/26 08:58:59 INFO mapred.JobClient: map 100% reduce 100%

13/01/26 08:58:59 INFO mapred.JobClient: Job complete: job_local_0002

13/01/26 08:58:59 INFO mapred.JobClient: Counters: 22

13/01/26 08:58:59 INFO mapred.JobClient: File Output Format Counters

13/01/26 08:58:59 INFO mapred.JobClient: Bytes Written=148

13/01/26 08:58:59 INFO mapred.JobClient: FileSystemCounters

13/01/26 08:58:59 INFO mapred.JobClient: FILE_BYTES_READ=4442

13/01/26 08:58:59 INFO mapred.JobClient: HDFS_BYTES_READ=1262

13/01/26 08:58:59 INFO mapred.JobClient: FILE_BYTES_WRITTEN=246235

13/01/26 08:58:59 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=676

13/01/26 08:58:59 INFO mapred.JobClient: File Input Format Counters

13/01/26 08:58:59 INFO mapred.JobClient: Bytes Read=82

13/01/26 08:58:59 INFO mapred.JobClient: Map-Reduce Framework

13/01/26 08:58:59 INFO mapred.JobClient: Map output materialized bytes=325

13/01/26 08:58:59 INFO mapred.JobClient: Map input records=2

13/01/26 08:58:59 INFO mapred.JobClient: Reduce shuffle bytes=0

13/01/26 08:58:59 INFO mapred.JobClient: Spilled Records=24

13/01/26 08:58:59 INFO mapred.JobClient: Map output bytes=289

13/01/26 08:58:59 INFO mapred.JobClient: Total committed heap usage (bytes)=667418624

13/01/26 08:58:59 INFO mapred.JobClient: CPU time spent (ms)=0

13/01/26 08:58:59 INFO mapred.JobClient: SPLIT_RAW_BYTES=262

13/01/26 08:58:59 INFO mapred.JobClient: Combine input records=0

13/01/26 08:58:59 INFO mapred.JobClient: Reduce input records=12

13/01/26 08:58:59 INFO mapred.JobClient: Reduce input groups=2

13/01/26 08:58:59 INFO mapred.JobClient: Combine output records=0

13/01/26 08:58:59 INFO mapred.JobClient: Physical memory (bytes) snapshot=0

13/01/26 08:58:59 INFO mapred.JobClient: Reduce output records=2

13/01/26 08:58:59 INFO mapred.JobClient: Virtual memory (bytes) snapshot=0

13/01/26 08:58:59 INFO mapred.JobClient: Map output records=12

13/01/26 08:58:59 INFO mapred.JobClient: Running job: job_local_0002

13/01/26 08:58:59 INFO mapred.JobClient: Job complete: job_local_0002

13/01/26 08:58:59 INFO mapred.JobClient: Counters: 22

13/01/26 08:58:59 INFO mapred.JobClient: File Output Format Counters

13/01/26 08:58:59 INFO mapred.JobClient: Bytes Written=148

13/01/26 08:58:59 INFO mapred.JobClient: FileSystemCounters

13/01/26 08:58:59 INFO mapred.JobClient: FILE_BYTES_READ=4442

13/01/26 08:58:59 INFO mapred.JobClient: HDFS_BYTES_READ=1262

13/01/26 08:58:59 INFO mapred.JobClient: FILE_BYTES_WRITTEN=246235

13/01/26 08:58:59 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=676

13/01/26 08:58:59 INFO mapred.JobClient: File Input Format Counters

13/01/26 08:58:59 INFO mapred.JobClient: Bytes Read=82

13/01/26 08:58:59 INFO mapred.JobClient: Map-Reduce Framework

13/01/26 08:58:59 INFO mapred.JobClient: Map output materialized bytes=325

13/01/26 08:58:59 INFO mapred.JobClient: Map input records=2

13/01/26 08:58:59 INFO mapred.JobClient: Reduce shuffle bytes=0

13/01/26 08:58:59 INFO mapred.JobClient: Spilled Records=24

13/01/26 08:58:59 INFO mapred.JobClient: Map output bytes=289

13/01/26 08:58:59 INFO mapred.JobClient: Total committed heap usage (bytes)=667418624

13/01/26 08:58:59 INFO mapred.JobClient: CPU time spent (ms)=0

13/01/26 08:58:59 INFO mapred.JobClient: SPLIT_RAW_BYTES=262

13/01/26 08:58:59 INFO mapred.JobClient: Combine input records=0

13/01/26 08:58:59 INFO mapred.JobClient: Reduce input records=12

13/01/26 08:58:59 INFO mapred.JobClient: Reduce input groups=2

13/01/26 08:58:59 INFO mapred.JobClient: Combine output records=0

13/01/26 08:58:59 INFO mapred.JobClient: Physical memory (bytes) snapshot=0

13/01/26 08:58:59 INFO mapred.JobClient: Reduce output records=2

13/01/26 08:58:59 INFO mapred.JobClient: Virtual memory (bytes) snapshot=0

13/01/26 08:58:59 INFO mapred.JobClient: Map output records=12

上一次MapReduce结果:第一行:(20.6,32.0) (10,30) (34,40) (10,30) (34,40) (15,20) (20.6,32.0)

第二行:(3.4285715,3.857143) (1,1) (9,9) (2,3) (4,4) (5,6) (2,3) (1,1) (3.4285715,3.857143)



0坐标距离:0.0

1坐标距离:0.0

新中心点:(20.6,32.0) (3.4285715,3.857143)

Iterator: 2
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: