您的位置:首页 > 运维架构

OpenCV使用Mat数据进行K-近邻分类

2015-07-17 14:14 495 查看

    K-近邻(K-Nearest Neighbors, KNN)是一种很好理解的分类算法,简单说来就是从训练样本中找出K个与其最相近的样本,然后看这K个样本中哪个类别的样本多,则待判定的值(或说抽样)就属于这个类别。

KNN算法的主要步骤:

1、计算已知类别数据集中每个点与当前点的距离;

2、选取与当前点距离最小的K个点;

3、统计前K个点中每个类别的样本出现的频率;

4、返回前K个点出现频率最高的类别作为当前点的预测分类。

主要函数:
CV_WRAP virtual bool train( const cv::Mat& trainData, const cv::Mat& responses,const
cv::Mat& sampleIdx=cv::Mat(), bool isRegression=false,int maxK=32, bool updateBase=false );
CV_WRAP virtual float find_nearest( const cv::Mat& samples, int k, CV_OUT cv::Mat&
results,CV_OUT cv::Mat& neighborResponses, CV_OUT cv::Mat& dists) const;
具体实现简单的KNN训练和预测:
#include "stdafx.h"

#include <opencv2/core/core.hpp>  

#include <opencv2/highgui/highgui.hpp>  

#include <opencv2/ml/ml.hpp>  

using namespace cv;  

using namespace std;

int main()  

{  
const
int K = 4;
int accuracy = 0;
//Set
up labels
float labels[10] = {0.0, 1.0, 1.0, 2.0,2.0,0.0, 1.0,1.0, 2.0,2.0};  
Mat labelsMat(10, 1, CV_32FC1, labels);
// Set up training data
float trainingData[10][3] = { {510, 510,10}, {405, 10,510}, {501, 45,420}, {10,20, 510},{35,45,515},{540,420,40},{380,30,300},{400,70,500},{30,60,410},{54,23,543}};  
Mat trainingDataMat(10, 3, CV_32FC1, trainingData);  
CvKNearest knn;
knn.train(trainingDataMat,labelsMat,Mat(),false,K,false);
Mat
sampleMat = (Mat_<float>(1,3) << 310,5,339);  
/*Mat results;
Mat dists;*/
Mat neighborResponses;
float response = knn.find_nearest(sampleMat,K,Mat(),neighborResponses,Mat()); //results,neighborResponses,dists
cout<<"response="<<response<<endl;
for(int k = 0; k < K; k++ )
{
if( neighborResponses.at<float>(k) == response)
accuracy++;
}
cout<<"accuracy="<<accuracy<<endl;
system("pause");
waitKey();  

}
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  OpenCV K近邻 Mat