您的位置:首页 > 移动开发 > IOS开发

【从零学习openCV】IOS7下的openCV开发起步(Xcode5.1.1&openCV2.49)

2014-05-25 19:42 239 查看

前言:

开发IOS7已经有一月的时间了,最近在准备推研的事,有点想往CV方向发展,于是开始自学openCV。

关注CSDN已经很久了,也从很多博主那学到了很多知识,于是我也从这周开启自己的blog之旅,从零开始学openCV,从零开始做笔记写blog,哈哈~

好了,废话不多说,进入正题。。

IOS7下使用openCV

Mac OS下要使用openCV当然要编译,但是如果只是在ios下使用openCV的话,只需要将bulid好的opencv2.framework导入到工程中即可。

opencv2.framework在openCV官网上下载即可,传送门:opencv2.framework

下面我们在Xcode下新建一个Single View Appliction,命名为CvForIOS。

将下载好的opencv2.framework拷贝到新建好的工程目录下,然后导入到工程目录下Frameworks中,还有IOS7.1下的libc++.dylib也要导入。



然后需要在Build Settings中,将“C++ Standard Library”设置成libstdc++。



因为opencv中的MIN宏和UIKit的MIN宏有冲突。所以需要在.pch文件中,先定义opencv的头文件,否则会有编译错误。

找到CvForIOS-Preix.pch修改如下:

//
//  Prefix header
//
//  The contents of this file are implicitly included at the beginning of every source file.
//

#import

#ifndef __IPHONE_5_0
#warning "This project uses features only available in iOS SDK 5.0 and later."
#endif

#ifdef __cplusplus
#import
#endif

#ifdef __OBJC__
#import
#import
#endif


至此,前期的配置工作已经完成。


第一个案例:用openCV对手机上的照片进行方框滤波

由于IOS一般都是用UIImage类型来存储显示图像,要用openCV对图像进行操作,就必须要能让cv::mat与UIImage之间自由转换。
在这里我们使用github上的开源代码,传送门:aptogo

代码如下:

#import

@interface UIImage (UIImage_OpenCV)

@property(nonatomic, readonly) cv::Mat CVMat;
@property(nonatomic, readonly) cv::Mat CVGrayscaleMat;

+(UIImage *)imageWithCVMat:(const cv::Mat&)cvMat;
-(id)initWithCVMat:(const cv::Mat&)cvMat;
+ (cv::Mat)cvMatFromUIImage:(UIImage *)image;  //新添加方法

@end

#import "OpenCV.h"

static void ProviderReleaseDataNOP(void *info, const void *data, size_t size)
{
// Do not release memory
return;
}

@implementation UIImage (UIImage_OpenCV)

-(cv::Mat)CVMat
{

CGColorSpaceRef colorSpace = CGImageGetColorSpace(self.CGImage);
CGFloat cols = self.size.width;
CGFloat rows = self.size.height;

cv::Mat cvMat(rows, cols, CV_8UC4); // 8 bits per component, 4 channels

CGContextRef contextRef = CGBitmapContextCreate(cvMat.data,                 // Pointer to backing data
cols,                      // Width of bitmap
rows,                     // Height of bitmap
8,                          // Bits per component
cvMat.step[0],              // Bytes per row
colorSpace,                 // Colorspace
kCGImageAlphaNoneSkipLast |
kCGBitmapByteOrderDefault); // Bitmap info flags

CGContextDrawImage(contextRef, CGRectMake(0, 0, cols, rows), self.CGImage);
CGContextRelease(contextRef);

return cvMat;
}

-(cv::Mat)CVGrayscaleMat
{
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();
CGFloat cols = self.size.width;
CGFloat rows = self.size.height;

cv::Mat cvMat = cv::Mat(rows, cols, CV_8UC1); // 8 bits per component, 1 channel

CGContextRef contextRef = CGBitmapContextCreate(cvMat.data,                 // Pointer to backing data
cols,                      // Width of bitmap
rows,                     // Height of bitmap
8,                          // Bits per component
cvMat.step[0],              // Bytes per row
colorSpace,                 // Colorspace
kCGImageAlphaNone |
kCGBitmapByteOrderDefault); // Bitmap info flags

CGContextDrawImage(contextRef, CGRectMake(0, 0, cols, rows), self.CGImage);
CGContextRelease(contextRef);
CGColorSpaceRelease(colorSpace);

return cvMat;
}

+ (UIImage *)imageWithCVMat:(const cv::Mat&)cvMat
{
return [[[UIImage alloc] initWithCVMat:cvMat] autorelease];
}

- (id)initWithCVMat:(const cv::Mat&)cvMat
{
NSData *data = [NSData dataWithBytes:cvMat.data length:cvMat.elemSize() * cvMat.total()];

CGColorSpaceRef colorSpace;

if (cvMat.elemSize() == 1)
{
colorSpace = CGColorSpaceCreateDeviceGray();
}
else
{
colorSpace = CGColorSpaceCreateDeviceRGB();
}

CGDataProviderRef provider = CGDataProviderCreateWithCFData((CFDataRef)data);

CGImageRef imageRef = CGImageCreate(cvMat.cols,                                     // Width
cvMat.rows,                                     // Height
8,                                              // Bits per component
8 * cvMat.elemSize(),                           // Bits per pixel
cvMat.step[0],                                  // Bytes per row
colorSpace,                                     // Colorspace
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault,  // Bitmap info flags
provider,                                       // CGDataProviderRef
NULL,                                           // Decode
false,                                          // Should interpolate
kCGRenderingIntentDefault);                     // Intent

self = [self initWithCGImage:imageRef];
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);

return self;
}

//新添加方法
+ (cv::Mat)cvMatFromUIImage:(UIImage *)image
{
CGColorSpaceRef colorSpace = CGImageGetColorSpace( image.CGImage );
CGFloat cols = image.size.width;
CGFloat rows = image.size.height;
cv::Mat cvMat( rows, cols, CV_8UC4 );
CGContextRef contextRef = CGBitmapContextCreate( cvMat.data, cols, rows, 8, cvMat.step[0], colorSpace, kCGImageAlphaNoneSkipLast | kCGBitmapByteOrderDefault );
CGContextDrawImage( contextRef, CGRectMake(0, 0, cols, rows), image.CGImage );
CGContextRelease( contextRef );
//CGColorSpaceRelease( colorSpace );
return cvMat;
}

@end

注意:.mm文件是说明在这个文件中混合使用了Object-C和C++两种编程语言,由于openCV使用的是C++,所以要注意只要使用了openCV的文件就必须改后缀名为.mm

好了,接下来我们终于可以进入正题,首先在main.storyboard上添加一个按钮,一个UIImageView控件,一个UISlider,布局如下:



各控件的消息响应和连接如下:





将视图控制器实现文件的后缀名改为.mm,让viewController实现UIImagePickerControllerDelegate和UIActionSheetDelegate协议

具体实现代码如下:

头文件:

//
//  xhtViewController.h
//  CvForIOS
//
//  Created by Panda on 14-5-25.
//  Copyright (c) 2014年 xht. All rights reserved.
//

#import

@interface xhtViewController : UIViewController

- (IBAction)pickImageClicked:(id)sender;
@property (retain, nonatomic) IBOutlet UIImageView *imageView;
- (IBAction)sliderChanged:(id)sender;

@end


实现文件:

//
//  xhtViewController.m
//  CvForIOS
//
//  Created by Panda on 14-5-25.
//  Copyright (c) 2014年 xht. All rights reserved.
//

#import "xhtViewController.h"
#import "opencv2/opencv.hpp"
#import "OpenCV.h"
#import

UIImage *image = nil;

@interface xhtViewController ()

@end

@implementation xhtViewController

- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}

- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}

- (IBAction)pickImageClicked:(id)sender {
UIActionSheet *actionSheet = [[UIActionSheet alloc] initWithTitle:@"Pick" delegate:self cancelButtonTitle:@"Cancel" destructiveButtonTitle:nil otherButtonTitles:@"Pick From Library",nil];
[actionSheet showInView:self.view];
}
- (void)dealloc {
[_imageView release];
[super dealloc];
}

#pragma mark - UIActionSheetDelegate methods
- (void)actionSheet:(UIActionSheet *)actionSheet clickedButtonAtIndex:(NSInteger)buttonIndex
{
UIImagePickerController *mediaUI = [[UIImagePickerController alloc] init];
mediaUI.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
mediaUI.mediaTypes = [UIImagePickerController availableMediaTypesForSourceType: UIImagePickerControllerSourceTypeSavedPhotosAlbum];
mediaUI.allowsEditing = NO;
mediaUI.delegate = self;
[self presentViewController:mediaUI animated:YES completion:nil];
}

#pragma mark - UIImagePickerControllerDelegate methods
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
if(picker.sourceType == UIImagePickerControllerSourceTypeCamera)
{
image = [info objectForKey:UIImagePickerControllerOriginalImage];
}
else
{
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if (CFStringCompare ((CFStringRef) mediaType, kUTTypeImage, 0) == kCFCompareEqualTo)
{
image = (UIImage *) [info objectForKey:UIImagePickerControllerOriginalImage];
}
}

self.imageView.image = [image copy];

[picker dismissViewControllerAnimated:YES completion:nil];
}

- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
[picker dismissViewControllerAnimated:YES completion:nil];
}

- (UIImage* )ImageBlurwithValue:(int)value withImage:(UIImage* )image
{
cv::Mat img;
img = [UIImage cvMatFromUIImage: image];
cv::boxFilter(img, img, -1, cv::Size(value+1,value+1));
return [UIImage imageWithCVMat:img];
}

- (IBAction)sliderChanged:(id)sender {
UISlider* slider = (UISlider* )sender;
self.imageView.image = [self ImageBlurwithValue: slider.value withImage:[image copy]];
}
@end


调用cv::boxFilter实现方框滤波,通过滑动条即可调整窗口的大小,最终效果如下:



doge经过方框滤波后还是那么带感,朦胧的眼神,哈哈!

总算是把准备工作做好了,接下来就可以用openCV在手机上玩耍了~

以后会在IOS7上用openCV实现更多更有意思的功能,最后整个案例的代码附上:cvForIOS

(转载请注明作者和出处:Shawn-HThttp://blog.csdn.net/shawn_ht 未经允许请勿用于商业用途)

参考文章:http://blog.devtang.com/blog/2012/10/27/use-opencv-in-ios/
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  opencv xcode ios7 配置
相关文章推荐