您的位置:首页 > 移动开发 > IOS开发

iOS 7 What’s New in AV Foundation之二维码扫描(上)

2013-10-18 20:00 951 查看
iOS 7 brings even more improvements to AV Foundation, such as:

•Barcode
reading support

•Speech
synthesis

•Improved
zoom functionality 

Getting Started

打开Xcode 创建一个新工程 选择iOS\Application\Single View Application模板
点击下一步 名字随便起一个, 选择iPhone for Devices,点击下一步 选择一个文件夹存放你的项目


废话不多说了, 

OpenMain.storyboard,
select the view controller in the scene, and selectEditor\Embed In\Navigation Controller.
Finally, select the navigation bar thatappears and set its title to ColloQR.

The base project is all set up —it's time to get cracking on the camera work!

Working with the camera

First you need to import the AV Foundation framework in order to work with itsjuicy new features. OpenViewController.mand
add the following import to thetop of the file:

@importAVFoundation;

Next, add the following instance variables to the implementation declaration:

@implementationViewController {

AVCaptureSession*_captureSession;

AVCaptureDevice*_videoDevice;

AVCaptureDeviceInput*_videoInput;

AVCaptureVideoPreviewLayer*_previewLayer;

BOOL_running;

}

Here’s a quick rundown of these instance variables:

1._captureSession–AVCaptureSessionis
the core media handling class in AVFoundation. It talks to the hardware to retrieve, process, and output video. Acapture session wires together inputs and outputs, and controls the format andresolution of the output frames. 

2._videoDevice–AVCaptureDeviceencapsulates
the physical camera on adevice. Modern iPhones have both front and rear cameras, while other devicesmay only have a single camera.

3._videoInput–
To add anAVCaptureDeviceto a session,
wrap it in anAVCaptureDeviceInput.
A capture session can have multiple inputs and multipleoutputs.

4._previewLayer–AVCaptureVideoPreviewLayerprovides
a mechanism fordisplaying the current frames flowing through a capture session; it allows you todisplay the camera output in your UI.

5._running–
This holds the state of the session; either the session is running orit’s not.

Your instance variables are declared; now your need to initialize them. Add thefollowing method toViewController.m:

- (void)setupCaptureSession
{// 1

if(_captureSession)return;

// 2

_videoDevice= [AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo];

if(!_videoDevice)
{
NSLog(@"No
video camera on this device!");return;

}

// 3

_captureSession= [[AVCaptureSessionalloc]init];

// 4

_videoInput= [[AVCaptureDeviceInputalloc]initWithDevice:_videoDeviceerror:nil];

// 5

if([_captureSessioncanAddInput:_videoInput])
{[_captureSessionaddInput:_videoInput];

}

// 6

_previewLayer= [[AVCaptureVideoPreviewLayeralloc]initWithSession:_captureSession];

_previewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill; 

}

The above method sets up the capture session. The following points explain the

code comment by comment:

1. If the session has already been created, then exit early as there’s no need to setthings up again.

2. Initialize the video device by obtaining the type of the default video mediadevice. This returns the most relevant device available. In practice, this
generallyreferences the device’s rear camera. If there’s no camera available, this methodwill return nil and exit.

3. Initialize the capture session so you’re prepared to receive input.

4. Create the capture input from the device obtained in comment 2.

5. Query the session withcanAddInput:to
determine if it will accept an input. Ifso, calladdInput:to
add the input to the session.

6. Finally, create and initialize a preview layer and indicate which capture session topreview. Set the gravity to "resize aspect fill" so that frames
will scale to fit thelayer, clipping them if required to maintain the aspect ratio. 

Creating the preview view

OpenMain.storyboard,
drag a UIViewonto the view controller,
and make it fillthe entire view. Next, add an outlet for the new view, name itpreviewViewandwire
it up. This serves a container for the preview layer.

Back inViewController.m,
modify viewDidLoadas shown below:

- (void)viewDidLoad
{[superviewDidLoad];

[selfsetupCaptureSession];

_previewLayer.frame=_previewView.bounds;

[_previewView.layeraddSublayer:_previewLayer];}

The code above creates the capture session, sets up the preview layer to fill thecontainer view and adds it as a sublayer.

Next, add the following two methods toViewController.m:

- (void)startRunning


if(_running)return;[_captureSessionstartRunning];_running=YES;

}

- (void)stopRunning
{
if(!_running)return;[_captureSessionstopRunning];_running=NO;

}

These methods start and stop the session if required. The_runninginstancevariable
prevents unnecessary actions, like starting running sessions, or stoppingterminated sessions.

The app should be a good citizen and start and stop the session as necessary. Inyour app, sessions run only when the view controller is on screen.

Add the following two methods toViewController.m:

- (void)viewDidAppear:(BOOL)animated
{[superviewDidAppear:animated];[selfstartRunning];

}

- (void)viewWillDisappear:(BOOL)animated
{[superviewWillDisappear:animated];[selfstopRunning];

}

The above methods hook into the standardUIViewControllermethods
to ensurethe session only runs when the view controller is visible. However, that’s only halfof the solution; you also need to stop the session when the app is put into thebackground and restart it when the app comes back to the foreground.

Add the following notification registrations toviewDidLoad:
inViewController.m:

[[NSNotificationCenterdefaultCenter]addObserver:self

selector:@selector(applicationWillEnterForeground:)name:UIApplicationWillEnterForegroundNotification

object:nil];

[[NSNotificationCenterdefaultCenter]addObserver:self

selector:@selector(applicationDidEnterBackground:)name:UIApplicationDidEnterBackgroundNotification 

object:nil]; 

The code above takes care of starting and stopping the session depending onwhether the app is in the foreground or background.

Next, add the following implementation of the registered selectors toViewController.m:

- (void)applicationWillEnterForeground:(NSNotification*)note
{[selfstartRunning];

}

- (void)applicationDidEnterBackground:(NSNotification*)note
{[selfstopRunning];

}

Now the session will start and stop as required.

Build and run your project; as noted at the beginning of this chapter, you will needto run your app on a physical device that has at least one camera.
The simulator isa pretty useful tool, but it can’t simulate video capture devices.

Once your app is running, you’ll see the camera’s images displayed on-screen,similar to the image below: 



If you see the exact same image as above, then you have your camera pointed at

the author’s laptop — and that’s just plain creepy!

The video capture is working well; it’s time to do something with that video input. 

转下一篇点击打开链接
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  iOS 7 二维码