ios使用AVCaptureVideoDataOutput实现连续拍照消除快门声音
2017-08-26 17:34
555 查看
最近碰到需求,要拍照时候没有声音,用AVCaptureStillImageOutput拍照会有快门声,查了网上的使用了反向声音的方式,但是因为要连拍会导致有些时候声音不同步,还是会有漏网的快门声,最后研究了一下使用了AVCaptureVideoDataOutput来对图像进行输出,没有了快门声音,代码如下:
//初始化 - (void)initAVCaptureSession{ AVCaptureSession *session = [[AVCaptureSession alloc] init]; NSError *error; AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; //更改这个设置的时候必须先锁定设备,修改完后再解锁,否则崩溃 [device lockForConfiguration:nil]; //设置闪光灯为自动 if ([device isFlashModeSupported:AVCaptureFlashModeOff]) { [device setFlashMode:AVCaptureFlashModeOff]; }; [device unlockForConfiguration]; self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:&error]; if (error) { NSLog(@"%@",error); } // Create a VideoDataOutput and add it to the session self.imageOutput = [[AVCaptureVideoDataOutput alloc] init]; [self.session addOutput:self.imageOutput]; // Configure your output. dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL); [self.imageOutput setSampleBufferDelegate:self queue:queue]; // Specify the pixel format self.imageOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; // Start the session running to start the flow of data [session startRunning]; // Assign session to an ivar. [self setSession:session]; if ([self.session canAddInput:self.videoInput]) { [self.session addInput:self.videoInput]; } if ([self.session canAddOutput:self.imageOutput]) { [self.session addOutput:self.imageOutput]; } AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront; for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) { if ([d position] == desiredPosition) { [self.previewLayer.session beginConfiguration]; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:d error:nil]; for (AVCaptureInput *oldInput in self.previewLayer.session.inputs) { [[self.previewLayer session] removeInput:oldInput]; } [self.previewLayer.session addInput:input]; [self.previewLayer.session commitConfiguration]; break; } } }
//避免前置摄像头反转 - (AVCaptureVideoOrientation)avOrientationForDeviceOrientation:(UIDeviceOrientation)deviceOrientation { AVCaptureVideoOrientation result = (AVCaptureVideoOrientation)deviceOrientation; if ( deviceOrientation == UIDeviceOrientationLandscapeLeft ) result = AVCaptureVideoOrientationLandscapeRight; else if ( deviceOrientation == UIDeviceOrientationLandscapeRight ) result = AVCaptureVideoOrientationLandscapeLeft; return result; }
//使用前置摄像头 - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation { return (interfaceOrientation == UIInterfaceOrientationPortrait); }
//照片获取,以及其他处理 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { if(sampleBuffer == NULL) { return ; } // Create a UIImage from the sample buffer data UIImage *oldImage = [self imageFromSampleBuffer:sampleBuffer]; NSData *newData = UIImageJPEGRepresentation(oldImage, 0.3); [dataArray addObject:newData]; NSString *imageName = [NSString stringWithFormat:@"%@.jpg",[self getCurrentTimeInterval]]; //保存图片本地 NSString *savedImagePath = [LOCATION_IMAGES_PATH stringByAppendingPathComponent:imageName]; // NSLog(@"%@",savedImagePath); [newData writeToFile:savedImagePath atomically:NO]; }
// Create a UIImage from sample buffer data - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { // Get a CMSampleBuffer's Core Video image buffer for the media data CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0); // Get the number of bytes per row for the pixel buffer void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); // Get the number of bytes per row for the pixel buffer size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); // Get the pixel buffer width and height size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); // Create a device-dependent RGB color space CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // Create a bitmap graphics context with the sample buffer data CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); // Create a Quartz image from the pixel data in the bitmap graphics context CGImageRef quartzImage = CGBitmapContextCreateImage(context); // Unlock the pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // Free up the context and color space CGContextRelease(context); CGColorSpaceRelease(colorSpace); // Create an image object from the Quartz image UIImage *image = [UIImage imageWithCGImage:quartzImage]; // Release the Quartz image CGImageRelease(quartzImage); return (image); }
相关文章推荐
- iOS 使用 AVCaptureVideoDataOutputSampleBufferDelegate获取实时拍照的视频流
- iOS 使用 AVCaptureVideoDataOutputSampleBufferDelegate获取实时拍照的视频流
- iOS AVCaptureVideoDataOutputSampleBufferDelegate 录制视频
- iOS中声音采集与播放的实现(使用AudioQueue)
- ios学习--iOS CVImageBuffer distorted from AVCaptureSessionDataOutput with AVCaptureSessionPresetPhoto
- AVCaptureVideoDataOutputSampleBufferDelegate 注意点
- AVCaptureVideoDataOutput
- iOS自定义相机输出时崩溃的解决方法+[AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:] - NULL sample buf
- ios去除自定义相机拍照快门声音
- 使用AudioServices相关接口(底层c接口)的连续震动实现(ios)
- AVCaptureDevice中通过调用VideoZoomFactor方法调整焦距实现拉近拉远镜头进行拍照录制视频(动画缩放画面,不闪屏)
- ios 使用ELCImagePicker实现相册照片多选和真机拍照调试
- 使用UIImagePickerController实现 iOS录像拍照
- AVCaptureVideoDataOutput
- iOS 6 编程--Core Data持久化数据存储(2)-使用Core Data实现简单ShoppingCart应用程序
- 关于IOS的AVFoundation框架的AVCaptureVideoDataOutput无法输出问题
- GPUImage--视频流处理之AVCaptureVideoDataOutputSampleBufferDelegate
- jsData 使用教程(三) 实现增删改功能
- 使用Silverlight3中的DataPager实现服务器端分页
- 使用FLEX实现简单WEB在线拍照功能