ios AudioUnit bluetooth 录音 播放 实现
2017-07-10 15:45
2546 查看
ios AudioUnit bluetooth 录音 播放 实现
因为拍摄短视频的时候有时候手机离主人比较远的时候,这样录音效果就不好了, 为了方便记录声音,则最好的方式就是使用蓝牙耳机,但是ios关于蓝牙耳机的使用特别少,以下是我总结的内容。#import "XYRecorder.h" #import <AVFoundation/AVFoundation.h> #define INPUT_BUS 1 #define OUTPUT_BUS 0 AudioUnit audioUnit; AudioBufferList *buffList; @implementation XYRecorder #pragma mark - init - (instancetype)init { self = [super init]; if (self) { AudioUnitInitialize(audioUnit); [self initRemoteIO]; } return self; } - (void)initRemoteIO { [self initAudioSession]; [self initBuffer]; [self initAudioComponent]; [self initFormat]; [self initAudioProperty]; [self initRecordeCallback]; [self initPlayCallback]; } - (void)initAudioSession { NSError *error; AVAudioSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionAllowBluetooth error:&error]; [audioSession setPreferredSampleRate:44100 error:&error]; [audioSession setPreferredInputNumberOfChannels:1 error:&error]; [audioSession setPreferredIOBufferDuration:0.023 error:&error]; } - (void)initBuffer { UInt32 flag = 0; AudioUnitSetProperty(audioUnit, kAudioUnitProperty_ShouldAllocateBuffer, kAudioUnitScope_Output, INPUT_BUS, &flag, sizeof(flag)); buffList = (AudioBufferList*)malloc(sizeof(AudioBufferList)); buffList->mNumberBuffers = 1; buffList->mBuffers[0].mNumberChannels = 1; buffList->mBuffers[0].mDataByteSize = 1024 * sizeof(short); buffList->mBuffers[0].mData = (short *)malloc(sizeof(short) * 1024); } - (void)initAudioComponent { AudioComponentDescription audioDesc; audioDesc.componentType = kAudioUnitType_Output; audioDesc.componentSubType = kAudioUnitSubType_RemoteIO; audioDesc.componentManufacturer = kAudioUnitManufacturer_Apple; audioDesc.componentFlags = 0; audioDesc.componentFlagsMask = 0; AudioComponent inputComponent = AudioComponentFindNext(NULL, &audioDesc); AudioComponentInstanceNew(inputComponent, &audioUnit); } - (void)initFormat { AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = 44100; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked; audioFormat.mFramesPerPacket = 1; audioFormat.mChannelsPerFrame = 1; audioFormat.mBitsPerChannel = 16; audioFormat.mBytesPerPacket = 2; audioFormat.mBytesPerFrame = 2; AudioUnitSetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, OUTPUT_BUS, &audioFormat, sizeof(audioFormat)); AudioUnitSetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, INPUT_BUS, &audioFormat, sizeof(audioFormat)); } - (void)initRecordeCallback { AURenderCallbackStruct recordCallback; recordCallback.inputProc = RecordCallback; recordCallback.inputProcRefCon = (__bridge void *)self; AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_SetInputCallback, kAudioUnitScope_Global, INPUT_BUS, &recordCallback, sizeof(recordCallback)); } - (void)initPlayCallback { AURenderCallbackStruct playCallback; playCallback.inputProc = PlayCallback; playCallback.inputProcRefCon = (__bridge void *)self; AudioUnitSetProperty(audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, OUTPUT_BUS, &playCallback, sizeof(playCallback)); } - (void)initAudioProperty { UInt32 flag = 1; AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, INPUT_BUS, &flag, sizeof(flag)); AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, OUTPUT_BUS, &flag, sizeof(flag)); } #pragma mark - callback function static OSStatus RecordCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) { AudioUnitRender(audioUnit, ioActionFlags, inTimeStamp, inBusNumber, inNumberFrames, buffList); short *data = (short *)buffList->mBuffers[0].mData; NSLog(@"%d %d %d ", buffList->mBuffers[0].mDataByteSize, inBusNumber, inNumberFrames); return noErr; } static OSStatus PlayCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) { // memcpy(ioData->mBuffers[0].mData, buffList->mBuffers[0].mData, buffList->mBuffers[0].mDataByteSize); AudioUnitRender(audioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, ioData); return noErr; } #pragma mark - public methods - (void)startRecorder { AudioOutputUnitStart(audioUnit); } - (void)stopRecorder { AudioOutputUnitStop(audioUnit); } @end
相关文章推荐
- ios之audio unit的录音和播放一起,解决audioqueue播放PCM延迟问题
- IOS利用AVFoundation框架实现录音和播放(AVAudioSession、AVAudioRecorder、AVAudioPlayer)
- IOS利用AVFoundation框架实现录音和播放(AVAudioSession、AVAudioRecorder、AVAudioPlayer)
- iOS ijkplayer Audio Unit 播放音频
- iOS 音视频开发:Audio Unit播放FFmpeg解码的音频
- Audio Unit Hosting实时录音及播放的官方相关代码
- 李洪强iOS开发之录音和播放实现
- ios AudioUnit 播放 pcm 数据
- iOS 音视频高级编程:Audio Unit播放FFmpeg解码的音频
- IOS利用AVFoundation框架实现录音和播放(AVAudioSession、AVAudioRecorder、AVAudioPlayer)
- IOS 实现 AAC格式 录音 录音后自动播放
- 【iOS录音与播放】实现利用音频队列,通过缓存进行对声音的采集与播放
- ios AudioUnit 录制播放 pcm
- IOS利用AVFoundation框架实现录音和播放(AVAudioSession、AVAudioRecorder、AVAudioPlayer)
- 【iOS录音与播放】实现利用音频队…
- iOS录音后播放声音小,AudioSessionInitialize failed,AudioQueueStart failed (-50)
- 实现KTV效果:播放歌曲,录音(存储录音文件),并同步播放录音 & 解释4.3以上audio与media资源冲突问题
- ios开发AudioUnit的录制与播放功能,双工模式下的回音抑制以及降噪
- iOS 音频录制、播放,使用音频处理框架The Amazing Audio Engine实现音频录制播放 —— HERO博客
- iOS 音视频高级编程:Audio Unit播放FFmpeg解码的音频