您的位置:首页 > 其它

iphone检测耳机插入/拔出

2014-05-12 14:56 357 查看
开发过程中录音和播放这块碰到了一些问题,麻烦的主要有三个:

检测是否有声音输入设备

当有多个声音输出设备时,指定声音输出设备

检测耳机的插入和拔出

第一个问题,对于iTouch和iPad等本身不带麦克风的设备,需要检查是否插入了带录音功能的耳机;对于iphone,由于其本身已近自带麦克风,所以相对容易。第二个问题,当在本身带有外放的设备上插入耳机等输出设备时,就出现了多个输出设备,需要实现在程序中指定将声音输出到哪里。第三个问题,插入/拔出耳机必然引起声音输出设备的变化,而如果是在iTouch和iPad上插入/拔出了带麦克风的耳机,则必然引起声音输入设备的变化。

1.检测声音输入设备

[plain]view
plaincopyprint?

-(BOOL)hasMicphone{

return[[AVAudioSessionsharedInstance]inputIsAvailable];

}

2.检测声音输出设备

对于输出设备的检测,我们只考虑了2个情况,一种是设备自身的外放(iTouch/iPad/iPhone都有),一种是当前是否插入了带外放的耳机。iOS已经提供了相关方法用于获取当前的所有声音设备,我们只需要检查在这些设备中是否存在我们所关注的那几个就可以了。

获取当前所有声音设备:

[plain]view
plaincopyprint?

CFStringRefroute;

UInt32propertySize=sizeof(CFStringRef);

AudioSessionGetProperty(kAudioSessionProperty_AudioRoute,&propertySize,&route);

在iOS上所有可能的声音设备包括:

[cpp]view
plaincopyprint?

/*Knownvaluesofroute:

*"Headset"

*"Headphone"

*"Speaker"

*"SpeakerAndMicrophone"

*"HeadphonesAndMicrophone"

*"HeadsetInOut"

*"ReceiverAndMicrophone"

*"Lineout"

*/

每一项的具体代表的设备请查考iOS文档,此处我们关注的是是否有耳机,所以只需要检查在route中是否有Headphone或Headset存在,具体方法如下:

[plain]view
plaincopyprint?

-(BOOL)hasHeadset{

#ifTARGET_IPHONE_SIMULATOR

#warning***Simulatormode:audiosessioncodeworksonlyonadevice

returnNO;

#else

CFStringRefroute;

UInt32propertySize=sizeof(CFStringRef);

AudioSessionGetProperty(kAudioSessionProperty_AudioRoute,&propertySize,&route);

if((route==NULL)||(CFStringGetLength(route)==0)){

//SilentMode

NSLog(@"AudioRoute:SILENT,donothing!");

}else{

NSString*routeStr=(NSString*)route;

NSLog(@"AudioRoute:%@",routeStr);

/*Knownvaluesofroute:

*"Headset"

*"Headphone"

*"Speaker"

*"SpeakerAndMicrophone"

*"HeadphonesAndMicrophone"

*"HeadsetInOut"

*"ReceiverAndMicrophone"

*"Lineout"

*/

NSRangeheadphoneRange=[routeStrrangeOfString:@"Headphone"];

NSRangeheadsetRange=[routeStrrangeOfString:@"Headset"];

if(headphoneRange.location!=NSNotFound){

returnYES;

}elseif(headsetRange.location!=NSNotFound){

returnYES;

}

}

returnNO;

#endif

}

请注意,由于获取AudioRoute的相关方法不能再simulator上运行(会直接crush),所以必须先行处理。

3.设置声音输出设备

在我们的项目中,存在当正在播放时用户会插入或拔出耳机的情况。如果是播放时用户插入了耳机,苹果会自动将声音输出指向到耳机并自动将音量调整为合适大小;如果是在用耳机的播放过程中用户拔出了耳机,声音会自动从设备自身的外放里面播出,但是其音量并不会自动调大。

经过我们的测试,我们发现当播放时拔出耳机会有两个问题(也许对你来说不是问题,但是会影响我们的app):

音乐播放自动停止

声音音量大小不会自动变大,系统仍然以较小的声音(在耳机上合适的声音)来进行外放

对于第一个问题,实际上就是需要能够检测到耳机拔出的事件;而第二个问题则是需要当耳机拔出时强制设置系统输出设备修改为系统外放。

强制修改系统声音输出设备:

[plain]view
plaincopyprint?

-(void)resetOutputTarget{

BOOLhasHeadset=[selfhasHeadset];

NSLog(@"WillSetoutputtargetis_headset=%@.",hasHeadset?@"YES":@"NO");

UInt32audioRouteOverride=hasHeadset?

kAudioSessionOverrideAudioRoute_None:kAudioSessionOverrideAudioRoute_Speaker;

AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute,sizeof(audioRouteOverride),&audioRouteOverride);

}

可以看到我们修改了AudioSession的属性“kAudioSessionProperty_OverrideAudioRoute”,该属性在iOS文档上的解释如下:

kAudioSessionProperty_OverrideAudioRoute

Specifies
whetherornottooverridetheaudiosessioncategory’snormalaudioroute.Canbesetwithoneoftwovalues:
kAudioSessionOverrideAudioRoute_None
,
whichspecifiesthatyouwanttousethenormalaudioroute;and
kAudioSessionOverrideAudioRoute_Speaker
,
whensendsoutputaudiotothespeaker.Awrite-only
UInt32
value.

Uponanaudioroutechange(suchasbyplugginginorunpluggingaheadset),oruponinterruption,thispropertyrevertstoitsdefaultvalue.
Thispropertycanbeusedonlywiththe
kAudioSessionCategory_PlayAndRecord
(or
theequivalent
AVAudioSessionCategoryRecord
)
category.
可以看到,该属性只有当category为kAudioSessionCategory_PlayAndRecord或者AVAudioSessionCategoryRecord时才能使用。所以我们还需要能够设置AudioSession的category。

4.设置Audio工作模式(category,我当做工作模式理解的)

iOS系统中Audio支持多种工作模式(category),要实现某个功能,必须首先将AudioSession设置到支持该功能的工作模式下。所有支持的工作模式如下:

[java]view
plaincopyprint?

AudioSessionCategories

Categoryidentifiersforaudiosessions,usedasvaluesforthesetCategory:error:method.

NSString*constAVAudioSessionCategoryAmbient;

NSString*constAVAudioSessionCategorySoloAmbient;

NSString*constAVAudioSessionCategoryPlayback;

NSString*constAVAudioSessionCategoryRecord;

NSString*constAVAudioSessionCategoryPlayAndRecord;

NSString*constAVAudioSessionCategoryAudioProcessing;

具体每一个category的功能请参考iOS文档,其中AVAudioSessionCategoryRecord为独立录音模式,而AVAudioSessionCategoryPlayAndRecord为支持录音盒播放的模式,而AVAudioSessionCategoryPlayback为普通播放模式。

设置category:

[java]view
plaincopyprint?

-(BOOL)checkAndPrepareCategoryForRecording{

recording=YES;

BOOLhasMicphone=[selfhasMicphone];

NSLog(@"WillSetcategoryforrecording!hasMicophone=%@",hasMicphone?@"YES":@"NO");

if(hasMicphone){

[[AVAudioSessionsharedInstance]setCategory:AVAudioSessionCategoryPlayAndRecord

error:nil];

}

[selfresetOutputTarget];

returnhasMicphone;

}

-(void)resetCategory{

if(!recording){

NSLog(@"WillSetcategorytostaticvalue=AVAudioSessionCategoryPlayback!");

[[AVAudioSessionsharedInstance]setCategory:AVAudioSessionCategoryPlayback

error:nil];

}

}

5.检测耳机插入/拔出事件

耳机插入拔出事件是通过监听AudioSession的RouteChange事件然后判断耳机状态实现的。实现步骤分为两步,首先注册监听函数,然后再监听函数中判断耳机状态。

注册监听函数:

[java]view
plaincopyprint?

AudioSessionAddPropertyListener(kAudioSessionProperty_AudioRouteChange,

audioRouteChangeListenerCallback,

self);

我们的需求是当耳机被插入或拔出时做出响应,而产生AouteChange事件的原因有多种,所以需要对各种类型进行处理并结合当前耳机状态进行判断。在iOS文档中,产生AouteChange事件的原因有如下几种:

[java]view
plaincopyprint?

AudioSessionRouteChangeReasons

IdentifiersforthevariousreasonsthatanaudioroutecanchangewhileyouriOSapplicationisrunning.

enum{

kAudioSessionRouteChangeReason_Unknown=0,

kAudioSessionRouteChangeReason_NewDeviceAvailable=1,

kAudioSessionRouteChangeReason_OldDeviceUnavailable=2,

kAudioSessionRouteChangeReason_CategoryChange=3,

kAudioSessionRouteChangeReason_Override=4,

//thisenumhasnoconstantwithavalueof5

kAudioSessionRouteChangeReason_WakeFromSleep=6,

kAudioSessionRouteChangeReason_NoSuitableRouteForCategory=7

};

具体每个类型的含义请查阅iOS文档,其中我们关注的是kAudioSessionRouteChangeReason_NewDeviceAvailable有新设备插入、kAudioSessionRouteChangeReason_OldDeviceUnavailable原有设备被拔出以及kAudioSessionRouteChangeReason_NoSuitableRouteForCategory当前工作模式缺少合适设备。

当有新设备接入时,如果检测到耳机,则判定为耳机插入事件;当原有设备移除时,如果无法检测到耳机,则判定为耳机拔出事件;当出现“当前工作模式缺少合适设备时”,直接判定为录音时拔出了麦克风。

很明显,这个判定逻辑实际上不准确,比如原来就有耳机但是插入了一个新的audio设备或者是原来就没有耳机但是拔出了一个原有的audio设备,我们的判定都会出错。但是对于我们的项目来说,其实关注的不是耳机是拔出还是插入,真正关注的是有audio设备插入/拔出时能够根据当前耳机/麦克风状态去调整设置,所以这个判定实现对我们来说是正确的。

监听函数的实现:

[plain]view
plaincopyprint?

voidaudioRouteChangeListenerCallback(

void*inUserData,

AudioSessionPropertyIDinPropertyID,

UInt32inPropertyValueSize,

constvoid*inPropertyValue

){

if(inPropertyID!=kAudioSessionProperty_AudioRouteChange)return;

//Determinesthereasonfortheroutechange,toensurethatitisnot

//becauseofacategorychange.

CFDictionaryRefrouteChangeDictionary=inPropertyValue;

CFNumberRefrouteChangeReasonRef=

CFDictionaryGetValue(routeChangeDictionary,

CFSTR(kAudioSession_AudioRouteChangeKey_Reason));

SInt32routeChangeReason;

CFNumberGetValue(routeChangeReasonRef,kCFNumberSInt32Type,&routeChangeReason);

NSLog(@"=======================RouteChangeReason:%d",routeChangeReason);

AudioHelper*_self=(AudioHelper*)inUserData;

if(routeChangeReason==kAudioSessionRouteChangeReason_OldDeviceUnavailable){

[_selfresetSettings];

if(![_selfhasHeadset]){

[[NSNotificationCenterdefaultCenter]postNotificationName:@"ununpluggingHeadse

object:nil];

}

}elseif(routeChangeReason==kAudioSessionRouteChangeReason_NewDeviceAvailable){

[_selfresetSettings];

if(![_selfhasMicphone]){

[[NSNotificationCenterdefaultCenter]postNotificationName:@"pluggInMicrophone"

object:nil];

}

}elseif(routeChangeReason==kAudioSessionRouteChangeReason_NoSuitableRouteForCategory){

[_selfresetSettings];

[[NSNotificationCenterdefaultCenter]postNotificationName:@"lostMicroPhone"

object:nil];

}

//elseif(routeChangeReason==kAudioSessionRouteChangeReason_CategoryChange){

//[[AVAudioSessionsharedInstance]setCategory:AVAudioSessionCategoryPlayAndRecorderror:nil];

//}

[_selfprintCurrentCategory];

}

当检测到相关事件后,通过NSNotificationCenter通知observers耳机(有无麦克风)拔出/插入事件拔出事件,从而触发相关操作。

6.事件处理

对于耳机(有无麦克风)拔出/插入事件,一般需要做如下处理:

强制重设系统声音输出设备(防止系统以较小声音在外放中播放)

如果拔出前正在播放,则启动已经暂停的播放(当耳机拔出时,系统会自动暂停播放)

当拔出前正在录音,则需要检查麦克风情况并决定是否停止录音(如果录音时从iTouch/iPad等设备上拔出了带麦克风的耳机)

完整代码

AudioHelper.h

[java]view
plaincopyprint?

#import<Foundation/Foundation.h>

@interfaceAudioHelper:NSObject{

BOOLrecording;

}

-(void)initSession;

-(BOOL)hasHeadset;

-(BOOL)hasMicphone;

-(void)cleanUpForEndRecording;

-(BOOL)checkAndPrepareCategoryForRecording;

@end

AudioHelper.m

[plain]view
plaincopyprint?

#import"AudioHelper.h"

#import<AVFoundation/AVFoundation.h>

#import<AudioToolbox/AudioToolbox.h>

@implementationAudioHelper

-(BOOL)hasMicphone{

return[[AVAudioSessionsharedInstance]inputIsAvailable];

}

-(BOOL)hasHeadset{

#ifTARGET_IPHONE_SIMULATOR

#warning***Simulatormode:audiosessioncodeworksonlyonadevice

returnNO;

#else

CFStringRefroute;

UInt32propertySize=sizeof(CFStringRef);

AudioSessionGetProperty(kAudioSessionProperty_AudioRoute,&propertySize,ute);

if((route==NULL)||(CFStringGetLength(route)==0)){

//SilentMode

NSLog(@"AudioRoute:SILENT,donothing!");

}else{

NSString*routeStr=(NSString*)route;

NSLog(@"AudioRoute:%@",routeStr);

/*Knownvaluesofroute:

*"Headset"

*"Headphone"

*"Speaker"

*"SpeakerAndMicrophone"

*"HeadphonesAndMicrophone"

*"HeadsetInOut"

*"ReceiverAndMicrophone"

*"Lineout"

*/

NSRangeheadphoneRange=[routeStrrangeOfString:@"Headphone"];

NSRangeheadsetRange=[routeStrrangeOfString:@"Headset"];

if(headphoneRange.location!=NSNotFound){

returnYES;

}elseif(headsetRange.location!=NSNotFound){

returnYES;

}

}

returnNO;

#endif

}

-(void)resetOutputTarget{

BOOLhasHeadset=[selfhasHeadset];

NSLog(@"WillSetoutputtargetis_headset=%@.",hasHeadset?@"YES":@"NO");

UInt32audioRouteOverride=hasHeadset?

kAudioSessionOverrideAudioRoute_None:kAudioSessionOverrideAudioRoute_Sper;

AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute,sizeof(audioRouteOverride),&audioRouteOverride);

[selfhasHeadset];

}

-(BOOL)checkAndPrepareCategoryForRecording{

recording=YES;

BOOLhasMicphone=[selfhasMicphone];

NSLog(@"WillSetcategoryforrecording!hasMicophone=%@",Micphone?@"YES":@"NO");

if(hasMicphone){

[[AVAudioSessionsharedInstance]Category:AVAudioSessionCategoryPlayAndRecord

error:nil];

}

[selfresetOutputTarget];

returnhasMicphone;

}

-(void)resetCategory{

if(!recording){

NSLog(@"WillSetcategorytostaticvalue=udioSessionCategoryPlayback!");

[[AVAudioSessionsharedInstance]Category:AVAudioSessionCategoryPlayback

error:nil];

}

}

-(void)resetSettings{

[selfresetOutputTarget];

[selfresetCategory];

BOOLisSucced=[[AVAudioSessionsharedInstance]setActive:YESerror:NULL];

if(!isSucced){

NSLog(@"Resetaudiosessionsettingsfailed!");

}

}

-(void)cleanUpForEndRecording{

recording=NO;

[selfresetSettings];

}

-(void)printCurrentCategory{

return;

UInt32audioCategory;

UInt32size=sizeof(audioCategory);

AudioSessionGetProperty(kAudioSessionProperty_AudioCategory,&size,dioCategory);

if(audioCategory==kAudioSessionCategory_UserInterfaceSoundEffects){

NSLog(@"currentcategoryis:dioSessionCategory_UserInterfaceSoundEffects");

}elseif(audioCategory==kAudioSessionCategory_AmbientSound){

NSLog(@"currentcategoryis:kAudioSessionCategory_AmbientSound");

}elseif(audioCategory==kAudioSessionCategory_AmbientSound){

NSLog(@"currentcategoryis:kAudioSessionCategory_AmbientSound");

}elseif(audioCategory==kAudioSessionCategory_SoloAmbientSound){

NSLog(@"currentcategoryis:kAudioSessionCategory_SoloAmbientSound");

}elseif(audioCategory==kAudioSessionCategory_MediaPlayback){

NSLog(@"currentcategoryis:kAudioSessionCategory_MediaPlayback");

}elseif(audioCategory==kAudioSessionCategory_LiveAudio){

NSLog(@"currentcategoryis:kAudioSessionCategory_LiveAudio");

}elseif(audioCategory==kAudioSessionCategory_RecordAudio){

NSLog(@"currentcategoryis:kAudioSessionCategory_RecordAudio");

}elseif(audioCategory==kAudioSessionCategory_PlayAndRecord){

NSLog(@"currentcategoryis:kAudioSessionCategory_PlayAndRecord");

}elseif(audioCategory==kAudioSessionCategory_AudioProcessing){

NSLog(@"currentcategoryis:kAudioSessionCategory_AudioProcessing");

}else{

NSLog(@"currentcategoryis:unknow");

}

}

voidaudioRouteChangeListenerCallback(

void*inUserData,

AudioSessionPropertyIDinPropertyID,

UInt32inPropertyValueS,

constvoid*inPropertyValue

){

if(inPropertyID!=kAudioSessionProperty_AudioRouteChange)return;

//Determinesthereasonfortheroutechange,toensurethatitisnot

//becauseofacategorychange.

CFDictionaryRefrouteChangeDictionary=inPropertyValue;

CFNumberRefrouteChangeReasonRef=

CFDictionaryGetValue(routeChangeDictionary,

CFSTR(kAudioSession_AudioRouteChangeKey_Reason));

SInt32routeChangeReason;

CFNumberGetValue(routeChangeReasonRef,kCFNumberSInt32Type,uteChangeReason);

NSLog(@"=====================================RouteChangeReason:%d",teChangeReason);

AudioHelper*_self=(AudioHelper*)inUserData;

if(routeChangeReason==kAudioSessionRouteChangeReason_OldDeviceUnavailable)

[_selfresetSettings];

if(![_selfhasHeadset]){

[[NSNotificationCenterdefaultCenter]tNotificationName:@"ununpluggingHeadse"

object:nil];

}

}elseif(routeChangeReason==dioSessionRouteChangeReason_NewDeviceAvailable){

[_selfresetSettings];

if(![_selfhasMicphone]){

[[NSNotificationCenterdefaultCenter]tNotificationName:@"pluggInMicrophone"

object:nil];

}

}elseif(routeChangeReason==dioSessionRouteChangeReason_NoSuitableRouteForCategory){

[_selfresetSettings];

[[NSNotificationCenterdefaultCenter]postNotificationName:@"lostMicroPhone"

object:nil];

}

//elseif(routeChangeReason==kAudioSessionRouteChangeReason_CategoryChange)

//[[AVAudioSessionsharedInstance]setCategory:AVAudioSessionCategoryPlayAndRecorderror:nil];

//}

[_selfprintCurrentCategory];

}

-(void)initSession{

recording=NO;

AudioSessionInitialize(NULL,NULL,NULL,NULL);

[selfresetSettings];

AudioSessionAddPropertyListener(kAudioSessionProperty_AudioRouteChange,

audioRouteChangeListenerCallback,

self);

[selfprintCurrentCategory];

[[AVAudioSessionsharedInstance]setActive:YESerror:NULL];

}

-(void)dealloc{

[superdealloc];

}

@end
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: