您的位置:首页 > 其它

iPhone/iPad开发札记2012/03/11 -2 Audio Unit

2012-03-11 12:09 253 查看
1. Reading

"Audio Unit Hosting Guide for iOS"/

"Audio Unit Processing Graph Service References"/ - for managing the graph ( a collection of units )

"Audio Component Services Reference"/ - for locating/loading a audio unit component

"Audio Unit Component Services References"- for managing a specific audio unit itself

* threading and locks

A graph object’s state can be manipulated in both the rendering thread and in other threads. Consequently, any activities that affect the state of the graph are guarded
with locks and a messaging model between any calling thread and the thread upon which the graph object’s output unit is called (the render thread).

* identification of an Audio Unit

typedef struct AudioComponentDescription {
OSType componentType; // mandatory
OSType componentSubType; // mandatory
OSType componentManufacturer; // mandatory
UInt32 componentFlags; // always 0
UInt32 componentFlagsMask; // always 0
} AudioComponentDescription;

注:iOS system has already registered 7 Audio Unit for us to use. Use the above structure to find one.

* AURenderCallback

Called by the system when an audio unit requires input samples, or before and after a render operation.

^What's this ?
typedef OSStatus (*AURenderCallback) (

void                        *inRefCon,          // meaning "
reference context
"

AudioUnitRenderActionFlags  *ioActionFlags,

const AudioTimeStamp        *inTimeStamp,

UInt32                      inBusNumber,

UInt32                      inNumberFrames,

AudioBufferList             *ioData

);

[/code]

ioData

The AudioBufferList that will be used to contain the rendered or provided audio data.
^ When "rendered"and when "provided"?

* scope and element

scope=input ( stream ) / output ( stream ) / global ( for whole Audio Unit )

element(bus) = some kind of function

A mixer unit, for example, might have several input elements but a single output element.

Note: This means an element(bus) can only be decided by both scope+bus number.

* I/O Unit Essentials

always be used

only Unit to start/stop the AUGraph processing

input element + output element + callback ( my own app )

* AUGraph / AUNode

Provides the capabilities of managing all Units and Callbacks.

It is THREAD SAFETY, which means you could change the configuration when the audio is played. ( this is the only place this kind of thing could happen. )

AUNode = proxy of Audio Unit under AUGraph context ( like adding an Audio Unit into a AUgraph) , however, need to handle Audio Unit directly when you want to configure the Audio Unit.

* Constructing an AUGraph

Adding nodes to a graph
Directly configuring the audio units represented by the nodes
Interconnecting the nodes

* Figure 4-1: a simple AUGraph for playback

Mixer Unit + Remote I/O Unit

Q: output element of Mixer Unit connected to the output element of I/O Unit ( why not input element ? )

* Render Callback Function

render callback function lives on a real-time priority thread on which subsequent render call comes asynchronously, which means the render call must run FAST ENOUGH!

NOT to do the following:

# lock

# memory allocate

# file system

# network connection

* AURenderCallback

As a notification listener, the system invokes this callback before and after an audio unit’s render operations.

As a render operation input callback, it is invoked when an audio unit requires input samples for the input bus that the callback is attached to.
* Parameter: inNumberFrame

inNumberFrames parameter indicates the number of audio sample frames that the callback is being asked to provide on the current invocation. You provide those frames to the buffers in the ioData parameter.

Q: How does the system decide the value of inNumberFrame ???

* AudioUnitSampleType

The canonical audio data sample type for audio processing.
typedef SInt32 AudioUnitSampleType;
#define kAudioUnitSampleFractionBits 24
Discussion
The canonical audio sample type for audio units and other audio processing in iPhone OS is
noninterleaved(?) linear PCM with 8.24-bit fixed-point samples.
Q: What does this really mean ?

* ASBD

layout of bits:

kAudioFormatFlagsAudioUnitCanonical =
kAudioFormatFlagIsFloat |
kAudioFormatFlagsNativeEndian |
kAudioFormatFlagIsPacked |
kAudioFormatFlagIsNonInterleaved

* unit connection & stream format propagation

one direction

happening when AUGraph is initialized.

* hardware sample rate

To use hardware sample rate whenever possible

Q: Should we use 44100 HZ rather than 16000 HZ for our iPad conference client

* configure Audio Session: sample rate

set the hardware sample rate to intended sample rate, which avoids sample rate conversation ( IMPORTANT ! )

[mySession setPreferredHardwareSampleRate:graphSampleRate // 2
error: &audioSessionError];

* configure Audio Session: ioBufferDuration

There’s one other hardware characteristic you may want to configure: audio hardware I/O buffer duration. The default duration is about 23 ms at a 44.1 kHz sample rate, equivalent to a slice
size of 1,024 samples. If I/O latency is critical in your app, you can request a smaller duration, down to about 0.005 ms (equivalent to 256 samples), as shown here:
self.ioBufferDuration = 0.005;

[mySession setPreferredIOBufferDuration: ioBufferDuration

error: &audioSessionError];

* 杂问

Q: 似乎没有见到对ASBD清0的地方?

2. Studying MixerHost sample project

3.Write my own Audio Unit in conference project.

4. Conference project

1:30 PM: 其它程序里也出现 Notification错误提示,以下问题需要弄明:

*是否只有speex打印相关问题?

*为什么昨天没有相关问题出来?什么变了?昨天用了黑苹果今天用真机了?

5. 深度G711版本的conference

* 启动会议失败

root cause:*.ini文件需要同时改meeting no和start time

* iPad加会失败

console显示:40206 ????

[Session started at 2012-03-11 20:05:53 +0800.]

2012-03-11 20:05:54.509 test555[921:207] xxxxxxxxxxxx

2012-03-11 20:05:54.613 test555[921:6803] onResetVideoSize

2012-03-11 20:05:54.614 test555[921:6803] joinConf

2012-03-11 20:05:54.749 test555[921:6803] 40206
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: