您的位置:首页 > 移动开发 > Android开发

use ffmpeg to setup streaming server on android

2011-12-21 15:45 591 查看
ffmpeg is a powerful media library. It provides

ffserver tool that can be used to setup a streaming server.

Here is how to compile ffmpeg for android, using CodeSourcery's cross compiler.

1. Download and extract ffmpeg source code.

2. Use below commands to compile ffmpeg

./configure --arch=arm --cross-prefix=arm-none-linux-gnueabi- --extra-ldflags=-static --target-os=linux

make
3. Run file ffserver && readelf ffserver -d or
arm-none-linux-gnueabi-objdump ffserver -x | grep NEEDED commands to make sure ffserver is statically linked

4. Transfer ffserver tool, ffserver.conf file (defines what media files will be served by ffserver) and media files to android with
adb push command

5. Start streaming server with ./ffserver -f ffserver.conf on andoird shell

Below is a sample ffserver.conf file, which tells the ffserver to listen on rtsp port 7654. It defines two media files for streaming, /data/1.mp3 and /data/1.mp4, respectively. So make sure these files exist.

# Port on which the server is listening. You must select a different

# port from your standard HTTP web server if it is running on the same

# computer.

Port 8090

# Address on which the server is bound. Only useful if you have

# several network interfaces.

BindAddress 0.0.0.0

# Port on which the server is listening. You must select a different

# port from your standard HTTP web server if it is running on the same

# computer.

RTSPPort 7654

# Address on which the server is bound. Only useful if you have

# several network interfaces.

RTSPBindAddress 0.0.0.0

# Number of simultaneous requests that can be handled. Since FFServer

# is very fast, it is more likely that you will want to leave this high

# and use MaxBandwidth, below.

MaxClients 1000

# This the maximum amount of kbit/sec that you are prepared to

# consume when streaming to clients.

MaxBandwidth 1000

# Access log file (uses standard Apache log file format)

# '-' is the standard output.

CustomLog -

# Suppress that if you want to launch ffserver as a daemon.

NoDaemon

<Stream 1.mp4>

Format rtp

File "/data/1.mp4"

</Stream>

<Stream 1.mp3>

Format rtp

File "/data/1.mp3"

</Stream>

To test ffserver, we can start a media player that supports media streaming, and open the url: rtsp://{ip address of the android device}:7654/1.mp3.

streaming audio on android

Streaming media refers to the capability of playing media data while the data is being transferred from server. The
user doesn't need to wait until full media content has been downloaded to start playing. In media streaming, media content is split into small chunks as the transport unit. After the user's player has received sufficient chunks, it starts playing.

From the developer's perspective, media streaming is comprised of two tasks, transfer data and render data. Application developers usually concentrate more on transfer data than render data, because codec and media renderer are often available already.

On android, streaming audio is somewhat easier than video for android provides a more friendly api to render audio data in small chunks. No matter what is our transfer mechanism, rtp, raw udp or raw file reading, we need to feed chunks we received to renderer.
The
AudioTrack.write function enables us doing so.

AudioTrack object runs in two modes, static or stream. In static mode, we write the whole audio file to audio hardware. In stream mode, audio data are written in small chunks. The static mode is more efficient because it doesn't have the overhead of copying
data from java layer to native layer, but it's not suitable if the audio file is too big to fit it memory. It's important to notice that we call

play at different time in two modes. In static mode, we must call write first, then call play. Otherwise, the AudioTrack raises an exception complains that AudioTrack object isn't properly initialized. In stream mode, they are called in reverse order. Under
the hood, static and stream mode determine the memory model. In static mode, audio data are passed to renderer via shared memory. Thus static mode is more efficient.

From birds eye view, the architecture of an typical audio streaming application is:




Our application receives data from network. Then the data will be passed to a java layer

AudioTrack object which internally calls through
jni to native AudioTrack object. The native AudioTrack object in our application is a proxy that refers to the implementation AudioTrack object resides in
audioflingerprocess, through binder ipc mechanism. The audiofinger process will interact with audio hardware.

Since our application and audioflinger are separate processes, so after our application has written data to audioflinger, the playback will not stop even if our application exits.

AudioTrack only supports
PCM (a.k.a
G.711) audio format. In other words, we can't stream mp3 audio directly. We have to deal with decoding ourselves, and feed decoded data to AudioTrack.

Sample

http://code.google.com/p/rxwen-blog-stuff/source/browse/trunk/android/streaming_audio/

For demonstration purpose, this sample chooses a very simple transfer mechanism. It reads data from a wav file on disk in chunks, but we can consider it as if the data were delivered from a media server on network. The idea is similar.

原文来源:http://rxwen.blogspot.com/2010/05/use-ffmpeg-to-setup-streaming-server-on.html
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: