您的位置:首页 > 移动开发 > Android开发

Android 播放视频并获取指定时间的帧画面

2016-08-12 11:12 453 查看
最近做的项目要求既能播放视频(类似于视频播放器),又能每隔1s左右获取一帧视频画面,然后对图片进行处理,调查了一周,也被折磨了一周,总算找到了大致符合要求的方法。首先对调查过程中涉及到的方法进行简单介绍,再重点介绍最终所采用的方法,话不多说,进入正题。

一.MediaMetadataRetriever

播放视频并取得画面的一帧,大家最先想到应该都是这个,我同样也最先对它进行了测试,这里使用MediaPlayer进行播放,视频播放界面使用SurfaceView来实现。

public class PlayerMainActivity extends Activity implements OnClickListener,
SurfaceHolder.Callback, Runnable {

private static final String TAG = "Movie";
private MediaPlayer mediaPlayer;
private SurfaceView surfaceView;
private SurfaceHolder surfaceHolder;
private Button play_btn;private int currentPosition = 0;
private Bitmap bitmap = null;
private String dataPath = Environment.getExternalStorageDirectory()    + "/Video/Test_movie.AVI";

@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);

surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
play_btn = (Button) findViewById(R.id.play_btn);
play_btn.setOnClickListener(this);
screen_cut_btn.setOnClickListener(this);

surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this);
}

@Override
public void run() {
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDisplay(surfaceHolder);
try {
mediaPlayer.setDataSource(dataPath);
mediaPlayer.prepare();
MediaMetadataRetriever mediaMetadataRetriever = new MediaMetadataRetriever();
mediaMetadataRetriever.setDataSource(dataPath);
int millis = mediaPlayer.getDuration();
Log.i(TAG, "millis: " + millis/1000);
for (int i = 10000*1000; i < 20*1000*1000; i+=500*1000) {

bitmap = mediaMetadataRetriever.getFrameAtTime(i, MediaMetadataRetriever.OPTION_CLOSEST);

String path = Environment.getExternalStorageDirectory() + "/bitmap/"    + i + ".png";
FileOutputStream fileOutputStream = null;
try {
fileOutputStream = new FileOutputStream(path);
bitmap.compress(CompressFormat.PNG, 100, fileOutputStream);
Log.i(TAG, "i: " + i/1000/1000);
} catch (Exception e) {
Log.i(TAG, "Error: " + i/1000/1000);
e.printStackTrace();
}
finally {
if (fileOutputStream != null) {
fileOutputStream.close();
}
}
bitmap.recycle();
}
} catch (Exception e) {
e.printStackTrace();
}
}

@Override
public void surfaceCreated(SurfaceHolder holder) {
Thread t = new Thread(this);
t.start();
}

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,    int height) {
}

@Override
public void surfaceDestroyed(SurfaceHolder holder) {
}

@Override
public void onClick(View view) {
switch (view.getId()) {
case R.id.play_btn:
if (mediaPlayer.isPlaying()) {
mediaPlayer.pause();
play_btn.setText(getResources().getText(R.string.play));
} else {
mediaPlayer.start();
play_btn.setText(getResources().getText(R.string.pause));
}
break;default:
break;
}
}

@Override
protected void onDestroy() {
super.onDestroy();
if (mediaPlayer.isPlaying()) {
mediaPlayer.stop();
}
mediaPlayer.release();
}
}


 

获取一帧的关键代码为:

Bitmap bitmap = mediaMetadataRetriever.getFrameAtTime(timeMs * 1000, MediaMetadataRetriever.OPTION_CLOSEST);


public Bitmap getFrameAtTime(long timeUs, int option) 

第一个参数是传入时间,只能是us(微秒)

第二个参数:
OPTION_CLOSEST    在给定的时间,检索最近一个帧,这个帧不一定是关键帧。
OPTION_CLOSEST_SYNC   在给定的时间,检索最近一个同步与数据源相关联的的帧(关键帧)。
OPTION_NEXT_SYNC 在给定时间之后检索一个同步与数据源相关联的关键帧。
OPTION_PREVIOUS_SYNC  顾名思义,同上

这里为了提取我们想要的帧,不使用关键帧,所以用 OPTION_CLOSEST .

 

最终的测试结果并不理想,连续取20帧画面,其中真正有效的只有7张,其余都是重复的,原因为即使是使用参数OPTION_CLOSEST,程序仍然会去取指定时间临近的关键帧,如10s-15s总是取同一帧,因此这种方法不可用。

提高视频的质量或许有效,未尝试。

 

补充MediaMetadataRetriever的其他知识

// 取得视频的总长度(单位为毫秒)

String time = mediaMetadataRetriever. extractMetadata( MediaMetadataRetriever. METADATA_KEY_DURATION);

MediaMetadataRetriever主要用来取缩略图。

 

二.ThumbnailUtils

同样主要是用来获取视频的缩略图,不可靠,因此并未深入研究。

 

从以上两种方法可以看出,Android API 所提供的获取视频帧的方法大都只能获取视频的缩略图,没办法获取视频每一帧的图片。因此,调查方向应当转向对视频进行解码,然后获取任意一帧。

 

三.MediaCodec

硬件解码,尝试从inputBuffers、outputBuffers中获取帧画面,失败,bitmap中的数据大小始终为0 KB。

public class MoviePlayerActivity extends Activity implements OnTouchListener, OnClickListener, SurfaceHolder.Callback {

private static final String TAG = "Image";
private String file_path;
private Button movie_play;
private boolean playButtonVisible;
private boolean playPause;
private SurfaceView surfaceView;
private SurfaceHolder surfaceHolder;
private PlayerThread playerThread = null;
private ByteBuffer mPixelBuf;

@Override
protected void onCreate(Bundle savedInstanceState) {
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
requestWindowFeature(Window.FEATURE_NO_TITLE);  
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

super.onCreate(savedInstanceState);
setContentView(R.layout.movie_player_activity);
movie_play = (Button) findViewById(R.id.movie_videoview_play);
movie_play.setOnClickListener(this);
movie_play.setText("Play");

Intent intent = getIntent();
file_path = intent.getStringExtra("file_path");  //此Activity是从其他页面跳转来的,file_path为要播放的视频地址

surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this);
mPixelBuf = ByteBuffer.allocateDirect(640*480*4);
mPixelBuf.order(ByteOrder.LITTLE_ENDIAN);
}

@Override
public void surfaceCreated(SurfaceHolder holder) {
if (playerThread == null) {
playerThread = new PlayerThread(holder.getSurface());
playerThread.start();
}
}

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,    int height) {
}

@Override
public void surfaceDestroyed(SurfaceHolder holder) {
if (playerThread != null) {
playerThread.interrupt();
}
}

@Override
public boolean onTouch(View v, MotionEvent event) {
if (!playButtonVisible) {
movie_play.setVisibility(View.VISIBLE);
movie_play.setEnabled(true);
} else {
movie_play.setVisibility(View.INVISIBLE);
}
playButtonVisible = !playButtonVisible;
return false;
}

@Override
public void onClick(View view) {
switch (view.getId()) {
case R.id.movie_videoview_play:
if (!playPause) {
movie_play.setText("Pause");
} else {
movie_play.setText("Play");
}
playPause = !playPause;
break;
default:
break;
}
}

private void writeFrameToSDCard(byte[] bytes, int i, int sampleSize) {                i++;
if (i%10 == 0) {
try {
Bitmap bmp = BitmapFactory.decodeByteArray(bytes, 0, sampleSize);
mPixelBuf.rewind();
bmp.copyPixelsFromBuffer(mPixelBuf);
String path = Environment.getExternalStorageDirectory() + "/bitmap/" + i + ".png";
FileOutputStream fileOutputStream = null;
try {
fileOutputStream = new FileOutputStream(path);
bmp.compress(CompressFormat.PNG, 90, fileOutputStream);
bmp.recycle();
Log.i(TAG, "i: " + i);
} catch (Exception e) {
Log.i(TAG, "Error: " + i);
e.printStackTrace();
}
finally {
if (fileOutputStream != null) {
fileOutputStream.close();
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
}

private class PlayerThread extends Thread {

private MediaExtractor extractor;
private MediaCodec mediaCodec;
private Surface surface;

public PlayerThread(Surface surface) {
this.surface = surface;
}

@Override
public void run() {
extractor = new MediaExtractor();
try {
extractor.setDataSource(file_path);
} catch (IOException e1) {
Log.i(TAG, "Error");
e1.printStackTrace();
}

for (int i = 0; i < extractor.getTrackCount(); i++) {
MediaFormat format = extractor.getTrackFormat(i);
String mime = format.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
extractor.selectTrack(i);
mediaCodec = MediaCodec.createDecoderByType(mime);
mediaCodec.configure(format, surface, null, 0);
break;
}
}

if (mediaCodec == null) {
Log.e(TAG, "Can't find video info!");
return;
}

mediaCodec.start();

ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
BufferInfo info = new BufferInfo();
boolean isEOS = false;
long startMs = System.currentTimeMillis();

int i = 0;
while (!Thread.interrupted()) {
if (!isEOS) {
int inIndex = mediaCodec.dequeueInputBuffer(10000);
if (inIndex >= 0) {
ByteBuffer buffer = inputBuffers[inIndex];
int sampleSize = extractor.readSampleData(buffer, 0);
if (sampleSize < 0) {
// We shouldn't stop the playback at this point, just pass the EOS
// flag to mediaCodec, we will get it again from the dequeueOutputBuffer
Log.d(TAG, "InputBuffer BUFFER_FLAG_END_OF_STREAM");
mediaCodec.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
isEOS = true;
} else {
mediaCodec.queueInputBuffer(inIndex, 0, sampleSize, extractor.getSampleTime(), 0);
extractor.advance();
}
}
}

int outIndex = mediaCodec.dequeueOutputBuffer(info, 100000);

switch (outIndex) {
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
Log.d(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
outputBuffers = mediaCodec.getOutputBuffers();
break;
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
Log.d(TAG,"New format " + mediaCodec.getOutputFormat());
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
Log.d(TAG, "dequeueOutputBuffer timed out!");
break;
default:
ByteBuffer buffer = outputBuffers[outIndex];
Log.v(TAG,"We can't use this buffer but render it due to the API limit, " + buffer);

// We use a very simple clock to keep the video FPS, or the video
// playback will be too fast
while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
try {
sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
break;
}
}
mediaCodec.releaseOutputBuffer(outIndex, true);

/* saves frame to SDcard */
mPixelBuf.rewind();
GLES20.glReadPixels(0, 0, 640, 480, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, mPixelBuf);
try {
ByteBuffer outByteBuffer = outputBuffers[outIndex];
outByteBuffer.position(info.offset);
outByteBuffer.limit(info.offset + info.size);  //info的两个参数值始终为0,所保存的.png也都是0KB。
outByteBuffer.limit(2);
byte[] dst = new byte[outByteBuffer.capacity()];
outByteBuffer.get(dst);
writeFrameToSDCard(dst, i, dst.length);
i++;
} catch (Exception e) {
Log.d(TAG, "Error while creating bitmap with: " + e.getMessage());
}
break;
}

// All decoded frames have been rendered, we can stop playing now
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.d(TAG,    "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
break;
}
}

mediaCodec.stop();
mediaCodec.release();
extractor.release();
}
}
}


所保存的图片大小始终为0的原因大概和下面的方法五类似。

想要深入研究此方法可以参考:
http://stackoverflow.com/questions/19754547/mediacodec-get-all-frames-from-video http://stackoverflow.com/questions/23321880/how-to-get-bitmap-frames-from-video-using-mediacodec
 

四.JCodec
http://jcodec.org/
jcodec-samples-0.1.7.apk

解码视频文件耗时较长,性能较差,平均1.5s取一帧图片,达不到要求。

 

五.使用VideoView播放视频,使用getDrawingCache获取View视图

VideoView是Android提供的播放视频组件,上手很容易。这里使用getDrawingCache获取控件的View。

但是DrawingCache只能截取非视频部分的画面,播放视频的那个小窗口一直是黑色的。原因为Activity画面走的是framebuffer,视频是硬解码推送过来的,所有读取/dev/graphics/fb0  视频播放的那一块就是黑色的,硬件解码不会推送到buffer的,而是直接推送到硬件输出了。

public class MoviePlayerActivity extends Activity implements OnTouchListener, OnClickListener, Runnable {

private static final String TAG = "Image";
private String file_path;
private VideoView videoView;
private Button movie_play;
private boolean playButtonVisible;
private boolean playPause;

@Override
protected void onCreate(Bundle savedInstanceState) {
// full screen
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
// no title
requestWindowFeature(Window.FEATURE_NO_TITLE);
// landscape or horizontal screen
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

super.onCreate(savedInstanceState);
setContentView(R.layout.movie_player_activity);
movie_play = (Button) findViewById(R.id.movie_videoview_play);
movie_play.setOnClickListener(this);
movie_play.setText("Play");

Intent intent = getIntent();
file_path = intent.getStringExtra("file_path");

videoView = (VideoView) findViewById(R.id.movie_palyer_videoview);
videoView.setMediaController(null);
// videoView.setMediaController(new MediaController(this));
videoView.setVideoPath(file_path);
videoView.start();
videoView.requestFocus();

Thread screenShootThread = new Thread(this);
screenShootThread.start();
videoView.setOnTouchListener(this);
}

@Override
public void run() {            //播放视频时后台自动截图,注意参数为videoView
for (int i = 10000 * 1000; i < 20 * 1000 * 1000; i += 500 * 1000) {
int nowTime = videoView.getCurrentPosition();
try {
screenShot(videoView, i);
} catch (Exception e1) {
Log.i(TAG, "Error: screenShot. ");
e1.printStackTrace();
}

try {
Thread.sleep(500);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}

@Override
public boolean onTouch(View v, MotionEvent event) {
if (!playButtonVisible) {
movie_play.setVisibility(View.VISIBLE);
movie_play.setEnabled(true);
} else {
movie_play.setVisibility(View.INVISIBLE);
}
playButtonVisible = !playButtonVisible;

return false;
}

@Override
public void onClick(View view) {
switch (view.getId()) {
case R.id.movie_videoview_play:
if (!playPause) {
movie_play.setText("Pause");
} else {
movie_play.setText("Play");
}
playPause = !playPause;

int nowTime = videoView.getCurrentPosition();
Log.i(TAG, "nowTime: " + nowTime);
try {
screenShot(videoView, nowTime);    //点击按钮截图,注意参数为videoView
} catch (Exception e) {
e.printStackTrace();
}

break;
default:
break;
}
}

public void screenShot(View view, int nowTime) throws Exception {

view.measure(MeasureSpec.makeMeasureSpec(0, MeasureSpec.UNSPECIFIED), MeasureSpec.makeMeasureSpec(0, MeasureSpec.UNSPECIFIED));
view.layout(0, 0, view.getMeasuredWidth(), view.getMeasuredHeight());
view.setDrawingCacheEnabled(true);
view.buildDrawingCache();
Bitmap bitmap = Bitmap.createBitmap(view.getDrawingCache());
String path = Environment.getExternalStorageDirectory() + "/bitmap/" + nowTime + ".png";
FileOutputStream fileOutputStream = null;
try {
fileOutputStream = new FileOutputStream(path);
bitmap.compress(Bitmap.CompressFormat.PNG, 90, fileOutputStream);
Log.i(TAG, "i: " + nowTime);
} catch (Exception e) {
Log.i(TAG, "Error: " + nowTime);
e.printStackTrace();
} finally {
if (fileOutputStream != null) {
fileOutputStream.close();
}
}
bitmap.recycle();
view.setDrawingCacheEnabled(false);
}
}


DrawingCache 获取其他控件如Button等View时正常。

 

六.VideoView播放视频,MediaMetadataRetriever获取帧画面(就是你了!)

能够正常获取帧画面,并且画面之间不重复,即不是只取关键帧,而是去取相应时间点的帧画面。若不保存为图片(.png/.jpg),耗时最多为0.4s,基本达到要求。

参考:http://yashirocc.blog.sohu.com/175636801.html

public class MoviePlayerActivity extends Activity implements OnTouchListener, OnClickListener, Runnable {

private static final String TAG = "ImageLight";
private String file_path;
private VideoView videoView;
private Button movie_play;
private boolean playButtonVisible;
private boolean playPause;

@Override
protected void onCreate(Bundle savedInstanceState) {
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);    // full screen
requestWindowFeature(Window.FEATURE_NO_TITLE);                            // no title
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);        // landscape or horizontal screen

super.onCreate(savedInstanceState);
setContentView(R.layout.movie_player_activity);
movie_play = (Button) findViewById(R.id.movie_videoview_play);
movie_play.setOnClickListener(this);
movie_play.setText("Play");

Intent intent = getIntent();
file_path = intent.getStringExtra("file_path");

videoView = (VideoView) findViewById(R.id.movie_palyer_videoview);
videoView.setMediaController(null);
// videoView.setMediaController(new MediaController(this));
videoView.setVideoPath(file_path);
videoView.start();
videoView.requestFocus();

Thread screenShootThread = new Thread(this);
screenShootThread.start();
videoView.setOnTouchListener(this);
}

@Override
public void run() {
MediaMetadataRetriever metadataRetriever = new MediaMetadataRetriever();
metadataRetriever.setDataSource(file_path);

for (int i = 40000 * 1000; i < 50 * 1000 * 1000; i += 500 * 1000) {
//            try {
//                Thread.sleep(500);
//            } catch (InterruptedException e) {
//                e.printStackTrace();
//            }
Bitmap bitmap = metadataRetriever.
getFrameAtTime(videoView.getCurrentPosition()*1000, MediaMetadataRetriever.OPTION_CLOSEST);
Log.i(TAG, "bitmap---i: " + i/1000);

String path = Environment.getExternalStorageDirectory() + "/bitmap/" + i + ".png";
FileOutputStream fileOutputStream = null;
try {
fileOutputStream = new FileOutputStream(path);
bitmap.compress(Bitmap.CompressFormat.PNG, 90, fileOutputStream);
Log.i(TAG, "i: " + i/1000);
} catch (Exception e) {
Log.i(TAG, "Error: " + i/1000);
e.printStackTrace();
} finally {
if (fileOutputStream != null) {
try {
fileOutputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
bitmap.recycle();
}
}

@Override
public boolean onTouch(View v, MotionEvent event) {
if (!playButtonVisible) {
movie_play.setVisibility(View.VISIBLE);
movie_play.setEnabled(true);
} else {
movie_play.setVisibility(View.INVISIBLE);
}
playButtonVisible = !playButtonVisible;
return false;
}

@Override
public void onClick(View view) {
switch (view.getId()) {
case R.id.movie_videoview_play:
if (!playPause) {
movie_play.setText("Pause");
} else {
movie_play.setText("Play");
}
playPause = !playPause;
break;
default:
break;
}
}
}
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  播放视频