您的位置:首页 > 其它

MediaCodec 实例

2015-06-09 14:28 1536 查看


Mediacodec
and camera, color space incorrect



up
vote6down
votefavorite
5

By referring Aegonis's work 1 and work
2, I also got the H.264 stream , but the color is not correct. I am using HTC Butterfly for development. Here is part of my code:

Camera:
parameters.setPreviewSize(width, height);
parameters.setPreviewFormat(ImageFormat.YV12);
parameters.setPreviewFrameRate(frameRate);


MediaCodec:
mediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 320, 240);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 500000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();


When using COLOR_FormatYUV420Planar the error shows "[OMX.qcom.video.encoder.avc] does not support color format 19," so I can only use "COLOR_FormatYUV420SemiPlanar".
Does anyone know the reason why no support?

Got it, by using :
int colorFormat = 0;
MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
for (int i = 0; i < capabilities.colorFormats.length && colorFormat == 0; i++) {
int format = capabilities.colorFormats[i];
Log.e(TAG, "Using color format " + format);
}


we can have color format 21 (COLOR_FormatYUV420SemiPlanar) and 2130708361 (no corresponding format), I think the format will change depends on
device.

Then, I tried the color transform provided from the suggestions in work
1 and work 2:
public static byte[] YV12toYUV420PackedSemiPlanar(final byte[] input, final byte[] output, final int width, final int height) {
/*
* COLOR_TI_FormatYUV420PackedSemiPlanar is NV12
* We convert by putting the corresponding U and V bytes together (interleaved).
*/
final int frameSize = width * height;
final int qFrameSize = frameSize/4;

System.arraycopy(input, 0, output, 0, frameSize); // Y

for (int i = 0; i < qFrameSize; i++) {
output[frameSize + i*2] = input[frameSize + i + qFrameSize]; // Cb (U)
output[frameSize + i*2 + 1] = input[frameSize + i]; // Cr (V)
}
return output;
}

public static byte[] YV12toYUV420Planar(byte[] input, byte[] output, int width, int height) {
/*
* COLOR_FormatYUV420Planar is I420 which is like YV12, but with U and V reversed.
* So we just have to reverse U and V.
*/
final int frameSize = width * height;
final int qFrameSize = frameSize/4;

System.arraycopy(input, 0, output, 0, frameSize); // Y
System.arraycopy(input, frameSize, output, frameSize + qFrameSize, qFrameSize); // Cr (V)
System.arraycopy(input, frameSize + qFrameSize, output, frameSize, qFrameSize); // Cb (U)

return output;
}

public static byte[] swapYV12toI420(byte[] yv12bytes, int width, int height) {
byte[] i420bytes = new byte[yv12bytes.length];
for (int i = 0; i < width*height; i++)
i420bytes[i] = yv12bytes[i];
for (int i = width*height; i < width*height + (width/2*height/2); i++)
i420bytes[i] = yv12bytes[i + (width/2*height/2)];
for (int i = width*height + (width/2*height/2); i < width*height + 2*(width/2*height/2); i++)
i420bytes[i] = yv12bytes[i - (width/2*height/2)];
return i420bytes;
}


Obviously, the color transform of YV12toYUV420PackedSemiPlanar performs better than the other two. It is relatively better but still looks different in comparison with the real color. Is there something
wrong with my code? Any comment will be appreciated.



android
colors h.264
shareimprove
this question
edited Apr
2 '13 at 9:19

asked Apr 1 '13 at 7:49





Albert

7115

+1 for writing the question in such a way. – Ameer
Moaaviah Apr
1 '13 at 7:51
"different" like the chroma channels are backward, or "different" like things are subtly off? (If you rearrange
YV12toYUV420PackedSemiPlanar
to
swap the Cb/Cr channels, does it look right?) – fadden Apr
2 '13 at 1:13
I tried to swap Cb/Cr and the color is incorrect. I also try just show the color of Y and it seems the video is been
put on a green mask, that is not as what I expect. I really can not figure out what is happening. – Albert Apr
2 '13 at 2:26
If it's relatively subtle it could be a gamut problem. BT.601 says the Y channel should go from 16 to 235, but maybe
the camera is outputting 0-255? If that's the issue you could try scaling it (set it to Y*(219/255)+16 as you copy it). – fadden Apr
2 '13 at 17:31
Hi Fadden, thanks for your suggestion. I tested, but the color again seems incorrect. And I observed that the pixel
value are vary from -128 to 127 for Y, Cb and Cr. I thought the pixel value should between 0 to 255. – Albert Apr
3 '13 at 6:15
show 2 more
comments


2 Answers

activeoldestvotes

up vote2down
vote
Got it, now the color looks good, the test is based on HTC Butterfly. When set the resolution to 320x240, your color transform should looks like:
System.arraycopy(input, 0, output, 0, frameSize);
for (int i = 0; i < (qFrameSize); i++) {
output[frameSize + i*2] = (input[frameSize + qFrameSize + i - 32 - 320]);
output[frameSize + i*2 + 1] = (input[frameSize + i - 32 - 320]);
}


for resolution 640x480 and above,
System.arraycopy(input, 0, output, 0, frameSize);
for (int i = 0; i < (qFrameSize); i++) {
output[frameSize + i*2] = (input[frameSize + qFrameSize + i]);
output[frameSize + i*2 + 1] = (input[frameSize + i]);
}


For the frame rate issue, we can use the getSupportedPreviewFpsRange() to check the supported frame rate range of our device as:
List<int[]> fpsRange = parameters.getSupportedPreviewFpsRange();
for (int[] temp3 : fpsRange) {
System.out.println(Arrays.toString(temp3));}


And the following setting works correct when play the encoded H.264 ES,
parameters.setPreviewFpsRange(29000, 30000);
//parameters.setPreviewFpsRange(4000,60000);//this one results fast playback when I use the FRONT CAMERA


shareimprove
this answer
edited Apr
30 '13 at 2:25

answered Apr 10 '13 at 7:40





Albert

7115

Thanks for your answer, Albert. Great research. I am wondering why there's -352 bytes padding when writing U and V components
for 320x240 resolution? And which padding should I use for 960x720 resolution? – Andrey
Chernih Dec
6 '13 at 12:42
Related question: stackoverflow.com/questions/17493169/… It
seems that when sending frame to encoder, Y component should be aligned by some number of bytes for certain resolutions. – Andrey
Chernih Dec
19 '13 at 8:40
add
a comment


up vote1down
vote
After reading this discussion it turns out that more
generalised way for encoding frames of various resolutions is to align chroma plane by 2048 bytes before sending frame to the
MediaCodec
.
This is actual for
QualComm
(
OMX.qcom.video.encoder.avc
)
encoder which I believe HTC Butterfly has, but still does not works well for all resolutions.
720x480
and
176x144
are
still have chroma plane misaligned according to the output video. Also, avoid resolutions which sizes can't be divided by 16.

The transformation is pretty simple:
int padding = 0;
if (mediaCodecInfo.getName().contains("OMX.qcom")) {
padding = (width * height) % 2048;
}
byte[] inputFrameBuffer = new byte[frame.length];
byte[] inputFrameBufferWithPadding = new byte[padding + frame.length];

ColorHelper.NV21toNV12(frame, inputFrameBuffer, width, height);
# copy Y plane
System.arraycopy(inputFrameBuffer, 0, inputFrameBufferWithPadding, 0, inputFrameBuffer.length);
int offset = width * height;
# copy U and V planes aligned by <padding> boundary
System.arraycopy(inputFrameBuffer, offset, inputFrameBufferWithPadding, offset + padding, inputFrameBuffer.length - offset);


shareimprove
this answer
answered Dec 19 '13 at 9:50

Andrey Chernih

804171

9

内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: