基于Camera、AudioRecord 、MediaCodec 和 MediaMuxer 录制 MP4

一.前言

AAC 音频编码保存和解码播放Camera 视频采集,H264 编码保存
两篇文章中介绍了如何通过 AudioRecord 和 MediaCodec 录制 AAC 音频以及如何通过 Camera
和 MediaCodec 录制 H264 视频。本文将介绍如何通过 MediaMuxer 合成 MP4 文件。

MP4

音视频开发基础概念中有介绍过,MP4 (或者称 MPEG-4) 是一种标准的数字多媒体容器格式,可以存储
音频数据和视频数据。对于视频格式,常见的是 H264 和 H265; 对于音频格式通常是 AAC 。

MediaMuxer

MediaMuxer 是 Android 用来产生一个混合音频和视频多媒体文件的 API ,只支持下面几种格式。

public static final inMUXER_OUTPUT_3GPP = 2;
public static final inMUXER_OUTPUT_HEIF = 3;
public static final inMUXER_OUTPUT_MPEG_4 = 0;
public static final inMUXER_OUTPUT_OGG = 4;
public static final inMUXER_OUTPUT_WEBM = 1;
1. 初始化
mMediaMuxer = new MediaMuxer(path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);

path 表示 MP4 文件的输出路径

2. 添加音频轨和视频轨
if (type == AAC_ENCODER) {
    mAudioTrackIndex = mMediaMuxer.addTrack(mediaFormat);
}
if (type == H264_ENCODER) {
    mVideoTrackIndex = mMediaMuxer.addTrack(mediaFormat);
}

传入 MediaFormat 对象从 MediaCodec 中获取。

3. 开始合成
mMediaMuxer.start();
4. 写入数据
mMediaMuxer.writeSampleData(avData.trackIndex, avData.byteBuffer, avData.bufferInfo);
5. 停止并释放资源
mMediaMuxer.stop();
mMediaMuxer.release();

二. 录制 MP4

AudioTrack、Camera、MediaCodec 和 MediaMuxer 录制 MP4 流程如下图所示:


image.png
1. 音频录制

音频录制使用 AudioRecord


public class AudioRecorder {

    private int mAudioSource;
    private int mSampleRateInHz;
    private int mChannelConfig;
    private int mAudioFormat;
    private int mBufferSizeInBytes;

    private AudioRecord mAudioRecord;
    private volatile boolean mIsRecording;
    private Callback mCallback;
    private byte[] mBuffer;

    public void setCallback(Callback callback) {
        mCallback = callback;
    }

    public interface Callback {
        void onAudioOutput(byte[] data);
    }

    public AudioRecorder(int audioSource, int sampleRateInHz, int channelConfig, int audioFormat, int bufferSizeInBytes) {
        mAudioSource = audioSource;
        mSampleRateInHz = sampleRateInHz;
        mChannelConfig = channelConfig;
        mAudioFormat = audioFormat;
        mBufferSizeInBytes = bufferSizeInBytes;

        mAudioRecord = new AudioRecord(audioSource, sampleRateInHz, channelConfig, audioFormat, bufferSizeInBytes);
        mIsRecording = false;
        int minBufferSize = AudioRecord.getMinBufferSize(sampleRateInHz, channelConfig, AudioFormat.ENCODING_PCM_16BIT);

        mBuffer = new byte[Math.min(2048, minBufferSize)];
    }

    public void start() {
        if (mIsRecording) {
            return;
        }
        new Thread(new Runnable() {
            @Override
            public void run() {
                onStart();
            }
        }).start();
    }

    public void onStart() {
        if (mAudioRecord == null) {
            mAudioRecord = new android.media.AudioRecord(mAudioSource, mSampleRateInHz, mChannelConfig, mAudioFormat, mBufferSizeInBytes);
        }
        mAudioRecord.startRecording();
        mIsRecording = true;
        while (mIsRecording) {
            int len = mAudioRecord.read(mBuffer, 0, mBuffer.length);
            if (len > 0) {
                if (mCallback != null) {
                    mCallback.onAudioOutput(mBuffer);
                }
            }
        }
        mAudioRecord.stop();
        mAudioRecord.release();
        mAudioRecord = null;
    }

    public void stop() {
        mIsRecording = false;
    }
}

2. 音频编码

音频编码使用阻塞队列 BlockingQueue 来缓冲数据,编码成 AAC 格式。


public class AacEncoder {
    public static final int AAC_ENCODER = 2;
    private MediaCodec mAudioEncoder;
    private MediaFormat mMediaFormat;
    private BlockingQueue<byte[]> mDataQueue;
    private volatile boolean mIsEncoding;
    private Callback mCallback;

    private static final String AUDIO_MIME_TYPE = "audio/mp4a-latm";//就是 aac

    public void setCallback(Callback callback) {
        mCallback = callback;
    }

    public interface Callback {
        void outputMediaFormat(int type, MediaFormat mediaFormat);

        void onEncodeOutput(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo);

        void onStop(int type);
    }

    public AacEncoder(int sampleRateInHz, int channelConfig, int bufferSizeInBytes) {
        try {
            mAudioEncoder = MediaCodec.createEncoderByType(AUDIO_MIME_TYPE);
            mMediaFormat = MediaFormat.createAudioFormat(AUDIO_MIME_TYPE, sampleRateInHz, channelConfig == AudioFormat.CHANNEL_OUT_MONO ? 1 : 2);
            mMediaFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
            mMediaFormat.setInteger(MediaFormat.KEY_CHANNEL_MASK, AudioFormat.CHANNEL_IN_STEREO);//CHANNEL_IN_STEREO 立体声
            int bitRate = sampleRateInHz * 16 * channelConfig == AudioFormat.CHANNEL_IN_MONO ? 1 : 2;
            mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
            mMediaFormat.setInteger(MediaFormat.KEY_CHANNEL_COUNT, channelConfig == AudioFormat.CHANNEL_IN_MONO ? 1 : 2);
            mMediaFormat.setInteger(MediaFormat.KEY_SAMPLE_RATE, sampleRateInHz);
            mAudioEncoder.configure(mMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        } catch (IOException e) {

        }
        mDataQueue = new ArrayBlockingQueue<>(10);
        mIsEncoding = false;
    }

    public void start() {
        if (mIsEncoding) {
            return;
        }
        new Thread(new Runnable() {
            @Override
            public void run() {
                onStart();
            }
        }).start();
    }

    public void stop() {
        mIsEncoding = false;
    }

    private void onStart() {
        mIsEncoding = true;
        mAudioEncoder.start();
        byte[] pcmData;
        int inputIndex;
        ByteBuffer inputBuffer;
        ByteBuffer[] inputBuffers = mAudioEncoder.getInputBuffers();

        int outputIndex;
        ByteBuffer outputBuffer;
        ByteBuffer[] outputBuffers = mAudioEncoder.getOutputBuffers();

        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
        while (mIsEncoding || !mDataQueue.isEmpty()) {
            pcmData = dequeueData();
            if (pcmData == null) {
                continue;
            }
            long pts = System.currentTimeMillis() * 1000 - AVTimer.getBaseTimestampUs();
            inputIndex = mAudioEncoder.dequeueInputBuffer(10_000);
            if (inputIndex >= 0) {
                inputBuffer = inputBuffers[inputIndex];
                inputBuffer.clear();
                inputBuffer.limit(pcmData.length);
                inputBuffer.put(pcmData);
                mAudioEncoder.queueInputBuffer(inputIndex, 0, pcmData.length, pts, 0);
            }

            outputIndex = mAudioEncoder.dequeueOutputBuffer(bufferInfo, 10_000);

            if (outputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                outputBuffers = mAudioEncoder.getOutputBuffers();
            } else if (outputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                MediaFormat newFormat = mAudioEncoder.getOutputFormat();
                if (null != mCallback) {
                    mCallback.outputMediaFormat(AAC_ENCODER, newFormat);
                }
            }
            while (outputIndex >= 0) {
                outputBuffer = outputBuffers[outputIndex];
                if (mCallback != null) {
                    mCallback.onEncodeOutput(outputBuffer, bufferInfo);
                }
                mAudioEncoder.releaseOutputBuffer(outputIndex, false);
                outputIndex = mAudioEncoder.dequeueOutputBuffer(bufferInfo, 10_000);
            }
        }
        mAudioEncoder.stop();
        mAudioEncoder.release();
        mAudioEncoder = null;
        if (mCallback != null) {
            mCallback.onStop(AAC_ENCODER);
        }
    }

    private byte[] dequeueData() {
        if (mDataQueue.isEmpty()) {
            return null;
        }
        try {
            return mDataQueue.take();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        return null;
    }

    public void queueData(byte[] data) {
        if (!mIsEncoding) {
            return;
        }
        try {
            mDataQueue.put(data);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}

3. 视频录制

视频录制通过摄像头 Camera 采集,由于 Camera 采集的视频默认是横屏的,还需要通过 YUVEngine 进行转换


public class H264VideoRecord implements CameraHelper.PreviewCallback, H264Encoder.Callback {

    private CameraHelper mCameraHelper;
    private H264Encoder mH264Encoder;

    private Callback mCallback;

    public void setCallback(Callback callback) {
        mCallback = callback;
    }

    public interface Callback {
        void outputMediaFormat(int type, MediaFormat mediaFormat);

        void outputVideo(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo);

        void onStop(int type);
    }

    public H264VideoRecord(Activity activity, SurfaceView surfaceView) {
        mCameraHelper = new CameraHelper(surfaceView, activity);
        mCameraHelper.setPreviewCallback(this);
    }

    public void start() {
        mH264Encoder.start();
    }

    public void stop() {
        mH264Encoder.stop();
        mCameraHelper.stop();
    }

    @Override
    public void onFrame(byte[] data) {
        mH264Encoder.queueData(data);
    }

    @Override
    public void outputMediaFormat(int type, MediaFormat mediaFormat) {
        if (mCallback == null) {
            return;
        }
        mCallback.outputMediaFormat(type, mediaFormat);
    }

    @Override
    public void onEncodeOutput(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
        if (mCallback == null) {
            return;
        }
        mCallback.outputVideo(byteBuffer, bufferInfo);
    }

    @Override
    public void onStop(int type) {
        if (mCallback == null) {
            return;
        }
        mCallback.onStop(type);
    }

    @Override
    public void onOperate(int width, int height, int fps) {
        mH264Encoder = new H264Encoder(width, height, fps);
        mH264Encoder.setCallback(this);
    }
}


public class CameraHelper {

    private int mPreWidth;
    private int mPreHeight;
    private int mFrameRate;


    private Camera mCamera;
    private Camera.Size mPreviewSize;
    private Camera.Parameters mCameraParameters;
    private boolean mIsPreviewing = false;
    private Activity mContext;
    private SurfaceView mSurfaceView;
    private SurfaceHolder mSurfaceHolder;
    private CameraPreviewCallback mCameraPreviewCallback;
    private PreviewCallback mPreviewCallback;


    public void setPreviewCallback(PreviewCallback previewCallback) {
        mPreviewCallback = previewCallback;
    }

    public interface PreviewCallback {
        void onFrame(byte[] data);

        void onOperate(int width, int height, int fps);
    }

    public CameraHelper(SurfaceView surfaceView, Activity context) {
        mSurfaceView = surfaceView;
        mContext = context;
        mSurfaceView.setKeepScreenOn(true);
        mSurfaceHolder = mSurfaceView.getHolder();
        mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        mSurfaceHolder.addCallback(new SurfaceHolder.Callback() {
            @Override
            public void surfaceCreated(SurfaceHolder surfaceHolder) {
                doOpenCamera();
                doStartPreview(mContext, surfaceHolder);
            }

            @Override
            public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {

            }

            @Override
            public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
                if (mCamera == null) {
                    return;
                }
                mCamera.stopPreview();
                mCamera.release();
                mCamera = null;
            }
        });

    }

    private void doOpenCamera() {
        if (mCamera != null) {
            return;
        }
        mCamera = Camera.open();
    }

    private void doStartPreview(Activity activity, SurfaceHolder surfaceHolder) {
        if (mIsPreviewing) {
            return;
        }
        mContext = activity;
        setCameraDisplayOrientation(activity, Camera.CameraInfo.CAMERA_FACING_BACK);
        setCameraParameters(surfaceHolder);
        try {
            mCamera.setPreviewDisplay(surfaceHolder);
        } catch (IOException e) {
            e.printStackTrace();
        }
        mCamera.startPreview();
        mIsPreviewing = true;
        mPreviewCallback.onOperate(mPreWidth, mPreHeight, mFrameRate);

    }

    public void stop() {
        if (mCamera != null) {
            mCamera.setPreviewCallbackWithBuffer(null);
            if (mIsPreviewing) {
                mCamera.stopPreview();
            }
            mIsPreviewing = false;
            mCamera.release();
            mCamera = null;
        }
    }

    private void setCameraDisplayOrientation(Activity activity, int cameraId) {
        Camera.CameraInfo info = new Camera.CameraInfo();
        Camera.getCameraInfo(cameraId, info);
        int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
        int degrees = 0;
        switch (rotation) {
            case Surface.ROTATION_0:
                degrees = 0;
                break;
            case Surface.ROTATION_90:
                degrees = 90;
                break;
            case Surface.ROTATION_180:
                degrees = 180;
                break;
            case Surface.ROTATION_270:
                degrees = 270;
                break;
        }
        int result = 0;
        if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
            result = (info.orientation + degrees) % 360;
            result = (360 - result) % 360;
        } else {
            result = (info.orientation - degrees + 360) % 360;
        }
        mCamera.setDisplayOrientation(result);
    }

    private void setCameraParameters(SurfaceHolder surfaceHolder) {
        if (!mIsPreviewing && mCamera != null) {
            mCameraParameters = mCamera.getParameters();
            mCameraParameters.setPreviewFormat(ImageFormat.NV21);
            List<Camera.Size> supportedPreviewSizes = mCameraParameters.getSupportedPreviewSizes();
            Collections.sort(supportedPreviewSizes, new Comparator<Camera.Size>() {
                @Override
                public int compare(Camera.Size o1, Camera.Size o2) {
                    Integer left = o1.width;
                    Integer right = o2.width;
                    return left.compareTo(right);
                }
            });

            DisplayMetrics displayMetrics = mContext.getResources().getDisplayMetrics();
            for (Camera.Size size : supportedPreviewSizes) {
                if (size.width >= displayMetrics.heightPixels && size.height >= displayMetrics.widthPixels) {
                    if ((1.0f * size.width / size.height) == (1.0f * displayMetrics.heightPixels / displayMetrics.widthPixels)) {
                        mPreviewSize = size;
                        break;
                    }
                }
            }
            if (mPreviewSize != null) {
                mPreWidth = mPreviewSize.width;
                mPreHeight = mPreviewSize.height;
            } else {
                mPreWidth = 1280;
                mPreHeight = 720;
            }


            mCameraParameters.setPreviewSize(mPreWidth, mPreHeight);

            //set fps range.
            int defminFps = 0;
            int defmaxFps = 0;
            List<int[]> supportedPreviewFpsRange = mCameraParameters.getSupportedPreviewFpsRange();
            for (int[] fps : supportedPreviewFpsRange) {
                if (defminFps <= fps[PREVIEW_FPS_MIN_INDEX] && defmaxFps <= fps[PREVIEW_FPS_MAX_INDEX]) {
                    defminFps = fps[PREVIEW_FPS_MIN_INDEX];
                    defmaxFps = fps[PREVIEW_FPS_MAX_INDEX];
                }
            }
            //设置相机预览帧率
            mCameraParameters.setPreviewFpsRange(defminFps, defmaxFps);
            mFrameRate = defmaxFps / 1000;
            surfaceHolder.setFixedSize(mPreWidth, mPreHeight);
            mCameraPreviewCallback = new CameraPreviewCallback();
            mCamera.addCallbackBuffer(new byte[mPreHeight * mPreWidth * 3 / 2]);
            mCamera.setPreviewCallbackWithBuffer(mCameraPreviewCallback);
            List<String> focusModes = mCameraParameters.getSupportedFocusModes();
            for (String focusMode : focusModes) {//检查支持的对焦
                if (focusMode.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
                    mCameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
                } else if (focusMode.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
                    mCameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
                } else if (focusMode.contains(Camera.Parameters.FOCUS_MODE_AUTO)) {
                    mCameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
                }
            }
            mCamera.setParameters(mCameraParameters);
        }
    }


    class CameraPreviewCallback implements Camera.PreviewCallback {
        private CameraPreviewCallback() {

        }

        @Override
        public void onPreviewFrame(byte[] data, Camera camera) {
            if (!mIsPreviewing || mCamera == null) {
                return;
            }
            Camera.Size size = camera.getParameters().getPreviewSize();
            //通过回调,拿到的data数据是原始数据
            //丢给VideoRunnable线程,使用MediaCodec进行h264编码操作
            if (data != null) {
                if (mPreviewCallback != null) {
                    mPreviewCallback.onFrame(data);
                }
                camera.addCallbackBuffer(data);
            } else {
                camera.addCallbackBuffer(new byte[size.width * size.height * 3 / 2]);
            }
        }
    }
}

4. 视频编码

视频编码使用阻塞队列 BlockingQueue 来缓冲数据,编码成 H264 格式。


public class H264Encoder {
    public static final String VIDEO_MIME_TYPE = "video/avc";//就是 H264
    public static final int H264_ENCODER = 1;
    private MediaCodec mMediaCodec;
    private MediaFormat mMediaFormat;
    private BlockingQueue<byte[]> mQueue;
    private MediaCodecInfo mMediaCodecInfo;
    private int mColorFormat;
    private int mWidth;
    private int mHeight;
    private int mBitRate;
    private byte[] mYUVBuffer;
    private byte[] mRotatedYUVBuffer;

    private int[] mOutWidth;
    private int[] mOutHeight;
    private ExecutorService mExecutorService;
    private volatile boolean mIsEncoding;

    private Callback mCallback;

    public void setCallback(Callback callback) {
        mCallback = callback;
    }

    public interface Callback {
        void outputMediaFormat(int type, MediaFormat mediaFormat);

        void onEncodeOutput(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo);

        void onStop(int type);
    }

    public H264Encoder(int width, int height, int fps) {
        Log.e("eee", "w:" + width + "h:" + height + "fps:" + fps);
        mWidth = width;
        mHeight = height;
        mQueue = new LinkedBlockingQueue<>();
        mMediaCodecInfo = selectCodecInfo();
        mColorFormat = selectColorFormat(mMediaCodecInfo);
        mBitRate = (mWidth * mHeight * 3 / 2) * 8 * fps;
        mMediaFormat = MediaFormat.createVideoFormat(VIDEO_MIME_TYPE, mHeight, mWidth);
        mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, mBitRate);// todo 没有这一行会报错 configureCodec returning error -38
        mMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, fps);
        mMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, mColorFormat);
        mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);

        try {
            mMediaCodec = MediaCodec.createByCodecName(mMediaCodecInfo.getName());
        } catch (IOException e) {
            e.printStackTrace();
        }
        mMediaCodec.configure(mMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mExecutorService = Executors.newFixedThreadPool(1);

        mYUVBuffer = new byte[YUVUtil.getYUVBuffer(width, height)];
        mRotatedYUVBuffer = new byte[YUVUtil.getYUVBuffer(width, height)];
        mOutWidth = new int[1];
        mOutHeight = new int[1];
        YUVEngine.startYUVEngine();
    }


    public void start() {
        if (mIsEncoding) {
            return;
        }


        mExecutorService.execute(new Runnable() {
            @Override
            public void run() {
                mIsEncoding = true;
                mMediaCodec.start();
                while (mIsEncoding) {
                    byte[] data = dequeueData();
                    if (data == null) {
                        continue;
                    }
                    encodeVideoData(data);
                }

                mMediaCodec.stop();
                mMediaCodec.release();
                if (mCallback != null) {
                    mCallback.onStop(H264_ENCODER);
                }

            }
        });
    }

    public void stop() {
        mIsEncoding = false;
    }

    private byte[] dequeueData() {
        if (mQueue.isEmpty()) {
            return null;
        }
        try {
            return mQueue.take();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        return null;
    }

    public void queueData(byte[] data) {
        if (data == null || !mIsEncoding) {
            return;
        }
        try {
            mQueue.put(data);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }

    private void encodeVideoData(byte[] data) {
        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
        mRotatedYUVBuffer = transferFrameData(data, mYUVBuffer, mRotatedYUVBuffer);
        ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
        int inputIndex = mMediaCodec.dequeueInputBuffer(10_000);
        if (inputIndex >= 0) {
            ByteBuffer byteBuffer = inputBuffers[inputIndex];
            byteBuffer.clear();
            byteBuffer.put(mRotatedYUVBuffer);
            long pts = System.currentTimeMillis() * 1000 - AVTimer.getBaseTimestampUs();
            mMediaCodec.queueInputBuffer(inputIndex, 0, mRotatedYUVBuffer.length, pts, 0);
        }

        ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers();
        int outputIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 10_000);
        if (outputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
            outputBuffers = mMediaCodec.getOutputBuffers();
        } else if (outputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
            MediaFormat newFormat = mMediaCodec.getOutputFormat();
            if (null != mCallback) {
                mCallback.outputMediaFormat(H264_ENCODER, newFormat);
            }
        }
        while (outputIndex >= 0) {
            ByteBuffer byteBuffer = outputBuffers[outputIndex];
            if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                bufferInfo.size = 0;
            }
            if (bufferInfo.size != 0 && mCallback != null) {
//                boolean keyFrame = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0;
//                Log.i(TAG, "is key frame :%s"+keyFrame);
                mCallback.onEncodeOutput(byteBuffer, bufferInfo);
            }
            mMediaCodec.releaseOutputBuffer(outputIndex, false);
            outputIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 10_000);
        }
    }

    private byte[] transferFrameData(byte[] data, byte[] yuvBuffer, byte[] rotatedYuvBuffer) {
        //Camera 传入的是 NV21
        //转换成 MediaCodec 支持的格式
        switch (mColorFormat) {
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar://对应Camera预览格式I420(YV21/YUV420P)
                YUVEngine.Nv21ToI420(data, yuvBuffer, mWidth, mHeight);
                YUVEngine.I420ClockWiseRotate90(yuvBuffer, mWidth, mHeight, rotatedYuvBuffer, mOutWidth, mOutHeight);
                //Log.i("transferFrameData", "COLOR_FormatYUV420Planar");
                break;
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar: //对应Camera预览格式NV12
                YUVEngine.Nv21ToNv12(data, yuvBuffer, mWidth, mHeight);
                YUVEngine.Nv12ClockWiseRotate90(yuvBuffer, mWidth, mHeight, rotatedYuvBuffer, mOutWidth, mOutHeight);
                //Log.i("transferFrameData", "COLOR_FormatYUV420SemiPlanar");
                break;
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar://对应Camera预览格式NV21
                System.arraycopy(data, 0, yuvBuffer, 0, mWidth * mHeight * 3 / 2);
                YUVEngine.Nv21ClockWiseRotate90(yuvBuffer, mWidth, mHeight, rotatedYuvBuffer, mOutWidth, mOutHeight);
                //Log.i("transferFrameData", "COLOR_FormatYUV420PackedSemiPlanar");
                break;
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar: ////对应Camera预览格式YV12
                YUVEngine.Nv21ToYV12(data, yuvBuffer, mWidth, mHeight);
                YUVEngine.Yv12ClockWiseRotate90(yuvBuffer, mWidth, mHeight, rotatedYuvBuffer, mOutWidth, mOutHeight);
                //Log.i("transferFrameData", "COLOR_FormatYUV420PackedPlanar");
                break;
        }
        return rotatedYuvBuffer;
    }

    private MediaCodecInfo selectCodecInfo() {
        int numCodecs = MediaCodecList.getCodecCount();
        for (int i = 0; i < numCodecs; i++) {
            MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(i);
            if (!codecInfo.isEncoder()) {
                continue;
            }
            String[] types = codecInfo.getSupportedTypes();
            for (int j = 0; j < types.length; j++) {
                if (types[j].equalsIgnoreCase(com.example.dplayer.mediacodec.h264.H264Encoder.VIDEO_MIME_TYPE)) {
                    return codecInfo;
                }
            }
        }
        return null;
    }

    //查询支持的输入格式
    private int selectColorFormat(MediaCodecInfo codecInfo) {
        if (codecInfo == null) {
            return -1;
        }
        MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(com.example.dplayer.mediacodec.h264.H264Encoder.VIDEO_MIME_TYPE);
        int[] colorFormats = capabilities.colorFormats;
        for (int i = 0; i < colorFormats.length; i++) {
            if (isRecognizedFormat(colorFormats[i])) {
                return colorFormats[i];
            }
        }
        return -1;
    }

    private boolean isRecognizedFormat(int colorFormat) {
        switch (colorFormat) {
            // these are the formats we know how to handle for this test
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar://对应Camera预览格式I420(YV21/YUV420P)
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar: //对应Camera预览格式NV12
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar://对应Camera预览格式NV21
            case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar: {////对应Camera预览格式YV12
                return true;
            }
            default:
                return false;
        }
    }
}

5. 合成 MP4

由于 MediaMuxer 需要先 addTrack 才能 start ,其音频解码回调和视频解码回调处于不同的线程,因此,使用 object.wait 来等待所有数据完成 addTrack 。


public class Mp4Record implements H264VideoRecord.Callback, AacAudioRecord.Callback {
    private static int index = 0;
    private H264VideoRecord mH264VideoRecord;
    private AacAudioRecord mAacAudioRecord;
    private MediaMuxer mMediaMuxer;

    private boolean mHasStartMuxer;
    private boolean mHasStopAudio;
    private boolean mHasStopVideo;
    private int mVideoTrackIndex = -1;
    private int mAudioTrackIndex = -1;
    private final Object mLock;
    private BlockingQueue<AVData> mDataBlockingQueue;
    private volatile boolean mIsRecoding;

    public Mp4Record(Activity activity, SurfaceView surfaceView, int audioSource, int sampleRateInHz, int channelConfig, int audioFormat, int bufferSizeInBytes, String path) {
        mH264VideoRecord = new H264VideoRecord(activity, surfaceView);
        mAacAudioRecord = new AacAudioRecord(audioSource, sampleRateInHz, channelConfig, audioFormat, bufferSizeInBytes);
        mH264VideoRecord.setCallback(this);
        mAacAudioRecord.setCallback(this);
        try {
            mMediaMuxer = new MediaMuxer(path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
        } catch (IOException e) {
            e.printStackTrace();
        }
        mHasStartMuxer = false;
        mLock = new Object();
        mDataBlockingQueue = new LinkedBlockingQueue<>();
    }

    public void start() {
        mIsRecoding = true;
        mAacAudioRecord.start();
        mH264VideoRecord.start();
    }

    public void stop() {
        mAacAudioRecord.stop();
        mH264VideoRecord.stop();
        mIsRecoding = false;
    }


    @Override
    public void outputAudio(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
        writeMediaData(mAudioTrackIndex, byteBuffer, bufferInfo);
    }

    @Override
    public void outputMediaFormat(int type, MediaFormat mediaFormat) {
        checkMediaFormat(type, mediaFormat);
    }

    @Override
    public void outputVideo(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
        writeMediaData(mVideoTrackIndex, byteBuffer, bufferInfo);
    }

    private void writeMediaData(int trackIndex, ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
        mDataBlockingQueue.add(new AVData(index++, trackIndex, byteBuffer, bufferInfo));
    }

    private void checkMediaFormat(int type, MediaFormat mediaFormat) {
        synchronized (mLock) {
            if (type == AAC_ENCODER) {
                mAudioTrackIndex = mMediaMuxer.addTrack(mediaFormat);
            }
            if (type == H264_ENCODER) {
                mVideoTrackIndex = mMediaMuxer.addTrack(mediaFormat);
            }
            startMediaMuxer();
        }
    }

    private void startMediaMuxer() {
        if (mHasStartMuxer) {
            return;
        }
        if (mAudioTrackIndex != -1 && mVideoTrackIndex != -1) {
            Log.e(TAG, "video track index:" + mVideoTrackIndex + "audio track index:" + mAudioTrackIndex);
            mMediaMuxer.start();
            mHasStartMuxer = true;
            new Thread(new Runnable() {
                @Override
                public void run() {
                    while (mIsRecoding || !mDataBlockingQueue.isEmpty()) {
                        AVData avData = mDataBlockingQueue.poll();
                        if (avData == null) {
                            continue;
                        }
                        boolean keyFrame = (avData.bufferInfo.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0;
                        Log.e(TAG, avData.index + "trackIndex:" + avData.trackIndex + ",writeSampleData:" + keyFrame);
                        mMediaMuxer.writeSampleData(avData.trackIndex, avData.byteBuffer, avData.bufferInfo);
                    }
                }
            }).start();
            mLock.notifyAll();
        } else {
            try {
                mLock.wait();
            } catch (InterruptedException e) {

            }
        }
    }

    @Override
    public void onStop(int type) {
        synchronized (mLock) {
            if (type == H264_ENCODER) {
                mHasStopVideo = true;
            }
            if (type == AAC_ENCODER) {
                mHasStopAudio = true;
            }
            if (mHasStopAudio && mHasStopVideo && mHasStartMuxer) {
                mHasStartMuxer = false;
                mMediaMuxer.stop();
            }
        }
    }

    private class AVData {
        int index = 0;
        int trackIndex;
        ByteBuffer byteBuffer;
        MediaCodec.BufferInfo bufferInfo;

        public AVData(int index, int trackIndex, ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {
            this.index = index;
            this.trackIndex = trackIndex;
            this.byteBuffer = byteBuffer;
            this.bufferInfo = bufferInfo;
            boolean keyFrame = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0;
            Log.e(TAG, index + "trackIndex:" + trackIndex + ",AVData:" + keyFrame);
        }
    }
}

6. 遇到的问题
录制后的视频播放很快

原因是视频掉帧,通过修改关键帧参数解决

mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
MediaMuxer 报 Video skip non-key frame

原因是第一个视频关键帧没有写入,因为处于不同的线程,放入队列中后,原始的 BufferInfo 被释放

github demo

![欢迎关注我的微信公众号【海盗的指针】]

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 202,802评论 5 476
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 85,109评论 2 379
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 149,683评论 0 335
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 54,458评论 1 273
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 63,452评论 5 364
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 48,505评论 1 281
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 37,901评论 3 395
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 36,550评论 0 256
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 40,763评论 1 296
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 35,556评论 2 319
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 37,629评论 1 329
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 33,330评论 4 318
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 38,898评论 3 307
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 29,897评论 0 19
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 31,140评论 1 259
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 42,807评论 2 349
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 42,339评论 2 342