ffmpeg学习笔记-native原生绘制

上次已将ffmpeg的动态库编译出来了,并且使用了ffmpeg的转码功能,成功将mp4格式视频转化为yuv视频,这篇文章基于上次测试的demo,使用surfaceview显示解码完成的像素数据

布局设置和权限添加

布局

<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent" >

    <com.cj5785.ffmpegnativeplayer.view.MySurfaceView
        android:id="@+id/surface_view"
        android:layout_width="fill_parent"
        android:layout_height="fill_parent"/>
    
    <Button
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:text="开始"
        android:onClick="mPlay" />

</FrameLayout>

权限

<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.MOUNT_UNMOUNT_FILESYSTEMS" />

编写自定义view和控制器

自定义View

package com.cj5785.ffmpegnativeplayer.view;

import android.content.Context;
import android.graphics.PixelFormat;
import android.util.AttributeSet;
import android.view.SurfaceHolder;
import android.view.SurfaceView;

public class MySurfaceView extends SurfaceView {

	public MySurfaceView(Context context) {
		super(context);
		init();
	}
	
	public MySurfaceView(Context context, AttributeSet attrs) {
		super(context, attrs);
		init();
	}
	
	public MySurfaceView(Context context, AttributeSet attrs, int defStyle) {
		super(context, attrs, defStyle);
		init();
	}

	//初始化像素格式
	private void init() {
		SurfaceHolder holder = getHolder();
		holder.setFormat(PixelFormat.RGBA_8888);
	}
}

控制器

package com.cj5785.ffmpegnativeplayer;

import android.view.Surface;

public class NativePlayer {
	
	public native void render(String input, Surface surface);
	
	static {
		System.loadLibrary("avutil-54");
		System.loadLibrary("swresample-1");
		System.loadLibrary("avcodec-56");
		System.loadLibrary("avformat-56");
		System.loadLibrary("swscale-3");
		System.loadLibrary("postproc-53");
		System.loadLibrary("avfilter-5");
		System.loadLibrary("avdevice-56");
		System.loadLibrary("ffmpeg_native_player");
	}
	
}

实现控制器native方法

  • 使用javah生成头文件,这里可能存在无法找到Surface签名的问题,这时候需要指定classpath路径
    javah -classpath E:eclipse-adtsdkplatformsandroid-15android.jar;. com.cj5785.ffmpegnativeplayer.NativePlayer
    格式说明:-classpath后面跟的是android.jar路径,最后接native方法类的全名

  • 新建jni文件夹,将头文件移至jni文件夹,添加本地依赖

  • 复制生成ffmpeg的include目录和so动态库到jni目录

  • 将之前的Android.mkApplication.mk复制到jni文件夹,并做适当修改
    Android.mk主要修改模块名,使其与控制器调用相统一
    Application.mk主要将APP_PLATFORM := android-8修改为APP_PLATFORM := android-9
    注意,此处如果不修改Application.mk将导致android/native_window_jni.h无法找到,同时,由于使用了这个头文件,需要在Android.mk配置-landroid

  • 使用开源库libyuv实现yuv转化为RGBA_8888
    下载开源库libyuv,下载地址libyuv下载地址
    将libyuv下的所有文件放入jni目录(NDK工程规范,必须存在jni目录)
    修改libyuv的Android.mk文件,将最后的include $(BUILD_STATIC_LIBRARY)改为include $(BUILD_SHARED_LIBRARY),这样就可以生成so动态库了
    还可以将LOCAL_MODULE := libyuv_static改为LOCAL_MODULE := libyuv,方便so管理
    在jni目录下执行ndk-build即可对libyuv进行编译
    编译生成的so动态库位于与jni目录同级的lib下
    将lib添加到工程jni目录下,为了便于管理,将jni的include目录进行重新分配,重新分配目录如下:(已将libyuv的include加入到工程,这里没有列出目录下包含的头文件)

│  Android.mk
│  Application.mk
│  com_cj5785_ffmpegnativeplayer_NativePlayer.h
│  ffmpeg_native_player.c
│  
└─include
    ├─ffmpeg
    │  │  libavcodec-56.so
    │  │  libavdevice-56.so
    │  │  libavfilter-5.so
    │  │  libavformat-56.so
    │  │  libavutil-54.so
    │  │  libpostproc-53.so
    │  │  libswresample-1.so
    │  │  libswscale-3.so
    │  │  
    │  ├─libavcodec
    │  ├─libavdevice
    │  ├─libavfilter
    │  ├─libavformat
    │  ├─libavutil
    │  ├─libpostproc
    │  ├─libswresample
    │  └─libswscale
    │          
    └─libyuv
        │  libyuv.h
        │  libyuv.so
        │  
        └─libyuv
  • 修改Android.mk,使其能找到so动态库
LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)
LOCAL_MODULE := avcodec
LOCAL_SRC_FILES := include/ffmpeg/libavcodec-56.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avdevice
LOCAL_SRC_FILES := include/ffmpeg/libavdevice-56.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avfilter
LOCAL_SRC_FILES := include/ffmpeg/libavfilter-5.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avformat
LOCAL_SRC_FILES := include/ffmpeg/libavformat-56.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avutil
LOCAL_SRC_FILES := include/ffmpeg/libavutil-54.so
include $(PREBUILT_SHARED_LIBRARY)


include $(CLEAR_VARS)
LOCAL_MODULE := postproc
LOCAL_SRC_FILES := include/ffmpeg/libpostproc-53.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := swresample
LOCAL_SRC_FILES := include/ffmpeg/libswresample-1.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := swscale
LOCAL_SRC_FILES := include/ffmpeg/libswscale-3.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := yuv
LOCAL_SRC_FILES := include/libyuv/libyuv.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := ffmpeg_native_player
LOCAL_SRC_FILES := ffmpeg_native_player.c
LOCAL_C_INCLUDES += $(LOCAL_PATH)/include/ffmpeg
LOCAL_C_INCLUDES += $(LOCAL_PATH)/include/libyuv
LOCAL_LDLIBS := -llog -landroid
LOCAL_SHARED_LIBRARIES := avcodec avdevice avfilter avformat avutil postproc swresample swscale yuv
include $(BUILD_SHARED_LIBRARY)
  • 修改Application.mk,更改APP_PLATFORM,使其可以使用android/native_window_jni.handroid/native_window.h头文件
APP_ABI := armeabi armeabi-v7a
APP_PLATFORM := android-9
  • 实现jni头文件声明的函数
#include <android/log.h>
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>
#include <android/native_window.h>
#include <android/native_window_jni.h>

#include "com_cj5785_ffmpegnativeplayer_NativePlayer.h"

//封装格式
#include "include/ffmpeg/libavformat/avformat.h"
//解码
#include "include/ffmpeg/libavcodec/avcodec.h"
//像素处理
#include "include/ffmpeg/libswscale/swscale.h"
//包含yuvlib头文件
#include "include/libyuv/libyuv.h"

#define LOGI(FORMAT,...) __android_log_print(4,"cj5785",FORMAT,##__VA_ARGS__);
#define LOGE(FORMAT,...) __android_log_print(6,"cj5785",FORMAT,##__VA_ARGS__);

JNIEXPORT void JNICALL Java_com_cj5785_ffmpegnativeplayer_NativePlayer_render
  (JNIEnv *env, jobject jobj, jstring jstr_path, jobject obj_surface)
{
	LOGE("%s", "开始");
	const char *input_cstr = (*env)->GetStringUTFChars(env, jstr_path, NULL);

	//1.注册组件
	av_register_all();

	AVFormatContext *pFormatCtx = avformat_alloc_context();
	//2.打开视频文件
	if(avformat_open_input(&pFormatCtx, input_cstr, NULL, NULL) != 0)
	{
		LOGE("%s", "打开文件失败!");
		return;
	}

	//3.获取视频相关信息
	if(avformat_find_stream_info(pFormatCtx, NULL) < 0)
	{
		LOGE("%s", "获取视频信息失败!");
		return;
	}

	int i = 0;
	int video_stream_index = -1;
	for (i = 0; i < pFormatCtx->nb_streams; i++) {
		if(pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO)
		{
			video_stream_index = i;
			break;
		}
	}

	if (video_stream_index == -1)
	{
		LOGE("%s","找不到视频流
");
		return;
	}
	//4.获取解码器
	AVCodecContext *pCodecCtx = pFormatCtx->streams[video_stream_index]->codec;
	AVCodec *pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
	if(pCodec == NULL)
	{
		LOGE("%s", "无法解码!");
		return;
	}

	//5.打开解码器
	if(avcodec_open2(pCodecCtx, pCodec, NULL) < 0)
	{
		LOGE("%s", "解码失败!");
		return;
	}

	//6.以帧为单位读取视频文件
	AVPacket *packet = (AVPacket *)av_malloc(sizeof(AVPacket));
	AVFrame *pFrame = av_frame_alloc();
	AVFrame *pRGBFrame = av_frame_alloc();

	//native绘制
	//窗体设置
	ANativeWindow *nativeWindow = ANativeWindow_fromSurface(env, obj_surface);
	//缓冲区设置
	ANativeWindow_Buffer outBuffer;
	int len, got_frame, frame_count = 0;
	while(av_read_frame(pFormatCtx, packet) >= 0)
	{
		if(packet->stream_index == video_stream_index)
		{
			len = avcodec_decode_video2(pCodecCtx, pFrame, &got_frame, packet);
			if(len < 0)
			{
				LOGE("%s","解码错误!");
				return;
			}
			if(got_frame)
			{
				LOGI("解码第%d帧", frame_count++);
				//a.lock
				//设置缓冲区属性(宽,高,像素格式)
				ANativeWindow_setBuffersGeometry(nativeWindow, pCodecCtx->width, pCodecCtx->height, WINDOW_FORMAT_RGBA_8888);
				ANativeWindow_lock(nativeWindow, &outBuffer, NULL);
				//b.fix buffer
				//设置RGB的缓冲区以及属性(像素格式,宽高),RGB缓冲区和outBuffer.bits是同一块内存
				avpicture_fill((AVPicture *)pRGBFrame, outBuffer.bits, AV_PIX_FMT_RGBA, pCodecCtx->width, pCodecCtx->height);
				//YUV转化为RGB
				I420ToARGB(pFrame->data[0], pFrame->linesize[0],
						pFrame->data[2], pFrame->linesize[2],
						pFrame->data[1], pFrame->linesize[1],
						pRGBFrame->data[0], pRGBFrame->linesize[0],
						pCodecCtx->width, pCodecCtx->height);
				//c.unlock
				ANativeWindow_unlockAndPost(nativeWindow);
				usleep(16 * 1000);
			}
		}

		av_free_packet(packet);
	}
	ANativeWindow_release(nativeWindow);
	av_frame_free(&pFrame);
	av_frame_free(&pRGBFrame);
	avcodec_close(pCodecCtx);
	avformat_free_context(pFormatCtx);
	(*env)->ReleaseStringUTFChars(env, jstr_path, input_cstr);
}

调用native,使其能够播放

package com.cj5785.ffmpegnativeplayer;

import java.io.File;

import com.cj5785.ffmpegnativeplayer.view.MySurfaceView;

import android.app.Activity;
import android.os.Bundle;
import android.os.Environment;
import android.view.Surface;
import android.view.View;

public class MainActivity extends Activity {

	private NativePlayer player;
	private MySurfaceView mySurfaceView;
	
	@Override
	protected void onCreate(Bundle savedInstanceState) {
		super.onCreate(savedInstanceState);
		setContentView(R.layout.activity_main);
		mySurfaceView = (MySurfaceView) findViewById(R.id.surface_view);
		player = new NativePlayer();
	}

	public void mPlay(View view) {
		String input = Environment.getExternalStorageDirectory().getAbsolutePath() + File.separatorChar + "oneplus.mp4";
		Surface surface = mySurfaceView.getHolder().getSurface();
		player.render(input, surface);
	}
}

至此,已经可以编译生成apk了,在手机上测试也没有问题

更改布局和主活动,使其可以播放多个测试视频

MainActivity.java

package com.cj5785.ffmpegnativeplayer;

import java.io.File;

import com.cj5785.ffmpegnativeplayer.view.MySurfaceView;

import android.app.Activity;
import android.os.Bundle;
import android.os.Environment;
import android.view.Surface;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.Spinner;

public class MainActivity extends Activity {

	private NativePlayer player;
	private MySurfaceView mySurfaceView;
	private Spinner sp_video;
	
	@Override
	protected void onCreate(Bundle savedInstanceState) {
		super.onCreate(savedInstanceState);
		setContentView(R.layout.activity_main);
		mySurfaceView = (MySurfaceView) findViewById(R.id.surface_view);
		sp_video = (Spinner)findViewById(R.id.sp_video);
		player = new NativePlayer();
		//视频列表
		String[] videoArray = getResources().getStringArray(R.array.video_list);
		ArrayAdapter<String> adapter = new ArrayAdapter<String>(this, 
				android.R.layout.simple_list_item_1, android.R.id.text1,videoArray);
		sp_video.setAdapter(adapter);
		
	}

	public void mPlay(View view) {
		String filename = sp_video.getSelectedItem().toString();
		String input = Environment.getExternalStorageDirectory().getAbsolutePath() + File.separatorChar + filename;
		Surface surface = mySurfaceView.getHolder().getSurface();
		player.render(input, surface);
	}
}

activity_main.xml

<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent" >

    <com.cj5785.ffmpegnativeplayer.view.MySurfaceView
        android:id="@+id/surface_view"
        android:layout_width="fill_parent"
        android:layout_height="fill_parent"/>
    <LinearLayout
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:orientation="horizontal">
        
        <Spinner 
            android:id="@+id/sp_video"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"/>
        
        <Button
        	android:layout_width="wrap_content"
        	android:layout_height="wrap_content"
        	android:text="开始"
        	android:onClick="mPlay" />
        
    </LinearLayout>

</FrameLayout>

string.xml中添加数组值

<string-array name="video_list">
    <item>naxienian.mp4</item>
    <item>cuc_ieschool.mkv</item>
    <item>sintel.wmv</item>
    <item>Nocturne.m4a</item>
</string-array>

需要注意的问题

在native的实现过程中,I420ToARGB()方法在调用的时候,UV的位置是颠倒的,需要对调UV的位置
在这个示例程序中,旨在说明native是怎么绘制的,其代码存在严重不足,比如在主线程中绘制界面
部分视频会出现花屏现象,这个问题在后面多线程解码的时候会解决

原文地址:https://www.cnblogs.com/cj5785/p/10664660.html