* 如果需要支持硬件解码,推荐看这篇文章 FFmpeg一键编译Android armv7-a arm64
<https://blog.csdn.net/bobcat_kay/article/details/88843778>
<>1.搭建编译环境

<>1.安装ubuntu14.04,安装完成后执行以下命令

apt-get update
apt-get install yasm
apt-get install pkg-config

<>2.下载ndk

这里用最新稳定版ndk r19c:下载ndk-r19c
<https://dl.google.com/android/repository/android-ndk-r19c-linux-x86_64.zip>
将ndk下载到 /home/ndk/目录下,下载完成后执行unzip android-ndk-r19c-linux-x86_64.zip解压

<>3.下载FFmpeg4.1.3

下载FFmpeg-n4.1.3 <https://github.com/FFmpeg/FFmpeg/archive/n4.1.3.tar.gz>
下载完成后执行tar -zxvf n4.1.3.tar.gz解压

<>2.编译FFmpeg

<>1.修改configure

进入源码根目录,用vim打开configure,找到
SLIBNAME_WITH_MAJOR='$(SLIBNAME).$(LIBMAJOR)'
LIB_INSTALL_EXTRA_CMD='$$(RANLIB)"$(LIBDIR)/$(LIBNAME)"'
SLIB_INSTALL_NAME='$(SLIBNAME_WITH_VERSION)'
SLIB_INSTALL_LINKS='$(SLIBNAME_WITH_MAJOR)$(SLIBNAME)'
将其修改为
SLIBNAME_WITH_MAJOR='$(SLIBPREF)$(FULLNAME)-$(LIBMAJOR)$(SLIBSUF)'
LIB_INSTALL_EXTRA_CMD='$$(RANLIB)"$(LIBDIR)/$(LIBNAME)"'
SLIB_INSTALL_NAME='$(SLIBNAME_WITH_MAJOR)' SLIB_INSTALL_LINKS='$(SLIBNAME)'
<>2.配置编译脚本

在源码根目录新建build.sh <http://xn--build-8w2i808b1uat17bn9fcqtpvxdgd.sh>,内容如下:
#!/bin/bash NDK=/home/ndk/android-ndk-r15c ADDI_LDFLAGS="-fPIE -pie"
ADDI_CFLAGS="-fPIE -pie -march=armv7-a -mfloat-abi=softfp -mfpu=neon"
CPU=armv7-a ARCH=arm HOST=arm-linux
SYSROOT=$NDK/toolchains/llvm/prebuilt/linux-x86_64/sysroot
CROSS_PREFIX=$NDK/toolchains/llvm/prebuilt/linux-x86_64/bin/armv7a-linux-androideabi21-
PREFIX=$(pwd)/android/$CPU x264=$(pwd)/x264/android/$CPU configure() {
./configure \ --prefix=$PREFIX \ --disable-encoders \ --disable-decoders \
--disable-avdevice \ --disable-static \ --disable-doc \ --disable-ffplay \
--disable-network \ --disable-doc \ --disable-symver \ --enable-neon \
--enable-shared \ --enable-libx264 \ --enable-gpl \ --enable-pic \ --enable-jni
\ --enable-pthreads \ --enable-mediacodec \ --enable-encoder=aac \
--enable-encoder=gif \ --enable-encoder=libopenjpeg \
--enable-encoder=libmp3lame \ --enable-encoder=libwavpack \
--enable-encoder=libx264 \ --enable-encoder=mpeg4 \ --enable-encoder=pcm_s16le
\ --enable-encoder=png \ --enable-encoder=srt \ --enable-encoder=subrip \
--enable-encoder=yuv4 \ --enable-encoder=text \ --enable-decoder=aac \
--enable-decoder=aac_latm \ --enable-decoder=libopenjpeg \ --enable-decoder=mp3
\ --enable-decoder=mpeg4_mediacodec \ --enable-decoder=pcm_s16le \
--enable-decoder=flac \ --enable-decoder=flv \ --enable-decoder=gif \
--enable-decoder=png \ --enable-decoder=srt \ --enable-decoder=xsub \
--enable-decoder=yuv4 \ --enable-decoder=vp8_mediacodec \
--enable-decoder=h264_mediacodec \ --enable-decoder=hevc_mediacodec \
--enable-ffmpeg \ --enable-bsf=aac_adtstoasc \ --enable-bsf=h264_mp4toannexb \
--enable-bsf=hevc_mp4toannexb \ --enable-bsf=mpeg4_unpack_bframes \
--enable-cross-compile \ --cross-prefix=$CROSS_PREFIX \ --target-os=android \
--arch=$ARCH \ --sysroot=$SYSROOT \ --extra-cflags="-I$x264/include
$ADDI_CFLAGS" \ --extra-ldflags="-L$x264/lib $ADDI_LDFLAGS" } build() { make
clean configure make -j4 make install } build
可以根据自己的需求对模块进行裁剪,
查看所有编译配置选项:./configure --help
查看支持的解码器:./configure --list-decoders
查看支持的编码器:./configure --list-encoders
查看支持的硬件加速:./configure --list-hwaccels

赋予脚本执行权限:chmod +x build.sh
执行脚本开始编译:./build.sh
如果一切顺利就可以在源码更目录下的android/armv7-a/lib/下找到我们需要的.so文件了


<>3.移植到Android App中

<>1.JNI编译

(1)将编译后的FFmpeg源码目录打包并复制到Windows中,在任意目录下新建jni文件夹,将 android\armv7-a\lib
目录下的.so文件复制到 jni\prebuilt\ 目录下;
将 android\armv7-a\include 目录下的所有文件夹复制到 jni 目录下;
将 源码根目录下的config.h复制到 jni 目录下;
将 fftools\ 目录下的以下文件复制到 jni 目录下:
cmdutils.h
ffmpeg.h
ffmpeg.c
ffmpeg_opt.c
ffmpeg_filter.c
cmdutils.c
ffmpeg_hw.c

(2)修改cmdutils
打开cmdutils.h,将
void show_help_children(const AVClass *class, int flags);
改为
void show_help_children(const AVClass *clazz, int flags);
否则和C++一起编译会出问题

(3)修改ffmpeg.c
找到入口函数int main(int argc, char **argv)
将其修改为int ffmpeg_exec(int argc, char **argv)
将此函数中所有调用exit_program()的地方都注释掉,并在函数末尾添加以下代码重置状态:
nb_filtergraphs = 0; nb_output_files = 0; nb_output_streams = 0;
nb_input_files = 0; nb_input_streams = 0;
同时在ffmpeg.h中添加函数申明:
int ffmpeg_exec(int argc, char **argv);

(4)输出ADB日志
在ffmpeg.c中引入头文件
#include "android/log.h" #define LOGD(...)
__android_log_print(ANDROID_LOG_DEBUG , "ffmpeg.c", __VA_ARGS__) #define
LOGE(...) __android_log_print(ANDROID_LOG_ERROR , "ffmpeg.c", __VA_ARGS__)
实现log_callback_null函数
static void log_callback_null(void *ptr, int level, const char *fmt, va_list
vl) { static int print_prefix = 1; static int count; static char prev[1024];
char line[1024]; static int is_atty; av_log_format_line(ptr, level, fmt, vl,
line, sizeof(line), &print_prefix); strcpy(prev, line); if (level <=
AV_LOG_WARNING) { LOGE("%s", line); } else { LOGD("%s", line); } }
(5)实现JNI接口
编写ffmpeg-invoke.cpp并放到 jni 目录下:
#include <jni.h> #include <string> #include "android/log.h" #define LOGD(...)
__android_log_print(ANDROID_LOG_DEBUG, "ffmpeg-invoke", __VA_ARGS__) extern
"C"{ #include "ffmpeg.h" #include "libavcodec/jni.h" } extern "C" JNIEXPORT
jint JNICALL Java_com_github_ffmpegtest_jni_FFmpegCmd_run(JNIEnv *env, jclass
type, jint cmdLen, jobjectArray cmd) { //set java vm JavaVM *jvm = NULL;
env->GetJavaVM(&jvm); av_jni_set_java_vm(jvm, NULL); char *argCmd[cmdLen] ;
jstring buf[cmdLen]; for (int i = 0; i < cmdLen; ++i) { buf[i] =
static_cast<jstring>(env->GetObjectArrayElement(cmd, i)); char *string =
const_cast<char *>(env->GetStringUTFChars(buf[i], JNI_FALSE)); argCmd[i] =
string; LOGD("argCmd=%s",argCmd[i]); } int retCode = ffmpeg_exec(cmdLen,
argCmd); LOGD("ffmpeg-invoke: retCode=%d",retCode); return retCode; }
注意:将com_github_ffmpegtest_jni_FFmpegCmd改为自己工程中的JAVA文件对应的包名
(6)编写.mk文件
编写Android.mk文件并放到 jni 目录
LOCAL_PATH := $(call my-dir) include $(CLEAR_VARS) LOCAL_MODULE := libavutil
LOCAL_SRC_FILES := prebuilt/libavutil.so include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS) LOCAL_MODULE := libswresample LOCAL_SRC_FILES :=
prebuilt/libswresample.so include $(PREBUILT_SHARED_LIBRARY) include
$(CLEAR_VARS) LOCAL_MODULE := libswscale LOCAL_SRC_FILES :=
prebuilt/libswscale.so include $(PREBUILT_SHARED_LIBRARY) include $(CLEAR_VARS)
LOCAL_MODULE := libavcodec LOCAL_SRC_FILES := prebuilt/libavcodec.so include
$(PREBUILT_SHARED_LIBRARY) include $(CLEAR_VARS) LOCAL_MODULE := libavformat
LOCAL_SRC_FILES := prebuilt/libavformat.so include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS) LOCAL_MODULE := libavfilter LOCAL_SRC_FILES :=
prebuilt/libavfilter.so include $(PREBUILT_SHARED_LIBRARY) include
$(CLEAR_VARS) LOCAL_MODULE := libpostproc LOCAL_SRC_FILES :=
prebuilt/libpostproc.so include $(PREBUILT_SHARED_LIBRARY) include
$(CLEAR_VARS) LOCAL_MODULE := ffmpeg-invoke LOCAL_SRC_FILES :=ffmpeg-invoke.cpp
\ cmdutils.c \ ffmpeg_filter.c \ ffmpeg_opt.c \ ffmpeg_hw.c \ ffmpeg.c
LOCAL_C_INCLUDES := D:\libs\FFmpeg-n4.1.3 LOCAL_LDLIBS := -llog -ljnigraphics
-lz -landroid -lm -pthread -L$(SYSROOT)/usr/lib LOCAL_SHARED_LIBRARIES :=
libavcodec libavfilter libavformat libavutil libswresample libswscale
libpostproc include $(BUILD_SHARED_LIBRARY)
注意:将LOCAL_C_INCLUDES改为自己电脑中FFmpeg源码所在目录

编写Application.mk并放到 …\jni\目录下
APP_ABI := armeabi-v7a APP_PLATFORM=android-21 APP_OPTIM := release APP_STL :=
stlport_static
(7)编译
确认所需文件都准备就绪:

打开CMD,进入该目录,执行ndk-build开始编译(确保ndk路径已经配置到环境变量中)
编译过程中可能出现的报错:
1.error: invalid suffix on literal; C++11 requires a space between literal
and identifier [-Wreserved-user-defined-literal] snprintf(name, sizeof(name),
"0x%"PRIx64, ch_layout);
解决方法:在cmdutils.h中找到 "0x%"PRIx64,在PRIx64前面加个空格即可。

2.error: assigning to 'BenchmarkTimeStamps' (aka 'struct BenchmarkTimeStamps')
from incompatible type 'int64_t' (aka 'long long') current_time = ti =
getutime();
解决方法:在ffmpeg.c中找到current_time = ti = getutime();,将这行代码改为current_time.sys_usec
= ti = getutime();

3.error: undefined reference to 'postproc_version'
解决方法:在cmdutils.c中注释这行代码
PRINT_LIB_INFO(postproc, POSTPROC, flags, level);

4.error: undefined reference to 'getutime'
解决方法:在ffmpeg.c中添加下面这个函数:
static int64_t getutime(void) { #if HAVE_GETRUSAGE struct rusage rusage;
getrusage(RUSAGE_SELF, &rusage); return (rusage.ru_utime.tv_sec * 1000000LL) +
rusage.ru_utime.tv_usec; #elif HAVE_GETPROCESSTIMES HANDLE proc; FILETIME c, e,
k, u; proc = GetCurrentProcess(); GetProcessTimes(proc, &c, &e, &k, &u); return
((int64_t) u.dwHighDateTime << 32 | u.dwLowDateTime) / 10; #else return
av_gettime_relative(); #endif }
编译成功可以看到生成的.so文件:


在jni同级目录下找到libs目录,libs\armeabi-v7a目录下的.so文件就是我们最终需要的动态链接库:


<>2.新建Android Studio测试工程 FFmpegTest

(1)在FFmpegTest\app\src\main\目录下新建jniLibs目录,将上一步编译生成的
\libs\armeabi-v7a文件夹复制到jniLibs目录下;
在com.github.ffmpegtest.jni路径下新建FFmpegCmd.java
public class FFmpegCmd { static { System.loadLibrary("avutil");
System.loadLibrary("avcodec"); System.loadLibrary("swresample");
System.loadLibrary("avformat"); System.loadLibrary("swscale");
System.loadLibrary("avfilter"); System.loadLibrary("ffmpeg-invoke"); } private
static native int run(int cmdLen, String[] cmd); public static native String
test(); public static int run(String[] cmd){ return run(cmd.length,cmd); } }


(2)测试FFmpeg命令
public class MainActivity extends AppCompatActivity { @Override protected void
onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main); if
(ContextCompat.checkSelfPermission(this,
Manifest.permission.READ_EXTERNAL_STORAGE) !=
PackageManager.PERMISSION_GRANTED) { ActivityCompat.requestPermissions(this,
new String[]{Manifest.permission.READ_EXTERNAL_STORAGE}, 100); } TextView
tvMessage = findViewById(R.id.tv_message);
tvMessage.setText(FFmpegInvoke.test()); ffmpegTest(); } private void
ffmpegTest() { new Thread(){ @Override public void run() { long startTime =
System.currentTimeMillis(); String input =
"/sdcard/Movies/Replay_2018.05.08-13.46.mp4"; String output =
"/sdcard/Movies/output.mp4"; //剪切视频从00:20-00:28的片段 String cmd = "ffmpeg -d -ss
00:00:20 -t 00:00:08 -i %s -vcodec copy -acodec copy %s"; cmd =
String.format(cmd,input,output); FFmpegCmd.run(cmd.split(" "));
Log.d("FFmpegTest", "run: 耗时:"+(System.currentTimeMillis()-startTime)); }
}.start(); } }
执行完毕后再/sdcard/Movies/目录下找到output.mp4,打开播放确实是我们想要的结果,说明FFmpeg的命令成功执行了。

友情链接
KaDraw流程图
API参考文档
OK工具箱
云服务器优惠
阿里云优惠券
腾讯云优惠券
华为云优惠券
站点信息
问题反馈
邮箱:[email protected]
QQ群:637538335
关注微信