Using the FFmpeg command in iOS

brief introduction

FFmpeg is a set of open source computer programs that can be used to record, convert digital audio, video, and convert it into a stream. Using LGPL or GPL license. It provides a complete solution for recording, converting, and streaming audio and video, including leading audio and video coding libavcodec.

Using the FFmpeg command in iOS

The following is a brief description of each module function:

Libavformat: for the generation and analysis of a variety of audio and video format;
libavcodec: used for various types of audio and video codec;
: libavutil contains some public utility function;
: libswscale for video scene conversion, color mapping shrinkage ratio;
: libpostproc for post processing effect;
ffmpeg: a tool for the the project provides, can be used for format conversion, decoding or encoding instant TV card;
ffsever: a HTTP multimedia broadcast on instant streaming server;
: ffplay is a simple player, using ffmpeg parsing and decoding, through the SDL display;

Implementation function

Code address

  • Picture, sound synthesis video.
  • Video transcoding.
  • Video watermarking.
  • Video filter.

Library file compilation

1, compile and decode library files

Due to the difficulty of compiling the FFmpeg library file is relatively large, so I mainly use the GitHub open source script files to compile. Script address: FFmpeg-iOS-build-script this script update is very timely, has supported more than 3 version. Here I use the version of FFmpeg3.0. Therefore modified the ffmpeg version of the shell script:

SOURCE= "ffmpeg-3.0"

Execute command after modification:


Script will automatically from the GitHub source code to the local ffmpeg and began to compile, compiled in the current directory to generate the FFmpeg-iOS folder.

Be careful:

In the scratch directory each schema has a configuration file config.h this file is more important. That represents the configuration parameters of the currently compiled library file. For example: what decoder, encoder.

Using the FFmpeg command in iOS
configuration file
2, compile command line support library files

When the library file is compiled, we can see that the FFmpeg-iOS directory is generated in the current directory.

Using the FFmpeg command in iOS
library file

Because we used the –disable-programs compiler option at compile time, as shown below:

CONFIGURE_FLAGS= "--enable-cross-compile --disable-programs --disable-doc --disable-debug --enable-pic"

Therefore, the command line related tools will not be compiled. Therefore, we need to compile their own documents to support the FFmpeg command line parsing.

In the compiler command to resolve the relevant library files, we mainly used a few source files

Ffmpeg_videotoolbox.c cmdutils.c ffmpeg_opt.c ffmpeg.c ffmpeg_filter.c ffprobe.c

At compile time we need to modify the ffmpeg.c main function, because a program can not have two main functions, where we changed to ffmpeg_main, as shown below:

Int ffmpeg_main (int argc, char **argv) {int ret; int64_t Ti; register_exit (ffmpeg_cleanup); setvbuf (stderr, NULL, _IONBF, 0); / * Win32 runtime needs * / this av_log_set_flags (AV_LOG_SKIP_REPEATED); parse_loglevel (argc, argv, options); if (argc> 1 & & StrCmp (! Argv[1], "-d")) {run_as_daemon=1 (log_callback_null); av_log_set_callback; argc--; argv++;} / / the following is omitted...}

We also need to modify the cmdutils.c function in exit_program, delete the original content of the function, add return RET and modify the return type of the function is int. If you do not modify, the FFmpeg command is executed, the program will exit.

Int exit_program (int RET); int exit_program (int RET) {//if (program_exit) / program_exit (RET); //exit (RET); return ret;}

After the modification, we use Xcode to create a static library project, and then the source file into the project. Need to configure the header files here in the search path, we are searching in the FFmpeg-iOS folder ($(SRCROOT) /../FFmpeg-iOS/include) search and the ffmpeg-3.0 directory ($(SRCROOT) /../ffmpeg-3.0) is a source file search. The reason to search in the source file, because the compiler does not copy the FFmpeg-iOS folder to copy all the header files, only the necessary header files. Here we do not need to link to the previously compiled library file, because the static library is only -c (clang) and -r (AR) product, do not need links.

Using the FFmpeg command in iOS
static library project

We compiled by universal library lipo -create command generation simulator and real machine architecture.

Lipo -create /Users/qinmin/Library/Developer/Xcode/DerivedData/FFmpeg-cvfzxtnwpwznsfclqrttxwgczhjv/Build/Products/Debug-iphonesimulator/libFFmpeg.a /Users/qinmin/Library/Developer/Xcode/DerivedData/FFmpeg-cvfzxtnwpwznsfclqrttxwgczhjv/Build/Products/ Debug-iphoneos/libFFmpeg.a -output /Users/qinmin/Desktop/libFFmpeg.a

Library file

1, video segmentation for pictures.

Extern int ffmpeg_main (int argc, char * argv[]); - (IBAction) sliceBtnClick: (UIButton * sender) {dispatch_async (dispatch_get_global_queue (0, 0), ^{char = *movie (char *) [BundlePath (@ 1.mp4) UTF8String]; char * outPic = (char *) [DocumentPath (@%05d.jpg) UTF8String]; char* {a[] = "ffmpeg", "-i", "movie, -r", "10", outPic}; ffmpeg_main (sizeof (a) /sizeof (*a), a);}});

2, pictures, sound synthesis video.

Extern int ffmpeg_main (int argc, char * argv[]); - (IBAction) composeBtnClick: (UIButton * sender) {dispatch_async (dispatch_get_global_queue (0, 0), ^{char = *outPic (char *) [DocumentPath (@%05d.jpg) UTF8String]; char *movie = (char *) [DocumentPath (@ 1.mp4) UTF8String] char*; a[] = {"ffmpeg", "-i", "outPic, -vcodec", "MPEG4", movie}; ffmpeg_main (sizeof (a) /sizeof (*a), a);}});

3, video transcoding.

Extern int ffmpeg_main (int argc, char * argv[]); - (IBAction) transBtnClick: (UIButton * sender) {dispatch_async (dispatch_get_global_queue (0, 0), ^{char = *outPic (char *) [DocumentPath (@ out.avi) UTF8String]; char * movie = (char *) [BundlePath (@ 1.mp4) UTF8String]; char* {a[] = "ffmpeg", "-i", "movie, -vcodec", "MPEG4", outPic}; ffmpeg_main (sizeof (a) /sizeof (*a), a);}});

4, video watermark

Extern int ffmpeg_main (int argc, char * argv[]); - (IBAction) logoBtnClick: (UIButton * sender) {dispatch_async (dispatch_get_global_queue (0, 0), ^{char = *outPic (char *) [DocumentPath (@ logo.mp4) UTF8String]; char * movie = (char *) [BundlePath (@ 1.mp4) UTF8String]; char logo[1024]; / / left sprintf (logo, movie=%s [logo]; [in][logo] overlay=30:10 [BundlePath (@ [out] "," ff.jpg "); / / UTF8String]) left //sprintf (logo, movie=%s [logo]; [in][logo] overlay=30:main_h-overlay_h-10 [BundlePath (@ [out]", "ff.jpg"); / / UTF8String]) right under //sprintf (logo movie=%s [logo]; [in][logo] overlay=main_w-overlay_w-10:main_h-overlay_h-10, [out], [BundlePath (@ ff.jpg) UTF8String The upper right); / /] //sprintf (logo, movie=%s [logo]; [in][logo] overlay=main_w-overlay_w-10:10 [BundlePath (@ [out] "," ff.jpg "); char* a[] = UTF8String]) {" ffmpeg "," -i "," -vf ", movie, logo, outPic}; ffmpeg_main (sizeof (a) /sizeof (*a), a);}});

5, video filter

Extern int ffmpeg_main (int argc, char * argv[]); - (IBAction) filterBtnClick: (ID sender) {dispatch_async (dispatch_get_global_queue (0, 0), ^{char = *outPic (char *) [DocumentPath (@ filter.mp4) UTF8String]; char * movie = (char *) [BundlePath (@ 1.mp4) UTF8String] painting; / / lattice //char *filter = "drawgrid=w=iw/3:h=ih/3:t=2:[email protected]"; / / char rectangle *filter = "drawbox=x=10:y=20:w=200:h=60:[email protected]"; / / *filter = "crop= in_w/2:in_h/2: clipping //char (in_w-out_w) /2+ ((in_w-out_w) /2) *sin (n/10): (in_h-out_h) /2 + ((in_h-out_h) /2) *sin (n/7); char* = a[] {" ffmpeg "," -i "," -vf ", movie, filter, out Pic}; ffmpeg_main (sizeof (a) /sizeof (*a), a);});}

Effect display

Using the FFmpeg command in iOS
segmentation image.Png
Using the FFmpeg command in iOS
Using the FFmpeg command in iOS


In the use of mobile devices FFmpeg codec, add filters and other operations is still very expensive CPU. Therefore, the need to consider the use of.

Reference resources