How to Port ffmpeg (the Program) to Android–Ideas and Thoughts

Why ffmpeg for Android and The App

I was working on a project that requires transcoding of video files and then play these files on Android device. The definite tool to do this is ffmpeg. I can do it on Desktop and transfer the files to Android, but why not just transcode on Android itself?

Then I searched for ffmpeg and transcoding tools on Android, but cannot find any. And since I’ve built ffmpeg libraries for Android before, I believe it is also doable to port the ffmpeg command line utility to Android. So I started doing it.

Below is the screen capture of the Android app (I released it as ffmpeg for Android Beta),

devicedevice1

Figure 1. Screenshots of ffmpeg for Android

Currently it only supports armv7 processor, the one that my Nexus One device uses.

How did I Port it?

The main design goal is to change as few lines of code as possible for ffmpeg.c file. To meet this goal, I decoupled the UI and ffmpeg processing into two processes.

Running ffmpeg on a separate processes has a few advantages,

  • ffmpeg uses a lot of exit function, which will terminate the process. Use a separate process for ffmpeg will ensure the termination of ffmpeg won’t affect the front UI.
  • ffmpeg is processing intensive, running it on a separate process will not affect the responsiveness of front UI.

There’s inter-process communication problem to solve (IPC) so the two processes can communicate with each other. The UI will be able to start and stop ffmpeg, and ffmpeg will send output message to UI.

From UI to ffmpeg is simple, I implement a service that will call ffmpeg. And we start the service using Intent. It’s the same for stop command.

From ffmpeg to UI, there’re a few choices. pipe, socket, etc. But since it requires only a very simple one way communication channel (ffmpeg write, UI read), I just use text files. ffmpeg dump all outputs to text files, and UI read it every second or two (you can also use file observer and update the UI only when the text files changes).

What are the Changes for ffmpeg.c

With the design above, the change for ffmpeg.c is simple.

  • Change the main(int argc, char **argv) function prototype to a java JNI interface prototype that Android java code can call.
  • Change the output to stdout to the text files we created.
  • Comment out a few lines that cause the compilation error.

A little More Thoughts

I’m thinking this method can be used to port lots of other applications to Android too. Two processes, running the background processing utility as a service, change the main method to an native interface so Java code can call, use intents and text files for communication, etc.

If you would like to try out this app, go to https://market.android.com/developer?pub=roman10. Note that it’s for armv7 processor only.

How to Build Android Applications Based on FFmpeg by An Example

This is a follow up post of the previous blog How to Build FFmpeg for Android.  You can read the previous tutorial first, or refer back to it when you feel necessary.

This blog covers how to build a simple Android app based on FFmpeg library. The app will detect the input video file’s resolution and codec information through interface provided by FFmpeg.

Blow is a screenshot of the app,

device

Figure 1. Screen shot of the sample android app based on FFmpeg

0. Create a new Android Project FFmpegTest.

When you’re asked to select targeted platform, select 2.2 as it’s the platform used in previous blog. But you’re free to change it.

Also create a folder named jni under the root directory “FFmpegTest” of this project.

1. Download the ffmpeg Source Code and Extract it to jni Folder

Follow the previous blog How to Build FFmpeg for Android to build the library.

2. Write the Native Code that use FFmpeg’s libavcodec library.

You can copy and paste the code below and save it as ffmpeg-test-jni.c under FFmpegTest/jni/ directory. Note that the code below is not completed, you can download the entire code at the end of the post.

/*for android logs*/

#define LOG_TAG "FFmpegTest"

#define LOG_LEVEL 10

#define LOGI(level, ...) if (level <= LOG_LEVEL) {__android_log_print(ANDROID_LOG_INFO, LOG_TAG, __VA_ARGS__);}

#define LOGE(level, ...) if (level <= LOG_LEVEL) {__android_log_print(ANDROID_LOG_ERROR, LOG_TAG, __VA_ARGS__);}

char *gFileName;      //the file name of the video

AVFormatContext *gFormatCtx;

int gVideoStreamIndex;    //video stream index

AVCodecContext *gVideoCodecCtx;

static void get_video_info(char *prFilename);

static void get_video_info(char *prFilename) {

    AVCodec *lVideoCodec;

    int lError;

    /*register the codec*/

    extern AVCodec ff_h264_decoder;

    avcodec_register(&ff_h264_decoder);

    /*register demux*/

    extern AVInputFormat ff_mov_demuxer;

    av_register_input_format(&ff_mov_demuxer);

    /*register the protocol*/

    extern URLProtocol ff_file_protocol;

    av_register_protocol2(&ff_file_protocol, sizeof(ff_file_protocol));

    /*open the video file*/

    if ((lError = av_open_input_file(&gFormatCtx, gFileName, NULL, 0, NULL)) !=0 ) {

        LOGE(1, "Error open video file: %d", lError);

        return;    //open file failed

    }

    /*retrieve stream information*/

    if ((lError = av_find_stream_info(gFormatCtx)) < 0) {

        LOGE(1, "Error find stream information: %d", lError);

        return;

    }

    /*find the video stream and its decoder*/

    gVideoStreamIndex = av_find_best_stream(gFormatCtx, AVMEDIA_TYPE_VIDEO, -1, -1, &lVideoCodec, 0);

    if (gVideoStreamIndex == AVERROR_STREAM_NOT_FOUND) {

        LOGE(1, "Error: cannot find a video stream");

        return;

    } else {

        LOGI(10, "video codec: %s", lVideoCodec->name);

    }

    if (gVideoStreamIndex == AVERROR_DECODER_NOT_FOUND) {

        LOGE(1, "Error: video stream found, but no decoder is found!");

        return;

    }

    /*open the codec*/

    gVideoCodecCtx = gFormatCtx->streams[gVideoStreamIndex]->codec;

    LOGI(10, "open codec: (%d, %d)", gVideoCodecCtx->height, gVideoCodecCtx->width);

    if (avcodec_open(gVideoCodecCtx, lVideoCodec) < 0) {

        LOGE(1, "Error: cannot open the video codec!");

        return;

    }

}

JNIEXPORT void JNICALL Java_roman10_ffmpegTest_VideoBrowser_naInit(JNIEnv *pEnv, jobject pObj, jstring pFileName) {

    int l_mbH, l_mbW;

    /*get the video file name*/

    gFileName = (char *)(*pEnv)->GetStringUTFChars(pEnv, pFileName, NULL);

    if (gFileName == NULL) {

        LOGE(1, "Error: cannot get the video file name!");

        return;

    }

    LOGI(10, "video file name is %s", gFileName);

    get_video_info(gFileName);

}

JNIEXPORT jstring JNICALL Java_roman10_ffmpegTest_VideoBrowser_naGetVideoCodecName(JNIEnv *pEnv, jobject pObj) {

    char* lCodecName = gVideoCodecCtx->codec->name;

    return (*pEnv)->NewStringUTF(pEnv, lCodecName);

}

 

 

 

If you’re not familiar with Java JNI, you may need to read about JNI first in order to understand the code. But this is not the focus of this tutorial hence not covered.

3. Build the Native Code.

Create a file named Android.mk under jni directory and copy paste the content below,

LOCAL_PATH := $(call my-dir)

 


 

#declare the prebuilt library

 

include $(CLEAR_VARS)

 

LOCAL_MODULE := ffmpeg-prebuilt

 

LOCAL_SRC_FILES := ffmpeg-0.8/android/armv7-a/libffmpeg.so

 

LOCAL_EXPORT_C_INCLUDES := ffmpeg-0.8/android/armv7-a/include

 

LOCAL_EXPORT_LDLIBS := ffmpeg-0.8/android/armv7-a/libffmpeg.so

 

LOCAL_PRELINK_MODULE := true

 

include $(PREBUILT_SHARED_LIBRARY)

 


 

#the ffmpeg-test-jni library

 

include $(CLEAR_VARS)

 

LOCAL_ALLOW_UNDEFINED_SYMBOLS=false

 

LOCAL_MODULE := ffmpeg-test-jni

 

LOCAL_SRC_FILES := ffmpeg-test-jni.c

 

LOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg-0.8/android/armv7-a/include

 

LOCAL_SHARED_LIBRARY := ffmpeg-prebuilt

 

LOCAL_LDLIBS    := -llog -ljnigraphics -lz -lm $(LOCAL_PATH)/ffmpeg-0.8/android/armv7-a/libffmpeg.so

 

include $(BUILD_SHARED_LIBRARY)

Create another file named Application.mk under jni directory and copy paste the content below,

# The ARMv7 is significanly faster due to the use of the hardware FPU

 

APP_ABI := armeabi-v7a

 

APP_PLATFORM := android-8

For more information about what Android.mk and Application.mk do, you can refer to the documentation comes with android NDK.

Your folder structure should be something like below after you finish all steps above,

jni

Figure 2. Folder structure of project FFmpegTest and jni

4. Call the Native Code from FFmpegTest Java Code.

You’ll need to load the libraries and declare the jni functions. The code below is extracted from the app source code, which is provided for download at the end of this tutorial.

/*this part communicates with native code through jni (java native interface)*/

 

    //load the native library

 

    static {

 

        System.loadLibrary("ffmpeg");

 

        System.loadLibrary("ffmpeg-test-jni");

 

    }

 

    //declare the jni functions

 

    private static native void naInit(String _videoFileName);

 

    private static native int[] naGetVideoResolution();

 

    private static native String naGetVideoCodecName();

 

    private static native String naGetVideoFormatName();

 

    private static native void naClose();

 


 

    private void showVideoInfo(final File _file) {

 

        String videoFilename = _file.getAbsolutePath();

 

        naInit(videoFilename);

 

        int[] prVideoRes = naGetVideoResolution();

 

        String prVideoCodecName = naGetVideoCodecName();

 

        String prVideoFormatName = naGetVideoFormatName();

 

        naClose();

 

        String displayText = "Video: " + videoFilename + "n";

 

        displayText += "Video Resolution: " + prVideoRes[0] + "x" + prVideoRes[1] + "n";

 

        displayText += "Video Codec: " + prVideoCodecName + "n";

 

        displayText += "Video Format: " + prVideoFormatName + "n";

 

        text_titlebar_text.setText(displayText);

 

    }

5. You can Download the Entire Source Code from here.

Note: the code is tested on Ubuntu 10.04 and Android ndk-5b, but it should work on other platforms (except for Windows, which I’m not sure about.)

How to Build FFmpeg for Android

For how to build ffmpeg 2.0.1 with NDK r9, please refer to: http://www.roman10.net/how-to-build-ffmpeg-with-ndk-r9/

ffmpeg is an open-source platform for recording, converting, playing and streaming video and audio. It includes libavcodec, a popular video/audio codec.

Several popular Android applications are built based on FFmpeg, including RockPlayer, MoboPlayer, acrMedia, vitalPlayer, V-Cut Express etc. If you’re developing multimedia applications that needs a video/audio codec, ffmpeg is a good choice.

This blog covers how to compile ffmpeg for Android, and next blog will cover how to use ffmpeg to build a simple application.

The steps below are done on Ubuntu 10.10, android NDK r5b, and ffmpeg 0.8. It should work on other versions of Android NDK and ffmpeg, but it may require minor changes.

0. Download Android NDK r5b

You can download the NDK here. Once downloaded, simply extract the file, and you’ll have a folder named android-ndk-r5b. You’ll need the folder location for configurations later.

1. Download Source Code for FFmpeg

You can download the source code from here. If you want to get the latest code, you can use git or svn, the link has detailed  instructions. But for this tutorial, the FFmpeg 0.8 “Love” release is downloaded.

After downloaded the source, extract it and you’ll have a folder named ffmpeg-0.8.

2. Build FFmpeg (The script is based on RockPlayer build script)

2.1 Copy and Paste the bash script from here to a text editor, and save it as build_android.sh under ffmpeg-0.8 folder.

Note that NDK location has to be changed according to your android-ndk-r5b folder location. In my machine, it’s at ~/Desktop/android/, so it’s set as

NDK=~/Desktop/android/android-ndk-r5b
PLATFORM=$NDK/platforms/android-8/arch-arm/
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86

You may also need to adjust the PLATFORM based on which version of SDK you’re using, android-8 corresponds to android SDK 2.2.

The default configuration in the script disables a lot of stuff to speed up the build, you can change the configuration to suit your needs. Besides, you can compile for multiple hardware platforms, but here we only enable arm v7vfpv3 to speed up the build process.

2.2 Make sure the bash script is executable. Go to the ffmpeg-0.8 directory in terminal, then type the following command,

sudo chmod 755 build_android.sh

2.3 Then execute the script, by typing the following command,

./build_android.sh

The compilation will take a while (several minutes or above depends on your machine) to finish.

Update for NDK-r6:

For android NDK-r6, the build_android.sh script might not work. You can try the script here.

Note that you may need to create ./android/armv7-a/ folder in the ffmpeg directory yourself. (Thanks to mgg28831 for this).

If you encounter permission denied error, you can try sudo ./build_android.sh.

3. The Output of the Build

Once the script finishes execution, there’ll be a folder called android under ffmpeg-0.8 directory, which contains all the output of the build.

4. To be Continued

Once the library is compiled successfuly, the next step is to use it to build Android apps. This is covered in next blog, How to Build Android Apps Based on FFmpeg By an Example.

Reference:

RockPlayer open source component: http://rockplayer.freecoder.org/tech.html

My ffmpeg Commands List

Side note: first draft on Apr 1 2011. It turns out ffmpeg can be really handy when you want to create some image or video data that works for your implementation. Winking smile. This list is on-going…

ffmpeg is a powerful open source video and audio processing tools. It supports a lot of different video/audio formats. It allows you to record, convert, stream and manipulate audio and video files.

This article collects my favourite ffmpeg commands,

1. Extract Frames from Video

1.1 Extract the first frame from video

ffmpeg –vframes 1 –i input_video_file_name –f image2 output_frame.bmp

ffmpeg also supports other image formats like .png or .jpg.

1.2 Extract all frames from video

ffmpeg –i input_video_file_name –f image2 output_file_name_format.bmp

Note that output_file_name_format.bmp can be, for example, %05d.bmp

2. Crop Video

ffmpeg –i input_video_file_name –vf crop=w:h:x:y output_file_name

Note that w and h are width and height, and x, y is the position of the top-left corner before crop. vf means video filter. crop is kind of video filter in ffmpeg.

3. Convert Image between Different Formats

ffmpeg –i input_image_file_name.ext1 output_image_file_name.ext2

4. Create Video from Sequences of Images

ffmpeg –r 20 –i %input_image_file_name_format.ext video_file_name.ext2

As an example, ffmpeg –r 20 –i %05.bmp test.avi

5. Convert Video to Different Container Formats

ffmpeg –i input_video_file_name.ext1 output_video_file_name.ext2

Example 1:  Convert Video from AVI to FLV

Sometimes ffmpeg cannot figure out everything itself, we may need to supply more info. I encountered an example where I’ll need to specify the output video, audio codec and scale.

ffmpeg -i recording_3.avi -f flv -vcodec copy -acodec copy -s 1280×720 recording3.flv

Note that the command above only changes the container format, it specifies same audio/video codec (using the “copy” option) as the avi file for the output flv file. This command runs very fast as it left the codec untouched.

6. Convert Video to Different Codec (Transcoding)

ffmpeg -i input_video_file_name.ext1 -vcodec xxx -r xx -s aaaaxbbbb -aspect xx:xx -b xxxxk output_file_name.ext2

As an example,

ffmpeg -i h1.mp4 -vcodec mpeg4 -r 30 -s 1280×720 -aspect 16:9 -b 10000k h1_1280_720.mp4

This command takes input h1.mp4 (it has h264 codec in my case), transcode it to mpeg4 codec, with frame rate 30 fps, resolution 1280×720, aspect ratio 16:9, bitrate of 10000 kbits/s, and output it to h1_1280_720.mp4 file.