How to Build ffmpeg with NDK r9

This is a updated post for a previous post, where we built ffmpeg 0.8 with Android NDK r5 and r6. This post will give instructions of how to build ffmpeg 2.0.1 with Android NDK r9.

0. Download Android NDK

The latest version of Android NDK can be downloaded at Android NDK website. At the time of writing, the newest version is NDK r9. Note that the website provides both current and legacy toolchains. We only need the current toolchain to compile ffmpeg.

After download NDK, simply decompress the archive. Note that we’ll use $NDK to represent the root path of the decompressed NDK.

1. Download ffmpeg source code

FFMPEG source code can be downloaded from the ffmpeg website. The latest stable release is 2.0.1. Download the source code and decompress it to $NDK/sources folder. We’ll discuss about the reason for doing this later.

2. Update configure file

Open ffmpeg-2.0.1/configure file with a text editor, and locate the following lines.

SLIBNAME_WITH_MAJOR='$(SLIBNAME).$(LIBMAJOR)'

LIB_INSTALL_EXTRA_CMD='$$(RANLIB) "$(LIBDIR)/$(LIBNAME)"'

SLIB_INSTALL_NAME='$(SLIBNAME_WITH_VERSION)'

SLIB_INSTALL_LINKS='$(SLIBNAME_WITH_MAJOR) $(SLIBNAME)'

This cause ffmpeg shared libraries to be compiled to libavcodec.so.<version> (e.g. libavcodec.so.55), which is not compatible with Android build system. Therefore we’ll need to replace the above lines with the following lines.

SLIBNAME_WITH_MAJOR='$(SLIBPREF)$(FULLNAME)-$(LIBMAJOR)$(SLIBSUF)'

LIB_INSTALL_EXTRA_CMD='$$(RANLIB) "$(LIBDIR)/$(LIBNAME)"'

SLIB_INSTALL_NAME='$(SLIBNAME_WITH_MAJOR)'

SLIB_INSTALL_LINKS='$(SLIBNAME)'

3. Build ffmpeg

Copy the following text to a text editor and save it as build_android.sh.

#!/bin/bash

NDK=$HOME/Desktop/adt/android-ndk-r9

SYSROOT=$NDK/platforms/android-9/arch-arm/

TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64

function build_one

{

./configure 

    --prefix=$PREFIX 

    --enable-shared 

    --disable-static 

    --disable-doc 

    --disable-ffmpeg 

    --disable-ffplay 

    --disable-ffprobe 

    --disable-ffserver 

    --disable-avdevice 

    --disable-doc 

    --disable-symver 

    --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- 

    --target-os=linux 

    --arch=arm 

    --enable-cross-compile 

    --sysroot=$SYSROOT 

    --extra-cflags="-Os -fpic $ADDI_CFLAGS" 

    --extra-ldflags="$ADDI_LDFLAGS" 

    $ADDITIONAL_CONFIGURE_FLAG

make clean

make

make install

}

CPU=arm

PREFIX=$(pwd)/android/$CPU 

ADDI_CFLAGS="-marm"

build_one

We disabled static library and enabled shared library. Note that the build script is not optimized for a particular CPU. One should refer to ffmpeg documentation for detailed information about available configure options.

Once the file is saved, make sure the script is executable by the command below,

sudo chmod +x build_android.sh

Then execute the script by the command,

./build_android.sh

4. Build Output

The build can take a while to finish depending on your computer speed. Once it’s done, you should be able to find a folder $NDK/sources/ffmpeg-2.0.1/android, which contains arm/lib and arm/include folders.

The arm/lib folder contains the shared libraries, while arm/include folder contains the header files for libavcodec, libavformat, libavfilter, libavutil, libswscale etc.

Note that the arm/lib folder contains both the library files (e.g.: libavcodec-55.so) and symbolic links (e.g.: libavcodec.so) to them. We can remove the symbolic links to avoid confusion.

5. Make ffmpeg Libraries available for Your Projects

Now we’ve compiled the ffmpeg libraries and ready to use them. Android NDK allows us to reuse a compiled module through the import-module build command.

The reason we built our ffmpeg source code under $NDK/sources folder is that NDK build system will search for directories under this path for external modules automatically. To declare the ffmpeg libraries as reusable modules, we’ll need to add a file named $NDK/sources/ffmpeg-2.0.1/android/arm/Android.mk with the following content,

LOCAL_PATH:= $(call my-dir)

 

include $(CLEAR_VARS)

LOCAL_MODULE:= libavcodec

LOCAL_SRC_FILES:= lib/libavcodec-55.so

LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include

include $(PREBUILT_SHARED_LIBRARY)

 

include $(CLEAR_VARS)

LOCAL_MODULE:= libavformat

LOCAL_SRC_FILES:= lib/libavformat-55.so

LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include

include $(PREBUILT_SHARED_LIBRARY)

 

include $(CLEAR_VARS)

LOCAL_MODULE:= libswscale

LOCAL_SRC_FILES:= lib/libswscale-2.so

LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include

include $(PREBUILT_SHARED_LIBRARY)

 

include $(CLEAR_VARS)

LOCAL_MODULE:= libavutil

LOCAL_SRC_FILES:= lib/libavutil-52.so

LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include

include $(PREBUILT_SHARED_LIBRARY)

 

include $(CLEAR_VARS)

LOCAL_MODULE:= libavfilter

LOCAL_SRC_FILES:= lib/libavfilter-3.so

LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include

include $(PREBUILT_SHARED_LIBRARY)

 

include $(CLEAR_VARS)

LOCAL_MODULE:= libwsresample

LOCAL_SRC_FILES:= lib/libswresample-0.so

LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include

include $(PREBUILT_SHARED_LIBRARY)

Below is an example of how we can use the libraries in a Android project’s jni/Android.mk file,

LOCAL_PATH := $(call my-dir)

 

include $(CLEAR_VARS)

 

LOCAL_MODULE    := tutorial03

LOCAL_SRC_FILES := tutorial03.c

LOCAL_LDLIBS := -llog -ljnigraphics -lz -landroid

LOCAL_SHARED_LIBRARIES := libavformat libavcodec libswscale libavutil

 

include $(BUILD_SHARED_LIBRARY)

$(call import-module,ffmpeg-2.0.1/android/arm)

Note that we called import-module with the relative path to $NDK/sources for the build system to locate the reusable ffmpeg libraries.

For real examples to how to use the ffmpeg libraries in Android app, please refer to my github repo of android-ffmpeg-tutorial.

Effective Color Conversion (YUV->RGB) for Android in Assembly

Please jump to “0. Android NDK” if you want pure technical stuff; please jump to end of the post to download the sample code if you want to figure out everything yourself.

Recently I am doing a project that decodes a video using ffmpeg and render it on Android using Bitmap. At first, I tried to using sws_scale to do color conversion and scaling, it’s very slow (100~300ms).

Then I found from reference 1 that there’s ARM assembly code available to do YUV to RGB/RGBA color conversion. The code downloaded from the website doesn’t compile for Android, so I changed the code a little bit and made it work on Android.

The code also contains C equivalent  procedures to do the conversion. It also runs faster than sws_scale. In case the assembly code doesn’t work for your processor, you can use the C equivalent methods.

0. Android NDK

In order to do yuv to rgb color conversion in assembly code, you’ll need to have Android NDK set up. I’m testing using NDK r5b, but other NDK versions should also work.
Note that you’ll need basic NDK knowledge in order to understand the example given. This includes how to pass bitmap, string, and primitive data from Java and C, how to manipulate bitmap object in C and basic JNI stuff.

1. How the Sample Program Work
The sample program consists of both Java and native C code. The java code will copy the test.yuv file from assets folder to /sdcard/test.yuv (main.java), create a bitmap object (RenderView.java), call the native code (also RenderView.java, native method defined in jni/test.c file) to do conversion and render the converted bitmap on screen.

The native code (jni/test.c) will manipulate the bitmap buffer, open the test.yuv file to get the YUV bytes and call the assembly procedure to do yuv2rgb8888 (defined in jni/yuv2rgb/yuv4202rgb8888.s file) conversion.

The assembly procedure is modified based on the code from reference 1. I don’t understand it fully yet.  🙁

2. How to Compile
I’ve created the Android.mk file. So you just go the jni folder, and type “ndk-build” will build the native library for you.

3. Who might Need this Code
The code basically contains the color conversion and also rendering on Android. If you’re developing a video player using ffmpeg, or a video game, you might want to handle color conversion and rendering yourself, then you might find this code useful.

4. Final Note
The sample code only test on yuv420 to rgb8888, but the code actually contains a lot other assembly code. You’ll need to modify the assembly procedure yourself and add them to Android.mk. But it should be straightforward and you can refer the yuv4202rgb8888.s file for reference.

When you call the assembly procedure, you might need to switch the order of u and v if you find the color is wired after conversion.

Note that it is the users’ responsibility to ensure that they are patent free before using the code.

5. Download
You can download the entire source code package here or from github here.

References:
1. YUV2RGB: http://wss.co.uk/pinknoise/yuv2rgb/

How to Port ffmpeg (the Program) to Android–Ideas and Thoughts

Why ffmpeg for Android and The App

I was working on a project that requires transcoding of video files and then play these files on Android device. The definite tool to do this is ffmpeg. I can do it on Desktop and transfer the files to Android, but why not just transcode on Android itself?

Then I searched for ffmpeg and transcoding tools on Android, but cannot find any. And since I’ve built ffmpeg libraries for Android before, I believe it is also doable to port the ffmpeg command line utility to Android. So I started doing it.

Below is the screen capture of the Android app (I released it as ffmpeg for Android Beta),

devicedevice1

Figure 1. Screenshots of ffmpeg for Android

Currently it only supports armv7 processor, the one that my Nexus One device uses.

How did I Port it?

The main design goal is to change as few lines of code as possible for ffmpeg.c file. To meet this goal, I decoupled the UI and ffmpeg processing into two processes.

Running ffmpeg on a separate processes has a few advantages,

  • ffmpeg uses a lot of exit function, which will terminate the process. Use a separate process for ffmpeg will ensure the termination of ffmpeg won’t affect the front UI.
  • ffmpeg is processing intensive, running it on a separate process will not affect the responsiveness of front UI.

There’s inter-process communication problem to solve (IPC) so the two processes can communicate with each other. The UI will be able to start and stop ffmpeg, and ffmpeg will send output message to UI.

From UI to ffmpeg is simple, I implement a service that will call ffmpeg. And we start the service using Intent. It’s the same for stop command.

From ffmpeg to UI, there’re a few choices. pipe, socket, etc. But since it requires only a very simple one way communication channel (ffmpeg write, UI read), I just use text files. ffmpeg dump all outputs to text files, and UI read it every second or two (you can also use file observer and update the UI only when the text files changes).

What are the Changes for ffmpeg.c

With the design above, the change for ffmpeg.c is simple.

  • Change the main(int argc, char **argv) function prototype to a java JNI interface prototype that Android java code can call.
  • Change the output to stdout to the text files we created.
  • Comment out a few lines that cause the compilation error.

A little More Thoughts

I’m thinking this method can be used to port lots of other applications to Android too. Two processes, running the background processing utility as a service, change the main method to an native interface so Java code can call, use intents and text files for communication, etc.

If you would like to try out this app, go to https://market.android.com/developer?pub=roman10. Note that it’s for armv7 processor only.

How to Build Android Applications Based on FFmpeg by An Example

This is a follow up post of the previous blog How to Build FFmpeg for Android.  You can read the previous tutorial first, or refer back to it when you feel necessary.

This blog covers how to build a simple Android app based on FFmpeg library. The app will detect the input video file’s resolution and codec information through interface provided by FFmpeg.

Blow is a screenshot of the app,

device

Figure 1. Screen shot of the sample android app based on FFmpeg

0. Create a new Android Project FFmpegTest.

When you’re asked to select targeted platform, select 2.2 as it’s the platform used in previous blog. But you’re free to change it.

Also create a folder named jni under the root directory “FFmpegTest” of this project.

1. Download the ffmpeg Source Code and Extract it to jni Folder

Follow the previous blog How to Build FFmpeg for Android to build the library.

2. Write the Native Code that use FFmpeg’s libavcodec library.

You can copy and paste the code below and save it as ffmpeg-test-jni.c under FFmpegTest/jni/ directory. Note that the code below is not completed, you can download the entire code at the end of the post.

/*for android logs*/

#define LOG_TAG "FFmpegTest"

#define LOG_LEVEL 10

#define LOGI(level, ...) if (level <= LOG_LEVEL) {__android_log_print(ANDROID_LOG_INFO, LOG_TAG, __VA_ARGS__);}

#define LOGE(level, ...) if (level <= LOG_LEVEL) {__android_log_print(ANDROID_LOG_ERROR, LOG_TAG, __VA_ARGS__);}

char *gFileName;      //the file name of the video

AVFormatContext *gFormatCtx;

int gVideoStreamIndex;    //video stream index

AVCodecContext *gVideoCodecCtx;

static void get_video_info(char *prFilename);

static void get_video_info(char *prFilename) {

    AVCodec *lVideoCodec;

    int lError;

    /*register the codec*/

    extern AVCodec ff_h264_decoder;

    avcodec_register(&ff_h264_decoder);

    /*register demux*/

    extern AVInputFormat ff_mov_demuxer;

    av_register_input_format(&ff_mov_demuxer);

    /*register the protocol*/

    extern URLProtocol ff_file_protocol;

    av_register_protocol2(&ff_file_protocol, sizeof(ff_file_protocol));

    /*open the video file*/

    if ((lError = av_open_input_file(&gFormatCtx, gFileName, NULL, 0, NULL)) !=0 ) {

        LOGE(1, "Error open video file: %d", lError);

        return;    //open file failed

    }

    /*retrieve stream information*/

    if ((lError = av_find_stream_info(gFormatCtx)) < 0) {

        LOGE(1, "Error find stream information: %d", lError);

        return;

    }

    /*find the video stream and its decoder*/

    gVideoStreamIndex = av_find_best_stream(gFormatCtx, AVMEDIA_TYPE_VIDEO, -1, -1, &lVideoCodec, 0);

    if (gVideoStreamIndex == AVERROR_STREAM_NOT_FOUND) {

        LOGE(1, "Error: cannot find a video stream");

        return;

    } else {

        LOGI(10, "video codec: %s", lVideoCodec->name);

    }

    if (gVideoStreamIndex == AVERROR_DECODER_NOT_FOUND) {

        LOGE(1, "Error: video stream found, but no decoder is found!");

        return;

    }

    /*open the codec*/

    gVideoCodecCtx = gFormatCtx->streams[gVideoStreamIndex]->codec;

    LOGI(10, "open codec: (%d, %d)", gVideoCodecCtx->height, gVideoCodecCtx->width);

    if (avcodec_open(gVideoCodecCtx, lVideoCodec) < 0) {

        LOGE(1, "Error: cannot open the video codec!");

        return;

    }

}

JNIEXPORT void JNICALL Java_roman10_ffmpegTest_VideoBrowser_naInit(JNIEnv *pEnv, jobject pObj, jstring pFileName) {

    int l_mbH, l_mbW;

    /*get the video file name*/

    gFileName = (char *)(*pEnv)->GetStringUTFChars(pEnv, pFileName, NULL);

    if (gFileName == NULL) {

        LOGE(1, "Error: cannot get the video file name!");

        return;

    }

    LOGI(10, "video file name is %s", gFileName);

    get_video_info(gFileName);

}

JNIEXPORT jstring JNICALL Java_roman10_ffmpegTest_VideoBrowser_naGetVideoCodecName(JNIEnv *pEnv, jobject pObj) {

    char* lCodecName = gVideoCodecCtx->codec->name;

    return (*pEnv)->NewStringUTF(pEnv, lCodecName);

}

 

 

 

If you’re not familiar with Java JNI, you may need to read about JNI first in order to understand the code. But this is not the focus of this tutorial hence not covered.

3. Build the Native Code.

Create a file named Android.mk under jni directory and copy paste the content below,

LOCAL_PATH := $(call my-dir)

 


 

#declare the prebuilt library

 

include $(CLEAR_VARS)

 

LOCAL_MODULE := ffmpeg-prebuilt

 

LOCAL_SRC_FILES := ffmpeg-0.8/android/armv7-a/libffmpeg.so

 

LOCAL_EXPORT_C_INCLUDES := ffmpeg-0.8/android/armv7-a/include

 

LOCAL_EXPORT_LDLIBS := ffmpeg-0.8/android/armv7-a/libffmpeg.so

 

LOCAL_PRELINK_MODULE := true

 

include $(PREBUILT_SHARED_LIBRARY)

 


 

#the ffmpeg-test-jni library

 

include $(CLEAR_VARS)

 

LOCAL_ALLOW_UNDEFINED_SYMBOLS=false

 

LOCAL_MODULE := ffmpeg-test-jni

 

LOCAL_SRC_FILES := ffmpeg-test-jni.c

 

LOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg-0.8/android/armv7-a/include

 

LOCAL_SHARED_LIBRARY := ffmpeg-prebuilt

 

LOCAL_LDLIBS    := -llog -ljnigraphics -lz -lm $(LOCAL_PATH)/ffmpeg-0.8/android/armv7-a/libffmpeg.so

 

include $(BUILD_SHARED_LIBRARY)

Create another file named Application.mk under jni directory and copy paste the content below,

# The ARMv7 is significanly faster due to the use of the hardware FPU

 

APP_ABI := armeabi-v7a

 

APP_PLATFORM := android-8

For more information about what Android.mk and Application.mk do, you can refer to the documentation comes with android NDK.

Your folder structure should be something like below after you finish all steps above,

jni

Figure 2. Folder structure of project FFmpegTest and jni

4. Call the Native Code from FFmpegTest Java Code.

You’ll need to load the libraries and declare the jni functions. The code below is extracted from the app source code, which is provided for download at the end of this tutorial.

/*this part communicates with native code through jni (java native interface)*/

 

    //load the native library

 

    static {

 

        System.loadLibrary("ffmpeg");

 

        System.loadLibrary("ffmpeg-test-jni");

 

    }

 

    //declare the jni functions

 

    private static native void naInit(String _videoFileName);

 

    private static native int[] naGetVideoResolution();

 

    private static native String naGetVideoCodecName();

 

    private static native String naGetVideoFormatName();

 

    private static native void naClose();

 


 

    private void showVideoInfo(final File _file) {

 

        String videoFilename = _file.getAbsolutePath();

 

        naInit(videoFilename);

 

        int[] prVideoRes = naGetVideoResolution();

 

        String prVideoCodecName = naGetVideoCodecName();

 

        String prVideoFormatName = naGetVideoFormatName();

 

        naClose();

 

        String displayText = "Video: " + videoFilename + "n";

 

        displayText += "Video Resolution: " + prVideoRes[0] + "x" + prVideoRes[1] + "n";

 

        displayText += "Video Codec: " + prVideoCodecName + "n";

 

        displayText += "Video Format: " + prVideoFormatName + "n";

 

        text_titlebar_text.setText(displayText);

 

    }

5. You can Download the Entire Source Code from here.

Note: the code is tested on Ubuntu 10.04 and Android ndk-5b, but it should work on other platforms (except for Windows, which I’m not sure about.)

How to Build FFmpeg for Android

For how to build ffmpeg 2.0.1 with NDK r9, please refer to: http://www.roman10.net/how-to-build-ffmpeg-with-ndk-r9/

ffmpeg is an open-source platform for recording, converting, playing and streaming video and audio. It includes libavcodec, a popular video/audio codec.

Several popular Android applications are built based on FFmpeg, including RockPlayer, MoboPlayer, acrMedia, vitalPlayer, V-Cut Express etc. If you’re developing multimedia applications that needs a video/audio codec, ffmpeg is a good choice.

This blog covers how to compile ffmpeg for Android, and next blog will cover how to use ffmpeg to build a simple application.

The steps below are done on Ubuntu 10.10, android NDK r5b, and ffmpeg 0.8. It should work on other versions of Android NDK and ffmpeg, but it may require minor changes.

0. Download Android NDK r5b

You can download the NDK here. Once downloaded, simply extract the file, and you’ll have a folder named android-ndk-r5b. You’ll need the folder location for configurations later.

1. Download Source Code for FFmpeg

You can download the source code from here. If you want to get the latest code, you can use git or svn, the link has detailed  instructions. But for this tutorial, the FFmpeg 0.8 “Love” release is downloaded.

After downloaded the source, extract it and you’ll have a folder named ffmpeg-0.8.

2. Build FFmpeg (The script is based on RockPlayer build script)

2.1 Copy and Paste the bash script from here to a text editor, and save it as build_android.sh under ffmpeg-0.8 folder.

Note that NDK location has to be changed according to your android-ndk-r5b folder location. In my machine, it’s at ~/Desktop/android/, so it’s set as

NDK=~/Desktop/android/android-ndk-r5b
PLATFORM=$NDK/platforms/android-8/arch-arm/
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86

You may also need to adjust the PLATFORM based on which version of SDK you’re using, android-8 corresponds to android SDK 2.2.

The default configuration in the script disables a lot of stuff to speed up the build, you can change the configuration to suit your needs. Besides, you can compile for multiple hardware platforms, but here we only enable arm v7vfpv3 to speed up the build process.

2.2 Make sure the bash script is executable. Go to the ffmpeg-0.8 directory in terminal, then type the following command,

sudo chmod 755 build_android.sh

2.3 Then execute the script, by typing the following command,

./build_android.sh

The compilation will take a while (several minutes or above depends on your machine) to finish.

Update for NDK-r6:

For android NDK-r6, the build_android.sh script might not work. You can try the script here.

Note that you may need to create ./android/armv7-a/ folder in the ffmpeg directory yourself. (Thanks to mgg28831 for this).

If you encounter permission denied error, you can try sudo ./build_android.sh.

3. The Output of the Build

Once the script finishes execution, there’ll be a folder called android under ffmpeg-0.8 directory, which contains all the output of the build.

4. To be Continued

Once the library is compiled successfuly, the next step is to use it to build Android apps. This is covered in next blog, How to Build Android Apps Based on FFmpeg By an Example.

Reference:

RockPlayer open source component: http://rockplayer.freecoder.org/tech.html