How to Build Android Applications Based on FFmpeg by An Example

This is a follow up post of the previous blog How to Build FFmpeg for Android.  You can read the previous tutorial first, or refer back to it when you feel necessary.

This blog covers how to build a simple Android app based on FFmpeg library. The app will detect the input video file’s resolution and codec information through interface provided by FFmpeg.

Blow is a screenshot of the app,

device

Figure 1. Screen shot of the sample android app based on FFmpeg

0. Create a new Android Project FFmpegTest.

When you’re asked to select targeted platform, select 2.2 as it’s the platform used in previous blog. But you’re free to change it.

Also create a folder named jni under the root directory “FFmpegTest” of this project.

1. Download the ffmpeg Source Code and Extract it to jni Folder

Follow the previous blog How to Build FFmpeg for Android to build the library.

2. Write the Native Code that use FFmpeg’s libavcodec library.

You can copy and paste the code below and save it as ffmpeg-test-jni.c under FFmpegTest/jni/ directory. Note that the code below is not completed, you can download the entire code at the end of the post.

/*for android logs*/

#define LOG_TAG "FFmpegTest"

#define LOG_LEVEL 10

#define LOGI(level, ...) if (level <= LOG_LEVEL) {__android_log_print(ANDROID_LOG_INFO, LOG_TAG, __VA_ARGS__);}

#define LOGE(level, ...) if (level <= LOG_LEVEL) {__android_log_print(ANDROID_LOG_ERROR, LOG_TAG, __VA_ARGS__);}

char *gFileName;      //the file name of the video

AVFormatContext *gFormatCtx;

int gVideoStreamIndex;    //video stream index

AVCodecContext *gVideoCodecCtx;

static void get_video_info(char *prFilename);

static void get_video_info(char *prFilename) {

    AVCodec *lVideoCodec;

    int lError;

    /*register the codec*/

    extern AVCodec ff_h264_decoder;

    avcodec_register(&ff_h264_decoder);

    /*register demux*/

    extern AVInputFormat ff_mov_demuxer;

    av_register_input_format(&ff_mov_demuxer);

    /*register the protocol*/

    extern URLProtocol ff_file_protocol;

    av_register_protocol2(&ff_file_protocol, sizeof(ff_file_protocol));

    /*open the video file*/

    if ((lError = av_open_input_file(&gFormatCtx, gFileName, NULL, 0, NULL)) !=0 ) {

        LOGE(1, "Error open video file: %d", lError);

        return;    //open file failed

    }

    /*retrieve stream information*/

    if ((lError = av_find_stream_info(gFormatCtx)) < 0) {

        LOGE(1, "Error find stream information: %d", lError);

        return;

    }

    /*find the video stream and its decoder*/

    gVideoStreamIndex = av_find_best_stream(gFormatCtx, AVMEDIA_TYPE_VIDEO, -1, -1, &lVideoCodec, 0);

    if (gVideoStreamIndex == AVERROR_STREAM_NOT_FOUND) {

        LOGE(1, "Error: cannot find a video stream");

        return;

    } else {

        LOGI(10, "video codec: %s", lVideoCodec->name);

    }

    if (gVideoStreamIndex == AVERROR_DECODER_NOT_FOUND) {

        LOGE(1, "Error: video stream found, but no decoder is found!");

        return;

    }

    /*open the codec*/

    gVideoCodecCtx = gFormatCtx->streams[gVideoStreamIndex]->codec;

    LOGI(10, "open codec: (%d, %d)", gVideoCodecCtx->height, gVideoCodecCtx->width);

    if (avcodec_open(gVideoCodecCtx, lVideoCodec) < 0) {

        LOGE(1, "Error: cannot open the video codec!");

        return;

    }

}

JNIEXPORT void JNICALL Java_roman10_ffmpegTest_VideoBrowser_naInit(JNIEnv *pEnv, jobject pObj, jstring pFileName) {

    int l_mbH, l_mbW;

    /*get the video file name*/

    gFileName = (char *)(*pEnv)->GetStringUTFChars(pEnv, pFileName, NULL);

    if (gFileName == NULL) {

        LOGE(1, "Error: cannot get the video file name!");

        return;

    }

    LOGI(10, "video file name is %s", gFileName);

    get_video_info(gFileName);

}

JNIEXPORT jstring JNICALL Java_roman10_ffmpegTest_VideoBrowser_naGetVideoCodecName(JNIEnv *pEnv, jobject pObj) {

    char* lCodecName = gVideoCodecCtx->codec->name;

    return (*pEnv)->NewStringUTF(pEnv, lCodecName);

}

 

 

 

If you’re not familiar with Java JNI, you may need to read about JNI first in order to understand the code. But this is not the focus of this tutorial hence not covered.

3. Build the Native Code.

Create a file named Android.mk under jni directory and copy paste the content below,

LOCAL_PATH := $(call my-dir)

 


 

#declare the prebuilt library

 

include $(CLEAR_VARS)

 

LOCAL_MODULE := ffmpeg-prebuilt

 

LOCAL_SRC_FILES := ffmpeg-0.8/android/armv7-a/libffmpeg.so

 

LOCAL_EXPORT_C_INCLUDES := ffmpeg-0.8/android/armv7-a/include

 

LOCAL_EXPORT_LDLIBS := ffmpeg-0.8/android/armv7-a/libffmpeg.so

 

LOCAL_PRELINK_MODULE := true

 

include $(PREBUILT_SHARED_LIBRARY)

 


 

#the ffmpeg-test-jni library

 

include $(CLEAR_VARS)

 

LOCAL_ALLOW_UNDEFINED_SYMBOLS=false

 

LOCAL_MODULE := ffmpeg-test-jni

 

LOCAL_SRC_FILES := ffmpeg-test-jni.c

 

LOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg-0.8/android/armv7-a/include

 

LOCAL_SHARED_LIBRARY := ffmpeg-prebuilt

 

LOCAL_LDLIBS    := -llog -ljnigraphics -lz -lm $(LOCAL_PATH)/ffmpeg-0.8/android/armv7-a/libffmpeg.so

 

include $(BUILD_SHARED_LIBRARY)

Create another file named Application.mk under jni directory and copy paste the content below,

# The ARMv7 is significanly faster due to the use of the hardware FPU

 

APP_ABI := armeabi-v7a

 

APP_PLATFORM := android-8

For more information about what Android.mk and Application.mk do, you can refer to the documentation comes with android NDK.

Your folder structure should be something like below after you finish all steps above,

jni

Figure 2. Folder structure of project FFmpegTest and jni

4. Call the Native Code from FFmpegTest Java Code.

You’ll need to load the libraries and declare the jni functions. The code below is extracted from the app source code, which is provided for download at the end of this tutorial.

/*this part communicates with native code through jni (java native interface)*/

 

    //load the native library

 

    static {

 

        System.loadLibrary("ffmpeg");

 

        System.loadLibrary("ffmpeg-test-jni");

 

    }

 

    //declare the jni functions

 

    private static native void naInit(String _videoFileName);

 

    private static native int[] naGetVideoResolution();

 

    private static native String naGetVideoCodecName();

 

    private static native String naGetVideoFormatName();

 

    private static native void naClose();

 


 

    private void showVideoInfo(final File _file) {

 

        String videoFilename = _file.getAbsolutePath();

 

        naInit(videoFilename);

 

        int[] prVideoRes = naGetVideoResolution();

 

        String prVideoCodecName = naGetVideoCodecName();

 

        String prVideoFormatName = naGetVideoFormatName();

 

        naClose();

 

        String displayText = "Video: " + videoFilename + "n";

 

        displayText += "Video Resolution: " + prVideoRes[0] + "x" + prVideoRes[1] + "n";

 

        displayText += "Video Codec: " + prVideoCodecName + "n";

 

        displayText += "Video Format: " + prVideoFormatName + "n";

 

        text_titlebar_text.setText(displayText);

 

    }

5. You can Download the Entire Source Code from here.

Note: the code is tested on Ubuntu 10.04 and Android ndk-5b, but it should work on other platforms (except for Windows, which I’m not sure about.)

0 thoughts on “How to Build Android Applications Based on FFmpeg by An Example”

    1. No, the example is tested on a Nexus One mobile device. But it should work on other Android devices with no/slight modification.

  1. I have managed to compile ffmpeg and to make your example work. However, I want to extract frames from a video (using avcodec_video_decode) and I can’t, because this function always returns a negative value. I have tried it with all types off video formats and it keeps failing. Any ideas?

    Any help would be much appreciated.

    1. Check out the build_android.sh script. The script provided in the sample only configures the basic functions. You may need to change it.

  2. I have a question, I am new to jni,
    and I could use some advice as to how to design my native code,
    I want to call the main function from the ffmepg.c file,
    My idea was to write the native method in the ffmpeg.c file itself to avoid the problem that the file does not have a header file. But from this example I feel that maybe it should be included in a separate file to create the module. Or will it be ok to set the local source file to ffmpeg.c or would I end up with linking errors or something.
    What do u think is the best way to approach this ?

  3. My English is not good, but I hope you can help me .
    I hope you can understand.
    I use “ndk-build” a compilation error.

    This is error message:
    Prebuilt : libffmpeg.so libs/armeabi-v7a/libffmpeg.so
    Compile thumb : ffmpeg-test-jni <= ffmpeg-test-jni.c
    F:/liuhao/android/resource/ffmpegtest/jni/ffmpeg-test-jni.c: In function 'get_video_info':
    F:/liuhao/android/resource/ffmpegtest/jni/ffmpeg-test-jni.c:74: warning: 'av_register_protocol2' is deprecated (declared at F:/liuhao/android/resource/ffmpegtest/jni/ffmpeg-0.8/android/armv7-a/include/libavformat/avio.h:212)
    F:/liuhao/android/resource/ffmpegtest/jni/ffmpeg-test-jni.c:76: warning: 'av_open_input_file' is deprecated (declared at F:/liuhao/android/resource/ffmpegtest/jni/ffmpeg-0.8/android/armv7-a/include/libavformat/avformat.h:1083)
    F:/liuhao/android/resource/ffmpegtest/jni/ffmpeg-test-jni.c: In function 'Java_roman10_ffmpegTest_VideoBrowser_naGetVideoCodecName':
    F:/liuhao/android/resource/ffmpegtest/jni/ffmpeg-test-jni.c:132: warning: initialization discards qualifiers from pointer target type
    F:/liuhao/android/resource/ffmpegtest/jni/ffmpeg-test-jni.c: In function 'Java_roman10_ffmpegTest_VideoBrowser_naGetVideoFormatName':
    F:/liuhao/android/resource/ffmpegtest/jni/ffmpeg-test-jni.c:137: warning: initialization discards qualifiers from pointer target type
    SharedLibrary : libffmpeg-test-jni.so
    arm-linux-androideabi-g++.exe: /cygdrive/f/liuhao/android/resource/ffmpegtest/jni/ffmpeg-0.8/android/armv7-a/libffmpeg.so: No such file or directory
    make: *** [/cygdrive/f/liuhao/android/resource/ffmpegtest/obj/local/armeabi-v7a/libffmpeg-test-jni.so] Error 1

        1. Please confirm me first your library attached code is right or wrong. I am using that one.

          if Right then after using this lib can i able to play mp4 file with any resolution video on surface.

  4. Hi Roman,

    I have compiled your example, and got libffmpeg.so, but in your source code you load 2 library
    System.loadLibrary(“ffmpeg”);
    System.loadLibrary(“ffmpeg-test-jni”);
    I had tried to test your example without ffmpeg-test-jni library and it doesn’t work?
    Is there any problem with ffmpeg-test-jni? Do I need it and how do I get this library (e.g libffmpeg-test-jni).
    Sorry if my question is stupid.

    1. The ffmpeg-test-jni is the library that you’ll get when you execute ndk-build. It’s the binary of the native code in the example.
      If you follow the instructions closely, I think you’ll get it. :)

  5. I am trying to run your sample code but getting unstaidfiedlink error can not find libffmpeg.so , although libffmpeg.so is generated under android arm7v dir.

        1. I got an unstaidfiedlink error too, but in my case I had altered the package names in the java classes, but forgot to change them in the native code. The runtime complained about not being able to find the naInit method. Just in case someone ele has the same problem.

        1. even i am getting the same error unstaidfiedlink error can not find libffmpeg.so. on executing the source code.How to solve it?

          1. i face the same problem and finally i found out the reason.

            To solve it u need to import the ffmpeg library in the static { } first before loading other library.

            like this

            static {
            System.loadLibrary(“ffmpeg”);
            System.loadLibrary(“native-jni”);
            }

  6. hi,
    The step by step explanation is just awesome.
    But am facing a small issue:
    I keep getting “Installation failed due to invalid apk file”

    Is there anything am missing here ?

  7. I wanna create application that send an video RTP packet live from android camera to PC using FFMPEG, not from a file. Can you point the step? Thanks alot

    1. This involves low level operation with Android camera. You’ll need to grab the frames produced by camera, packetize it with RTP format. And stream it to the PC. Basically an RTP server interfacing with camera.
      FFMPEG has some support for RTP Packetization, but how to grab the frames from camera, or whether android allows you to do it or not, I’m not sure.
      Hope it helps.

      1. I’m stil wondering if there are some issues about codec or encoding. I’ve tried another libraries such SipDroid to simply packet the camera frames into RTP format, and then send them frame by frame to my PC address. Wireshark in my PC could catch that incoming packets well, but the player such VLC couldn’t play them.

        1. Does wireshark decode the RTP packet correctly? What are the errors given by VLC? RTP packetization is not easy to get it correctly if you’re doing it without a library support.

          1. It seems like the packets were decoded correctly since Wireshark shows all fields value in each packet and there’s no malformed-packet message found.

            VLC didn’t display any error log, yet it played nothing on its screen.

  8. “The code below is extracted from the app source code, which is provided for download at the end of this tutorial.” – I cannot find it anywhere

    1. Hi, Asdasd,

      Thank you for reminding me this. The post is corrupted due to some wordpress plug-in update. I’ve fixed this issue. You can download at the end of the post now.

  9. I want to do a live streaming using a camera feed which feeds YUV data. How can encode this data using H264 and send it accross the network using RTP .. Please suggest.

  10. Hey,

    So I’m hitting the Unsatisfiedlink error as well. Can someone be kind enough to reply with a solution?

    P.S: Thanks for the wonderful tutorial btw. :)

    1. One situation I encountered is that the C jni interface name doesn’t follow the rules. I suggest you to read up some material about JNI.

      But there might be other reasons.

  11. This is a fantastic tutorial!

    I had to make a few tweaks to get this to work on OSX Lion and android-ndk-r6b

    build_android_r6.sh:
    Add the argument to -i, otherwise complains that c is not a command, e.g.
    sed -i ‘s/HAVE_LRINT 0/HAVE_LRINT 1/g’ config.h
    ->
    sed -i .back1 ‘s/HAVE_LRINT 0/HAVE_LRINT 1/g’ config.h

    Android.mk
    the C_INCLUDES are incorrect:

    LOCAL_EXPORT_C_INCLUDES := ffmpeg-0.8/android/armv7-a/include
    ->
    LOCAL_EXPORT_C_INCLUDES := ffmpeg-0.8

    and

    LOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg-0.8/android/armv7-a/include
    ->
    LOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg-0.8

    Now build the jni bridge (i.e. missing part of step 3 above, Build the native code)

    cd [path to project]/FFmpegTest/jni
    [path to ndk]/ndk-build

    This compiles the file libffmpeg-test-jni.so to the …/FFmpegTest/libs dirictory. (libffmpeg.so already in libs)

    Finally, edit the Eclipse FFMpegTest project since the new .so libraries are not known to the project, and we need to add the libs directory.

    In the project browser, right click the FFMpegTest project and select new folder. This open a ‘New folder’ dialog. Click ‘Advanced’ and the check box ‘link to folder in the file system’. Browse to the FFMpegTest/libs directory and hit ‘open’ and ‘finish’.

    This worked for me and the app works on a Nexus One.

  12. This is a wonderful tutorial..

    I followed the step by step procedure and got it built and
    the application also ran without any hitch.

    I just had a few query’s..

    1. If I were to build an app with this ffmpeg port, how do I make it compatible with all the phones that support android 2.2 and above

    2. Can I use this port to convert audio and video on the phone what functions would I have to export through jni?

    3. Do I need all the contents of the ffmpeg folder in my project or just the shared library’s (.so) files. Just to reduce the size of the app.

    1. 1. You’ll need to build the library for different CPUs, check out android NDK doc, it has some comments on it.
      2. Yes, please refer to another article “how to port ffmpeg the program to android — ideas and thoughts”
      3. No. You only need the so files.

  13. I fail to get this app running and really would like to get started…

    I always get an UnsatisfiedLinkError upon running the application. Yet, I did compile the ffmpeg code so I got the .so file inside ffmpeg-0.8/android/armv7-a/ (libffmpeg.so). Also, ndk-build seems to run fine with your Android.mk file. It copies 2 .so files to the libs folder, like it should.

    Upon running the app, I get the error. I’m desperate, I can’t get it to work. I don’t understand how I could fix it, I already read the entire ANDROID-MK.html page.

    I am using NDK7 and SDK10 (2.3.3). Any help would be appreciated :)

  14. I’m sorry, but I tried it again with NDK5b and SDK8 (2.2). Again I get the UnsatisfiedLinkError. I really don’t understand why; both .so files are being generated into libs/armeabi-v7a.

    I also tried to change LOCAL_MODULE to “ffmpeg” instead of “ffmpeg-prebuilt”, because you refer to “ffmpeg” when you call System.loadLibrary(“ffmpeg”). But this didn’t solve the problem.

    I really, really don’t understand, as I previously managed to build my own JNI test app, which simply calls a C method. That does work… But this tutorial doesn’t.

    It would be great if I got it to work, as I want to continue to work on my app! Thanks in advance.

  15. OK, I finally managed to get it working! The information below may be usefull for other users too.

    I got it working by simply setting “APP_ABI := armeabi” instead of “armeabi-v7” in Application.mk. Ofcourse I had the ffmpeg source code compiled for armeabi too, by uncommenting the corresponding lines in build_android.sh.

    I don’t know why your app does not work under armeabi-v7. It really gave me a headache looking for answers and solutions, but OK, finally I managed to get it working. Up to the next level…

    Greetings and thanks for your tutorial.

  16. (Sorry for the spam…)

    In addition: the Android AVD Manager doesn’t let you choose another CPU when creating a virtual device. I implicitly thought that armeabi-v7 would be the default, but I think it isn’t.

    The next time I get my hands on a real device, I’ll test this tutorial again.

  17. Aight, I tested it on a real device and it works like a charm.

    It always failed when running on the official emulator, but on a real device, it’s perfect.

    So I can only advice people to work on real devices, as the emulator is outdated and runs extremely slow.

  18. Hi ,

    Thank you for the great tutorial , I downloaded the source code and tried to run it on my Acer tablet without any changes to source code or project ,but I got an exception on the eclipse logCat “Cann’t load ffmpeg , findlibaray function return null” any idea how to fix this ,

    Thanks

  19. That’s a great step-by-step article
    Is there any possible run the funtion static void video_encode_example(const char *filename) in
    url
    under this environment?

  20. Really good topic, thank you!

    Unfortunately I’ve got the following error while trying to run your code on Android 2.2 emulator:

    01-21 13:00:35.750: E/AndroidRuntime(321): Caused by: java.lang.UnsatisfiedLinkError: Library ffmpeg not found

    I’m newbie to Eclipse and Android…
    Could you please tell me how to resolve this?

    Thanks again!

        1. Hi,

          First of all thanks very much for the post!

          Secondly, what exactly does this mean? (specify CPU) More changes in Android.mk or Application.mk? I’ve copied the entire application in an attempt to make this run on target platform 2.3.3. I’m still left with the UnsatisfiedLinkError, cannot find ffmpeg, though it’s built and present under ffmpeg-0.8/android/.

          Also another question… Don’t i need to make the jni folder a source folder under eclipse? How else does ffmpeg get packaged in the apk? (excuse me if this is a newbish question, i am indeed one such individual)

          Any ideas are appreciated.

          Regards,
          DAV

  21. Hey Roman,

    I’m an anroid newbie and I used your tutorials to successfully build ffmpeg and the test app on windows with CYGWIN, so if you had any doubts, know that this DOES work on windows.
    The main difficulty on windows is to work out where to use regular windows paths (C:/bla/bla/bla) and when to use CYGWIN paths (/cygdrive/c/bla/bla/bla) as make can only use CYGWIN paths and NDK tools are native windows tools that should use windows paths.
    The best solution is to use relative paths when applicable. Basically changes are required in the ffmpeg build_android.sh and configure files, and in the android project, the path to libffmpeg.so should be a windows path.
    Thanks again for the excellent stuff,
    Ziv

  22. Hi, I successfully compiled the FFmpeg library and ran the ndk-build but I’m unable to install the APK on my device or the emulator. Any pointers? Please help.

    01-30 23:55:38.858: W/PackageManager(59): Native ABI mismatch from package file
    01-30 23:55:38.858: W/PackageManager(59): Package couldn’t be installed in /data/app/roman10.ffmpegTest-1.apk

  23. Hi roman,
    Your article is too good and I have successfully compiled and done with the ndk-build. I have used your ffmpeg-test-jni.c native code and VideoBrowser for checking how itz work on emulator.I seuccesfully installed the application FFmpegTest.apk application on emulator.But when I click on .mp4 video format in FFmpegTest application i am getting an error :

    01-31 10:19:30.337: E/FFmpegTest(778): Error open video file: -1094995529
    01-31 10:18:01.047: A/libc(697): Fatal signal 11 (SIGSEGV) at 0x00000028 (code=1)

    I am unable to resolve the problem as I am new to android platform. Can you help me in resolving the problem.

  24. Thanks for the amazing tutorial but I couldn’t get it to run. I am using android ndk r6b, and I managed to compile ffmpeg. The problem is there is no “include” directory under ./android/armv7-a/. There is only one .o file there. Running ndk-build gives me all kinda errors.

    /home/ubuntu/Documents/Projects/FFmpegTest/jni/ffmpeg-test-jni.c:18:32: error: libavutil/avstring.h: No such file or directory
    /home/ubuntu/Documents/Projects/FFmpegTest/jni/ffmpeg-test-jni.c:19:31: error: libavutil/pixdesc.h: No such file or directory
    /home/ubuntu/Documents/Projects/FFmpegTest/jni/ffmpeg-test-jni.c:20:32: error: libavutil/imgutils.h: No such file or directory
    I guess this basically means, the header files are not found. I even tried to run the application directory without ndk-build but it winds up in an UnsatisfiedLinkerError exception. Any suggestion would be highly appreciated.

  25. Just putting it here for other people who might be interested in the answer. I downloaded NDK r5. Used build instructions for r5 (build_android.sh for r5). Bingo! Got all the missing folders there (bin, include, lib, …). Replaced the new folder with existing ffmpeg folder under jni in my project. ndk-build and there you have it, shared library is made!

  26. Hey,
    I am getting the error :( what should I do?

    java.lang.UnsatisfiedLinkError: Couldn’t load ffmpeg: findLibrary returned null

    Here is something you need to be careful about.
    I will suggest to go through all the comments, it will help you understand different ways you may be doing it wrongly. It would be very much useful if you go through the NDK basics on android developers site.

    Now follow the procedure below :

    Build your native code by running the ‘ndk-build’ script from your project’s directory. It is located in the top-level NDK directory:

    cd
    /ndk-build

    The build tools copy the stripped, shared libraries needed by your application to the proper location in the application’s project directory.

    I was able to solve the unsatisfied link error problem this way, it may differ in your case but still worth trying :)

  27. Hi to everyone! I’ve finally compiled the application with the following additional steps:
    1) Run “./ndk-build NDK_PROJECT_PATH = “;
    2) Modify APP_ABI in “Application.mk”:

    APP_ABI := armeabi armeabi-v7a

    But unfortunately I’ve faced another issue – “FFmpegTest” application closes unexpectedly when I select *.3gp file on the virtual SD card of the Android emulator.

    “LogCat” doesn’t print any warnings/errors when this happens.

    Could someone help me with this?

    Thanks in advance.

    Alex R.

    1. hi Alex,I am getting the same problem, after compiling succesfully, when i run the application its stops, ans no error or log prints in the console..

      if u find a solution for this,

      please help me,,, thanks

  28. I was successful at compiling the first step but once I get to building this second step I’m stuck…

    Apparently my initial build gives me the libffmpeg.so but there is no include folder created why? Here are the first errors I get after calling ndk-build:
    rebuilt : libffmpeg.so libs/armeabi-v7a/libffmpeg.so
    Compile thumb : ffmpeg-test-jni <= ffmpeg-test-jni.c
    /Users/dev/android/workspace_dev/FFmpegTest/jni/ffmpeg-test-jni.c:18:32: error: libavutil/avstring.h: No such file or directory
    /Users/dev/android/workspace_dev/FFmpegTest/jni/ffmpeg-test-jni.c:19:31: error: libavutil/pixdesc.h: No such file or directory
    /Users/dev/android/workspace_dev/FFmpegTest/jni/ffmpeg-test-jni.c:20:32: error: libavutil/imgutils.h: No such file or directory
    /Users/dev/android/workspace_dev/FFmpegTest/jni/ffmpeg-test-jni.c:21:33: error: libavutil/samplefmt.h: No such file or directory
    /Users/dev/android/workspace_dev/FFmpegTest/jni/ffmpeg-test-jni.c:23:34: error: libavformat/avformat.h: No such file or directory
    /Users/dev/android/workspace_dev/FFmpegTest/jni/ffmpeg-test-jni.c:25:32: error: libswscale/swscale.h: No such file or directory
    /Users/dev/android/workspace_dev/FFmpegTest/jni/ffmpeg-test-jni.c:27:32: error: libavcodec/avcodec.h: No such file or directory

  29. hi Roman10 first of all thanks for such a nice article..

    After sucessfully completing the first part of this blog when i ran the application. it goes into the naInit method of ffmpeg-test-jni.c and after that this method is not working..

    if ((lError = av_open_input_file(&gFormatCtx, gFileName, NULL, 0, NULL)) !=0 ) {
    LOGE(1, “Error open video file: %d”, lError);
    return; //open file failed
    }

    so after that code is nt working.
    any suggestion
    Please help me.. thanks in advance

  30. Thanks a lot mate, your tutorial is very helpful. I could build my application with ffmpeg though your attached source code didn’t work for me.

    I used NDK8 & ffmpeg0.8 on Ubuntu 12.04. Anyone trying to build this on Windows dont do it, I have wasted 3 days trying with Cygwin/MinGW etc. tools. Instead install Ubuntu on a virtualbox and it took me just half a day.

    1. I just observed that the code given by you only extracts the info from a video file and doesn’t play the file. In order to display the video file I guess we need another library (SDL). Do I need to build my ffpmeg sources again for SDL support? Any help in displaying the video?

      1. Harish, did you ever figure this out? This guide works for me, but I’ve been trying to just call the main() method of ffmpeg, and it looks like I need to compile additional libraries. Did you determine how to adjust his build script in order to build the additional libraries?

        1. Hi Wesley:

          I’d like to do the same thing. If you have done this successfully, could you post your modifications for us? Thanks.

  31. how to do?
    E/AndroidRuntime(332): Caused by: java.lang.UnsatisfiedLinkError: Couldn’t load ffmpeg: findLibrary returned null.

  32. how to do in windows?

    06-12 07:21:10.951: E/AndroidRuntime(332): Caused by: java.lang.UnsatisfiedLinkError: Couldn’t load ffmpeg: findLibrary returned null.

  33. I Have the same issue un mac os x.

    ” E/AndroidRuntime(576): Caused by: java.lang.UnsatisfiedLinkError: Couldn’t load ffmpeg: findLibrary returned null.”

  34. I implemented this in Windows and now facing the error, “Couldn’t load ffmpeg: findLibrary returned null.”

    Can you please tell me that How to do it in windows?

  35. Nice tutorial for getting started with ffmpeg. How long do you think it would take me to develop the functionality to take video files and re-encode them at a lower resolution (and lower file size)? Looking for a reasonably estimate in terms of hours :). I’ve been developing on android for about 18 months.

    1. It depends on if you’re familiar with ffmpeg and NDk development or not. It’s hard to estimate in hours.

  36. Thanks very much! But I have a question, I hope you will help me. In ffmpeg, example I’m using terminal: ffmpeg -i Video.mpg Pictures%d.jpg. How to I can implement command on android with library ffmpeg ?

  37. Hi, there is no .so file in the code which i have download please help as I am getting the following error.

    07-28 16:20:35.240: E/AndroidRuntime(19945): Caused by: java.lang.UnsatisfiedLinkError: Couldn’t load ffmpeg: findLibrary returned null

  38. Thanks for your valuable post :) I have a problem, when i want to run my project the following error occurs :
    java.lang.UnsatisfiedLinkError: Couldn’t load ffmpeg: findLibrary returned null

    While, i build the ffmpeg-0.8 and .so file appears on ffmpeg-0.8/android/armv7-a.

    What’s wrong ?

  39. I’m not that much of a online reader to be honest but your blogs really nice,
    keep it up! I’ll go ahead and bookmark your website to come back later. Cheers

  40. It generates this error when I run project: java.lang.UnsatisfiedLinkError: Couldn’t load ffmpeg: findLibrary returned null.
    I compile ffmpeg using NDK in CentOS system and get the folder named android and other libraries.
    What’s wrong with me?

  41. I downloaded the source code from the link, and after that when I am trying NDK build, get the error as follows

    Android NDK: ERROR:/home/workspace/FFmpegTest/jni/Android.mk:ffmpeg-prebuilt: LOCAL_SRC_FILES points to a missing file. When I checked that, I found, it pointed ffmpeg-0.8/android/armv7-a/libffmpeg.so, which was missing.

    What should I do in this case?

  42. Thanks for your wonderful tutorial!

    I have one question though:
    Is it possible to pass in a url instead of a file path?

    I tried something like this:
    av_open_input_file(&gFormatCtx, “http://www.test.com/test.mp4”, NULL, 0, NULL)

    But it just returned, that the file couldn’t be opened.

    Thanks for your help and keep up the good work! :)

  43. Hi ,
    While trying to build this project using android ndk8 i am facing the following issues.

    $ /cygdrive/d/Android_NDK/android-ndk-r8-windows/android-ndk-r8/ndk-build
    Prebuilt : libffmpeg.so libs/armeabi-v7a/libffmpeg.so
    install: cannot open `./obj/local/armeabi-v7a/libffmpeg.so’ for reading: Permission denied
    /cygdrive/d/Android_NDK/android-ndk-r8-windows/android-ndk-r8/build/core/build-binary.mk:409: recipe for target `libs/armeabi-v7a/libffmpeg.so’ failed
    make: *** [libs/armeabi-v7a/libffmpeg.so] Error 1

  44. Hi, congratulation for you tutorial. I have a question. What I have to do if I would include on the file ffmpeg-test.c other libraries as for example .
    I get always an error when I try to include new libraries..
    Thanks in advance

  45. Thanks for this wonderful tutorial.I have a problem is that when I ndk-build…it throw error
    Android NDK: Could not find application project directory !
    Android NDK: Please define the NDK_PROJECT_PATH variable to point to it.
    /home/dilip/Desktop/Android/android-ndk-r8d/build/core/build-local.mk:130: *** Android NDK: Aborting . Stop.

  46. Thanks for the example. But how do I run it on Windows and Android.

    I get the following in logcat
    03-23 11:01:56.291: E/AndroidRuntime(1246): FATAL EXCEPTION: main
    03-23 11:01:56.291: E/AndroidRuntime(1246): java.lang.ExceptionInInitializerError
    03-23 11:01:56.291: E/AndroidRuntime(1246): at java.lang.Class.newInstanceImpl(Native Method)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at java.lang.Class.newInstance(Class.java:1319)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at android.app.Instrumentation.newActivity(Instrumentation.java:1023)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:1871)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:1981)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at android.app.ActivityThread.access$600(ActivityThread.java:123)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1147)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at android.os.Handler.dispatchMessage(Handler.java:99)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at android.os.Looper.loop(Looper.java:137)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at android.app.ActivityThread.main(ActivityThread.java:4424)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at java.lang.reflect.Method.invokeNative(Native Method)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at java.lang.reflect.Method.invoke(Method.java:511)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at dalvik.system.NativeStart.main(Native Method)
    03-23 11:01:56.291: E/AndroidRuntime(1246): Caused by: java.lang.UnsatisfiedLinkError: Couldn’t load ffmpeg: findLibrary returned null
    03-23 11:01:56.291: E/AndroidRuntime(1246): at java.lang.Runtime.loadLibrary(Runtime.java:365)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at java.lang.System.loadLibrary(System.java:535)
    03-23 11:01:56.291: E/AndroidRuntime(1246): at roman10.ffmpegTest.VideoBrowser.(VideoBrowser.java:37)
    03-23 11:01:56.291: E/AndroidRuntime(1246): … 15 more

    rgds

  47. Hi!
    I download Your example, but I run get the exception (ndk.r8d):
    04-24 08:48:52.062: W/dalvikvm(27491): Exception Ljava/lang/UnsatisfiedLinkError; thrown while initializing Lroman10/ffmpegTest/VideoBrowser;
    04-24 08:48:52.062: W/dalvikvm(27491): Class init failed in newInstance call (Lroman10/ffmpegTest/VideoBrowser;)
    04-24 08:49:05.414: D/AndroidRuntime(27491): Shutting down VM
    04-24 08:49:05.414: W/dalvikvm(27491): threadid=1: thread exiting with uncaught exception (group=0x40a9a300)
    04-24 08:49:05.480: E/AndroidRuntime(27491): FATAL EXCEPTION: main
    04-24 08:49:05.480: E/AndroidRuntime(27491): java.lang.ExceptionInInitializerError
    04-24 08:49:05.480: E/AndroidRuntime(27491): at java.lang.Class.newInstanceImpl(Native Method)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at java.lang.Class.newInstance(Class.java:1319)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at android.app.Instrumentation.newActivity(Instrumentation.java:1053)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2090)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2210)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at android.app.ActivityThread.access$600(ActivityThread.java:142)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1208)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at android.os.Handler.dispatchMessage(Handler.java:99)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at android.os.Looper.loop(Looper.java:137)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at android.app.ActivityThread.main(ActivityThread.java:4931)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at java.lang.reflect.Method.invokeNative(Native Method)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at java.lang.reflect.Method.invoke(Method.java:511)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:791)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:558)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at dalvik.system.NativeStart.main(Native Method)
    04-24 08:49:05.480: E/AndroidRuntime(27491): Caused by: java.lang.UnsatisfiedLinkError: Couldn’t load ffmpeg: findLibrary returned null
    04-24 08:49:05.480: E/AndroidRuntime(27491): at java.lang.Runtime.loadLibrary(Runtime.java:365)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at java.lang.System.loadLibrary(System.java:535)
    04-24 08:49:05.480: E/AndroidRuntime(27491): at roman10.ffmpegTest.VideoBrowser.(VideoBrowser.java:36)
    04-24 08:49:05.480: E/AndroidRuntime(27491): … 15 more

    You can write it to me, what I have to change in order for the code to run?

  48. what folder exactly should I put in the jni folder? If I extract the source to there, then why did we build the ffmpeg for?

  49. Thanks you very much. I have built android application succesfully based on ffmpeg. But i have a question about streaming url.
    How to convert url multicast to unicast.

  50. Hey, thank you for your tutorial, is very useful, only i have one question, i have success when i compile the ffmpeg library, but, when i try run you testing project the library is not found, my path for the prebuild library is /home/user1/Documents/ffmpeg-0.8
    and i use this in the Android.mk

    LOCAL_PATH := $(call my-dir)
    #declare the prebuilt library
    include $(CLEAR_VARS)
    LOCAL_MODULE := ffmpeg-0.8-prebuilt
    LOCAL_SRC_FILES := /home/user1/Documents/ffmpeg-0.8/android/armv7-a/libffmpeg.so
    LOCAL_EXPORT_C_INCLUDES := /home/user1/Documents/ffmpeg-0.8/android/armv7-a/include
    LOCAL_EXPORT_LDLIBS := /home/user1/Documents/ffmpeg-0.8/android/armv7-a/libffmpeg.so
    LOCAL_PRELINK_MODULE := true
    include $(PREBUILT_SHARED_LIBRARY)

    #the andzop library
    include $(CLEAR_VARS)
    LOCAL_ALLOW_UNDEFINED_SYMBOLS=fasle
    LOCAL_MODULE := ffmpeg-test-jni
    LOCAL_SRC_FILES := ffmpeg-test-jni.c
    LOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg-0.8/android/armv7-a/include
    LOCAL_SHARED_LIBRARY := ffmpeg-prebuilt
    LOCAL_LDLIBS := -llog -ljnigraphics -lz -lm $(LOCAL_PATH)/ffmpeg-0.8/android/armv7-q/libffmpeg.so
    include %(BUILD_SHARED_LIBRARY)

    but when i run the project the library ffmpeg is not found
    what i doing wrong?

    thank you for your support

  51. Thanks you!
    How to run ffmpeg comment in Android client.
    It shows “Permission denied”.
    When i build ffmpeg. I get /android/arm-v7a/bin/ffmpeg.
    I should put ffmpeg in which place ?

  52. Hi Liu,
    I came across your posts related to building Android applications using FFMpeg in here & found those to be really interesting. Congratulations on the job done well! Moreover, these posts are particularly relevant to one of my hobby DIY projects. The goal for my project is to develop an Android application capable of streaming live feeds captured through web cams in a LAN environment using FFMpeg as the underlying engine. So far, following the instructions mentioned in your blog, I did the following –

    1. Compiling and generating FFMpeg related libraries for the following releases –

    FFMpeg version: 2.0
    NDK version: r8e & r9
    Android Platform version: android-16 & android-18
    Toolchain version: 4.6 & 4.8
    Platform built on: Fedora 18 (x86_64)

    2. Creating the files Android.mk & Application.mk in appropriate path (as suggested in the tutorial).

    However, when it came to writing the native C code for utilizing appropriate functionality of FFMpeg from the application layer using Java, the following questions came to my mind –

    a) Which all of FFMpeg’s features I need to make available from native to application layer?
    b) Whether the compilation options (as shown in the tutorial) are sufficient for handling *.sdp streams or do I need to modify it?
    c) Do I need to make use of live555?

    I am totally new to FFMpeg and Android application development and this is going to be my first serious project on Android platform. I have been searching for relevant tutorials dealing with RTSP streaming using FFMpeg for a while now without much success. Moreover, I tried the latest development build of VLC player and found it to be great in streaming RT feeds. However, it’s a complex beast and the goal for my project is of quite limited nature, mostly learning – within a short time span.

    Could you suggest some pointers (e.g. links, documents or sample code) on how can I write the native C code for utilizing FFMpeg and subsequently use those functionality from the app layer? Moreover, will really appreciate if you could let me know the kind of background knowledge necessary for this project from a functional standpoint (in a language agnostic sense).

    Thank you so much!

  53. Hi Liu,

    Hope you’re doing well!

    I came across your posts related to building Android applications using FFMpeg in here and found those to be really interesting. Congratulations on the job done well! Moreover, these posts are particularly relevant to one of my hobby DIY projects. The goal for my project is to develop an Android application capable of streaming live feeds captured through web cams in a LAN environment using FFMpeg as the underlying engine. So far, following the instructions mentioned in your blog, I did the following –

    1. Compiling and generating FFMpeg related libraries for the following releases –

    FFMpeg version: 2.0
    NDK version: r8e & r9
    Android Platform version: android-16 & android-18
    Toolchain version: 4.6 & 4.8
    Platform built on: Fedora 18 (x86_64)

    2. Creating the files Android.mk & Application.mk in appropriate path (as suggested in the tutorial).

    However, when it came to writing the native C code for utilizing appropriate functionality of FFMpeg from the application layer using Java, the following questions came to my mind –

    a) Which all of FFMpeg’s features I need to make available from native to app layer?
    b) Whether the compilation options (as shown in the tutorial) are sufficient for handling *.sdp streams or do I need to modify it?
    c) Do I need to make use of live555?

    I am totally new to FFMpeg and Android application development and this is going to be my first serious project for Android platform. I have been searching for relevant tutorials dealing with RTSP streaming using FFMpeg on Android platform for a while now without much success. Moreover, I tried the latest development build of VLC player and found it to be great in streaming real-time feeds. However, it’s a complex beast and the goal for my project is of quite limited nature, mostly learning – in a short time span.

    Could you suggest some pointers (e.g. links, documents or sample code) on how can I write the native C code for utilizing FFMpeg and subsequently use those functionality from the app layer? Moreover, will really appreciate if you could let me know the kind of background knowledge necessary for this project from a functional standpoint (in a language agnostic sense).

    Thank you so much!

    ~rurtle

    1. Hi did you find the info you need ?
      I have a similar project : display H264 video stream over RTSp without latency,
      I did the jobs on windows, but now i need to port it on android. But i’m new on Android and i’m on windows.

  54. Hi Roman, running the entire source code in Ecplise/SDK, he allows finds e cuts file video only.
    How can I do the same thing with mp3 files instead of video files?
    Thank you very much!
    Alex.

  55. Hey Roman
    I followed your steps It works fine but when it comes to avcodec_open2 it crahses. Can u please check the issue

  56. How can i compile jni of ffmpeg while using android sdk having api 19.
    I am getting error:

    /root/android_ndk/android-ndk-r10c/ndk-build all
    Android NDK: ERROR:jni/Android.mk:ffmpeg-prebuilt: LOCAL_SRC_FILES points to a missing file
    Android NDK: Check that jni/ffmpeg-build/arm64-v8a/libffmpeg.so exists or that its path is correct
    /root/android_ndk/android-ndk-r10c/build/core/prebuilt-library.mk:45: *** Android NDK: Aborting . Stop.

    **** Build Finished ****

  57. Hi there,

    I’m also getting the “Android NDK: Check that libffmpeg.so exists or that its path is correct” error. I’m trying to build ffmpeg on a Mac, I assume I need to customize “build_android.sh”? libffmpeg.so seems to be missing.

    Thanks!

  58. Hi.
    The 1st great tutorial.
    I built all as you wrote.
    Unfortunately in /android/arm/lib does not have libffmpeg.so file.
    I am using ndk10

Leave a Reply

Your email address will not be published. Required fields are marked *