Recherche avancée

Médias (0)

Mot : - Tags -/tags

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (51)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (7064)

  • JavaCV generate video from images Crashes

    2 décembre 2014, par Mohammad Khatri

    I am doing javacv with android since yesterday and getting error while generating video from image (IplImage) using FFmpegFrameRecorder or FrameRecorder while getting succes in imagefiltering using cvCvtColor and cvCvtColor.

    grayscale

    As shown in picture , Grayscale and flip filter is done by clicking second Button (Apply Effect).

    But when clicking (Make Video) it crashes.

    Here is my code for making video from image.

    String path = Environment.getExternalStorageDirectory().getPath() + "/test.mp4";
    Log.i("path", path);
    FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path, 256, 256);
    try {

       recorder.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
       // recorder.setCodecID(avcodec.AV_CODEC_ID_H263);
       recorder.setFormat("mp4");
       recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
       recorder.start();
       for (int i = 0; i < 10; i++) {

           recorder.record(image);
       }
       recorder.stop();
    } catch (Exception e) {

       e.printStackTrace();
    }

    image is of type IplImage

    getting error on recorder object creation.

    I am using 2 devices

    1)Asus zenfone 5

    Stacktrace

    Caused by: java.lang.NoClassDefFoundError: java.lang.ClassNotFoundException: org.bytedeco.javacpp.avcodec
               at org.bytedeco.javacpp.Loader.load(Loader.java:387)
               at org.bytedeco.javacpp.Loader.load(Loader.java:353)
               at org.bytedeco.javacpp.avformat.<clinit>(avformat.java:13)
               at org.bytedeco.javacv.FFmpegFrameRecorder.<clinit>(FFmpegFrameRecorder.java:106)
               at com.example.javacvex1.MainActivity$asyncImageProcVideo.makeVideo(MainActivity.java:191)
               at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:180)
               at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:152)
               at android.os.AsyncTask$2.call(AsyncTask.java:288)
               at java.util.concurrent.FutureTask.run(FutureTask.java:237)
                at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
                at java.lang.Thread.run(Thread.java:841)
        Caused by: java.lang.ClassNotFoundException: org.bytedeco.javacpp.avcodec
               at java.lang.Class.classForName(Native Method)
               at java.lang.Class.forName(Class.java:251)
               at org.bytedeco.javacpp.Loader.load(Loader.java:385)
                at org.bytedeco.javacpp.Loader.load(Loader.java:353)
                at org.bytedeco.javacpp.avformat.<clinit>(avformat.java:13)
                at org.bytedeco.javacv.FFmpegFrameRecorder.<clinit>(FFmpegFrameRecorder.java:106)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.makeVideo(MainActivity.java:191)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:180)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:152)
                at android.os.AsyncTask$2.call(AsyncTask.java:288)
                at java.util.concurrent.FutureTask.run(FutureTask.java:237)
                at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
                at java.lang.Thread.run(Thread.java:841)
        Caused by: java.lang.UnsatisfiedLinkError: dlopen failed: "/data/app-lib/com.example.javacvex1-1/libjniavcodec.so" has unexpected e_machine: 40
               at java.lang.Runtime.loadLibrary(Runtime.java:364)
               at java.lang.System.loadLibrary(System.java:526)
               at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:535)
               at org.bytedeco.javacpp.Loader.load(Loader.java:410)
               at org.bytedeco.javacpp.Loader.load(Loader.java:353)
               at org.bytedeco.javacpp.avcodec.<clinit>(avcodec.java:12)
                at java.lang.Class.classForName(Native Method)
                at java.lang.Class.forName(Class.java:251)
                at org.bytedeco.javacpp.Loader.load(Loader.java:385)
                at org.bytedeco.javacpp.Loader.load(Loader.java:353)
                at org.bytedeco.javacpp.avformat.<clinit>(avformat.java:13)
                at org.bytedeco.javacv.FFmpegFrameRecorder.<clinit>(FFmpegFrameRecorder.java:106)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.makeVideo(MainActivity.java:191)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:180)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:152)
                at android.os.AsyncTask$2.call(AsyncTask.java:288)
                at java.util.concurrent.FutureTask.run(FutureTask.java:237)
                at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
                at java.lang.Thread.run(Thread.java:841)
        Caused by: java.lang.UnsatisfiedLinkError: dlopen failed: "/data/app-lib/com.example.javacvex1-1/libavcodec.so" has unexpected e_machine: 40
    </clinit></clinit></clinit></clinit></clinit></clinit></clinit>

    2) Htc me tablet

    (Stacktrace)

    Caused by: java.lang.ExceptionInInitializerError
               at com.example.javacvex1.MainActivity$asyncImageProcVideo.makeVideo(MainActivity.java:191)
               at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:180)
               at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:152)
               at android.os.AsyncTask$2.call(AsyncTask.java:264)
               at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:305)
                at java.util.concurrent.FutureTask.run(FutureTask.java:137)
                at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:208)
                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1076)
                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:569)
                at java.lang.Thread.run(Thread.java:856)
        Caused by: java.lang.ExceptionInInitializerError
               at org.bytedeco.javacv.FFmpegFrameRecorder.<clinit>(FFmpegFrameRecorder.java:106)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.makeVideo(MainActivity.java:191)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:180)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:152)
                at android.os.AsyncTask$2.call(AsyncTask.java:264)
                at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:305)
                at java.util.concurrent.FutureTask.run(FutureTask.java:137)
                at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:208)
                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1076)
                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:569)
                at java.lang.Thread.run(Thread.java:856)
        Caused by: java.lang.ExceptionInInitializerError
               at java.lang.Class.classForName(Native Method)
               at java.lang.Class.forName(Class.java:217)
               at org.bytedeco.javacpp.Loader.load(Loader.java:385)
               at org.bytedeco.javacpp.Loader.load(Loader.java:353)
               at org.bytedeco.javacpp.avformat.<clinit>(avformat.java:13)
                at org.bytedeco.javacv.FFmpegFrameRecorder.<clinit>(FFmpegFrameRecorder.java:106)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.makeVideo(MainActivity.java:191)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:180)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:152)
                at android.os.AsyncTask$2.call(AsyncTask.java:264)
                at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:305)
                at java.util.concurrent.FutureTask.run(FutureTask.java:137)
                at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:208)
                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1076)
                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:569)
                at java.lang.Thread.run(Thread.java:856)
        Caused by: java.lang.UnsatisfiedLinkError: Cannot load library: link_image[1936]:    76 could not load needed library 'libavcodec.so' for 'libjniavcodec.so' (find_library[1199]:    76 'libavcodec.so' failed to load previously)
               at java.lang.Runtime.loadLibrary(Runtime.java:370)
               at java.lang.System.loadLibrary(System.java:535)
               at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:535)
               at org.bytedeco.javacpp.Loader.load(Loader.java:410)
               at org.bytedeco.javacpp.Loader.load(Loader.java:353)
               at org.bytedeco.javacpp.avcodec.<clinit>(avcodec.java:12)
                at java.lang.Class.classForName(Native Method)
                at java.lang.Class.forName(Class.java:217)
                at org.bytedeco.javacpp.Loader.load(Loader.java:385)
                at org.bytedeco.javacpp.Loader.load(Loader.java:353)
                at org.bytedeco.javacpp.avformat.<clinit>(avformat.java:13)
                at org.bytedeco.javacv.FFmpegFrameRecorder.<clinit>(FFmpegFrameRecorder.java:106)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.makeVideo(MainActivity.java:191)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:180)
                at com.example.javacvex1.MainActivity$asyncImageProcVideo.doInBackground(MainActivity.java:152)
                at android.os.AsyncTask$2.call(AsyncTask.java:264)
                at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:305)
                at java.util.concurrent.FutureTask.run(FutureTask.java:137)
                at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:208)
                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1076)
                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:569)
                at java.lang.Thread.run(Thread.java:856)
        Caused by: java.lang.UnsatisfiedLinkError: Cannot load library: link_image[1936]:    76 could not load needed library 'libswresample.so' for 'libavcodec.so' (load_library[1091]: Library 'libswresample.so' not found)
    </clinit></clinit></clinit></clinit></clinit></clinit>

    At last I am putting my build.gradle with jniLibs on left side.

    enter image description here

    No luck with answers from other questions. Stuck since last 24 hours. :(

    Any help will be great.

  • Running Windows XP In 2016

    2 janvier 2016, par Multimedia Mike

    I have an interest in getting a 32-bit Windows XP machine up and running. I have a really good yet slightly dated and discarded computer that seemed like a good candidate for dedicating to this task. So the question is : Can Windows XP still be installed from scratch on a computer, activated, and used in 2016 ? I wasn’t quite sure since I have heard stories about how Microsoft has formally ended support for Windows XP as of the first half of 2014 and I wasn’t entirely sure what that meant.

    Spoiler : It’s still possible to install and activate Windows XP as of the writing of this post. It’s also possible to download and install all the updates published up until support ended.

    The Candidate Computer
    This computer was assembled either in late 2008 or early 2009. It was a beast at the time.


    New old Windows XP computer
    Click for a larger image

    It was built around the newly-released NVIDIA GTX 280 video card. The case is a Thermaltake DH-101, which is a home theater PC thing. The motherboard is an Asus P5N32-SLI Premium with a Core 2 Duo X6800 2.93 GHz CPU on board. 2 GB of RAM and a 1.5 TB hard drive are also present.

    The original owner handed it off to me because their family didn’t have much use for it anymore (too many other machines in the house). Plus it was really, obnoxiously loud. The noisy culprit was the stock blue fan that came packaged with the Intel processor (seen in the photo) whining at around 65 dB. I replaced the fan and brought the noise level way down.

    As for connectivity, the motherboard has dual gigabit NICs (of 2 different chipsets for some reason) and onboard wireless 802.11g. I couldn’t make the latter work and this project was taking place a significant distance from my wired network. Instead, I connected a USB 802.11ac dongle and antenna which is advertised to work in both Windows XP and Linux. It works great under Windows XP. Meanwhile, making the adapter work under Linux provided a retro-computing adventure in which I had to modify C code to make the driver work.

    So, score 1 for Windows XP over Linux here.

    The Simple Joy of Retro-computing
    One thing you have to watch out for when you get into retro-computing is fighting the urge to rant about the good old days of computing. Most long-time computer users have a good understanding of the frustration that computers keep getting faster by orders of magnitude and yet using them somehow feels slower and slower over successive software generations.

    This really hits home when you get old software running, especially on high-end hardware (relative to what was standard contemporary hardware). After I got this new Windows XP machine running, as usual, I was left wondering why software was so much faster a few generations ago.

    Of course, as mentioned, it helps when you get to run old software on hardware that would have been unthinkably high end at the software’s release. Apparently, the minimum WinXP specs as set by MS are a 233 MHz Pentium CPU and 64 MB of RAM, with 1.5 GB of hard drive space. This machine has more than 10x the clock speed (and 2 CPUs), 32x the RAM, and 1000x the HD space. Further, I’m pretty sure 100 Mbit ethernet was the standard consumer gear in 2001 while 802.11b wireless was gaining traction. The 802.11ac adapter makes networking quite pleasant.

    Purpose
    Retro-computing really seems to be ramping up in popularity lately. For some reason, I feel compelled to declare at this juncture that I was into it before it was cool.

    Why am I doing this ? I have a huge collection of old DOS/Windows computer games. I also have this nerdy obsession with documenting old video games in the MobyGames database. I used to do a lot of this a few years ago, tracking the effort on my gaming blog. In the intervening years, I have still collected a lot of old, unused, unloved video games, usually either free or very cheap while documenting my collection efforts on that same blog.

    So I want to work my way through some of this backlog, particularly the games that are not yet represented in the MobyGames database, and even more pressing, ones that the internet (viewed through Google at least) does not seem to know about. To that end, I thought this was a good excuse to get Windows XP on this old machine. A 32-bit Windows XP machine is capable of running any software advertised as supporting Windows XP, Windows ME, Windows 98, Windows 95, and even 16-bit Windows 3.x (I have games for all these systems). That covers a significant chunk of PC history. It can probably be made to run DOS games as well, but those are (usually) better run under DosBox. In order to get the right display feel, I even invested in a (used) monitor sporting a 4:3 aspect ratio. If I know these old games, most will be engineered and optimized for that ratio rather than the widescreen resolutions seen nowadays.

    I would also like to get back to that Xbox optical disc experimentation I was working on a few years ago. Another nice feature of this motherboard is that it still provides a 40-pin IDE/PATA adapter which makes the machine useful for continuing that old investigation (and explains why I have that long IDE cable to no where pictured hanging off the board).

    The Messy Details
    I did the entire installation process twice. The first time was a bumbling journey of discovery and copious note-taking. I still have Windows XP installation media that includes service pack 2 (SP2), along with 2 separate licenses that haven’t been activated for a long time. My plan was to install it fresh, then install the relevant drivers. Then I would investigate the Windows update and activation issues and everything should be fine.

    So what’s the deal with Windows Update for XP, and with activations ? Second item first : it IS possible to still activate Windows XP. The servers are still alive and respond quickly. However, as always, you don’t activate until you’re sure everything is working at some baseline. It took awhile to get there.

    As for whether Windows Update still works for XP, that’s a tougher question. Short answer is yes ; longer answer is that it can be difficult to kick off the update process. At least on SP2, the “Windows Update” program launches IE6 and navigates to a special microsoft.com URL which initiates the update process (starting with an ActiveX control). This URL no longer exists.

    From what I can piece together from my notes, this seems to be the route I eventually took :

    1. Install Windows XP fresh
    2. Install drivers for the hardware ; fortunately, Asus still has all the latest drivers necessary for the motherboard and its components but it’s necessary to download these from another network-connected PC since the networking probably won’t be running “out of the box”
    3. Download the .NET 3.5 runtime, which is the last one supported by Windows XP, and install it
    4. Download the latest NVIDIA drivers ; this needs to be done after the previous step because the installer requires the .NET runtime ; run the driver installer and don’t try to understand why it insists on re-downloading .NET 3.5 runtime before installation
    5. While you’re downloading stuff on other computers to be transported to this new machine, be sure to download either Chrome or Firefox per your preference ; if you try to download via IE6, you may find that their download pages aren’t compatible with IE6
    6. Somewhere along the line (I’m guessing as a side effect of the .NET 3.5 installation), the proper, non-IE6-based Windows Update program magically springs to life ; once this happens, there will be 144 updates (in my case anyway) ; installing these will probably require multiple reboots, but SP3 and all known pre-deprecation security fixes will be installed
    7. Expect that, even after installing all of these, a few more updates will appear ; eventually, you’ll be at the end of the update road
    8. Once you’re satisfied everything is working satisfactorily, take the plunge and activate your installation

    Residual Quirks
    Steam runs great on Windows XP, as do numerous games I have purchased through the service. So that opens up a whole bunch more games that I could play on this machine. Steam’s installer highlights a curious legacy problem of Windows XP– it seems there are many languages that it does not support “out of the box” :


    Steam missing languages under Windows XP

    It looks like the Chinese options and a few others that are standard now weren’t standard 15 years ago.

    Also, a little while after booting up, I’ll get a crashing error concerning a process called geoforms.scr. This appears to be NVIDIA-related. However, I don’t notice anything obviously operationally wrong with the system.

    Regarding DirectX support, DirectX 9 is the highest version officially supported by Windows XP. There are allegedly methods to get DirectX 10 running as well, but I don’t care that much. I did care, briefly, when I realized that a bunch of the demos for the NVIDIA GTX 280 required DX10 which left me wondering why it was possible to install them on Windows XP.

    Eventually, by installing enough of these old games, I fully expect to have numerous versions of .NET, DirectX, QT, and Video for Windows installed side by side.

    Out of curiosity, I tried playing a YouTube HD/1080p video. I wanted to see if the video was accelerated through my card. The video played at full speed but I noticed some tearing. Then I inspected the CPU usage and noticed that the CPU was quite loaded. So either the GTX 280 doesn’t have video acceleration, or Windows XP doesn’t provide the right APIs, or Chrome is not able to access the APIs in Windows XP, or perhaps some combination of the foregoing.

    Games are working well, though. I tried one of my favorite casual games and got sucked into that for, like, an entire night because that’s what casual games do. But then, I booted up a copy of WarCraft III that I procured sometime ago. I don’t have any experience with the WarCraft universe (RTS or MMO) but I developed a keen interest in StarCraft II over the past few years and wanted to try WarCraft III. Unfortunately, I couldn’t get WarCraft III to work correctly on several different Windows 7 installations (movies didn’t play, which left me slightly confused as to what I was supposed to do).

    Still works beautifully on the new old Windows XP machine.

  • Feeding MediaCodec with byte data from AVPacket : problems with output buffers

    2 mars 2016, par serg66

    Description of my task :
    I’m developing a video player on Android (API >= 17). It has to work both with HLS and multicast video. In addition, it has to support multiple audio tracks.

    Why I decided to use ffmpeg :

    • On some devices MediaPlayer doesn’t support multicast-video
    • MediaExtractor doesn’t work with HLS (getTrackCount() returns 0)
    • ffmpeg works both with HLS and multicast

    My idea :
    I demux a stream using ffmpeg in a loop. I get the CSD using videoStream->codec->extradata and then properly configure the MediaFormat. On each iteration when I have a new video AVPacket available, I filter it’s buffer using av_bitstream_filter_init to h264_mp4toannexb. Then I call the java method onNewVideoData, in which I get the AVPacket byte array. I clear the available input buffer, after that I fill it with the new data. I also get the pts. Since I have a stream with no beginning, additionally, I calculate new pts’ by subtracting the pts of the first AVPacket from all the following pts’. The first pts I assign to 0. Then I call queueInputBuffer to send the buffer to the decoder.

    I use two threads : one for getting and submitting data to the input buffers, and another one for posting it to the Surface.

    The full player c-code :

    #include
    #include <android></android>log.h>
    #include

    #include <libavformat></libavformat>avformat.h>
    #include <libavcodec></libavcodec>avcodec.h>
    #include <libavutil></libavutil>buffer.h>

    #define TAG "ffmpegPlayer"

    struct
    {
       const char* url;
       jint width;
       jint height;
       jfloat aspectRatio;
       jint streamsCount;
       AVFormatContext* formatContext;
       AVStream* videoStream;
    } context;

    AVPacket packet;
    AVBitStreamFilterContext* avBitStreamFilterContext;

    JNIEXPORT jbyteArray JNICALL Java_com_example_app_FfmpegPlayer_getCsdNative(JNIEnv* env, jobject x)
    {
       jbyteArray arr = (*env)->NewByteArray(env, context.videoStream->codec->extradata_size);
       (*env)->SetByteArrayRegion(env, arr, 0, context.videoStream->codec->extradata_size, (jbyte*)context.videoStream->codec->extradata);

       return arr;
    }

    JNIEXPORT jint JNICALL Java_com_example_app_FfmpegPlayer_getWidthNative(JNIEnv* env, jobject x)
    {
       return context.width;
    }

    JNIEXPORT jint JNICALL Java_com_example_app_FfmpegPlayer_getHeightNative(JNIEnv* env, jobject x)
    {
       return context.height;
    }

    JNIEXPORT jfloat JNICALL Java_com_example_app_FfmpegPlayer_getAspectRatioNative(JNIEnv* env, jobject x)
    {
       return context.aspectRatio;
    }

    JNIEXPORT jfloat JNICALL Java_com_example_app_FfmpegPlayer_getStreamsCountNative(JNIEnv* env, jobject x)
    {
       return context.streamsCount;
    }

    JNIEXPORT jlong JNICALL Java_com_example_app_FfmpegPlayer_getPtsNative(JNIEnv* env, jobject obj)
    {
       return packet.pts * av_q2d(context.videoStream->time_base) * 1000000;
    }

    JNIEXPORT jboolean JNICALL Java_com_example_app_FfmpegPlayer_initNative(JNIEnv* env, jobject obj, const jstring u)
    {
       av_register_all();
       avBitStreamFilterContext = av_bitstream_filter_init("h264_mp4toannexb");

       const char* url = (*env)->GetStringUTFChars(env, u , NULL);
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "Init: %s", url);

       AVFormatContext* formatContext = NULL;
       if (avformat_open_input(&amp;formatContext, url, NULL, NULL) &lt; 0) {
           __android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to open input");
           return JNI_FALSE;
       }

       if (avformat_find_stream_info(formatContext, NULL) &lt; 0) {
           __android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to find stream info");
           return JNI_FALSE;
       }

       AVInputFormat * iformat = formatContext->iformat;
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "format: %s", iformat->name);

       context.streamsCount = formatContext->nb_streams;
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "Streams count: %d", formatContext->nb_streams);

       int i = 0;
       AVStream* videoStream = NULL;
       AVDictionaryEntry* lang;
       for (i = 0; i &lt; formatContext->nb_streams; i++) {
           int codecType = formatContext->streams[i]->codec->codec_type;
           if (videoStream == NULL &amp;&amp; codecType == AVMEDIA_TYPE_VIDEO) {
               videoStream = formatContext->streams[i];
           }
           else if (codecType == AVMEDIA_TYPE_AUDIO) {
               lang = av_dict_get(formatContext->streams[i]->metadata, "language", NULL, 0);
               if (lang != NULL) {
                   __android_log_print(ANDROID_LOG_DEBUG, TAG, "Audio stream %d: %s", i, lang->value);
               }
           }
       }
       if (videoStream == NULL) {
           __android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to find video stream");
           return JNI_FALSE;
       }
       context.videoStream = videoStream;
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "Video stream:  %d", videoStream->index);

       AVCodecContext *codecContext = formatContext->streams[videoStream->index]->codec;

       __android_log_print(ANDROID_LOG_DEBUG, TAG, "width: %d, height: %d", codecContext->width, codecContext->height);
       context.width = codecContext->width;
       context.height = codecContext->height;

       AVRational aspectRatio = codecContext->sample_aspect_ratio;
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "aspect ratio: %d/%d", aspectRatio.num, aspectRatio.den);
       context.aspectRatio = aspectRatio.num / aspectRatio.den;

       context.formatContext = formatContext;

       return JNI_TRUE;
    }

    void filterPacket()
    {
       av_bitstream_filter_filter(avBitStreamFilterContext, context.videoStream->codec, NULL, &amp;packet.data, &amp;packet.size, packet.data, packet.size, packet.flags);
    }

    JNIEXPORT void JNICALL Java_com_example_app_FfmpegPlayer_startNative(JNIEnv* env, jobject obj)
    {
       jclass cl = (*env)->GetObjectClass(env, obj);
       jmethodID updateMethodId = (*env)->GetMethodID(env, cl, "onNewVideoData", "()V");

       while (av_read_frame(context.formatContext, &amp;packet) >= 0) {
           if (context.formatContext == NULL) {
               return;
           }
           if (packet.stream_index == context.videoStream->index) {
               filterPacket();
               (*env)->CallVoidMethod(env, obj, updateMethodId);
           }
       }
    }

    JNIEXPORT jbyteArray JNICALL Java_com_example_app_FfmpegPlayer_getVideoDataNative(JNIEnv* env, jobject obj)
    {
       AVBufferRef *buf = packet.buf;

       jbyteArray arr = (*env)->NewByteArray(env, buf->size);
       (*env)->SetByteArrayRegion(env, arr, 0, buf->size, (jbyte*)buf->data);

       return arr;
    }

    The full Java-code :

    package com.example.app;


    import android.media.MediaCodec;
    import android.media.MediaFormat;
    import android.view.Surface;

    import java.nio.ByteBuffer;

    public class FfmpegPlayer {

       static {
           System.loadLibrary("avutil-54");
           System.loadLibrary("swscale-3");
           System.loadLibrary("swresample-1");
           System.loadLibrary("avcodec-56");
           System.loadLibrary("avformat-56");
           System.loadLibrary("avfilter-5");
           System.loadLibrary("ffmpeg-player");
       }

       private native boolean initNative(String url);
       private native boolean startNative();
       private native int getWidthNative();
       private native int getHeightNative();
       private native float getAspectRatioNative();
       private native byte[] getVideoDataNative();
       private native long getPtsNative();
       private native byte[] getCsdNative();

       private String source;
       private PlayerThread playerThread;
       private int width;
       private int height;
       private MediaCodec decoder;
       private ByteBuffer[] inputBuffers;
       private Surface surface;
       private long firstPtsTime;

       public PlanetaPlayer(Surface surface) {
           this.surface = surface;
       }

       public void setDataSource(String source) {
           if (!initNative(source)) {
               return;
           }
           width = getWidthNative();
           height = getHeightNative();
           MediaFormat format = MediaFormat.createVideoFormat("video/avc", width, height);
           format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, width * height);
           format.setByteBuffer("csd-0", ByteBuffer.wrap(getCsdNative()));
           LogUtils.log("CSD: ");
           outputAsHex(getCsdNative());
           try {
               decoder = MediaCodec.createDecoderByType("video/avc");
               decoder.configure(format, surface, null, 0);
               decoder.start();

               playerThread = new PlayerThread();
               playerThread.start();

               new OutputThread().run();
           }
           catch (Exception e) {
               e.printStackTrace();
           }
       }

       public void onNewVideoData() {
           int index = decoder.dequeueInputBuffer(0);
           if (index >= 0) {
               byte[] data = getVideoDataNative();
               ByteBuffer byteBuffer = decoder.getInputBuffers()[index];
               byteBuffer.clear();
               byteBuffer.put(data);
               long pts = getPtsNative();

               LogUtils.log("Input AVPacket pts: " + pts);
               LogUtils.log("Input AVPacket data length: " + data.length);
               LogUtils.log("Input AVPacket data: ");
               outputAsHex(data);

               if (firstPtsTime == 0) {
                   firstPtsTime = pts;
                   pts = 0;
               }
               else {
                   pts -= firstPtsTime;
               }
               decoder.queueInputBuffer(index, 0, data.length, pts, 0);
           }
       }

       private void outputAsHex(byte[] data) {
           String[] test = new String[data.length];
           for (int i = 0; i &lt; data.length; i++) {
               test[i] = String.format("%02x", data[i]);
           }
           LogUtils.log(test);
       }

       private class PlayerThread extends Thread {
           @Override
           public void run() {
               super.run();

               startNative();
           }
       }

       private class OutputThread extends Thread {

           @Override
           public void run() {
               super.run();
               MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
               while (true) {
                   int index = decoder.dequeueOutputBuffer(info, 0);
                   if (index >= 0) {
                       ByteBuffer buffer = decoder.getOutputBuffers()[index];
                       buffer.position(info.offset);
                       buffer.limit(info.offset + info.size);
                       byte[] test = new byte[info.size];
                       for (int i = 0; i &lt; info.size; i++) {
                           test[i] = buffer.get(i);
                       }
                       LogUtils.log("Output info: size=" + info.size + ", presentationTimeUs=" + info.presentationTimeUs + ",offset=" + info.offset + ",flags=" + info.flags);
                       LogUtils.log("Output data: ");
                       outputAsHex(test);
                       decoder.releaseOutputBuffer(index, true);
                   }
               }
           }
       }
    }

    The problem :
    For the tests I used a TS file with the following video stream :

    Codec: H264 - MPEG-4 AVC (part 10) (h264)
    Resolution: 720x578
    Frame rate: 25
    Decoded format: Planar 4:2:0 YUV

    The CSD is the following :

    [00, 00, 00, 01, 09, 10, 00, 00, 00, 01, 27, 4d, 40, 1e, 9a, 62, 01, 68, 48, b0, 44, 20, a0, a0, a8, 00, 00, 03, 00, 08, 00, 00, 03, 01, 94, a0, 00, 00, 00, 01, 28, ee, 3c, 80]

    On different devices I have different results. But I couldn’t achieve showing the video on the Surface.

    Input :

    Input AVPacket pts: 351519222
    Input AVPacket data length: 54941
    Input AVPacket data: [00, 00, 00, 01, 09, 10, 00, 00, 00, 01, 27, 4d, 40, 1e, 9a, 62, 01, 68, 48, b0, 44, 20, a0, a0, a8, 00, 00, 03, 00, 08, 00, 00, 03, 01, 94, a0, 00, 00, 00, 01,...]
    ------------------------------------
    Input AVPacket pts: 351539222
    Input AVPacket data length: 9605
    Input AVPacket data: [00, 00, 00, 01, 09, 30, 00, 00, 00, 01, 06, 01, 01, 24, 80, 00, 00, 00, 01, 21, e3, bd, da, e4, 46, c5, 8b, 6b, 7d, 07, 59, 23, 6f, 92, e9, fb, 3b, b9, 4d, f9,...]
    ------------------------------------
    Input AVPacket pts: 351439222
    Input AVPacket data length: 1985
    Input AVPacket data: [00, 00, 00, 01, 09, 50, 00, 00, 00, 01, 06, 01, 01, 14, 80, 00, 00, 00, 01, 21, a8, f2, 74, 69, 14, 54, 4d, c5, 8b, e8, 42, 52, ac, 80, 53, b4, 4d, 24, 1f, 6c,...]
    ------------------------------------
    Input AVPacket pts: 351459222
    Input AVPacket data length: 2121
    Input AVPacket data: [00, 00, 00, 01, 09, 50, 00, 00, 00, 01, 06, 01, 01, 24, 80, 00, 00, 00, 01, 21, a8, f3, 74, e9, 0b, 8b, 17, e8, 43, f8, 10, 88, ca, 2b, 11, 53, c8, 31, f0, 0b,...]
    ... on and on

    Asus Zenfone (Android 5.0.2) output thread (after decoding, strange results with 25 buffers of only 8 byte data) :

    Output info: size=8, presentationTimeUs=-80001,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 90, c5, 99, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=0,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, 78, ea, 86, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=720000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e8, 86, b6, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=780000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, c0, cb, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=840000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 80, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=960000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, 3f, 8b, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1040000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, f8, 76, 85, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1180000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1260000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e8, b5, d2, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1800000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 90, c5, 99, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1860000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, c0, 84, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=2080000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, c0, cb, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=3440000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, 80, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=3520000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 78, ea, 86, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4160000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e8, 86, b6, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4300000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, 3f, 8b, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4400000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, 90, c5, 99, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4480000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, f8, 76, 85, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4680000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, c0, cb, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4720000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, c0, 84, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4760000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4800000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 58, 54, 83, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=5040000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, e8, b5, d2, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=5100000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, 80, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=5320000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 78, ea, 86, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=5380000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e8, 86, b6, ac]

    Other Asus Zenfone logs :

    01-25 17:11:36.859 4851-4934/com.example.app I/OMXClient: Using client-side OMX mux.
    01-25 17:11:36.865 317-1075/? I/OMX-VDEC-1080P: component_init: OMX.qcom.video.decoder.avc : fd=43
    01-25 17:11:36.867 317-1075/? I/OMX-VDEC-1080P: Capabilities: driver_name = msm_vidc_driver, card = msm_vdec_8974, bus_info = , version = 1, capabilities = 4003000
    01-25 17:11:36.881 317-1075/? I/OMX-VDEC-1080P: omx_vdec::component_init() success : fd=43
    01-25 17:11:36.885 4851-4934/com.example.app I/ACodec: [OMX.qcom.video.decoder.avc] DRC Mode: Dynamic Buffer Mode
    01-25 17:11:36.893 317-20612/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.933 317-12269/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.933 317-12269/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.935 317-5559/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.957 317-5559/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.957 4851-4934/com.example.app I/ExtendedCodec: Decoder will be in frame by frame mode
    01-25 17:11:36.963 317-1075/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.963 317-1075/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.964 317-20612/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.describeColorFormat not implemented
    01-25 17:11:37.072 317-20612/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.describeColorFormat not implemented
    01-25 17:11:37.072 4851-4934/com.example.app W/ACodec: do not know color format 0x7fa30c04 = 2141391876

    Asus Nexus 7 (Android 6.0.1) crashes :

    01-25 17:23:06.921 11602-11695/com.example.app I/OMXClient: Using client-side OMX mux.
    01-25 17:23:06.952 11602-11694/com.example.app I/MediaCodec: [OMX.qcom.video.decoder.avc] setting surface generation to 11880449
    01-25 17:23:06.954 194-194/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.storeANWBufferInMetadata not implemented
    01-25 17:23:06.954 194-194/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.storeMetaDataInBuffers not implemented
    01-25 17:23:06.954 194-194/? E/OMXNodeInstance: getExtensionIndex(45:qcom.decoder.avc, OMX.google.android.index.storeMetaDataInBuffers) ERROR: NotImplemented(0x80001006)
    01-25 17:23:06.954 11602-11695/com.example.app E/ACodec: [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648
    01-25 17:23:06.963 11602-11695/com.example.app D/SurfaceUtils: set up nativeWindow 0xa0b7a108 for 720x576, color 0x7fa30c03, rotation 0, usage 0x42002900
    01-25 17:23:06.967 194-604/? E/OMX-VDEC-1080P: GET_MV_BUFFER_SIZE returned: Size: 122880 and alignment: 8192
    01-25 17:23:07.203 11602-11695/com.example.app W/AHierarchicalStateMachine: Warning message AMessage(what = 'omxI') = {
                                                                            int32_t type = 0
                                                                            int32_t event = 2130706432
                                                                            int32_t data1 = 1
                                                                            int32_t data2 = 0
                                                                          } unhandled in root state.
    01-25 17:23:07.232 11602-11695/com.example.app D/SurfaceUtils: set up nativeWindow 0xa0b7a108 for 720x576, color 0x7fa30c03, rotation 0, usage 0x42002900
    01-25 17:23:07.241 194-194/? E/OMX-VDEC-1080P: GET_MV_BUFFER_SIZE returned: Size: 122880 and alignment: 8192
    01-25 17:23:07.242 194-194/? E/OMX-VDEC-1080P: Insufficient sized buffer given for playback, expected 671744, got 663552
    01-25 17:23:07.242 194-194/? E/OMXNodeInstance: useBuffer(45:qcom.decoder.avc, Output:1 671744@0xb60a0860) ERROR: BadParameter(0x80001005)
    01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: registering GraphicBuffer 0 with OMX IL component failed: -2147483648
    01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: Failed to allocate output port buffers after port reconfiguration: (-2147483648)
    01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
    01-25 17:23:07.243 11602-11694/com.example.app E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 6
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err: java.lang.IllegalStateException
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at android.media.MediaCodec.native_dequeueOutputBuffer(Native Method)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at android.media.MediaCodec.dequeueOutputBuffer(MediaCodec.java:2379)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at com.example.app.FfmpegPlayer$OutputThread.run(FfmpegPlayer.java:122)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at com.example.app.FfmpegPlayer.setDataSource(FfmpegPlayer.java:66)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at com.example.app.activities.TestActivity$2.surfaceCreated(TestActivity.java:151)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at android.view.SurfaceView.updateWindow(SurfaceView.java:583)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at android.view.SurfaceView$3.onPreDraw(SurfaceView.java:177)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:944)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:2055)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1107)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:6013)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.Choreographer$CallbackRecord.run(Choreographer.java:858)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.Choreographer.doCallbacks(Choreographer.java:670)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.Choreographer.doFrame(Choreographer.java:606)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:844)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.os.Handler.handleCallback(Handler.java:739)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.os.Handler.dispatchMessage(Handler.java:95)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.os.Looper.loop(Looper.java:148)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.app.ActivityThread.main(ActivityThread.java:5417)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at java.lang.reflect.Method.invoke(Native Method)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)

    Another device always has empty output buffers, thought the indexes aren >= 0 ;

    What am I doing wrong ?