Recherche avancée

Médias (0)

Mot : - Tags -/auteurs

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (55)

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (8715)

  • FFMPEG RTSP stream to MPEG4/H264 file using libx264

    16 octobre 2020, par Phi

    Heyo folks,

    



    I'm attempting to transcode/remux an RTSP stream in H264 format into a MPEG4 container, containing just the H264 video stream. Basically, webcam output into a MP4 container.

    



    I can get a poorly coded MP4 produced, using this code :

    



    // Variables here for demo
AVFormatContext * video_file_output_format = nullptr;
AVFormatContext * rtsp_format_context = nullptr;
AVCodecContext * video_file_codec_context = nullptr;
AVCodecContext * rtsp_vidstream_codec_context = nullptr;
AVPacket packet = {0};
AVStream * video_file_stream = nullptr;
AVCodec * rtsp_decoder_codec = nullptr;
int errorNum = 0, video_stream_index = 0;
std::string outputMP4file = "D:\\somemp4file.mp4";

// begin
AVDictionary * opts = nullptr;
av_dict_set(&opts, "rtsp_transport", "tcp", 0);

if ((errorNum = avformat_open_input(&rtsp_format_context, uriANSI.c_str(), NULL, &opts)) < 0) {
    errOut << "Connection failed: avformat_open_input failed with error " << errorNum << ":\r\n" << ErrorRead(errorNum);
    TacticalAbort();
    return;
}

rtsp_format_context->max_analyze_duration = 50000;
if ((errorNum = avformat_find_stream_info(rtsp_format_context, NULL)) < 0) {
    errOut << "Connection failed: avformat_find_stream_info failed with error " << errorNum << ":\r\n" << ErrorRead(errorNum);
    TacticalAbort();
    return;
}

video_stream_index = errorNum = av_find_best_stream(rtsp_format_context, AVMEDIA_TYPE_VIDEO, -1, -1, NULL, 0);

if (video_stream_index < 0) {
    errOut << "Connection in unexpected state; made a connection, but there was no video stream.\r\n"
        "Attempts to find a video stream resulted in error " << errorNum << ": " << ErrorRead(errorNum);
    TacticalAbort();
    return;
}

rtsp_vidstream_codec_context = rtsp_format_context->streams[video_stream_index]->codec;

av_init_packet(&packet);

if (!(video_file_output_format = av_guess_format(NULL, outputMP4file.c_str(),  NULL))) {
    TacticalAbort();
    throw std::exception("av_guess_format");
}

if (!(rtsp_decoder_codec = avcodec_find_decoder(rtsp_vidstream_codec_context->codec_id))) {
    errOut << "Connection failed: connected, but avcodec_find_decoder returned null.\r\n"
        "Couldn't find codec with an AV_CODEC_ID value of " << rtsp_vidstream_codec_context->codec_id << ".";
    TacticalAbort();
    return;
}

video_file_format_context = avformat_alloc_context();
video_file_format_context->oformat = video_file_output_format;

if (strcpy_s(video_file_format_context->filename, sizeof(video_file_format_context->filename), outputMP4file.c_str())) {
    errOut << "Couldn't open video file: strcpy_s failed with error " << errno << ".";
    std::string log = errOut.str();
    TacticalAbort();
    throw std::exception("strcpy_s");
}

if (!(video_file_encoder_codec = avcodec_find_encoder(video_file_output_format->video_codec))) {
    TacticalAbort();
    throw std::exception("avcodec_find_encoder");
}

// MARKER ONE

if (!outputMP4file.empty() &&
    !(video_file_output_format->flags & AVFMT_NOFILE) &&
    (errorNum = avio_open2(&video_file_format_context->pb, outputMP4file.c_str(), AVIO_FLAG_WRITE, nullptr, &opts)) < 0) {
    errOut << "Couldn't open video file \"" << outputMP4file << "\" for writing : avio_open2 failed with error " << errorNum << ": " << ErrorRead(errorNum);
    TacticalAbort();
    return;
}

// Create stream in MP4 file
if (!(video_file_stream = avformat_new_stream(video_file_format_context, video_file_encoder_codec))) {
    TacticalAbort();
    return;
}

AVCodecContext * video_file_codec_context = video_file_stream->codec;

// MARKER TWO

// error -22/-21 in avio_open2 if this is skipped
if ((errorNum = avcodec_copy_context(video_file_codec_context, rtsp_vidstream_codec_context)) != 0) {
    TacticalAbort();
    throw std::exception("avcodec_copy_context");
}

//video_file_codec_context->codec_tag = 0;

/*
// MARKER 3 - is this not needed? Examples suggest not.
if ((errorNum = avcodec_open2(video_file_codec_context, video_file_encoder_codec, &opts)) < 0)
{
    errOut << "Couldn't open video file codec context: avcodec_open2 failed with error " << errorNum << ": " << ErrorRead(errorNum);
    std::string log = errOut.str();
    TacticalAbort();
    throw std::exception("avcodec_open2, video file");
}*/

//video_file_format_context->flags |= AVFMT_FLAG_GENPTS;
if (video_file_format_context->oformat->flags & AVFMT_GLOBALHEADER)
{
    video_file_codec_context->flags |= CODEC_FLAG_GLOBAL_HEADER;
}

if ((errorNum = avformat_write_header(video_file_format_context, &opts)) < 0) {
    errOut << "Couldn't open video file: avformat_write_header failed with error " << errorNum << ":\r\n" << ErrorRead(errorNum);
    std::string log = errOut.str();
    TacticalAbort();
    return;
}


    



    However, there are several issues :

    



      

    1. I can't pass any x264 options to the output file. The output H264 matches the input H264's profile/level - switching cameras to a different model switches H264 level.
    2. 


    3. The timing of the output file is off, noticeably.
    4. 


    5. The duration of the output file is off, massively. A few seconds of footage becomes hours, although playtime doesn't match. (FWIW, I'm using VLC to play them.)
    6. 


    



    Passing x264 options

    



    If I manually increment PTS per packet, and set DTS equal to PTS, it plays too fast, 2-3 seconds' worth of footage in one second playtime, and duration is hours long. The footage also blurs past several seconds, about 10 seconds' footage in a second.

    



    If I let FFMPEG decide (with or without GENPTS flag), the file has a variable frame rate (probably as expected), but it plays the whole file in an instant and has a long duration too (over forty hours for a few seconds). The duration isn't "real", as the file plays in an instant.

    



    At Marker One, I try to set the profile by passing options to avio_open2. The options are simply ignored by libx264. I've tried :

    



    av_dict_set(&opts, "vprofile", "main", 0);
av_dict_set(&opts, "profile", "main", 0); // error, missing '('
// FF_PROFILE_H264_MAIN equals 77, so I also tried
av_dict_set(&opts, "vprofile", "77", 0); 
av_dict_set(&opts, "profile", "77", 0);


    



    It does seem to read the profile setting, but it doesn't use them. At Marker Two, I tried to set it after the avio_open2, before avformat_write_header .

    



    // I tried all 4 av_dict_set from earlier, passing it to avformat_write_header.
// None had any effect, they weren't consumed.
av_opt_set(video_file_codec_context, "profile", "77", 0);
av_opt_set(video_file_codec_context, "profile", "main", 0);
video_file_codec_context->profile = FF_PROFILE_H264_MAIN;
av_opt_set(video_file_codec_context->priv_data, "profile", "77", 0);
av_opt_set(video_file_codec_context->priv_data, "profile", "main", 0);


    



    Messing with privdata made the program unstable, but I was trying anything at that point.
I'd like to solve issue 1 with passing settings, since I imagine it'd bottleneck any attempt to solve issues 2 or 3.

    



    I've been fiddling with this for the better part of a month now. I've been through dozens of documentation, Q&As, examples. It doesn't help that quite a few are outdated.

    



    Any help would be appreciated.

    



    Cheers

    


  • Can't decode h264 frame by frame

    17 mai 2017, par J. Doe

    Here is problematic code :

    int frame_count{0};
    int status = -1;
    while ((status = av_read_frame(ctx, pkt)) >= 0) {
       int got_frame;
       auto len =
           avcodec_decode_video2(video_ctx, frame, &got_frame, pkt);
       errcheck(len);

       if (got_frame == 0)
           errthrow("No frame could be decompressed");

       auto w = frame->width;
       auto h = frame->height;
       auto gray_convert_ctx = sws_getContext(
           w, h, input_pix_format, w, h, output_pix_format, SWS_POINT,
           nullptr, nullptr, nullptr);

       sws_scale(gray_convert_ctx, frame->data, frame->linesize, 0, h,
             frame_converted->data, frame_converted->linesize);

       f_(frame_converted->data[0], frame_converted->linesize[0], w,
          h);
       ++frame_count;
       sws_freeContext(gray_convert_ctx);

       if (pkt->data) {
           pkt->size -= len;
           pkt->data += len;
       }
    }
    if (status != AVERROR_EOF)
       errcheck(status);

    With vp8/vp9 all is okay, but when I’m trying to decode h264, I’ve got error :

    ➤ ./cv /tmp/x
    file size: 694KiB
    read /tmp/x: 694KiB
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x191fc00] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x191fc00] ISO: File Type Major Brand: mp42
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x191fc00] rfps: 31.000000 0.000599
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x191fc00] Before avformat_find_stream_info() pos: 3104 bytes read:32768 seeks:0
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x191fc00] All info found
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x191fc00] After avformat_find_stream_info() pos: 456697 bytes read:458752 seeks:0 frames:34
    [h264 @ 0x1926700] no frame!
    error: AV: Invalid data found when processing input

    Maybe that’s because of h264 does not support AV_CODEC_CAP_TRUNCATED ? But I suppose it’s should be handled in av_read_frame.

    Then ignoring (just skipping this read if error occurs) — no frame decoded anyway until EOF. :c This video working fine with ffplay/mpv/etc. and was recorded by Android.

    #include <iostream>

    extern "C" {
    #include <libavcodec></libavcodec>avcodec.h>
    #include <libavformat></libavformat>avformat.h>
    }

    #include <functional>

    class Video
    {
         public:
       static std::string TAG;

       Video();
       Video(void *data_ptr, size_t data_size);
       ~Video();

       void set(void *data_ptr, size_t data_size);
       void process(std::function f_);

         private:
       static constexpr AVPixelFormat output_pix_format{AV_PIX_FMT_GRAY8};

       struct {
           uint8_t *ptr{nullptr};
           size_t size;
       } bd;

       bool video_ctx_opened{false};
       AVCodecContext *video_ctx{nullptr};
       AVStream *video_stream{nullptr};

       size_t width;
       size_t heigh;

       AVPixelFormat input_pix_format;

       size_t avio_ctx_buffer_size = 32 * 1024; // 32 KiB
       uint8_t *avio_ctx_buffer{nullptr};

       AVFormatContext *ctx{nullptr};
       AVIOContext *avio_ctx{nullptr};

       uint8_t *frame_converted_buffer{nullptr};
       AVFrame *frame_converted{nullptr};

       AVFrame *frame{nullptr};
       AVPacket *pkt{nullptr};

       void init_stream();
       void init_codec();
       void init_frame_converted();
    };


    extern "C" {
    #include <libswscale></libswscale>swscale.h>
    }

    #include <iostream>
    #include <stdexcept>

    namespace
    {
    using str_t = decltype(Video::TAG);

    static str_t averr(int code)
    {
       static thread_local std::array buf;
       av_make_error_string(buf.data(), buf.size(), code);
       return str_t(buf.data(), buf.size());
    }

    static str_t errstr(int err) { return Video::TAG + ": " + averr(err); }

    static str_t errstr(const char *err) { return Video::TAG + ": " + err; }

    static void errthrow(str_t err) { throw std::runtime_error{std::move(err)}; }

    static void errcheck(int val)
    {
       if (val &lt; 0)
           errthrow(errstr(val));
    }

    template <class t="t"> static void errcheck(T *ptr, const char *errmsg)
    {
       if (!ptr)
           errthrow(errstr(errmsg));
    }

    static int read_packet(void *opaque, uint8_t *buf, int buf_size)
    {
       struct _bd {
           uint8_t *ptr;
           size_t size;
       };
       _bd *bd = static_cast&lt;_bd *>(opaque);

       buf_size = FFMIN(buf_size, bd->size);

       memcpy(buf, bd->ptr, buf_size);
       bd->ptr += buf_size;
       bd->size -= buf_size;

       return buf_size;
    }
    }

    std::string Video::TAG = "AV";

    Video::Video()
    {
       av_register_all();
       avcodec_register_all();

       frame = av_frame_alloc();
       errcheck(frame, "Could not allocate frame");

       pkt = static_cast<avpacket>(av_malloc(sizeof(AVPacket)));
       errcheck(pkt, "Could not allocate packet");
       av_init_packet(pkt);
    }

    Video::Video(void *data_ptr, size_t data_size) : Video()
    {
       set(data_ptr, data_size);
    }

    Video::~Video()
    {
       avformat_close_input(&amp;ctx);
       if (avio_ctx) {
           av_freep(&amp;avio_ctx->buffer);
           av_freep(&amp;avio_ctx);
       }

       if (video_ctx) {
           avcodec_close(video_ctx);
           av_free(video_ctx);
       }
       if (frame)
           av_frame_free(&amp;frame);
       if (frame_converted_buffer)
           av_freep(&amp;frame_converted_buffer);
       if (frame_converted)
           av_frame_free(&amp;frame_converted);
       if (pkt) {
           av_free_packet(pkt);
           av_free(pkt);
       }
    }

    void Video::set(void *data_ptr, size_t data_size)
    {
       bd.ptr = static_cast(data_ptr);
       bd.size = data_size;

       init_stream();
       init_frame_converted();
       init_codec();
       pkt->data = nullptr;
       pkt->size = 0;
    }

    void Video::process(
       std::function f_)
    {
       int frame_count{0};
       int status = -1;
       while ((status = av_read_frame(ctx, pkt)) >= 0) {
           int got_frame;
           auto len =
               avcodec_decode_video2(video_ctx, frame, &amp;got_frame, pkt);
           errcheck(len);

           if (got_frame == 0)
               errthrow("No frame could be decompressed");

           auto w = frame->width;
           auto h = frame->height;
           auto gray_convert_ctx = sws_getContext(
               w, h, input_pix_format, w, h, output_pix_format, SWS_POINT,
               nullptr, nullptr, nullptr);

           sws_scale(gray_convert_ctx, frame->data, frame->linesize, 0, h,
                     frame_converted->data, frame_converted->linesize);

           f_(frame_converted->data[0], frame_converted->linesize[0], w,
              h);
           ++frame_count;
           sws_freeContext(gray_convert_ctx);

           if (pkt->data) {
               pkt->size -= len;
               pkt->data += len;
           }
       }
       if (status != AVERROR_EOF)
           errcheck(status);
    }

    void Video::init_stream()
    {
       ctx = avformat_alloc_context();
       errcheck(ctx, "Could not allocate format context");

       avio_ctx_buffer =
           static_cast(av_malloc(avio_ctx_buffer_size));
       errcheck(avio_ctx_buffer, "Could not allocate io buffer");

       avio_ctx = avio_alloc_context(avio_ctx_buffer, avio_ctx_buffer_size, 0,
                                     &amp;bd, &amp;read_packet, nullptr, nullptr);
       errcheck(avio_ctx, "Could not allocate io context");
       ctx->pb = avio_ctx;

       auto status = avformat_open_input(&amp;ctx, nullptr, nullptr, nullptr);
       errcheck(status);

       status = avformat_find_stream_info(ctx, nullptr);
       errcheck(status);

       for (decltype(ctx->nb_streams) i = 0; i &lt; ctx->nb_streams; ++i) {
           auto stream = ctx->streams[i];
           if (!stream || !stream->codec)
               continue;
           if (stream->codec->codec_type == AVMEDIA_TYPE_VIDEO) {
               video_stream = stream;
               break;
           }
       }
       errcheck(video_stream, "Could not find valid video stream");

       width = video_stream->codec->width;
       heigh = video_stream->codec->height;

       input_pix_format = video_stream->codec->pix_fmt;
    }

    void Video::init_codec()
    {
       auto codec = avcodec_find_decoder(video_stream->codec->codec_id);
       errcheck(codec, "Codec not found");

       video_ctx = avcodec_alloc_context3(codec);
       errcheck(video_ctx, "Could not allocate video codec context");

       if (codec->capabilities &amp; AV_CODEC_CAP_TRUNCATED)
           video_ctx->flags |=
               AV_CODEC_FLAG_TRUNCATED; // we do not send complete frames

       auto status = avcodec_open2(video_ctx, codec, nullptr);
       errcheck(status);
    }

    void Video::init_frame_converted()
    {
       frame_converted = av_frame_alloc();
       errcheck(frame_converted, "Could not allocate frame");

       int frame_converted_buffer_size =
           avpicture_get_size(output_pix_format, width, heigh);
       errcheck(frame_converted_buffer_size);

       frame_converted_buffer =
           static_cast(av_malloc(frame_converted_buffer_size));
       errcheck(frame_converted_buffer, "Could not allocate picture buffer");

       auto status = avpicture_fill(
           reinterpret_cast<avpicture>(frame_converted),
           frame_converted_buffer, output_pix_format, width, heigh);
       errcheck(status);
    }

    #include <vector>
    #include <fstream>

    std::vector<char> read_file(const std::string &amp;fname)
    {
       std::ifstream file(fname, std::ios::binary | std::ios::ate);
       if (!file.is_open())
           throw std::runtime_error{"can't open " + fname};

       auto size = file.tellg();
       file.seekg(0, std::ios::beg);

       std::cout &lt;&lt; "file size: " &lt;&lt; std::to_string(size / 1024) &lt;&lt; "KiB\n";

       std::vector<char> buffer(size);

       if (file.read(buffer.data(), size))
           return buffer;
       return {};
    }

    int main(int argc, const char **argv)
    {
       if (argc &lt; 2)
           return EXIT_FAILURE;

       av_log_set_level(AV_LOG_DEBUG);

       try {
           auto data = read_file(argv[1]);
           std::cout &lt;&lt; "read " &lt;&lt; argv[1] &lt;&lt; ": "
                     &lt;&lt; std::to_string(data.size() / 1024) &lt;&lt; "KiB\n";

                   Video v;
               v.set(data.data(), data.size());

               v.process([](unsigned char *data, int wrap, int xsize,
                              int ysize) {
                   std::cout &lt;&lt; "w: " &lt;&lt; xsize
                             &lt;&lt; " h: " &lt;&lt; ysize &lt;&lt; '\n';
               });

       } catch (const std::runtime_error &amp;e) {
           std::cout &lt;&lt; "error: " &lt;&lt; e.what() &lt;&lt; '\n';
       }
    }
    </char></char></fstream></vector></avpicture></avpacket></class></stdexcept></iostream></functional></iostream>

    Compile&Run :
    g++ cv.cpp -std=c++14 -lavutil -lavcodec -lavformat -lswscale ; ./a.out file.name.here

  • Android plugin with .so lib's (UNITY)

    23 mars 2017, par Vladimir Ilianov

    So i need ffmpeg in android. (for muxing audio and video).
    I found this awesome project for android studio (link in the end of a post, in rar as ffmpeg4android_demo_studio2).
    I works perfect, does muxing and overall awesome project. So i decided to make plugin for anroid from that project, so i remade project like that : (link in the end of a post, in rar as ffmpeg4android_demo_studio ((p.S. the is no GeneralUtils code because its big, please download rar and open it.))

    package com.netcompss.ffmpeg4android;

    public class CommandValidationException extends Exception {
       private static final long serialVersionUID = 1L;
    }

    =============================================

    package com.netcompss.ffmpeg4android;

       import android.content.Context;
       import android.util.Log;
       import android.widget.Toast;    
       public class FFMpeg {
       private Context context;
       private static FFMpeg instance;
       public FFMpeg (){
           this.instance = this;
       }

       public static  FFMpeg instance(){
           if(instance == null){
               instance = new FFMpeg();
           }
           return instance;
       }

       public void setContext(Context context){
           this.context = context;
       }

       public void mux(String video,String audio,String out){
           //GeneralUtils.checkForPermissionsMAndAbove(currentActivity, true);
           LoadJNI vk = new LoadJNI();
           try {
               String workFolder = context.getFilesDir().getAbsolutePath()+ "/";
               String cmd = "ffmpeg -i "+video+" -i "+audio+" -c copy -map 0:v:0 -map 1:a:0 -shortest "+out;
               vk.run(GeneralUtils.utilConvertToComplex(cmd) , workFolder , context);
               Log.i("test", "ffmpeg4android finished successfully");
           } catch (Throwable e) {
               Log.e("test", "vk run exception.", e);
           }
       }

       public void showMessage(String message){
           Toast.makeText(this.context,message,Toast.LENGTH_SHORT).show();
       }

       }

    ==================

    package com.netcompss.ffmpeg4android ;

    import android.app.Activity;
    import android.content.Context;
    import android.widget.TextView;
    import android.os.Bundle;


    public class LicenseCheckJNI
    {

       public int licenseCheck(String path, Context ctx) {
          String rcStr = "-100";
             rcStr = licenseCheckComplexJNI(path);
          int rc =Integer.decode(rcStr);
          return rc;
       }


       public native String licenseCheckComplexJNI(String path);
       public native String licenseCheckSimpleJNI(String path);




       static {
           System.loadLibrary("license-jni");
       }
    }

    package com.netcompss.ffmpeg4android;

    import java.io.File;
    import android.content.Context;
    import android.nfc.Tag;
    import android.util.Log;

    public final class LoadJNI {

      static {
         System.loadLibrary("loader-jni");
         System.loadLibrary("license-jni");
         System.loadLibrary("videokit");

      }

      /**
       *
       * @param args ffmpeg command
       * @param workFolder working directory
       * @param ctx Android context
       * @param isValidate apply validation to the command
       * @throws CommandValidationException
       */
      public void run(String[] args, String workFolder, Context ctx, boolean isValidate) throws CommandValidationException {
         Log.i(Prefs.TAG, "running ffmpeg4android_lib: " + Prefs.version);
         // delete previous log: this is essential for correct progress calculation
         String vkLogPath = workFolder + "vk.log";
         GeneralUtils.deleteFileUtil(vkLogPath);
         GeneralUtils.printCommand(args);

         //printInternalDirStructure(ctx);

         if (isValidate) {
            if (GeneralUtils.isValidCommand(args)) {
               Log.d(Prefs.TAG, "=LOAD================");
               load(args, workFolder, getVideokitLibPath(ctx), true);
            }
            else
               throw new CommandValidationException();
         }
         else {
            Log.d(Prefs.TAG, "=LOAD================");
            load(args, workFolder, getVideokitLibPath(ctx), true);
         }

      }

      /**
       *
       * @param args ffmpeg command
       * @param workFolder working directory
       * @param ctx Android context
       * @throws CommandValidationException
       */
      public void run(String[] args, String workFolder, Context ctx) throws CommandValidationException {
         run(args, workFolder, ctx, true);
      }


      private static void printInternalDirStructure(Context ctx) {
         Log.d(Prefs.TAG, "=printInternalDirStructure=");
         Log.d(Prefs.TAG, "==============================");
         File file = new File(ctx.getFilesDir().getParent());
         analyzeDir(file);
         Log.d(Prefs.TAG, "==============================");
      }

      private static void analyzeDir(File path) {
         if (path.isDirectory()) {
            Log.d(Prefs.TAG,"Scanning dir: " + path.getAbsolutePath());
            File[] files1 = path.listFiles();
            for (int i = 0; i &lt; files1.length; i++) {
               analyzeDir(files1[i]);
            }
            Log.d(Prefs.TAG, "==========");
         }
         else {
            Log.d(Prefs.TAG, path.getAbsolutePath());

         }
      }

      private static String getVideokitLibPath(Context ctx) {

         //File file = new File(ctx.getFilesDir().getParent() + "/lib/");
         //analyzeDir(file);

         String videokitLibPath = ctx.getFilesDir().getParent()  + "/lib/libvideokit.so";

         File file = new File(videokitLibPath);
         if(file.exists())  {    
           Log.i(Prefs.TAG, "videokitLibPath exits");
         }
         else {
            Log.w(Prefs.TAG, "videokitLibPath not exits: " + videokitLibPath);
            videokitLibPath = ctx.getFilesDir().getParent()  + "/lib/arm64/libvideokit.so";
            Log.i(Prefs.TAG, "trying videokitLibPath: " + videokitLibPath);
            file = new File(videokitLibPath);
            if(file.exists())  {
               Log.i(Prefs.TAG, "videokitLibPath exits: " + videokitLibPath);
            }
            else {
               Log.w(Prefs.TAG, "videokitLibPath not exits: " + videokitLibPath);
               videokitLibPath = "/data/app/com.examples.ffmpeg4android_demo-1/lib/arm64/libvideokit.so";
               Log.i(Prefs.TAG, "trying videokitLibPath: " + videokitLibPath);
               file = new File(videokitLibPath);
               if(file.exists())  {
                  Log.i(Prefs.TAG, "videokitLibPath exits: " + videokitLibPath);
               }
               else {
                  Log.w(Prefs.TAG, "videokitLibPath not exits: " + videokitLibPath);
                  videokitLibPath = "/data/app/com.examples.ffmpeg4android_demo-2/lib/arm64/libvideokit.so";
                  Log.i(Prefs.TAG, "trying videokitLibPath: " + videokitLibPath);
                  if(file.exists())  {
                     Log.i(Prefs.TAG, "videokitLibPath exits: " + videokitLibPath);
                  }
                  else {
                     Log.e(Prefs.TAG, "can't find path of lib");
                  }
               }
            }
         }





         //String videokitLibPath = ctx.getFilesDir().getParent()  + "/lib/arm64/libvideokit.so";

         // only this works on Android M, and the number changes (demo-2, demo-1)
         //String videokitLibPath = "/data/app/com.examples.ffmpeg4android_demo-1/lib/arm64/libvideokit.so";


         //Log.i(Prefs.TAG, "videokitLibPath: " + videokitLibPath);
         return videokitLibPath;

      }



      public void fExit( Context ctx) {
         fexit(getVideokitLibPath(ctx));
      }

      public native String fexit(String videokitLibPath);
      public native String unload();
      public native String load(String[] args, String videokitSdcardPath, String videokitLibPath, boolean isComplex);
    }

    ============================

    package com.netcompss.ffmpeg4android;

    public class Prefs {
      public static final String TAG = "ffmpeg4android";
      public static final String version = "322.00.00_LM322";
    }
    [/code]
    [code=JavaScript]
    package com.netcompss.ffmpeg4android;

    import java.text.ParseException;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    import java.util.Locale;

    import android.util.Log;

    public class ProgressCalculator {

      private int _durationOfCurrentWaitIndex = 0;
      private final int DURATION_OF_CURRENT_WAIT_INDEX_LIMIT = 12;
      private String _durationOfCurrent;
      private long _lastVklogSize = -1;
      private int _vkLogNoChangeCounter = 0;
      private SimpleDateFormat _simpleDateFormat;
      long _timeRef = -1;
      int  _prevProgress = 0;
      private String vkLogPath = null;

      public ProgressCalculator(String vkLogPathIn) {
         vkLogPath = vkLogPathIn;
         _simpleDateFormat = new SimpleDateFormat("HH:mm:ss.SS");
         try {
            Date ref = _simpleDateFormat.parse("00:00:00.00");
            ref.setYear(112);
            _timeRef = ref.getTime();
         } catch (ParseException e) {
            Log.w(Prefs.TAG, "failed to set _timeRef");
         }
      }

      public void initCalcParamsForNextInter() {
         Log.i(Prefs.TAG, "initCalcParamsForNextInter");
         _lastVklogSize = -1;
         _vkLogNoChangeCounter = 0;
         _durationOfCurrent = null;

      }

      public int calcProgress() {
         return calcProgress(1);
      }


      public int calcProgress(int durationMultiplyer) {
         //Log.i(Prefs.TAG, "========calc progress======= " + durationMultiplyer);
         int progress  = 0;
         if (_durationOfCurrent == null) {
            String dur = GeneralUtils.getDutationFromVCLogRandomAccess(vkLogPath);
            Log.d(Prefs.TAG, "dur: " + dur);
            if (dur == null || dur.equals("") || dur.equals("null") ) {
               Log.i(Prefs.TAG, "dur is not good, not setting ");
               if (_durationOfCurrentWaitIndex &lt; DURATION_OF_CURRENT_WAIT_INDEX_LIMIT) {
                  Log.i(Prefs.TAG, "waiting for real duration, going out of calcProgress with 0");
                  _durationOfCurrentWaitIndex ++;
                  return 0;
               }
               else {
                  Log.i(Prefs.TAG, "_durationOfCurrentWaitIndex is equal to: " + DURATION_OF_CURRENT_WAIT_INDEX_LIMIT + " reseting.");
                  _durationOfCurrentWaitIndex = 0;
                  Log.i(Prefs.TAG, "setting fake Prefs.durationOfCurrent");

                  _durationOfCurrent = "00:03:00.00";
                  Log.w(Prefs.TAG, "setting fake Prefs.durationOfCurrent (Cant get from file): " + _durationOfCurrent);

               }
            }
            else {
               _durationOfCurrent = GeneralUtils.getDutationFromVCLogRandomAccess(vkLogPath);
               Log.i(Prefs.TAG, "duration: " + _durationOfCurrent + " \nTranscoding...");
            }
         }


         if (_durationOfCurrent != null) {

            long currentVkLogSize = -1;
            currentVkLogSize = GeneralUtils.getVKLogSizeRandomAccess(vkLogPath);
            //Log.d(Prefs.TAG, "currentVkLogSize: " + currentVkLogSize + " _lastVklogSize: " + _lastVklogSize);

            if (currentVkLogSize > _lastVklogSize) {
               _lastVklogSize = currentVkLogSize;
               _vkLogNoChangeCounter = 0;
            }
            else {
               //Log.w(Prefs.TAG, "Looks like Vk log is not increasing in size");
               _vkLogNoChangeCounter++;
            }


            String currentTimeStr = GeneralUtils.readLastTimeFromVKLogUsingRandomAccess(vkLogPath);
            //Log.d(Prefs.TAG, "currentTimeStr: " + currentTimeStr);
            if (currentTimeStr.equals("exit")) {
               Log.d(Prefs.TAG, "============Found one of the exit tokens in the log============");
               return 100;
            }
            else if (currentTimeStr.equals("error") &amp;&amp; _prevProgress == 0) {
               Log.d(Prefs.TAG, "============Found error in the log============");
               return 100;
            }
            else if (_vkLogNoChangeCounter > 16) {
               Log.e(Prefs.TAG, "VK log is not changing in size, and no exit token found");
               return 100;
            }
            try {
               Date durationDate = _simpleDateFormat.parse(_durationOfCurrent);
               Date currentTimeDate = _simpleDateFormat.parse(currentTimeStr);
               currentTimeDate.setYear(112);
               durationDate.setYear(112);
               //Log.d(Prefs.TAG, " durationDate: " + durationDate + " currentTimeDate: " + currentTimeDate);

               long durationLong = durationDate.getTime() - _timeRef;
               if (durationMultiplyer != 1) {
                  //Log.i(Prefs.TAG, "====durationMultiplyer is not 1, handling===");
                  //Log.i(Prefs.TAG, "durationLong before: " + durationLong);
                  durationLong = durationLong * durationMultiplyer;
                  //Log.i(Prefs.TAG, "durationLong after: " + durationLong);
               }
               long currentTimeLong = currentTimeDate.getTime() - _timeRef;
               //Log.d(Prefs.TAG, " durationLong: " + durationLong + " currentTimeLong: " + currentTimeLong + " diff: " + (durationLong - currentTimeLong));
               progress  = Math.round(((float)currentTimeLong / durationLong) * 100);
               if (progress >= 100) {
                  Log.w(Prefs.TAG, "progress is 100, but can't find exit in the log, probably fake progress, still running...");
                  progress = 99;
               }
               _prevProgress = progress;


            } catch (ParseException e) {
               Log.w(Prefs.TAG, e.getMessage());
            }
         }

         return progress;
      }


    }

    ==================================

    Then clicked build and copied ffmpeg4android_lib.aar in unity project under assets/Plugins/Android/libs/
    then made this wrap up

    using System.Collections;
    using System.Collections.Generic;
    using UnityEngine;
    using UnityEngine.UI;
    using System.IO;


    public class Test : MonoBehaviour {

       private AndroidJavaObject FFMpeg = null;
       private AndroidJavaObject activityContext = null;

       public string Path1;
       public string Path2;
       public string Out3;

       public string path;

       public Text File1;
       public Text File2;
       public Text Context;
       public Text End;

       public void Convert(){
           File1.text = File.Exists (path+Path1).ToString();
           File2.text = File.Exists (path+Path2).ToString();
           if (FFMpeg == null) {
               using(AndroidJavaClass activityclass = new AndroidJavaClass("com.unity3d.player.UnityPlayer")){
                   activityContext = activityclass.GetStatic<androidjavaobject> ("currentActivity");
               }
               Context.text = "Context =" + activityContext;
           }
           using (AndroidJavaClass pluginClass = new AndroidJavaClass ("com.netcompss.ffmpeg4android.FFMpeg")) {
               if (pluginClass != null) {
                   FFMpeg = pluginClass.CallStatic<androidjavaobject> ("instance");
                   FFMpeg.Call ("setContext", activityContext);
                   //activityContext.Call ("runOnUiThread", new AndroidJavaRunnable (() => {
                   FFMpeg.Call ("mux", path+Path1,path+Path2,path+Out3);
                   //}));
               }
           }
           End.text = "Done";

       }
    }
    </androidjavaobject></androidjavaobject>

    =============================================

    And the problem is :
    When i launch apk made from original project in AndroidStudio everything works fine.
    When i launch apk made in unity its works fine until this part :
    load(args, workFolder, getVideokitLibPath(ctx), true) ;

    IT does load all lib ok.
    Its throw this error in logcat when trying to do that code

    03-23 10:43:17.293 28263-28277/? W/dalvikvm: No implementation found for native Lcom/netcompss/ffmpeg4android/LoadJNI;.load:([Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Z)Ljava/lang/String;
    03-23 10:43:17.294 28263-28277/? E/test: vk run exception.
                                            java.lang.UnsatisfiedLinkError: Native method not found: com.netcompss.ffmpeg4android.LoadJNI.load:([Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Z)Ljava/lang/String;
                                                at com.netcompss.ffmpeg4android.LoadJNI.load(Native Method)
                                                at com.netcompss.ffmpeg4android.LoadJNI.run(LoadJNI.java:37)
                                                at com.netcompss.ffmpeg4android.LoadJNI.run(LoadJNI.java:57)
                                                at com.netcompss.ffmpeg4android.FFMpeg.mux(FFMpeg.java:36)
                                                at com.unity3d.player.UnityPlayer.nativeRender(Native Method)
                                                at com.unity3d.player.UnityPlayer.a(Unknown Source)
                                                at com.unity3d.player.UnityPlayer$b$1.handleMessage(Unknown Source)
                                                at android.os.Handler.dispatchMessage(Handler.java:103)
                                                at android.os.Looper.loop(Looper.java:194)
                                                at com.unity3d.player.UnityPlayer$b.run(Unknown Source)

    Projects Dropbox link :
    https://www.dropbox.com/s/6vglcw7xk2n8lwu/AndroidStudioProjects.rar?dl=0