Recherche avancée

Médias (16)

Mot : - Tags -/mp3

Autres articles (66)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (13931)

  • Video frame difference with FFMPEG

    27 décembre 2015, par StepTNT

    I need to compute the frame differences between a source video and a compressed one.
    For now I’m using OpenCV with Java, by extracting each frame and doing a simple difference, but it’s quite slow (working a 0.5 fps, meaning that a 500 frames video will take more than 15 mins) so I was thinking to move to FFMPEG.

    FFMPEG feels a lot faster (everything’s done under 1 minute) but it has one big issue that makes the results useless : when compressing the source file, done with FFMPEG too, an extra gray frame is added at the beginning and this fakes the results because different frames are compared.

    This is what I’m doing now (knowing that the extra frame messes it all) :

    ffmpeg -y -i src.avi -i compressed.avi -filter_complex "blend=all_mode=difference,hue=s=0" -c:v libx264 -crf 18 -c:a copy difference.avi

    To fix the frame issue I was trying to remove the first frame by re-encoding the compressed video with this command

    ffmpeg -y -ss 0.02 -i compressed.mpg -an -f mpeg2video compressed-cut.mpg"

    (Note that -ss is 0.02 because it’s a 50 fps video, so I did 1/FPS as suggested here)

    But I get this response

    Output file is empty, nothing was encoded (check -ss / -t / -frames parameters if used)

    So, finally, the question is : since extracting all the frames and then compute differences with OpenCV is really slow, how can I use FFMPEG to produce a video containing the difference between two sources while keeping in mind that one of them has an extra frame at the beginning ?

    EDIT : I wanted to avoid posting endless console outputs but since you asked for it, here we go.

    1) Encoding

    Input

    ffmpeg -i "720p50_mobcal_ter.avi" -an -f mpeg2video -y "720p50_mobcal_ter.mpg"

    Output

    ffmpeg version N-76684-g1fe82ab Copyright (c) 2000-2015 the FFmpeg developers
     built with gcc 5.2.0 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
     libavutil      55.  6.100 / 55.  6.100
     libavcodec     57. 15.100 / 57. 15.100
     libavformat    57. 14.100 / 57. 14.100
     libavdevice    57.  0.100 / 57.  0.100
     libavfilter     6. 15.100 /  6. 15.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, avi, from '720p50_mobcal_ter.avi':
     Metadata:
       encoder         : Lavf57.14.100
     Duration: 00:00:10.08, start: 0.000000, bitrate: 552974 kb/s
       Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 1280x720, 554059 kb/s, SAR 1:1 DAR 16:9, 50 fps, 50 tbr, 50 tbn, 50 tbc
    Output #0, mpeg2video, to '720p50_mobcal_ter.mpg':
     Metadata:
       encoder         : Lavf57.14.100
       Stream #0:0: Video: mpeg2video, yuv420p, 1280x720 [SAR 1:1 DAR 16:9], q=2-31, 200 kb/s, 50 fps, 50 tbn, 50 tbc
       Metadata:
         encoder         : Lavc57.15.100 mpeg2video
    Stream mapping:
     Stream #0:0 -> #0:0 (rawvideo (native) -> mpeg2video (native))
    Press [q] to stop, [?] for help
    frame=   41 fps=0.0 q=31.0 size=     984kB time=00:00:00.78 bitrate=10330.5kbits/frame=   80 fps= 78 q=31.0 size=    1323kB time=00:00:01.56 bitrate=6948.1kbits/frame=  124 fps= 80 q=31.0 size=    1725kB time=00:00:02.44 bitrate=5790.0kbits/frame=  168 fps= 81 q=31.0 size=    2084kB time=00:00:03.32 bitrate=5142.8kbits/frame=  212 fps= 81 q=31.0 size=    2482kB time=00:00:04.20 bitrate=4841.4kbits/frame=  255 fps= 82 q=31.0 size=    2840kB time=00:00:05.06 bitrate=4597.2kbits/frame=  296 fps= 82 q=31.0 size=    3133kB time=00:00:05.88 bitrate=4364.5kbits/frame=  338 fps= 82 q=24.8 size=    3453kB time=00:00:06.72 bitrate=4209.2kbits/frame=  382 fps= 82 q=31.0 size=    3723kB time=00:00:07.60 bitrate=4013.4kbits/frame=  426 fps= 83 q=31.0 size=    4005kB time=00:00:08.48 bitrate=3869.1kbits/frame=  470 fps= 83 q=24.8 size=    4276kB time=00:00:09.36 bitrate=3742.5kbits/frame=  504 fps= 83 q=31.0 Lsize=    4469kB time=00:00:10.06 bitrate=3639.3kbits/s
    video:4469kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000000%

    This adds the extra grey frame at the beginning, it just duplicates the first one

    2) Removing first frame

    Input

    ffmpeg -y -i "720p50_mobcal_ter.mpg" -an -f mpeg2video -vf select=gte(n\,1) "CUT-720p50_mobcal_ter.mpg"

    Output

    ffmpeg version N-76684-g1fe82ab Copyright (c) 2000-2015 the FFmpeg developers
     built with gcc 5.2.0 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
     libavutil      55.  6.100 / 55.  6.100
     libavcodec     57. 15.100 / 57. 15.100
     libavformat    57. 14.100 / 57. 14.100
     libavdevice    57.  0.100 / 57.  0.100
     libavfilter     6. 15.100 /  6. 15.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, mpegvideo, from '720p50_mobcal_ter.mpg':
     Duration: N/A, bitrate: N/A
       Stream #0:0: Video: mpeg2video (Main), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 50 fps, 50 tbr, 1200k tbn, 100 tbc
    Output #0, mpeg2video, to 'CUT-720p50_mobcal_ter.mpg':
     Metadata:
       encoder         : Lavf57.14.100
       Stream #0:0: Video: mpeg2video, yuv420p, 1280x720 [SAR 1:1 DAR 16:9], q=2-31, 200 kb/s, 50 fps, 50 tbn, 50 tbc
       Metadata:
         encoder         : Lavc57.15.100 mpeg2video
    Stream mapping:
     Stream #0:0 -> #0:0 (mpeg2video (native) -> mpeg2video (native))
    Press [q] to stop, [?] for help
    frame=  255 fps=0.0 q=31.0 size=    2781kB time=00:00:05.10 bitrate=4467.3kbits/frame=  503 fps=0.0 q=31.0 Lsize=    4415kB time=00:00:10.08 bitrate=3588.5kbits/s
    video:4415kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000000%

    3) Frame difference

    Input

    ffmpeg -y -i "720p50_mobcal_ter.avi" -i "CUT-720p50_mobcal_ter.mpg" -filter_complex "blend=all_mode=difference,hue=s=0" -c:v libx264 -crf 18 -c:a copy "DIFF-720p50_mobcal_ter.mpg"

    Output

    ffmpeg version N-76684-g1fe82ab Copyright (c) 2000-2015 the FFmpeg developers
     built with gcc 5.2.0 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
     libavutil      55.  6.100 / 55.  6.100
     libavcodec     57. 15.100 / 57. 15.100
     libavformat    57. 14.100 / 57. 14.100
     libavdevice    57.  0.100 / 57.  0.100
     libavfilter     6. 15.100 /  6. 15.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, avi, from '720p50_mobcal_ter.avi':
     Metadata:
       encoder         : Lavf57.14.100
     Duration: 00:00:10.08, start: 0.000000, bitrate: 552974 kb/s
       Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 1280x720, 554059 kb/s, SAR 1:1 DAR 16:9, 50 fps, 50 tbr, 50 tbn, 50 tbc
    Input #1, mpegvideo, from 'CUT-720p50_mobcal_ter.mpg':
     Duration: N/A, bitrate: N/A
       Stream #1:0: Video: mpeg2video (Main), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 50 fps, 50 tbr, 1200k tbn, 100 tbc
    [libx264 @ 000002784dbeb980] using SAR=1/1
    [libx264 @ 000002784dbeb980] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2
    [libx264 @ 000002784dbeb980] profile High, level 3.2
    [mpeg @ 000002784dbeaf20] VBV buffer size not set, using default size of 130KB
    If you want the mpeg file to be compliant to some specification
    Like DVD, VCD or others, make sure you set the correct buffer size
    Output #0, mpeg, to 'D:\DOWNLOADS\TMP\Video TDI\AVI\DIFF-720p50_mobcal_ter.mpg':
     Metadata:
       encoder         : Lavf57.14.100
       Stream #0:0: Video: h264 (libx264), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], q=-1--1, 50 fps, 90k tbn, 50 tbc (default)
       Metadata:
         encoder         : Lavc57.15.100 libx264
    Stream mapping:
     Stream #0:0 (rawvideo) -> blend:top
     Stream #1:0 (mpeg2video) -> blend:bottom
     hue -> Stream #0:0 (libx264)
    Press [q] to stop, [?] for help
    frame=  504 fps= 39 q=-1.0 Lsize=   32182kB time=00:00:10.04 bitrate=26258.5kbits/s
    video:32061kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.377054%
    [libx264 @ 000002784dbeb980] frame I:30    Avg QP:19.69  size:149974
    [libx264 @ 000002784dbeb980] frame P:299   Avg QP:23.28  size: 69423
    [libx264 @ 000002784dbeb980] frame B:175   Avg QP:24.48  size: 43280
    [libx264 @ 000002784dbeb980] consecutive B-frames: 30.6% 69.4%  0.0%  0.0%
    [libx264 @ 000002784dbeb980] mb I  I16..4: 18.3% 51.4% 30.4%
    [libx264 @ 000002784dbeb980] mb P  I16..4:  0.6%  5.6%  2.4%  P16..4: 35.9% 22.9% 15.6%  0.0%  0.0%    skip:17.0%
    [libx264 @ 000002784dbeb980] mb B  I16..4:  0.2%  0.5%  0.3%  B16..8: 49.5% 12.4%  5.6%  direct:15.5%  skip:16.1%  L0:47.8% L1:42.1% BI:10.1%
    [libx264 @ 000002784dbeb980] 8x8 transform intra:57.5% inter:38.5%
    [libx264 @ 000002784dbeb980] coded y,uvDC,uvAC intra: 90.7% 0.0% 0.0% inter: 50.3% 0.0% 0.0%
    [libx264 @ 000002784dbeb980] i16 v,h,dc,p: 32% 23% 35% 10%
    [libx264 @ 000002784dbeb980] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 11% 11% 41%  7%  5%  6%  5%  6%  8%
    [libx264 @ 000002784dbeb980] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 18% 14% 23%  8%  7%  7%  7%  7%  8%
    [libx264 @ 000002784dbeb980] i8c dc,h,v,p: 100%  0%  0%  0%
    [libx264 @ 000002784dbeb980] Weighted P-Frames: Y:33.8% UV:0.0%
    [libx264 @ 000002784dbeb980] ref P L0: 58.1% 16.3% 14.2%  9.4%  2.0%
    [libx264 @ 000002784dbeb980] ref B L0: 79.3% 20.7%
    [libx264 @ 000002784dbeb980] kb/s:26056.02

    The second command made everything work while the second one in the first part didn’t (the one with the -ss option), so I could be quite happy about it but I’m not that sure if FFMPEG duplicates the first frame for every video or if it’s just related to the one I’m using now, so it could be better to start off with a compressed video that has the same frame count of the original one.

    So let’s get to one final question : why does FFMPEG add a duplicated first frame at the beginning of the compressed video and how can I avoid that ?

  • Android : Pass video path to FFmpeg

    7 janvier 2016, par marian

    I have developed an app that play video from gallery. I would like to add watermark using FFmpeg command in the video selected. But I do not know how to pass the path to the FFmpeg command. I could not find proper tutorials or reference regarding this. My coding are as follows :

    MainActivity.java :

    import android.app.Activity;
    import android.app.ProgressDialog;
    import android.content.DialogInterface;
    import android.content.Intent;
    import android.net.Uri;
    import android.os.Bundle;
    import android.os.Handler;
    import android.os.Message;
    import android.os.PowerManager;
    import android.util.Log;
    import android.view.View;
    import android.widget.Button;
    import android.widget.Toast;
    import android.widget.VideoView;

    import com.netcompss.ffmpeg4android.CommandValidationException;
    import com.netcompss.ffmpeg4android.GeneralUtils;
    import com.netcompss.ffmpeg4android.Prefs;
    import com.netcompss.ffmpeg4android.ProgressCalculator;
    import com.netcompss.loader.LoadJNI;

    public class MainActivity extends Activity {
    public ProgressDialog progressBar;

    String workFolder = null;
    String demoVideoFolder = null;
    String demoVideoPath = null;
    String vkLogPath = null;
    LoadJNI vk;
    private final int STOP_TRANSCODING_MSG = -1;
    private final int FINISHED_TRANSCODING_MSG = 0;
    private boolean commandValidationFailedFlag = false;

    Button button;
    VideoView videoView;
    private static final int PICK_FROM_GALLERY = 1;


    private void runTranscodingUsingLoader() {
       Log.i(Prefs.TAG, "runTranscodingUsingLoader started...");

       PowerManager powerManager = (PowerManager)MainActivity.this.getSystemService(Activity.POWER_SERVICE);
       PowerManager.WakeLock wakeLock = powerManager.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, "VK_LOCK");
       Log.d(Prefs.TAG, "Acquire wake lock");
       wakeLock.acquire();



       String[] complexCommand = {"ffmpeg","-y" ,"-i", "/sdcard/videokit/in.mp4","-strict","experimental",
               "-vf", "movie=/sdcard/videokit/watermark.png [watermark];" +
               " [in][watermark] overlay=main_w-overlay_w-10:10 [out]","-s",
               "320x240","-r", "30", "-b", "15496k", "-vcodec", "mpeg4","-ab",
               "48000", "-ac", "2", "-ar", "22050", "/sdcard/videokit/out1.mp4"};
       ///////////////////////////////////////////////////////////////////////


       vk = new LoadJNI();
       try {
           // running complex command with validation
           vk.run(complexCommand, workFolder, getApplicationContext());

           // running without command validation
           //vk.run(complexCommand, workFolder, getApplicationContext(), false);

           // running regular command with validation
           //vk.run(GeneralUtils.utilConvertToComplex(commandStr), workFolder, getApplicationContext());

           Log.i(Prefs.TAG, "vk.run finished.");
           // copying vk.log (internal native log) to the videokit folder
           GeneralUtils.copyFileToFolder(vkLogPath, demoVideoFolder);

       } catch (CommandValidationException e) {
           Log.e(Prefs.TAG, "vk run exeption.", e);
           commandValidationFailedFlag = true;

       } catch (Throwable e) {
           Log.e(Prefs.TAG, "vk run exeption.", e);
       }
       finally {
           if (wakeLock.isHeld()) {
               wakeLock.release();
               Log.i(Prefs.TAG, "Wake lock released");
           }
           else{
               Log.i(Prefs.TAG, "Wake lock is already released, doing nothing");
           }
       }

       // finished Toast
       String rc = null;
       if (commandValidationFailedFlag) {
           rc = "Command Vaidation Failed";
       }
       else {
           rc = GeneralUtils.getReturnCodeFromLog(vkLogPath);
       }
       final String status = rc;
       MainActivity.this.runOnUiThread(new Runnable() {
           public void run() {
               Toast.makeText(MainActivity.this, status, Toast.LENGTH_LONG).show();
               if (status.equals("Transcoding Status: Failed")) {
                   Toast.makeText(MainActivity.this, "Check: " + vkLogPath + " for more information.", Toast.LENGTH_LONG).show();
               }
           }
       });
    }


    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);

       button = (Button) findViewById(R.id.button);

       videoView = (VideoView) findViewById(R.id.videoview);

       button.setOnClickListener(new View.OnClickListener() {

           public void onClick(View v) {
               // TODO Auto-generated method stub
               Intent intent = new Intent();

               intent.setType("video/*");
               intent.setAction(Intent.ACTION_GET_CONTENT);

               startActivityForResult(Intent.createChooser(intent, "Complete action using"), PICK_FROM_GALLERY);
           }
       });

    }

    @Override
    public void onActivityResult(int requestCode, int resultCode, Intent data) {
       if (resultCode != RESULT_OK) return;

       if (requestCode == PICK_FROM_GALLERY) {
           Uri mVideoURI = data.getData();
           videoView.setVideoURI(mVideoURI);
           videoView.start();
           demoVideoFolder = mVideoURI.getPath();
           demoVideoPath = demoVideoFolder;
           savevideo(mVideoURI);

       }


    }
    private Handler handler = new Handler() {
       @Override
       public void handleMessage(Message msg) {
           Log.i(Prefs.TAG, "Handler got message");
           if (progressBar != null) {
               progressBar.dismiss();

               // stopping the transcoding native
               if (msg.what == STOP_TRANSCODING_MSG) {
                   Log.i(Prefs.TAG, "Got cancel message, calling fexit");
                   vk.fExit(getApplicationContext());


               }
           }
       }
    };

    public void runTranscoding() {
       progressBar = new ProgressDialog(MainActivity.this);
       progressBar.setProgressStyle(ProgressDialog.STYLE_HORIZONTAL);
       progressBar.setTitle("FFmpeg4Android Direct JNI");
       progressBar.setMessage("Press the cancel button to end the operation");
       progressBar.setMax(100);
       progressBar.setProgress(0);

       progressBar.setCancelable(false);
       progressBar.setButton(DialogInterface.BUTTON_NEGATIVE, "Cancel", new DialogInterface.OnClickListener() {
           @Override
           public void onClick(DialogInterface dialog, int which) {
               handler.sendEmptyMessage(STOP_TRANSCODING_MSG);
           }
       });

       progressBar.show();

       new Thread() {
           public void run() {
               Log.d(Prefs.TAG,"Worker started");
               try {
                   //sleep(5000);
                   runTranscodingUsingLoader();
                   handler.sendEmptyMessage(FINISHED_TRANSCODING_MSG);

               } catch(Exception e) {
                   Log.e("threadmessage",e.getMessage());
               }
           }
       }.start();

       // Progress update thread
       new Thread() {
           ProgressCalculator pc = new ProgressCalculator(vkLogPath);
           public void run() {
               Log.d(Prefs.TAG,"Progress update started");
               int progress = -1;
               try {
                   while (true) {
                       sleep(300);
                       progress = pc.calcProgress();
                       if (progress != 0 && progress < 100) {
                           progressBar.setProgress(progress);
                       }
                       else if (progress == 100) {
                           Log.i(Prefs.TAG, "==== progress is 100, exiting Progress update thread");
                           pc.initCalcParamsForNextInter();
                           break;
                       }
                   }

               } catch(Exception e) {
                   Log.e("threadmessage",e.getMessage());
               }
           }
       }.start();
    }

    public void savevideo (Uri mVideoURI){
       demoVideoFolder = mVideoURI.getPath();
       demoVideoPath = demoVideoFolder;
       Log.i(Prefs.TAG, getString(R.string.app_name) + " version: " + GeneralUtils.getVersionName(getApplicationContext()));

       Button invoke = (Button) findViewById(R.id.button);
       invoke.setOnClickListener(new View.OnClickListener() {
           public void onClick(View v) {
               Log.i(Prefs.TAG, "run clicked.");
               runTranscoding();
           }
       });

       workFolder = getApplicationContext().getFilesDir() + "/";
       Log.i(Prefs.TAG, "workFolder (license and logs location) path: " + workFolder);
       vkLogPath = workFolder + "vk.log";
       Log.i(Prefs.TAG, "vk log (native log) path: " + vkLogPath);
       GeneralUtils.copyLicenseFromAssetsToSDIfNeeded(this, workFolder);
       GeneralUtils.copyDemoVideoFromAssetsToSDIfNeeded(this, demoVideoFolder);
       int rc = GeneralUtils.isLicenseValid(getApplicationContext(), workFolder);
       Log.i(Prefs.TAG, "License check RC: " + rc);

    }
    }

    ffmpeg command :

    String[] complexCommand = {"ffmpeg","-y" ,"-i",  "/sdcard/videokit/in.mp4","-strict","experimental",
               "-vf", "movie=/sdcard/videokit/watermark.png [watermark];" +
               " [in][watermark] overlay=main_w-overlay_w-10:10 [out]","-s",
               "320x240","-r", "30", "-b", "15496k", "-vcodec", "mpeg4","-ab",
               "48000", "-ac", "2", "-ar", "22050", "/sdcard/videokit/out1.mp4"};

    Tis command is from a sample project. How do i pass the video path to this command ? I do not know how to edit the command to support my requirement. Can someone guide me through this. Any help will be really helpful. Thank you.

  • Unable to merge videos in Android using JavaCV ("Sample Description" Error)

    15 décembre 2015, par San

    I am creating a video from images via FFMPEG and I am able to get the video from images. I am also making use of JavaCV to merge two videos and I am able to join videos using JavaCV without any issues provided both the videos are taken via camera, i.e, a video actually recorded via mobile camera.

    Issue that I’m facing :

    I am not able to merge the video that was generated from FFMPEG using the images along with the video user has chosen which will mostly be a video that was not generated and taken via mobile camera.

    CODE :
    Code to generate Video via Images :

                     FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path + "/" + "dec16.mp4", 800, 400);
                               try {
                                   recorder.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
                                   //recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
                                   recorder.setVideoCodecName("H264");
                                   recorder.setVideoOption("preset", "ultrafast");
                                   recorder.setFormat("mp4");
                                   recorder.setFrameRate(frameRate);
                                   recorder.setVideoBitrate(60);
                                   recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
                                   startTime = System.currentTimeMillis();
                                   recorder.start();
                                   for(int j=0;j recorder.getTimestamp()) {

                                           recorder.setTimestamp(t);
                                           recorder.record(image);
                                       }
                                   }
                                   recorder.stop();
                               } catch (Exception e) {
                                   e.printStackTrace();
                               }

    Code to merge Videos :

    int count = file_path.size();
               System.out.println("final_joined_list size " + file_path.size());
               if (file_path.size() != 1) {
                   try {
                       Movie[] inMovies = new Movie[count];
                       mediaStorageDir = new File(
                               Environment.getExternalStorageDirectory()
                                       + "/Pictures");

                       for (int i = 0; i < count; i++) {
                           File file = new File(file_path.get(i));
                           System.out.println("fileeeeeeeeeeeeeeee " + file);
                           System.out.println("file exists!!!!!!!!!!");

                           FileInputStream fis = new FileInputStream(file);
                           FileChannel fc = fis.getChannel();
                           inMovies[i] = MovieCreator.build(fc);
                           fis.close();
                           fc.close();

                       }
                       List<track> videoTracks = new LinkedList<track>();
                       List<track> audioTracks = new LinkedList<track>();
                       Log.d("Movies length", "isss  " + inMovies.length);
                       if (inMovies.length != 0) {

                           for (Movie m : inMovies) {

                               for (Track t : m.getTracks()) {
                                   if (t.getHandler().equals("soun")) {
                                       audioTracks.add(t);
                                   }
                                   if (t.getHandler().equals("vide")) {
                                       videoTracks.add(t);
                                   }
                                   if (t.getHandler().equals("")) {

                                   }
                               }

                           }
                       }

                       Movie result = new Movie();

                       System.out.println("audio and videoo tracks : "
                               + audioTracks.size() + " , " + videoTracks.size());
                       if (audioTracks.size() > 0) {
                           result.addTrack(new AppendTrack(audioTracks
                                   .toArray(new Track[audioTracks.size()])));
                       }
                       if (videoTracks.size() > 0) {
                           result.addTrack(new AppendTrack(videoTracks
                                   .toArray(new Track[videoTracks.size()])));
                       }
                       IsoFile out = null;
                       try {
                           out = (IsoFile) new DefaultMp4Builder().build(result);
                       } catch (Exception e) {
                           // TODO Auto-generated catch block
                           e.printStackTrace();
                       }

                       long timestamp = new Date().getTime();
                       String timestampS = "" + timestamp;

                       File storagePath = new File(mediaStorageDir
                               + File.separator);
                       storagePath.mkdirs();
                       File myMovie = new File(storagePath, String.format("%s.mp4", timestampS));
                       FileOutputStream fos = new FileOutputStream(myMovie);
                       FileChannel fco = fos.getChannel();
                       fco.position(0);
                       out.getBox(fco);
                       fco.close();
                       fos.close();

                   } catch (FileNotFoundException e) {
                       // TODO Auto-generated catch block
                       e.printStackTrace();
                   } catch (IOException e) {
                       // TODO Auto-generated catch block
                       e.printStackTrace();
                   }
                   String mFileName = Environment.getExternalStorageDirectory()
                           .getAbsolutePath();
                   // mFileName += "/output.mp4";

                   File sdCardRoot = Environment.getExternalStorageDirectory();
                   File yourDir = new File(mediaStorageDir + File.separator);
                   for (File f : yourDir.listFiles()) {
                       if (f.isFile())
                           name = f.getName();
                       // make something with the name
                   }
                   mFileName = mediaStorageDir.getPath() + File.separator
                           + "output-%s.mp4";
                   System.out.println("final filename : "
                           + mediaStorageDir.getPath() + File.separator
                           + "output-%s.mp4" + "names of files : " + name);
                   single_video = false;
                   return name;
               } else {
                   single_video = true;
                   name = file_path.get(0);
                   return name;
               }
    </track></track></track></track>

    Error :

    The Error that I am facing while trying to merge the videos generated via Images and a normal video is

    12-15 12:26:06.155  26022-26111/? W/System.err﹕ java.io.IOException: Cannot append com.googlecode.mp4parser.authoring.Mp4TrackImpl@45417c38 to com.googlecode.mp4parser.authoring.Mp4TrackImpl@44ffac60 since their Sample Description Boxes differ
    12-15 12:26:06.155  26022-26111/? W/System.err﹕ at com.googlecode.mp4parser.authoring.tracks.AppendTrack.<init>(AppendTrack.java:48)
    </init>

    Fix that I tried :

    Google advised me to change the CODEC in JavaCV from avcodec.AV_CODEC_ID_MPEG4 to avcodec.AV_CODEC_ID_H264. But when I did that, I am not able to get the video from images thereby throwing the following error :

    12-15 12:26:05.840  26022-26089/? W/linker﹕ libavcodec.so has text relocations. This is wasting memory and is a security risk. Please fix.
    12-15 12:26:05.975  26022-26089/? W/System.err﹕ com.googlecode.javacv.FrameRecorder$Exception: avcodec_open2() error -1: Could not open video codec.
    12-15 12:26:05.975  26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.startUnsafe(FFmpegFrameRecorder.java:492)
    12-15 12:26:05.975  26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:267)

    What I need :

    Creating video from Images is inevitable and that video will definitely be used to merge with other videos which might have any Codec Formats. So I need to find a way to merge any kind of videos irrespective of their Codecs or any other parameters. I am trying to keep it simple by just using the Jars and SO files and I dont want to drive myself crazy by going on a full scale implementation of FFMPEG Library. That being said, I am also ready to look into that library if I dont have any other ways to achieve what I want but a solid resource with an ALMOST working code would be much appreciated. Cheers.

    Update :
    I looked upon the issues mentioned at GitHub of OpenCV, but didnt find anything solid from it.
    OpenCV Issues