Advanced search

Medias (91)

Other articles (105)

  • Encoding and processing into web-friendly formats

    13 April 2011, by

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 April 2011, by

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

  • Gestion de la ferme

    2 March 2010, by

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

On other websites (13003)

  • Remuxing MKV to MP4 fails when using -map 0 but works fine when Remuxing MP4 to MKV

    20 September 2020, by AppetiteForDestruction

    I have an MKV file that contains 5 streams...

    


    0: video
    
1: audio
    
2: audio
    
3: text
    
4: menu

    


    I was able to remux the MKV into an MP4 using this command...

    


    > ffmpeg.exe -i input.mp4 -map 0 -c copy -c:s mov_text output.mkv


    


    The resulting output MP4 contains all the streams from the original MKV.

    


    But if I try to remux the output MP4 into an MKV using the same command, but with a different codec for the subtitles (srt) and different file extension (mkv), I get an error. However, if I omit "-map 0" from the command, it works but only ONE of the audio streams get copied to the output MKV file.

    


    ====================
    
 Steps to Reproduce 
    
====================

    


    These two commands work, however, the output MKV only contains ONE of the TWO audio streams...

    


    > ffmpeg.exe -i input.mp4 -c copy -c:s srt output.mkv
> ffmpeg.exe -i input.mp4 -c:v copy -c:a copy -c:s srt output.mkv


    


    When I try to use either of the two commands below using "-map 0" just like I did when I remuxed the original MKV into an MP4, the commands fail and both result in the "Full Output" shown at the bottom...

    


    > ffmpeg.exe -i input.mp4 -map 0 -c copy -c:s srt output.mkv
> ffmpeg.exe -i input.mp4 -map 0 -c:v copy -c:a copy -c:s srt output.mkv


    


    Output Error

    


    Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument


    


    So why would using "-map 0" cause it to fail when I used the exact same command to remux the original MKV into an MP4? And why does omitting "-map 0" allow it to work, but obviously doesn't copy both audio streams? Seems like a BUG to me because why does practically the exact same command work for going from MKV to MP4, but not from MP4 to MKV?

    


    Full Output

    


    ffmpeg version git-2020-07-20-43a08d9 Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 9.3.1 (GCC) 20200621
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --enable-libgsm --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
  libavutil      56. 55.100 / 56. 55.100
  libavcodec     58. 96.100 / 58. 96.100
  libavformat    58. 48.100 / 58. 48.100
  libavdevice    58. 11.101 / 58. 11.101
  libavfilter     7. 87.100 /  7. 87.100
  libswscale      5.  8.100 /  5.  8.100
  libswresample   3.  8.100 /  3.  8.100
  libpostproc    55.  8.100 / 55.  8.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'F:\input.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2mp41
    encoder         : Lavf58.48.100
  Duration: 00:11:03.90, start: 0.000000, bitrate: 5873 kb/s
    Chapter #0:0: start 0.000000, end 663.872000
    Metadata:
      title           : Chapter 01
    Stream #0:0(eng): Video: mpeg2video (Main) (mp4v / 0x7634706D), yuv420p(tv, smpte170m, top first), 720x480 [SAR 8:9 DAR 4:3], 5546 kb/s, 29.97 fps, 29.97 tbr, 16k tbn, 59.94 tbc (default)
    Metadata:
      handler_name    : VideoHandler
    Side data:
      cpb: bitrate max/min/avg: 8500000/0/0 buffer size: 1835008 vbv_delay: N/A
    Stream #0:1(eng): Audio: ac3 (ac-3 / 0x332D6361), 48000 Hz, stereo, fltp, 160 kb/s (default)
    Metadata:
      handler_name    : SoundHandler
    Side data:
      audio service type: main
    Stream #0:2(eng): Audio: ac3 (ac-3 / 0x332D6361), 48000 Hz, stereo, fltp, 160 kb/s
    Metadata:
      handler_name    : SoundHandler
    Side data:
      audio service type: main
    Stream #0:3(eng): Subtitle: mov_text (tx3g / 0x67337874), 0 kb/s (default)
    Metadata:
      handler_name    : SubtitleHandler
    Stream #0:4(eng): Data: bin_data (text / 0x74786574)
    Metadata:
      handler_name    : SubtitleHandler
Multiple -c, -codec, -acodec, -vcodec, -scodec or -dcodec options specified for stream 3, only the last option '-c:s srt' will be used.
[matroska @ 0000002df6f90f40] Only audio, video, and subtitles are supported for Matroska.
Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
  Stream #0:1 -> #0:1 (copy)
  Stream #0:2 -> #0:2 (copy)
  Stream #0:3 -> #0:3 (mov_text (native) -> subrip (srt))
  Stream #0:4 -> #0:4 (copy)
    Last message repeated 1 times


    


  • FFmpeg continues to process after time specified at "-to"

    4 August 2020, by Yamahabest

    I have a video, where I want to cut a part from the beginning, and from the end. And I want to apply some fade ins/fade outs, and add some text.

    


    So, I came up with the following syntax:

    


    -ss 10 -to 40 
-i "D:\DATA\Software\VideoProcessor_Files\20171015 Zelhem Tandem Frans met Mirthe.MP4" 
-loop 1 -i "Input_Files\logo maurik large.png" 
-loop 1 -i "Input_Files\logo maurik small.png" 
-filter_complex "
    color=0x7F7F7F@0.95:1920x1080[grey_for_fade_out];
    [grey_for_fade_out]fade=t=out:st=12:d=2:alpha=1[grey_fade_out];
    [0:v][grey_fade_out]overlay[video_grey_fade_out];
    color=0x7F7F7F@0.95:1920x1080[grey_for_fade_in];
    [grey_for_fade_in]fade=t=in:st=37:d=2:alpha=1[grey_fade_in];
    [video_grey_fade_out][grey_fade_in]overlay[video_grey_fade_out_in];
    [1:v]fade=t=out:st=13:d=2:alpha=1[over];
    [over]scale=iw/1.5:-1[scaled];
    [video_grey_fade_out_in][scaled]overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/10[video_grey_fade_out_in_logo];
    [1:v]fade=t=in:st=36:d=2:alpha=1[over2];
    [over2]scale=iw/1.5:-1[scaled2];
    [video_grey_fade_out_in_logo][scaled2]overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2[video_grey_fade_out_in_logo2];
    [2:v]colorchannelmixer=aa=0.5,fade=t=in:st=14:d=2:alpha=1,fade=t=out:st=35:d=2:alpha=1[over3];
    [over3]scale=iw/10:-1[scaled3];
    [video_grey_fade_out_in_logo2][scaled3]overlay=10:10[video_complete];
    [video_complete]drawtext=fontfile=Input_Files/Sansation-Bold.ttf:text='Tandemvlucht met Mirthe':fontsize=96:fontcolor=white:alpha='if(lt(t,11),1,(2-(t-11))/2)':x=(w-text_w)/2:y=((h-text_h)/2)+125,drawtext=fontfile=Input_Files/Sansation-Bold.ttf:text='Zeddam':fontsize=96:fontcolor=white:alpha='if(lt(t,11),1,(2-(t-11))/2)':x=(w-text_w)/2:y=((h-text_h)/2)+250,drawtext=fontfile=Input_Files/Sansation-Bold.ttf:text='4 augustus 2020':fontsize=96:fontcolor=white:alpha='if(lt(t,11),1,(2-(t-11))/2)':x=(w-text_w)/2:y=((h-text_h)/2)+375,drawtext=fontfile=Input_Files/Sansation-Bold.ttf:text='Ook een keer meevliegen?':fontsize=96:fontcolor=white:alpha='if(lt(t,37),0,(t-37)/2)':x=(w-text_w)/2:y=((h-text_h)/6),drawtext=fontfile=Input_Files/Sansation-Bold.ttf:text='Of bel 085 - 049 55 69':fontsize=96:fontcolor=white:alpha='if(lt(t,37),0,(t-37)/2)':x=(w-text_w)/2:y=((h-text_h)/2)+350"
-preset medium 
-crf 18 
-c:a copy 
-y ".\Output_Files\Video\Zeddam\2020-08-04\Mirthe\27ed390a-8497-4550-b93f-4f87d9f2c9f0\MP_Tandemvlucht met_Mirthe_Zeddam_2020-08-04.mp4"


    


    I am quite sure this has worked in the past, but now FFmpeg just keeps on processing endlessly. If I then stop the FFmpeg process, and look at the resulting file, I see that the last frame (of the end of the specified period) just keeps on duplicating.

    


    The drop counter in the console output of FFmpeg also starts increasing at the end of the specified period:

    


    frame=  987 fps= 15 q=-1.0 Lsize=   31357kB time=00:00:41.04 bitrate=6259.0kbits/s dup=0 drop=10 speed=0.644x


    


    I am kind of lost on why this doesn't work anymore. I might have upgraded the FFmpeg executable in the mean time. Maybe my syntax was/is not correct, but I believe it just worked.

    


    It has to be in the complex filter, because when I remove that, it works alright.
It is not in the drawtext part of the complex filter, because it still occurs when I remove that. And when I only do the drawtext, FFmpeg stops correctly at the specified time.

    


    I have tried this, but then it still occurs:

    


    -filter_complex "
    color=0x7F7F7F@0.95:1920x1080[grey_for_fade_out];
    [grey_for_fade_out]fade=t=out:st=12:d=2:alpha=1[grey_fade_out];
    [0:v][grey_fade_out]overlay[video_grey_fade_out];
    color=0x7F7F7F@0.95:1920x1080[grey_for_fade_in];
    [grey_for_fade_in]fade=t=in:st=37:d=2:alpha=1[grey_fade_in];
    [video_grey_fade_out][grey_fade_in]overlay"


    


    Also with this, it still occurs:

    


    -filter_complex "
    [1:v]fade=t=out:st=13:d=2:alpha=1[over];
    [over]scale=iw/1.5:-1[scaled];
    [0:v][scaled]overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/10[video_grey_fade_out_in_logo];
    [1:v]fade=t=in:st=36:d=2:alpha=1[over2];
    [over2]scale=iw/1.5:-1[scaled2];
    [video_grey_fade_out_in_logo][scaled2]overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2[video_grey_fade_out_in_logo2];
    [2:v]colorchannelmixer=aa=0.5,fade=t=in:st=14:d=2:alpha=1,fade=t=out:st=35:d=2:alpha=1[over3];
    [over3]scale=iw/10:-1[scaled3];
    [video_grey_fade_out_in_logo2][scaled3]overlay=10:10"


    


    I just don't understand. All my fade-ins/-outs seem to be within the specified range:

    


      

    • fade=t=out:st=12:d=2: start at 12 seconds, duration 2 seconds. This ends at 14 seconds, which is smaller than 40 seconds.
    • 


    • fade=t=in:st=37:d=2: start at 37 seconds, duration 2 seconds. This ends at 39 seconds, which is smaller than 40 seconds.
    • 


    • fade=t=out:st=13:d=2: start at 13 seconds, duration 2 seconds. This ends at 15 seconds, which is smaller than 40 seconds.
    • 


    • fade=t=in:st=36:d=2: start at 36 seconds, duration 2 seconds. This ends at 38 seconds, which is smaller than 40 seconds.
    • 


    


    It is just like some sequence is not ended properly, which is causing FFmpeg to just continue.

    


  • how to send the input data to FFMPEG from a C# program

    18 October 2020, by jstuardo

    I need to send a binary stream to FFMPEG so that it sends to an RTMP server.

    


    I did it in a nodejs script using socket.io library and in Linux. It works perfectly.

    


    I need to do the same, but in a Windows Forms application using C#.

    


    This is how I run the ffmpeg.exe application:

    


            _currentProcess = new Process();
        _currentProcess.StartInfo.FileName = _ffmpegExe;
        _currentProcess.StartInfo.Arguments = BuildOptions(framesPerSecond, audioBitRate, audioEncoding, rtmpServer);
        _currentProcess.StartInfo.UseShellExecute = false;
        _currentProcess.StartInfo.CreateNoWindow = true;
        _currentProcess.StartInfo.RedirectStandardInput = true;
        _currentProcess.StartInfo.RedirectStandardError = true;
        _currentProcess.ErrorDataReceived += CurrentProcess_ErrorDataReceived;
        _currentProcess.Start();
        _currentProcess.BeginErrorReadLine();


    


    BuildOptions method is defined this way:

    


        private string BuildOptions(int framesPerSecond, int audioBitRate, string audioEncoding, string rtmpServer)
    {
        string options;
        if (framesPerSecond == 1)
        {
            options = $"-i - -c:v libx264 -preset ultrafast -tune zerolatency -r 1 -g 2 -keyint_min 2 -x264opts keyint=2 -crf 25 -pix_fmt yuv420p -profile:v baseline -level 3 -c:a aac -b:a {audioEncoding} -ar {audioBitRate}-f flv {rtmpServer}";
        }
        else if (framesPerSecond == 15)
        {
            options = $"-i - -c:v libx264 -preset ultrafast -tune zerolatency max_muxing_queue_size 1000 -bufsize 5000 -r 15 -g 30 -keyint_min 30 -x264opts keyint=30 -crf 25 -pix_fmt yuv420p -profile:v baseline -level 3 -c:a aac -b:a {audioEncoding} -ar {audioBitRate} -f flv {rtmpServer}";
        }
        else
        {
            options = $"-i - -c:v libx264 -preset ultrafast -tune zerolatency -c:a aac -ar {audioBitRate} -b:a {audioEncoding} -bufsize 5000 -f flv {rtmpServer}";
        }

        return options;
    }


    


    I am sending the data to the standard input this way:

    


        public void EncodeAndSend(byte[] data)
    {
        if (_currentProcess != null)
        {
            var streamWriter = _currentProcess.StandardInput;
            streamWriter.Write(Encoding.GetEncoding("ISO-8859-1").GetChars(data));
        }
    }


    


    And finally, this method is for receiving the standard error which receives the result from ffmpeg.exe:

    


        private void CurrentProcess_ErrorDataReceived(object sender, DataReceivedEventArgs e)
    {
        Console.WriteLine(e.Data);
    }


    


    When I run the application, this is shown in the console:

    


    ffmpeg version 4.3.1-2020-10-01-essentials_build-www.gyan.dev Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 10.2.0 (Rev3, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-libass --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
  libavutil      56. 51.100 / 56. 51.100
  libavcodec     58. 91.100 / 58. 91.100
  libavformat    58. 45.100 / 58. 45.100
  libavdevice    58. 10.100 / 58. 10.100
  libavfilter     7. 85.100 /  7. 85.100
  libswscale      5.  7.100 /  5.  7.100
  libswresample   3.  7.100 /  3.  7.100
  libpostproc    55.  7.100 / 55.  7.100
pipe:: Invalid data found when processing input


    


    If I change the EncodeAndSend method to be:

    


        public void EncodeAndSend(byte[] data)
    {
        if (_currentProcess != null)
        {
            var streamWriter = _currentProcess.StandardInput;
            streamWriter.Write(data);
        }
    }


    


    pipe:: Invalid data found when processing input error is not produced, but no more outputs are shown so it seems it is not working.

    


    What is wrong with this? how can I send the data to the FFMPEG process?

    


    Finally, I tell you that the binary stream comes from the camera by mean of MediaRecorder in a web page (the same used for my program in nodejs server, so that it is not the issue here)