Recherche avancée

Médias (91)

Autres articles (35)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (4030)

  • FFMPEG incorrect duration from raw video stream

    28 juillet 2020, par LMB

    I have the raw video data from a video stream (no audio) and the duration when I play the raw video in VLC is 67 seconds. When I convert the file to MP4 using FFMPEG the duration is 82.68 seconds.

    


    Video is available here : https://www.filedropper.com/showdownload.php/67seconds
I am using FFMPEG/FFPROBE version git-2020-07-21-b5f1e05 on Windows.

    


    Details of the raw video here :

    


    ffprobe -i 67seconds.raw
ffprobe version git-2020-07-21-b5f1e05 Copyright (c) 2007-2020 the FFmpeg developers
  built with gcc 9.3.1 (GCC) 20200621
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --enable-libgsm --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
  libavutil      56. 55.100 / 56. 55.100
  libavcodec     58. 97.100 / 58. 97.100
  libavformat    58. 49.100 / 58. 49.100
  libavdevice    58. 11.101 / 58. 11.101
  libavfilter     7. 87.100 /  7. 87.100
  libswscale      5.  8.100 /  5.  8.100
  libswresample   3.  8.100 /  3.  8.100
  libpostproc    55.  8.100 / 55.  8.100
Input #0, h264, from '67seconds.raw':
  Duration: N/A, bitrate: N/A
    Stream #0:0: Video: h264 (High), yuv420p(progressive), 640x360 [SAR 1:1 DAR 16:9], 25 fps, 25 tbr, 1200k tbn, 50 tbc


    


    The mp4 duration matches my ffprobe calculation against the raw file itself (2067/25 = 82.68 seconds) :

    


    ffprobe -show_entries stream=r_frame_rate,nb_read_frames -select_streams v -count_frames -of compact=p=0:nk=1 -v 0 67seconds.raw
25/1|2067


    


    What would explain this difference in durations between the raw playback and the FFMPEG interpretation and how do I encode to the same duration for this and other stream data ?

    


  • Sound is lost when concatenate several videos with ffmpeg

    18 août 2020, par Yevhenii Molodtsov

    I have a python script that generates an FFmpeg command and then runs it with shell.
Everything is going okay (the videos are concatenated, the needed cross-dissolve effect is also great), but there are two problems I am trying to fix :

    


      

    1. The output loses the sound.
    2. 


    3. The output is frizzing a little bit (it seems like there are about 10 fps).
    4. 


    


    The example of command generatede by the script is here :

    


    ffmpeg -i /tmp/media/import-2020-Aug-18-Tuesday-08-31-10/temp_8ce3c8ac-
1379-4a16-b141-ebdc910e1142.mp4 -i /tmp/media/import-2020-Aug-18-
Tuesday-08-31-10/temp_d9f60ffb-0a83-4052-8ebb-620a574d1871.mp4 -i 
/tmp/media/import-2020-Aug-18-Tuesday-08-31-10/temp_93934eaa-9589-4736-
8d11-fd8c1bdf0b92.mp4  -filter_complex "
[0:v]trim=start=0:end=4.038,setpts=PTS-STARTPTS[clip0];
[1:v]trim=start=2:end=4.038,setpts=PTS-STARTPTS[clip1];
[2:v]trim=start=2,setpts=PTS-STARTPTS[clip2];
[0:v]trim=start=5.038:end=6.038,setpts=PTS-STARTPTS[out0];
[1:v]trim=start=5.038:end=6.038,setpts=PTS-STARTPTS[out1];
[1:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in1];
[2:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in2];
[in1]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein1];
[in2]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein2];
[out0]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout0];
[out1]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout1];
[fadein1]fifo[fadein1fifo];[fadein2]fifo[fadein2fifo];
[fadeout0]fifo[fadeout0fifo];[fadeout1]fifo[fadeout1fifo];[fadeout0fifo]
[fadein1fifo]overlay[crossfade0];[fadeout1fifo]
[fadein2fifo]overlay[crossfade1];[clip0][crossfade0][clip1][crossfade1]
[clip2]concat=n=5[output]" -map "[output]" 
/tmp/media/final/some_filename_c8e898b2-e3f7-4656-b3b9-1e8fb93cd75c.mp4


    


    I would much appreciate any help because I am new to this tool and here I got stuck.

    


  • Getting "Buffer queue overflow, dropping" while concatenation in ffmpeg

    19 août 2020, par Yevhenii Molodtsov

    I have a python script that generates the FFmpeg command to concatenate the video clips.
Here is the script :

    


            ffmpeg 
        -i /tmp/media/import-2020-Aug-19-Wednesday-08-08-26/temp_eba59811-bb5c-4e36-aa7d-b1731e70432c.mp4 
        -i /tmp/media/import-2020-Aug-19-Wednesday-08-08-26/temp_2a6d7907-7cad-4f38-ae82-4b475546d3e0.mp4 
        -i /tmp/media/import-2020-Aug-19-Wednesday-08-08-26/temp_d7371ecb-01df-4199-951f-336fa475998f.mp4 
        -i /tmp/media/import-2020-Aug-19-Wednesday-08-08-26/temp_cc5de0f1-84b4-47bc-b390-45b5f8711914.mp4 
        -i /tmp/media/import-2020-Aug-19-Wednesday-08-08-26/temp_bf796e0d-135f-40fd-ab8e-4780da6973fd.mp4  
        -filter_complex 
        "[0:v]trim=start=0:end=8.032,setpts=PTS-STARTPTS[clip0];
        [1:v]trim=start=2:end=13.047,setpts=PTS-STARTPTS[clip1];
        [2:v]trim=start=2:end=13.558,setpts=PTS-STARTPTS[clip2];
        [3:v]trim=start=2:end=13.186,setpts=PTS-STARTPTS[clip3];
        [4:v]trim=start=2,setpts=PTS-STARTPTS[clip4];
        [0:v]trim=start=9.032:end=10.032,setpts=PTS-STARTPTS[out0];
        [1:v]trim=start=14.047:end=15.047,setpts=PTS-STARTPTS[out1];
        [2:v]trim=start=14.558:end=15.558,setpts=PTS-STARTPTS[out2];
        [3:v]trim=start=14.186:end=15.186,setpts=PTS-STARTPTS[out3];
        [1:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in1];
        [2:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in2];
        [3:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in3];
        [4:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in4];
        [in1]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein1];
        [in2]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein2];
        [in3]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein3];
        [in4]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein4];
        [out0]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout0];
        [out1]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout1];
        [out2]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout2];
        [out3]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout3];
        [fadein1]fifo[fadein1fifo];
        [fadein2]fifo[fadein2fifo];
        [fadein3]fifo[fadein3fifo];
        [fadein4]fifo[fadein4fifo];
        [fadeout0]fifo[fadeout0fifo];
        [fadeout1]fifo[fadeout1fifo];
        [fadeout2]fifo[fadeout2fifo];
        [fadeout3]fifo[fadeout3fifo];
        [fadeout0fifo][fadein1fifo]overlay[crossfade0];
        [fadeout1fifo][fadein2fifo]overlay[crossfade1];
        [fadeout2fifo][fadein3fifo]overlay[crossfade2];
        [fadeout3fifo][fadein4fifo]overlay[crossfade3];
        [clip0][crossfade0][clip1][crossfade1][clip2][crossfade2][clip3][crossfade3][clip4]concat=n=9[output];
         [0:a][1:a]acrossfade=d=10:c1=tri:c2=tri[A1];
         [A1][2:a]acrossfade=d=10:c1=tri:c2=tri[A2];
         [A2][3:a]acrossfade=d=10:c1=tri:c2=tri[A3];
         [A3][4:a]acrossfade=d=10:c1=tri:c2=tri[audio] " -map "[output]" -map "[audio]" /tmp/media/final/some_filename_587463a2-e3ca-436a-874b-a72e295e46ac.mp4


    


    But I get the following error : Buffer queue overflow, dropping.
I read that to solve this I need to use fifo, but I can't figure out where should I place it. I addition, it seems like fifo works only with video streams.
I would much appreciate any help !