Recherche avancée

Médias (91)

Autres articles (38)

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (5852)

  • avformat : refactor ff_stream_encode_params_copy() to stream_params_copy()

    6 août 2022, par Pierre-Anthony Lemieux
    avformat : refactor ff_stream_encode_params_copy() to stream_params_copy()
    

    Addresses http://ffmpeg.org/pipermail/ffmpeg-devel/2022-August/299726.html

    Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@outlook.com>

    • [DH] libavformat/avformat.c
    • [DH] libavformat/fifo.c
    • [DH] libavformat/internal.h
    • [DH] libavformat/mux.h
    • [DH] libavformat/mux_utils.c
    • [DH] libavformat/segment.c
    • [DH] libavformat/tee.c
    • [DH] libavformat/webm_chunk.c
  • How to decode two Dolby E audio streams simultaneously with ffmpeg ?

    11 août 2022, par David

    I'm trying to decode 2 Dolby-E audio streams simultaneously.

    &#xA;

    ffmpeg -non_pcm_mode copy -i C:\vlc-record.ts -map 0:3 -map 0:4 -c:a pcm_s24le -f s24le - | ffplay -

    &#xA;

    Using -map 0:3 it works for one at a time but when I add the second one -map 0:4 it only process the first one and there is just one stream on my dolby_e output pipe.

    &#xA;

    Is there a way to create a two streams multicanal non-pcm output from 2 Dolbys PCM streams ?

    &#xA;

    I tried to find a solution with named pipe but it seems like windows command line doesn't accept this. I also tried to merge the PCM streams but then it says Invalid data found when processing input.

    &#xA;

    I can't make it with 2 pass because I'll be using it with udp live video streams.

    &#xA;

    ffmpeg version 2022-03-07-git-e645a1ddb9-full_build-www.gyan.dev Copyright (c) 2000-2022 the FFmpeg developers&#xA;  built with gcc 11.2.0 (Rev7, Built by MSYS2 project)&#xA;  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint&#xA;  libavutil      57. 22.100 / 57. 22.100&#xA;  libavcodec     59. 22.100 / 59. 22.100&#xA;  libavformat    59. 17.103 / 59. 17.103&#xA;  libavdevice    59.  5.100 / 59.  5.100&#xA;  libavfilter     8. 27.100 /  8. 27.100&#xA;  libswscale      6.  5.100 /  6.  5.100&#xA;  libswresample   4.  4.100 /  4.  4.100&#xA;  libpostproc    56.  4.100 / 56.  4.100&#xA;ffplay version 2022-03-07-git-e645a1ddb9-full_build-www.gyan.dev Copyright (c) 2003-2022 the FFmpeg developers&#xA;  built with gcc 11.2.0 (Rev7, Built by MSYS2 project)&#xA;  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint&#xA;  libavutil      57. 22.100 / 57. 22.100&#xA;  libavcodec     59. 22.100 / 59. 22.100&#xA;  libavformat    59. 17.103 / 59. 17.103&#xA;  libavdevice    59.  5.100 / 59.  5.100&#xA;  libavfilter     8. 27.100 /  8. 27.100&#xA;  libswscale      6.  5.100 /  6.  5.100&#xA;  libswresample   4.  4.100 /  4.  4.100&#xA;  libpostproc    56.  4.100 / 56.  4.100&#xA;&#xA;Input #0, mpegts, from &#x27;C:\vlc-record.ts&#x27;:&#xA;  Duration: 00:00:15.76, start: 42530.445511, bitrate: 31505 kb/s&#xA;  Program 1&#xA;  Stream #0:0: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 1920x1080 [SAR 1:1 DAR 16:9], 25 fps, 25 tbr, 90k tbn&#xA;  Stream #0:1: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, fltp, 256 kb/s&#xA;  Stream #0:2: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, fltp, 256 kb/s&#xA;  Stream #0:3: Audio: s302m (BSSD / 0x44535342), stereo, s32 (20 bit), 2304 kb/s&#xA;  Stream #0:4: Audio: s302m (BSSD / 0x44535342), stereo, s32 (20 bit), 2304 kb/s&#xA;Stream mapping:&#xA;  Stream #0:3 -> #0:0 (s302m (native) -> pcm_s24le (native))&#xA;Press [q] to stop, [?] for help&#xA;Output #0, s24le, to &#x27;pipe:&#x27;:&#xA;  Metadata:&#xA;    encoder         : Lavf59.17.103&#xA;  Stream #0:0: Audio: pcm_s24le, 48000 Hz, stereo, s32 (20 bit), 2304 kb/s&#xA;    Metadata:&#xA;      encoder         : Lavc59.22.100 pcm_s24le&#xA;[dolby_e @ 0000026b16da0640] Stream has 2 programs (configuration 0), channels will be output in native order.&#xA;Input #0, s337m, from &#x27;pipe:&#x27;:&#xA;  Duration: N/A, bitrate: N/A&#xA;  Stream #0:0: Audio: dolby_e, 44800 Hz, 7.1, fltp&#xA;[dolby_e @ 0000026b16d7ca80] Stream has 2 programs (configuration 0), channels will be output in native order.&#xA;   1.25 M-A:  0.000 fd=   0 aq=  292KB vq=    0KB sq=    0B f=0/0&#xA;av_interleaved_write_frame(): Broken pipe&#xA;Error writing trailer of pipe:: Broken pipe&#xA;size=    1451kB time=00:00:05.20 bitrate=2286.3kbits/s speed=3.07x&#xA;video:0kB audio:1462kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown&#xA;Error closing file pipe:: Broken pipe&#xA;Conversion failed!&#xA;&#xA;

    &#xA;

  • FFMPEG RTSP stream to multiple output, one mp4 and also restream

    16 août 2022, par csonty

    I would like to seek your help with FFMPEG and how to use it properly on Windows and Linux. I have to get the camera RTSP stream and record it into 1 minutes segments and also stream at the same time.&#xA;I did try to stream it to rtsp, but it didn't work at all [f=RTSP]rtsp ://ipaddress:554/mystream , it stuck and did nothing.

    &#xA;

    I did also try to add the -use_wallclock_as_timestamps 1 but it didn't do anything. I also tried to add the -flags +global_header, but not sure, how to add it into this script.

    &#xA;

    I did try it on both Linux and Windows with the following commands (not sure how to create a bash script from it)

    &#xA;

    Linux :&#xA;./usr/bin/ffmpeg -i &#x27;rtsp://username:password@192.168.68.99:554/Streaming/Channels/101/&#x27; -c copy -f tee -map 0:v &#x27;[f=segment:strftime=1:segment_time=60]wwcam99-%Y-%m-%d_%H-%M-%S.mp4|[f=mpegts]udp://10.10.3.251:5555&#x27;

    &#xA;

    Windows :&#xA;ffmpeg -i "rtsp://username:password@192.168.68.99:554/Streaming/Channels/101/" -c copy -f tee -map 0:v "[f=segment:strftime=1:segment_time=60]wwcam99-%%Y-%%m-%%d_%%H-%%M-%%S.mp4|[f=mpegts]udp://10.10.3.199:5555"

    &#xA;

    It records the files and the stream can be viewed on the localhost (not sure why I cannot view from other machines even if the firewall is off, but it might be a network issue)

    &#xA;

    This is the output with mostly error messages that I couldn't fix :

    &#xA;

    &#xA;

    [rtsp @ 0x563f2688cf80] max delay reached. need to consume packet&#xA;[rtsp @ 0x563f2688cf80] RTP : missed 216 packets&#xA;[h264 @ 0x563f26890c40] left block unavailable for requested intra4x4 mode -1&#xA;[h264 @ 0x563f26890c40] error while decoding MB 0 12, bytestream 144004&#xA;[h264 @ 0x563f26890c40] concealing 13993 DC, 13993 AC, 13993 MV errors in I frame&#xA;Input #0, rtsp, from 'rtsp ://username:password@192.168.68.99:554/Streaming/Channels/101/' :&#xA;Metadata :&#xA;title : Media Presentation&#xA;Duration : N/A, start : 0.983333, bitrate : N/A&#xA;Stream #0:0 : Video : h264 (Main), yuvj420p(pc, bt709, progressive), 2688x1520 [SAR 1:1 DAR 168:95], 6 fps, 20 tbr, 90k tbn, 12 tbc&#xA;[segment @ 0x563f268ced00] Opening 'wwcam99-2022-08-16_17-28-20.mp4' for writing&#xA;Output #0, tee, to '[f=segment:strftime=1:segment_time=60]wwcam09-%Y-%m-%d_%H-%M-%S.mp4|[f=mpegts]udp ://10.10.3.251:5555' :&#xA;Metadata :&#xA;title : Media Presentation&#xA;encoder : Lavf58.29.100&#xA;Stream #0:0 : Video : h264 (Main), yuvj420p(pc, bt709, progressive), 2688x1520 [SAR 1:1 DAR 168:95], q=2-31, 6 fps, 20 tbr, 6 tbn, 6 tbc&#xA;Stream mapping :&#xA;Stream #0:0 -> #0:0 (copy)&#xA;Press [q] to stop, [?] for help&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 0, current : -5 ; changing to 1. This may result in incorrect timestamps in the output file.&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 1, current : -4 ; changing to 2. This may result in incorrect timestamps in the output file.&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 2, current : -3 ; changing to 3. This may result in incorrect timestamps in the output file.&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 3, current : -2 ; changing to 4. This may result in incorrect timestamps in the output file.&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 4, current : -1 ; changing to 5. This may result in incorrect timestamps in the output file.&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 5, current : 0 ; changing to 6. This may result in incorrect timestamps in the output file.&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 6, current : 1 ; changing to 7. This may result in incorrect timestamps in the output file.&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 7, current : 2 ; changing to 8. This may result in incorrect timestamps in the output file.&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 8, current : 3 ; changing to 9. This may result in incorrect timestamps in the output file.&#xA;[tee @ 0x563f268cbc00] Non-monotonous DTS in output stream 0:0 ; previous : 9, current : 4 ; changing to 10. This may result in incorrect timestamps in the output file.

    &#xA;

    &#xA;

    Any help would be highly appreciated

    &#xA;