Recherche avancée

Médias (2)

Mot : - Tags -/map

Autres articles (39)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (6465)

  • ffmpeg : segmenting webcam video output on windows (dshow)

    2 mai 2021, par Arthur Hebert-Ryan

    I want to do exactly what is described in the post linked below, except on Windows using dshow instead of v4l2 :

    


    FFMPEG : Create contiguous videos from webcam

    


    My attempts all result in a single output file as though the segment options are being ignored. Here is my most basic command to try to get a series of 30s video files from my webcam :

    


    ffmpeg -f dshow -video_pin_name 1 -i video="USB Camera" -copyinkf -vcodec copy -f segment -segment_time 30 video_segment_%03d.mp4


    


    Command output :

    


    D:\videos>ffmpeg -f dshow -video_pin_name 1 -i video="USB Camera" -copyinkf -vcodec copy -f segment -segment_time 30 video_segment_%03d.mp4
ffmpeg version 4.3.2-2021-02-27-essentials_build-www.gyan.dev Copyright (c) 2000-2021 the FFmpeg developers
built with gcc 10.2.0 (Rev6, Built by MSYS2 project)
configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-libass --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
libavutil      56. 51.100 / 56. 51.100
libavcodec     58. 91.100 / 58. 91.100
libavformat    58. 45.100 / 58. 45.100
libavdevice    58. 10.100 / 58. 10.100
libavfilter     7. 85.100 /  7. 85.100
libswscale      5.  7.100 /  5.  7.100
libswresample   3.  7.100 /  3.  7.100
libpostproc    55.  7.100 / 55.  7.100
Input #0, dshow, from 'video=USB Camera':
Duration: N/A, start: 869580.415000, bitrate: N/A
    Stream #0:0: Video: h264 (Main) (H264 / 0x34363248), yuv420p, 1920x1080, 30 fps, 30 tbr, 10000k tbn, 20000k tbc
[segment @ 000001e814655640] Opening 'video_segment_000.mp4' for writing
Output #0, segment, to 'video_segment_%03d.mp4':
Metadata:
    encoder         : Lavf58.45.100
    Stream #0:0: Video: h264 (Main) (H264 / 0x34363248), yuv420p, 1920x1080, q=2-31, 30 fps, 30 tbr, 10000k tbn, 10000k tbc
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
frame=  171 fps= 34 q=-1.0 size=N/A time=00:00:05.26 bitrate=N/A speed=1.04x


    


    I've tried :

    


      

    • using ssegment instead of segment
    • 


    • different output containers (.mp4, .h264, .avi, .mkv)
    • 


    • adding -map 0 and -an options
    • 


    • verifying that I'm getting keyframes from the output every second using the ffprobe command described here
    • 


    • adding -g 60
    • 


    


    But I get only one output file, which grows indefinitely. How do I capture from my webcam continuously into files of equal duration ?

    


    Here is a log of the same command using the -report option during 2+ minutes of capture (the log too big to paste into SO).

    


  • Performance issue in streaming desktop out of a Raspberry with FFMPEG

    30 avril 2021, par skynet

    I'm quite a newbie in FFMPEG and I apologize in advance for any inaccuracy I may write.

    


    My goal is to stream over UDP a Full HD (1920x1080) desktop connected to a Raspberry PI 4 - 4 GB RAM.

    


    I made many attempts, and currently this the setup with better performance I found (I used the knowledge in https://www.willusher.io/general/2020/11/15/hw-accel-encoding-rpi4).

    


      

    1. I installed the 64-bit Raspbian OS on the PI from this link : https://www.raspberrypi.org/forums/viewtopic.php?t=275370
    2. 


    


    This because the HW H264 encoder (h264_v4l2m2m) is faster than the 32-bit encoder h264_omx

    


      

    1. I downloaded and installed FFMPEG 4.4, using this configuration
./configure —prefix="$HOME/ffmpeg_build" —pkg-config-flags="pkg-config —static" —extra-cflags="-I$HOME/ffmpeg_build/include" —extra-ldflags="-L$HOME/ffmpeg_build/lib" —extra-libs="-lpthread -lm" —bindir="$HOME/bin" —enable-gpl —enable-gnutls —disable-libaom —enable-libass —enable-libfdk-aac —enable-libfreetype —enable-libmp3lame —enable-libopus —enable-libvorbis —enable-libvpx —enable-libx264 —disable-libx265 —enable-nonfree —arch=aarch64 —disable-libxml2 —enable-libwebp—enable-libdrm
    2. 


    


    I used FFMPEG 4.4 because of compatibility issues with h264_v4l2m2m of the shipped FFMPEG version 4.1.4

    


      

    1. I use this command

      


      ffmpeg -f x11grab -probesize 42M -s 1920x1080 -i :0.0 -c:a copy -c:v h264_v4l2m2m -num_output_buffers 32 -num_capture_buffers 16 -b:v 8M -minrate 8M -maxrate 8M -pix_fmt rgb24 -f mpegts udp ://239.255.90.60:5004 ?pkt_size=1316

      


    2. 


    


    The -pix_fmt rgb24 flag is needed because x11grab in 64 bit OS use BGR0 chroma subsampling... which means red and blue colours are inverted !

    


    This is the output :

    


    ffmpeg version 4.4 Copyright (c) 2000-2021 the FFmpeg developers  
  built with gcc 8 (Debian 8.3.0-6)  
  configuration: --prefix=/home/pi/ffmpeg_build --pkg-config-flags='pkg-config --static' --extra-cflags=-I/home/pi/ffmpeg_build/include --extra-ldflags=-L/home/pi/ffmpeg_build/lib --extra-libs='-lpthread -lm' --enable-gpl --enable-gnutls --disable-libaom --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libvorbis --enable-libvpx --enable-libx264 --disable-libx265 --enable-nonfree --arch=aarch64 --disable-libxml2 --enable-libwebp --enable-libdrm  
  libavutil      56. 70.100 / 56. 70.100  
  libavcodec     58.134.100 / 58.134.100  
  libavformat    58. 76.100 / 58. 76.100  
  libavdevice    58. 13.100 / 58. 13.100  
  libavfilter     7.110.100 /  7.110.100  
  libswscale      5.  9.100 /  5.  9.100  
  libswresample   3.  9.100 /  3.  9.100  
  libpostproc    55.  9.100 / 55.  9.100  
Input #0, x11grab, from ':0.0':  
  Duration: N/A, start: 1619778505.990981, bitrate: 1988667 kb/s  
  Stream #0:0: Video: rawvideo (BGR[0] / 0x524742), bgr0, 1920x1080, 1988667 kb/s, 29.97 fps, 56   tbr, 1000k tbn, 1000k tbc  
Stream mapping:  
  Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (h264_v4l2m2m))  
Press [q] to stop, [?] for help  
[h264_v4l2m2m @ 0x5598d14070] Using device /dev/video11  
[h264_v4l2m2m @ 0x5598d14070] driver 'bcm2835-codec' on card 'bcm2835-codec-encode' in mplane mode  
[h264_v4l2m2m @ 0x5598d14070] requesting formats: output=RGB3 capture=H264  
[h264_v4l2m2m @ 0x5598d14070] Failed to set gop size: Invalid argument  
Output #0, mpegts, to 'udp://239.255.90.60:5004?pkt_size=1316':  
  Metadata:  
    encoder         : Lavf58.76.100  
  Stream #0:0: Video: h264, rgb24(pc, progressive), 1920x1080, q=2-31, 8000 kb/s, 56 fps, 90k tbn  
    Metadata:  
      encoder         : Lavc58.134.100 h264_v4l2m2m  
[mpegts @ 0x5598d12c10] Non-monotonous DTS in output stream 0:0; previous: 0, current: 0; changing   to 1. This may result in incorrect timestamps in the output file.  
frame=  799 fps= 16 q=-0.0 size=   16180kB time=00:00:49.19 bitrate=2694.2kbits/s speed=   1x 


    


    As you can see my issue is that the fps value is about 15 when streaming an HD video with VLC in full-screen mode (the fps value depends on what is displayed on screen, which I find odd being the encoding process should be done in HW).

    


    So the question is : any hope I can get close to 25 fps, so to have a smooth display on the receiver ? Either using a better FFMPEG command or tweaking the Raspberry ?

    


    Thanks for any help !

    


  • compiling ffmpeg for Mac OSX High Sierra 10.13

    29 juillet 2021, par Martin

    Hello I made an electron app that uses ffmpeg to combine audio and renders video, it works fine on windows, linux, and modern mac osx computers, but a user has reported to me that on an older version of mac osx such as High Sierra 10.13, the way that I have setup ffmpeg does not work.

    


    I have a virtual machine with High Sierra v10.13 where I install RenderTune-mac.dmg from my RenderTune releases page, then I download 2 audio files and the image from this link. I open RenderTune, and try render a video. My command to combine the audio files into a single mp3 works fine, but when I try to combine that mp3 with the image file, the ffmpeg build I have packaged with my electron app fails with this error :

    


    Command was killed with SIGABRT (Aborted): /Applications/RenderTune.app/Contents/Resources/ffmpeg -loop 1 -framerate 2 -i /Users/martin/Downloads/R-3777978-1344032418-8379.jpeg.jpg -i /Users/martin/Downloads/output-871140.mp3 -y -acodec copy -b:a 320k -vcodec libx264 -b:v 8000k -maxrate 8000k -minrate 8000k -bufsize 3M -filter:v scale=w=1920:h=1954 -preset medium -tune stillimage -crf 18 -pix_fmt yuv420p -shortest /Users/martin/Downloads/concatVideo-871140.mp4
ffmpeg version git-2021-03-24-13335df Copyright (c) 2000-2021 the FFmpeg developers
  built with Apple LLVM version 10.0.1 (clang-1001.0.46.4)
  configuration: --pkgconfigdir=/Users/martinbarker/Documents/projects/rendertune-0.5.0/workspace/lib/pkgconfig --prefix=/Users/martinbarker/Documents/projects/rendertune-0.5.0/workspace --pkg-config-flags=--static --extra-cflags='-I/Users/martinbarker/Documents/projects/rendertune-0.5.0/workspace/include -mmacosx-version-min=10.10' --extra-ldflags='-L/Users/martinbarker/Documents/projects/rendertune-0.5.0/workspace/lib -mmacosx-version-min=10.10' --extra-libs='-lpthread -lm' --enable-static --disable-securetransport --disable-debug --disable-shared --disable-ffplay --disable-lzma --disable-doc --enable-version3 --enable-pthreads --enable-runtime-cpudetect --enable-avfilter --enable-filters --disable-libxcb --enable-gpl --enable-nonfree --disable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libx264
  libavutil      56. 66.100 / 56. 66.100
  libavcodec     58.128.100 / 58.128.100
  libavformat    58. 69.100 / 58. 69.100
  libavdevice    58. 12.100 / 58. 12.100
  libavfilter     7.107.100 /  7.107.100
  libswscale      5.  8.100 /  5.  8.100
  libswresample   3.  8.100 /  3.  8.100
  libpostproc    55.  8.100 / 55.  8.100
Input #0, image2, from '/Users/martin/Downloads/R-3777978-1344032418-8379.jpeg.jpg':
  Duration: 00:00:00.50, start: 0.000000, bitrate: 1758 kb/s
  Stream #0:0: Video: mjpeg (Progressive), yuvj444p(pc, bt470bg/unknown/unknown), 590x600 [SAR 1:1 DAR 59:60], 2 fps, 2 tbr, 2 tbn, 2 tbc
Input #1, mp3, from '/Users/martin/Downloads/output-871140.mp3':
  Metadata:
    title           : My Little Grass Shack
    album           : Our Hawaii - A Collection Of Personal Favorites
    artist          : Society Of Seven
    track           : 11
    encoder         : Lavf58.69.100
  Duration: 00:06:25.59, start: 0.025057, bitrate: 320 kb/s
  Stream #1:0: Audio: mp3, 44100 Hz, stereo, fltp, 320 kb/s
    Metadata:
      encoder         : Lavc58.12
Stream mapping:
  Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
  Stream #1:0 -> #0:1 (copy)
Press [q] to stop, [?] for help
[swscaler @ 0x7fbad9167600] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 0x7fbad9040400] using SAR=2681/2679
dyld: lazy symbol binding failed: Symbol not found: ____chkstk_darwin
  Referenced from: /Applications/RenderTune.app/Contents/Resources/ffmpeg
  Expected in: /usr/lib/libSystem.B.dylib

dyld: Symbol not found: ____chkstk_darwin
  Referenced from: /Applications/RenderTune.app/Contents/Resources/ffmpeg
  Expected in: /usr/lib/libSystem.B.dylib

    at makeError (/Applications/Render…eca/lib/error.js:59)
    at handlePromise (/Applications/Render…/execa/index.js:114)
    at async file:/Applicat…js/newindex.js:1323


    


    These files will render fine on windows/linux and recent mac versions. In order to package ffmpeg in my electron app on mac computers I had to build a custom sandboxed version with no dynamically linked libraries. I have a .sh file that automatically downloads ffmpeg and builds it with all the necessary flags for mac computers.
https://github.com/MartinBarker/RenderTune/blob/master/buildffmpeg.sh
Inside this .sh file is where I compile ffmpeg using these flags :

    


    ./configure \
    --pkgconfigdir="$WORKSPACE/lib/pkgconfig" \
    --prefix=${WORKSPACE} \
    --pkg-config-flags="--static" \
    --extra-cflags="-I$WORKSPACE/include -mmacosx-version-min=${MACOS_MIN}" \
    --extra-ldflags="-L$WORKSPACE/lib -mmacosx-version-min=${MACOS_MIN}" \
    --extra-libs="-lpthread -lm" \
        --enable-static \
        --disable-securetransport \
        --disable-debug \
        --disable-shared \
        --disable-ffplay \
        --disable-lzma \
        --disable-doc \
        --enable-version3 \
        --enable-pthreads \
        --enable-runtime-cpudetect \
        --enable-avfilter \
        --enable-filters \
        --disable-libxcb \
        --enable-gpl \
        --enable-nonfree \
        --disable-libass \
        --enable-libfdk-aac \
        --enable-libmp3lame \
        --enable-libx264 


    


    If I try to run this script in my High Sierra VM, it fails with this message :

    


    Unknown option "-extra-libs=-lpthread"

    


    if I remove that flag it fails with a different message :

    


    Unknown option "--enable-static"

    


    I need this flag in order to release my electron app on the mac apple store, can anyone help me compile a static version of ffmpeg that works on old versions like High Sierra 10.13 as well as works on modern mac os systems ?