Recherche avancée

Médias (91)

Autres articles (59)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Les statuts des instances de mutualisation

    13 mars 2010, par

    Pour des raisons de compatibilité générale du plugin de gestion de mutualisations avec les fonctions originales de SPIP, les statuts des instances sont les mêmes que pour tout autre objets (articles...), seuls leurs noms dans l’interface change quelque peu.
    Les différents statuts possibles sont : prepa (demandé) qui correspond à une instance demandée par un utilisateur. Si le site a déjà été créé par le passé, il est passé en mode désactivé. publie (validé) qui correspond à une instance validée par un (...)

Sur d’autres sites (12528)

  • How to use ffmpeg to overlay waveforms on xstack mosaics and specify specific audio for playback

    1er mai 2022, par kellib

    I would like to make a mosaic of multiple titled streams, 1) specifying which of the audio streams to play and 2) overlay waveforms at the bottom of each of the video tiles for the audio that they belong to.

    


    I'm successfully able to create the titled mosaic of streams with the code below.

    


    However :

    


      

    1. I'm having a hard time figuring out how to specify just one of the specific audio sources. I found amix, but I don't really want to mix them, I just want to specify audio [a0], or [a1], or [a2], etc.
    2. 


    


    and

    


      

    1. I'm having a hard time figuring out how to overlay the wave forms at the bottom of the video for each of the tiles. I struggled trying to figure out putting showwaves into the mix. Is it possible ?
    2. 


    


    I want each tile to look like this, but since these are rtmp streams, they need to play-out the matching waveforms dynamically with each stream. https://dragonquest64.blogspot.com/2020/01/ffmpeg-audio-waveform.html

    


    If someone could point me in the right direction, that would be great. I'm getting close, but I'm pretty new to all of this, and have already spent way more time than I should have, so would love a little help.

    


    ffmpeg \
-i rtmp://my.cdn.com/srcEncoders/STREAM-1 \
-i rtmp://my.cdn.com/srcEncoders/STREAM-2 \
-i rtmp://my.cdn.com/srcEncoders/STREAM-3 \
-i rtmp://my.cdn.com/srcEncoders/STREAM-4 \
  -filter_complex " \
      [0:v] setpts=PTS-STARTPTS, scale=qvga \
    , drawtext=text=STREAM-1:fontsize=20:x=10:y=10:fontcolor=white:box=1:boxcolor=black@0.5:boxborderw=5 [a0]; \
      [1:v] setpts=PTS-STARTPTS, scale=qvga \
    , drawtext=text=STREAM-2:fontsize=20:x=10:y=10:fontcolor=white:box=1:boxcolor=black@0.5:boxborderw=5 [a1]; \
      [2:v] setpts=PTS-STARTPTS, scale=qvga \
    , drawtext=text=STREAM-3:fontsize=20:x=10:y=10:fontcolor=white:box=1:boxcolor=black@0.5:boxborderw=5 [a2]; \
      [3:v] setpts=PTS-STARTPTS, scale=qvga \ 
    , drawtext=text=STREAM-4:fontsize=20:x=10:y=10:fontcolor=white:box=1:boxcolor=black@0.5:boxborderw=5 [a3]; \
      [a0][a1][a2][a3]xstack=inputs=4:layout=0_0|0_h0|w0_0|w0_h0[out]; \
    amix=inputs=1
      " \
  -map "[out]" \
 -c:v libx264 -b:v 1000k -g 30 -keyint_min 120 -profile:v baseline -preset veryfast -f mpegts "udp://127.0.0.1:1234?pkt_size=1316"


    


  • Debian php ffmpeg error "Could not get frame filename number 2 from pattern"

    22 août 2021, par Sam

    My code below works fine in on my osx laptop

    


    $sec = 1;
$ffmpeg = \FFMpeg\FFMpeg::create([
   'ffmpeg.binaries'  => exec('which ffmpeg'),
   'ffprobe.binaries' => exec('which ffprobe')
]);

$video = $ffmpeg->open($file);
$frame = $video->frame(TimeCode::fromSeconds($sec));
if(!file_exists($newFilePath)) {
      $frame->save($newFilePath);
      $this->output->writeln('make: ' . $newFilePath);
}


    


    but on debian I get error Could not get frame filename number 2 from pattern

    


    here is the error description from ffmpeg website

    


    


    This usually occurs because the output name is incorrect or some option was omitted.
If outputting a single image you need to include -frames:v 1.
If outputting a series of images you need to use the proper naming pattern as described in the image muxer documentation. For example, output_%03d.png will make a series named output_001.png, output_002.png, output_003.png, etc.
If outputting a single image that is continuously overwritten with new images, add -update 1.

    


    


    My filename looks fine to me

    


    here is the detailed error :

    


    ffmpeg failed to execute command '/usr/bin/ffmpeg' '-y' '-ss' '00:00:01.00' '-i' '/var/www/web/public/data/post/J4ukGXDowqFFA6hNIuP11606468845/BuF8e7HrkX1606  
  468845.mov' '-vframes' '1' '-f' 'image2' '/var/www/web/public/data/post/J4ukGXDowqFFA6hNIuP11606468845/BuF8e7HrkX1606468845-preview.webp':

Error Output:

ffmpeg version 4.1.6-1~deb10u1 Copyright (c) 2000-2020 the FFmpeg developers
built with gcc 8 (Debian 8.3.0-6)
configuration: --prefix=/usr --extra-version='1~deb10u1' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd
64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass
--enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libf
ribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-li
bpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-l
ibtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-li
bzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chrom
aprint --enable-frei0r --enable-libx264 --enable-shared
libavutil      56. 22.100 / 56. 22.100
libavcodec     58. 35.100 / 58. 35.100
libavformat    58. 20.100 / 58. 20.100
libavdevice    58.  5.100 / 58.  5.100
libavfilter     7. 40.101 /  7. 40.101
libavresample   4.  0.  0 /  4.  0.  0
libswscale      5.  3.100 /  5.  3.100
libswresample   3.  3.100 /  3.  3.100
libpostproc    55.  3.100 / 55.  3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/var/www/web/public/data/post/J4ukGXDowqFFA6hNIuP11606468845/BuF8e7HrkX1606468845.mov':                               
Metadata:
major_brand     : qt
minor_version   : 0
compatible_brands: qt
creation_time   : 2020-11-27T09:17:34.000000Z
Duration: 00:00:03.18, start: 0.000000, bitrate: 932 kb/s
Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 138 kb/s (default)                                                                 
Metadata:
creation_time   : 2020-11-27T09:17:34.000000Z
handler_name    : Core Media Audio
Stream #0:1(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt709), 480x264, 780 kb/s, 30 fps, 30 tbr, 600 tbn, 1200 tbc (default)  
Metadata:
creation_time   : 2020-11-27T09:17:34.000000Z
handler_name    : Core Media Video
encoder         : H.264
Stream mapping:
Stream #0:1 -> #0:0 (h264 (native) -> webp (libwebp_anim))                                                                                                          
Press [q] to stop, [?] for help
Output #0, image2, to '/var/www/web/public/data/post/J4ukGXDowqFFA6hNIuP11606468845/BuF8e7HrkX1606468845-preview.webp':                                        
Metadata:
major_brand     : qt
minor_version   : 0
compatible_brands: qt
encoder         : Lavf58.20.100
Stream #0:0(und): Video: webp (libwebp_anim), yuv420p, 480x264, q=2-31, 200 kb/s, 30 fps, 30 tbn, 30 tbc (default)                                                
Metadata:
creation_time   : 2020-11-27T09:17:34.000000Z
handler_name    : Core Media Video
encoder         : Lavc58.35.100 libwebp_anim
[image2 @ 0x55bc313ccfc0] Could not get frame filename number 2 from pattern '/var/www/web/public/data/post/J4ukGXDowqFFA6hNIuP11606468845/BuF8e7HrkX16064688  
  45-preview.webp' (either set update or use a pattern like %03d within the filename pattern)                                                                           
  av_interleaved_write_frame(): Invalid argument                                                                                                                        
  frame=    1 fps=0.0 q=-0.0 Lsize=N/A time=00:00:00.06 bitrate=N/A speed=1.54x                                                                                         
  video:18kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown                                                                       
  Conversion failed!   


    


  • ffmpeg Output file #0 does not contain any stream when rtsp to png

    10 avril 2022, par Alan Sereb

    I want to take 1 frame from a video and put it to png as a thumbnail and refresh it once in a while, so I run :

    


    ffmpeg -y -i 'rtsp://192.168.2.123' -s 240x135 -vf fps=1 %d.jpg


    


    I get the output :

    


    Input #0, rtsp, from 'rtsp://192.168.2.123':
  Metadata:
    title           : RTSP Server
  Duration: N/A, bitrate: 64 kb/s
    Stream #0:0: Video: h264, none, 90k tbr, 90k tbn, 180k tbc
    Stream #0:1: Audio: pcm_mulaw, 8000 Hz, mono, s16, 64 kb/s
Output #0, image2, to '%d.jpg':
Output file #0 does not contain any stream


    


    ffmpeg version 4.2.4-1ubuntu0.1 Copyright (c) 2000-2020 the FFmpeg developers


    


    What am I doing wrong ?