Recherche avancée

Médias (3)

Mot : - Tags -/pdf

Autres articles (57)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (11720)

  • ffmpeg doesnt use all the pictures when creating a video

    9 septembre 2022, par Mikhael Karabas

    I have 75 pictures of the same size for an animation. named 0.png ... 74.png
when running ffmpeg to create a video out of them with 24 fps (commmand and log below) the resulting video instead of expected 75/24 = 3.125 sec. is 2.667 sec in lenght and consists only of first 64 frames(pictures), although ffmpeg tells it has processed 75 frames.
I have checked with

    


    ffmpeg -i output.webm out%%d.png - on the resulting video, it indeed exports 64 first frames and not the rest 11 of them.

    


    Cant undertand what am i doing wrong. please kindly advise.

    


    brief output below.

    


    complete log : https://drive.google.com/file/d/1_J7wLPU9PJZ7jztpiJ8g_bZKPZfiK02L/view?usp=sharing

    


    D:\ffmpeg\ffmpeg-64.exe -report -framerate 24 -f image2 -i %01d.png -c:v libvpx-vp9 -pix_fmt yuva420p -crf 10 -b:v 0 output.webm
ffmpeg started on 2022-09-09 at 19:03:15
Report written to "ffmpeg-20220909-190315.log"
Log level: 48
ffmpeg version 2021-12-17-git-b780b6db64-essentials_build-www.gyan.dev Copyright (c) 2000-2021 the FFmpeg developers
  built with gcc 11.2.0 (Rev2, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-libass --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
  libavutil      57. 11.100 / 57. 11.100
  libavcodec     59. 14.100 / 59. 14.100
  libavformat    59. 10.100 / 59. 10.100
  libavdevice    59.  0.101 / 59.  0.101
  libavfilter     8. 20.100 /  8. 20.100
  libswscale      6.  1.101 /  6.  1.101
  libswresample   4.  0.100 /  4.  0.100
  libpostproc    56.  0.100 / 56.  0.100
Input #0, image2, from '%01d.png':
  Duration: 00:00:03.13, start: 0.000000, bitrate: N/A
  Stream #0:0: Video: png, rgba(pc), 300x400, 24 fps, 24 tbr, 24 tbn
File 'output.webm' already exists. Overwrite? [y/N] y
Stream mapping:
  Stream #0:0 -> #0:0 (png (native) -> vp9 (libvpx-vp9))
Press [q] to stop, [?] for help
[libvpx-vp9 @ 000002dad505c8c0] v1.11.0-62-g7f45e94d9
Output #0, webm, to 'output.webm':
  Metadata:
    encoder         : Lavf59.10.100
  Stream #0:0: Video: vp9, yuva420p(tv, progressive), 300x400, q=2-31, 24 fps, 1k tbn
    Metadata:
      encoder         : Lavc59.14.100 libvpx-vp9
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
frame=   75 fps=9.9 q=0.0 Lsize=    1056kB time=00:00:02.58 bitrate=3347.2kbits/s speed=0.342x
video:1036kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.942318%


    


  • FFmpeg C API HLS real time timestamps

    7 septembre 2022, par Robin

    I'm using the FFmpeg C API in C++ to read from a HLS stream. I need to know the real time of each AVPacket. I can extract the pts using AVPacket::pts but that is relative to the start of the stream.

    


    This is how the .m3u8 file looks :

    


    #EXTM3U
#EXT-X-VERSION:3
#EXT-X-ALLOW-CACHE:NO
#EXT-X-TARGETDURATION:10
#EXT-X-MEDIA-SEQUENCE:0

#EXT-X-PROGRAM-DATE-TIME:2022-09-07T14:01:56.612+02:00
#EXTINF:10.322561783,
1662552116.ts
#EXT-X-PROGRAM-DATE-TIME:2022-09-07T14:02:06.935+02:00
#EXTINF:10.320075936,
1662552126.ts

...


    


    The .m3u8 file contains an accurate EXT-X-PROGRAM-DATE-TIME, but how can I extract the one of the currently playing segment ?

    


    Alternatively, the file name of each .ts file is the unix timestamp in seconds. Can I extract that somehow ?

    


    If none of those are possibly, is it possible to control the exact number of preloaded segments ? I know the (approximate) segment length is 10 seconds so I could just do the following when receiving the first AVPacket :

    


    start_time = current_time - segment_count * segment_length`
start_pts = first_av_packet.pts


    


    And then to get the time of a later AVPacket, I could do :

    


    packet_time = start_time + new_packet.pts - start_pts


    


    This wouldn't give the same accuracy since the segments are not exactly the same length, but that is okay.

    


  • FFmpeg ERROR : Feed with name 'ffm' for stream 'm3u8' is not defined

    7 septembre 2022, par H.W.Kim

    I'm running ffmpeg at :

    


    Ubuntu 18.04.5 LTS
ffmpeg version 3.4.11-0ubuntu0.1


    


    I'm trying to run the video server using ffserver so I modified /etc/ffserver.conf and tagged that .conf file for hls streaming.

    


    /etc/ffserver.conf - added part&#xA;<stream>&#xA;Feed /etc/feed1.ffm&#xA;File "/stream_src/test001/test001.m3u8"&#xA;Format mp4&#xA;VideoCodec mpeg2video&#xA;VideoFrameRate 1&#xA;VideoIntraOnly&#xA;NoAudio&#xA;<stream>&#xA;</stream></stream>

    &#xA;

    But when I runned ffserver -f /etc/server.conf, it tooks an error as shown :

    &#xA;

    root@server-itsteam:/home/server# ffserver -f /etc/ffserver.conf&#xA;\ffserver version 3.4.11-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers&#xA;  built with gcc 7 (Ubuntu 7.5.0-3ubuntu1~18.04)&#xA;  configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared&#xA;  libavutil      55. 78.100 / 55. 78.100&#xA;  libavcodec     57.107.100 / 57.107.100&#xA;  libavformat    57. 83.100 / 57. 83.100&#xA;  libavdevice    57. 10.100 / 57. 10.100&#xA;  libavfilter     6.107.100 /  6.107.100&#xA;  libavresample   3.  7.  0 /  3.  7.  0&#xA;  libswscale      4.  8.100 /  4.  8.100&#xA;  libswresample   2.  9.100 /  2.  9.100&#xA;  libpostproc    54.  7.100 / 54.  7.100&#xA;/etc/ffserver.conf:164: Setting default value for video bit rate tolerance = 64000. Use NoDefaults to disable it.&#xA;/etc/ffserver.conf:164: Setting default value for video rate control equation = tex^qComp. Use NoDefaults to disable it.&#xA;/etc/ffserver.conf:164: Setting default value for video max rate = 128000. Use NoDefaults to disable it.&#xA;/etc/ffserver.conf:219: Setting default value for audio sample rate = 22050. Use NoDefaults to disable it.&#xA;/etc/ffserver.conf:219: Setting default value for audio channel count = 1. Use NoDefaults to disable it.&#xA;/etc/ffserver.conf:219: Setting default value for video bit rate tolerance = 64000. Use NoDefaults to disable it.&#xA;/etc/ffserver.conf:219: Setting default value for video rate control equation = tex^qComp. Use NoDefaults to disable it.&#xA;/etc/ffserver.conf:219: Setting default value for video max rate = 512000. Use NoDefaults to disable it.&#xA;/etc/ffserver.conf:375: Feed with name &#x27;/etc/feed1.ffm&#x27; for stream &#x27;test001.m3u8&#x27; is not defined&#xA;Error reading configuration file &#x27;/etc/ffserver.conf&#x27;: Invalid argument&#xA;

    &#xA;

    Why does this error occur ?&#xA;And is there any part that causes errors in the syntax I wrote ?

    &#xA;