Recherche avancée

Médias (1)

Mot : - Tags -/belgique

Autres articles (70)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (13016)

  • kmsgrab : Use GetFB2 if available

    5 juillet 2020, par Mark Thompson
    kmsgrab : Use GetFB2 if available
    

    The most useful feature here is the ability to automatically extract the
    framebuffer format and modifiers. It also makes support for multi-plane
    framebuffers possible, though none are added to the format table in this
    patch.

    This requires libdrm 2.4.101 (from April 2020) to build, so it includes a
    configure check to allow compatibility with existing distributions. Even
    with libdrm support, it still won't do anything at runtime if you are
    running Linux < 5.7 (before June 2020).

    • [DH] configure
    • [DH] libavdevice/kmsgrab.c
  • ffmpeg extract frame and get timestamp based on creation time for each extracted frame in exif or txt

    4 août 2021, par Scaling

    I try to extract frames from a underwater survey video using ffmpeg. the extracted frames should be merged with a gps position based on a timestamp later in the process. the code i use to extract the frames is :

    &#xA;&#xA;

    ffmpeg -i GOPR0173.MP4 -qscale:v 2 -r 1 frames/%4d.jpg.

    &#xA;&#xA;

    I need a solution to get the real timestamp of the frame based on the survey datetime, so that the timestamp of the frame equals the real time when the video was created. Ideally the timestamp is saved in the images exif or if this is not possible in a seperate textfile and the timestamp is used as name of the files like 2020_01_01_HH_MM_SS.jpg.

    &#xA;&#xA;

    My idea is to use the creation_time of the video add the timestamp of the frame to this and store the result in the exif of the extracted frame or if this is not possible in a txt file.

    &#xA;&#xA;

    Im new to ffmpeg and really lost and would appreciate any help to solve this task !&#xA;If anyone could provide me the necessary code, that would be awesome !

    &#xA;&#xA;

    C:\Unterwasserfotos>ffmpeg -i GOPR0173.MP4 -vstats -qscale:v 2 -r 0.1 frames/%4d.jpg&#xA;ffmpeg version git-2020-01-21-fc6fde2 Copyright (c) 2000-2020 the FFmpeg developers&#xA;  built with gcc 9.2.1 (GCC) 20200122&#xA;  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls&#xA; --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enabl&#xA;e-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-&#xA;libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwol&#xA;ame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enab&#xA;le-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable&#xA;-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable&#xA;-libaom --enable-libmfx --enable-ffnvcodec --enable-cuvid --enable-d3d11va --enable-nvenc --enabl&#xA;e-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf&#xA;  libavutil      56. 38.100 / 56. 38.100&#xA;  libavcodec     58. 66.100 / 58. 66.100&#xA;  libavformat    58. 35.104 / 58. 35.104&#xA;  libavdevice    58.  9.103 / 58.  9.103&#xA;  libavfilter     7. 71.100 /  7. 71.100&#xA;  libswscale      5.  6.100 /  5.  6.100&#xA;  libswresample   3.  6.100 /  3.  6.100&#xA;  libpostproc    55.  6.100 / 55.  6.100&#xA;Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;GOPR0173.MP4&#x27;:&#xA;  Metadata:&#xA;    major_brand     : mp41&#xA;    minor_version   : 538120216&#xA;    compatible_brands: mp41&#xA;    creation_time   : 2020-01-21T12:33:10.000000Z&#xA;    firmware        : HD4.02.05.00.00&#xA;  Duration: 00:02:21.85, start: 0.000000, bitrate: 30152 kb/s&#xA;    Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, bt709), 1920x1080 [SAR&#xA; 1:1 DAR 16:9], 30000 kb/s, 50 fps, 50 tbr, 90k tbn, 100 tbc (default)&#xA;    Metadata:&#xA;      creation_time   : 2020-01-21T12:33:10.000000Z&#xA;      handler_name    :         GoPro AVC&#xA;      encoder         : GoPro AVC encoder&#xA;      timecode        : 12:33:11:16&#xA;    Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 127 kb/s (defa&#xA;ult)&#xA;    Metadata:&#xA;      creation_time   : 2020-01-21T12:33:10.000000Z&#xA;      handler_name    :         GoPro AAC&#xA;      timecode        : 12:33:11:16&#xA;    Stream #0:2(eng): Data: none (tmcd / 0x64636D74) (default)&#xA;    Metadata:&#xA;      creation_time   : 2020-01-21T12:33:10.000000Z&#xA;      handler_name    :         GoPro TCD&#xA;      timecode        : 12:33:11:16&#xA;    Stream #0:3(eng): Data: none (fdsc / 0x63736466), 12 kb/s (default)&#xA;    Metadata:&#xA;      creation_time   : 2020-01-21T12:33:10.000000Z&#xA;      handler_name    :         GoPro SOS&#xA;Stream mapping:&#xA;  Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))&#xA;Press [q] to stop, [?] for help&#xA;Output #0, image2, to &#x27;frames/%4d.jpg&#x27;:&#xA;  Metadata:&#xA;    major_brand     : mp41&#xA;    minor_version   : 538120216&#xA;    compatible_brands: mp41&#xA;    firmware        : HD4.02.05.00.00&#xA;    encoder         : Lavf58.35.104&#xA;    Stream #0:0(eng): Video: mjpeg, yuvj420p(pc), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 200 kb/s,&#xA; 0.10 fps, 0.10 tbn, 0.10 tbc (default)&#xA;    Metadata:&#xA;      creation_time   : 2020-01-21T12:33:10.000000Z&#xA;      handler_name    :         GoPro AVC&#xA;      timecode        : 12:33:11:16&#xA;      encoder         : Lavc58.66.100 mjpeg&#xA;    Side data:&#xA;      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A&#xA;frame=   16 fps=0.3 q=2.0 Lsize=N/A time=00:02:40.00 bitrate=N/A dup=0 drop=7076 speed=3.26x&#xA;video:1938kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown&#xB4;&#xB4;&#xB4;&#xA;&#xA;&#xA;&#xA;&#xA;&#xA;&#xA;&#xA;

    &#xA;

  • Concat mp4 videos and merge their audios to the final output

    24 août 2020, par Yevhenii Molodtsov

    I have several videos and photos and need to merge them with the cross-dissolve effect. The algorithm is next :

    &#xA;

      &#xA;
    1. Create videos from images and add silent audio to them (so they will also have a sound stream) :
    2. &#xA;

    &#xA;

    ffmpeg -y -f lavfi -i anullsrc -loop 1 -i /tmp/media/import-2020-Aug-19-Wednesday-05-40-34/ea5c93fd-d946-4742-b8f7-ea9ae4d43441.jpg -c:v libx264 -t 10 -pix_fmt yuv420p -vf scale=750:1280  /tmp/media/import-2020-Aug-19-Wednesday-05-40-34/ea5c93fd-d946-4742-b8f7-ea9ae4d43441.mp4&#xA;

    &#xA;

      &#xA;
    1. Combine all the videos and audios into one using this command :
    2. &#xA;

    &#xA;

    ffmpeg &#xA;-i /tmp/media/import-2020-Aug-19-Wednesday-05-40-34/temp_68d437c0-f5e2-4651-b07e-91533480b6ef.mp4 &#xA;-i /tmp/media/import-2020-Aug-19-Wednesday-05-40-34/temp_48f3c111-610d-40c7-ac71-6ce2fbb16184.mp4 &#xA;-i /tmp/media/import-2020-Aug-19-Wednesday-05-40-34/temp_1593b5d8-7e16-417d-9372-2267581cd504.mp4 &#xA;-i /tmp/media/import-2020-Aug-19-Wednesday-05-40-34/temp_1ac7f6be-1b12-4e31-b904-1491cc9b9494.mp4 &#xA;-i /tmp/media/import-2020-Aug-19-Wednesday-05-40-34/temp_ea5c93fd-d946-4742-b8f7-ea9ae4d43441.mp4  &#xA;-filter_complex &#xA;"[0:v]trim=start=0:end=8.032,setpts=PTS-STARTPTS[clip0];&#xA;[1:v]trim=start=2:end=13.047,setpts=PTS-STARTPTS[clip1];&#xA;[2:v]trim=start=2:end=13.558,setpts=PTS-STARTPTS[clip2];&#xA;[3:v]trim=start=2:end=13.186,setpts=PTS-STARTPTS[clip3];&#xA;[4:v]trim=start=2,setpts=PTS-STARTPTS[clip4];&#xA;[0:v]trim=start=9.032:end=10.032,setpts=PTS-STARTPTS[out0];&#xA;[1:v]trim=start=14.047:end=15.047,setpts=PTS-STARTPTS[out1];&#xA;[2:v]trim=start=14.558:end=15.558,setpts=PTS-STARTPTS[out2];&#xA;[3:v]trim=start=14.186:end=15.186,setpts=PTS-STARTPTS[out3];&#xA;[1:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in1];&#xA;[2:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in2];&#xA;[3:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in3];&#xA;[4:v]trim=start=0:end=2,setpts=PTS-STARTPTS[in4];&#xA;[in1]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein1];&#xA;[in2]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein2];&#xA;[in3]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein3];&#xA;[in4]format=pix_fmts=yuva420p,fade=t=in:st=0:d=2:alpha=1[fadein4];&#xA;[out0]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout0];&#xA;[out1]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout1];&#xA;[out2]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout2];&#xA;[out3]format=pix_fmts=yuva420p,fade=t=out:st=0:d=2:alpha=1[fadeout3];&#xA;[fadein1]fifo[fadein1fifo];&#xA;[fadein2]fifo[fadein2fifo];&#xA;[fadein3]fifo[fadein3fifo];&#xA;[fadein4]fifo[fadein4fifo];&#xA;[fadeout0]fifo[fadeout0fifo];&#xA;[fadeout1]fifo[fadeout1fifo];&#xA;[fadeout2]fifo[fadeout2fifo];&#xA;[fadeout3]fifo[fadeout3fifo];&#xA;[fadeout0fifo][fadein1fifo]overlay[crossfade0];&#xA;[fadeout1fifo][fadein2fifo]overlay[crossfade1];&#xA;[fadeout2fifo][fadein3fifo]overlay[crossfade2];&#xA;[fadeout3fifo][fadein4fifo]overlay[crossfade3];&#xA;[clip0][crossfade0][clip1][crossfade1][clip2][crossfade2][clip3][crossfade3][clip4]concat=n=9[output];&#xA;[0:a][1:a]acrossfade=d=10:c1=tri:c2=tri[A1];&#xA;[A1][2:a]acrossfade=d=10:c1=tri:c2=tri[A2];&#xA;[A2][3:a]acrossfade=d=10:c1=tri:c2=tri[A3];&#xA;[A3][4:a]acrossfade=d=10:c1=tri:c2=tri[audio] " &#xA;-vsync 0 -map "[output]" -map "[audio]" /tmp/media/final/some_filename_d0d2aab0-792a-4540-b2d3-e64abe98bf5c.mp4&#xA;

    &#xA;

    And all works pretty well, but if I have, for example :

    &#xA;

      &#xA;
    • picture
    • &#xA;

    • video
    • &#xA;

    • video
    • &#xA;

    • picture
    • &#xA;

    &#xA;

    Then the sound from the second video is mapping to the first picture and sound from the third video to second video. And the third video actually goes without sound.

    &#xA;

    It seems like it's happening because the silent sound of the first picture is pretty short. An I right ?&#xA;If so, how can I increase its duration ?

    &#xA;

    I would much appreciate any help with this !

    &#xA;