Recherche avancée

Médias (0)

Mot : - Tags -/logo

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (56)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (9765)

  • FFmpeg creating multicast with multiple streams from one source (live source)

    1er mai 2018, par Netheme

    Hi :) I’m facing problems when creating multicast with multiple streams from one source (live source). The issue is that the encoding is slower than realtime. That is big issue for me as I’m processing live feed. The exact command and it’s output you can find here :

    Command

    ffmpeg -y -hwaccel cuvid -c:v h264_cuvid -vsync 0 -deint 2 -drop_second_field 1 -surfaces 10 -i 'udp://@239.10.150.2:5004?fifo_size=100000000&bitrate=12000000'  -map i:0xc77 -r:v:0 25 -filter:v:0 fps=25,scale_npp=640:360:format=same:interp_algo=lanczos -aspect:v:0 16:9 -g:v:0 80 -vcodec:v:0 h264_nvenc -b:v:0 400K -minrate:v:0 400k -maxrate:v:0 400k  -map i:0xc77 -r:v:1 25 -filter:v:1 fps=25,scale_npp=720:405:format=same:interp_algo=lanczos -aspect:v:1 16:9 -g:v:1 80 -vcodec:v:1 h264_nvenc -b:v:1 1000K -minrate:v:1 1000k -maxrate:v:1 1000k  -map i:0xc77 -r:v:2 25 -filter:v:2 fps=25,scale_npp=1280:720:format=same:interp_algo=lanczos -aspect:v:2 16:9 -g:v:2 80 -vcodec:v:2 h264_nvenc -b:v:2 2500K -minrate:v:2 2500K -maxrate:v:2 2500K  -map i:0xc77 -r:v:3 25 -filter:v:3 fps=25,scale_npp=1920:1080:format=same:interp_algo=lanczos -aspect:v:3 16:9 -g:v:3 80 -vcodec:v:3 h264_nvenc -b:v:3 4500K -minrate:v:3 4500K -maxrate:v:3 4500K  -map i:0xc7a -ab:a:0 128k -ar:a:0 48k -acodec:a:0 aac -ac:a:0 2  -map i:0xc7b -ab:a:1 128k -ar:a:1 48k -acodec:a:1 aac -ac:a:1 2  -map i:0xc7c -ab:a:2 128k -ar:a:2 48k -acodec:a:2 aac -ac:a:2 2  -map i:0xc7d -ab:a:3 128k -ar:a:3 48k -acodec:a:3 aac -ac:a:3 2  -f mpegts "udp://@239.100.0.23:3234?overrun_nonfatal_option=1&pkt_size=1316

    And the output (speed should be 1> i hope)

    Stream #0:0 -> #0:0 (h264 (h264_cuvid) -> h264 (h264_nvenc))
    Stream #0:0 -> #0:1 (h264 (h264_cuvid) -> h264 (h264_nvenc))
    Stream #0:0 -> #0:2 (h264 (h264_cuvid) -> h264 (h264_nvenc))
    Stream #0:0 -> #0:3 (h264 (h264_cuvid) -> h264 (h264_nvenc))
    Stream #0:1 -> #0:4 (mp2 (native) -> aac (native))
    Stream #0:2 -> #0:5 (mp2 (native) -> aac (native))
    Stream #0:3 -> #0:6 (mp2 (native) -> aac (native))
    Stream #0:4 -> #0:7 (mp2 (native) -> aac (native))
    Press [q] to stop, [?] for help
    Output #0, mpegts, to 'udp://@239.100.0.23:3234?overrun_nonfatal_option=1&pkt_size=1316':=  -0.0kbits/s speed=N/A    
    Metadata:
      encoder         : Lavf57.83.100
      Stream #0:0: Video: h264 (h264_nvenc) (Main), cuda, 640x360 [SAR 1:1 DAR 16:9], q=-1--1, 400 kb/s, 25 fps, 90k tbn, 25 tbc
      Metadata:
        encoder         : Lavc57.107.100 h264_nvenc
      Side data:
        cpb: bitrate max/min/avg: 400000/0/400000 buffer size: 800000 vbv_delay: -1
      Stream #0:1: Video: h264 (h264_nvenc) (Main), cuda, 720x405 [SAR 1:1 DAR 16:9], q=-1--1, 1000 kb/s, 25 fps, 90k tbn, 25 tbc
      Metadata:
        encoder         : Lavc57.107.100 h264_nvenc
      Side data:
        cpb: bitrate max/min/avg: 1000000/0/1000000 buffer size: 2000000 vbv_delay: -1
      Stream #0:2: Video: h264 (h264_nvenc) (Main), cuda, 1280x720 [SAR 1:1 DAR 16:9], q=-1--1, 2500 kb/s, 25 fps, 90k tbn, 25 tbc
      Metadata:
        encoder         : Lavc57.107.100 h264_nvenc
      Side data:
        cpb: bitrate max/min/avg: 2500000/0/2500000 buffer size: 5000000 vbv_delay: -1
      Stream #0:3: Video: h264 (h264_nvenc) (Main), cuda, 1920x1080 [SAR 1:1 DAR 16:9], q=-1--1, 4500 kb/s, 25 fps, 90k tbn, 25 tbc
      Metadata:
        encoder         : Lavc57.107.100 h264_nvenc
      Side data:
        cpb: bitrate max/min/avg: 4500000/0/4500000 buffer size: 9000000 vbv_delay: -1
      Stream #0:4(cze): Audio: aac (LC), 48000 Hz, stereo, fltp, 128 kb/s
      Metadata:
        encoder         : Lavc57.107.100 aac
      Stream #0:5(eng): Audio: aac (LC), 48000 Hz, stereo, fltp, 128 kb/s
      Metadata:
        encoder         : Lavc57.107.100 aac
      Stream #0:6(hun): Audio: aac (LC), 48000 Hz, stereo, fltp, 128 kb/s
      Metadata:
        encoder         : Lavc57.107.100 aac
      Stream #0:7(ron): Audio: aac (LC), 48000 Hz, stereo, fltp, 128 kb/s
      Metadata:
        encoder         : Lavc57.107.100 aac
    frame= 2868 fps= 19 q=18.0 Lq=14.0 q=15.0 q=16.0 size=  134155kB time=00:01:55.88 bitrate=9484.0kbits/s speed=0.768x

    Does anyone have experiences with something like this ?

  • ffmpeg - Different number of macroblocks per each frame

    18 juin 2018, par fabridigua

    i am studying the MPEG compression, in particular how are "labeled" the macroblocks with ffmpeg, using the command :

    ffmpeg -debug mb_type -i input.mp4 out.mp4

    From theory, i know that typically a macroblock is a 16x16 (pixels) block.

    So if a frame is e.g. 1920x1080 i suppose that the macroblocks are

    (1920*1080)/(16*16) = 8100

    Now, analysing the ffmpeg report, I’ve seen that for each frame there are many many less macroblocks (labelled with some "characters" indicating their type).

    Can you explain me why this happens ?
    Is a (mine) theory error ?

    I’ve seen that "visualizing" macroblocks with -debug vis_mb_type there some gray blocks.. corresponding to the (macro)blocks not satisfying any criteria ffmpeg is looking for.. maybe in my example for each frame there are

    8100 - #"gray" macroblocks

    Is this possible ? And why ?

  • ffmpeg cross fade 2 pictures in 1080p

    11 mai 2018, par user3574681

    I use this command to fade 2 pictures and it work perfect with 1024 × 768 test pictures.

    ffmpeg -loop 1 -i 1-test.jpg -loop 1 -i 2-test.jpg -filter_complex "[1:v][0:v]blend=all_expr='A*(if(gte(T,3),1,T/3))+B*(1-(if(gte(T,3),1,T/3)))'" -t 4 frames_%04d.png

    But with my original pictures in 1920 × 1080 this error shows up :

    [Parsed_blend_0 @ 0x97c8240] First input link top parameters (size 1920x1080, SAR 0:1) do not match the corresponding second input link bottom parameters (1920x1080, SAR 1:1)
    [Parsed_blend_0 @ 0x97c8240] Failed to configure output pad on Parsed_blend_0

    Why is this happening and how can I fix it ?
    Thank you for helping !