Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (62)

  • Encodage et transformation en formats lisibles sur Internet

    10 avril 2011

    MediaSPIP transforme et ré-encode les documents mis en ligne afin de les rendre lisibles sur Internet et automatiquement utilisables sans intervention du créateur de contenu.
    Les vidéos sont automatiquement encodées dans les formats supportés par HTML5 : MP4, Ogv et WebM. La version "MP4" est également utilisée pour le lecteur flash de secours nécessaire aux anciens navigateurs.
    Les documents audios sont également ré-encodés dans les deux formats utilisables par HTML5 :MP3 et Ogg. La version "MP3" (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (9516)

  • ffmpeg : How does one designate what parts of an overlay video stream let the underlay video stream show through ?

    21 février 2023, par Walt Howard

    Stream 0 is a rotating planet created using povray. (https://www.povray.org/)

    


    Stream 1 is just a static jpeg of stars.

    


    I'm using overlay like this :

    


    nice ffmpeg -i protoplanet.mp4 -i stars.jpg \
         -filter_complex "[0:v]scale=1024:768[scaled];[1:v][scaled]overlay=0:0" \
         -movflags faststart -pix_fmt yuv420p -r 10 planet.mp4


    


    I had it working when I used the output from povray directly. I didn't have to know why that worked, because it just did. However, after adding some post processing to the planet video, the entire video has no alpha channel (educated guess) so the background stream (Stream 1) cannot show through.

    


    The post processing I did was this (which works great) : https://www.reddit.com/r/ffmpeg/comments/im2mkp/creating_a_retro_glow_effect_with_ffmpeg/

    


    But that made the video unable to have an overlay background possibly due to it destroying the alpha channel and turning many of the blacks to dark grey.

    


    I can merge the pure POVRAY output and the background and then add the glow effect, but it adds the effect to the background stars and captions also which ruins the effect to some degree. I want to glo-ify the planet first, then stick it on a pure starfield background.

    


    In thinking this over I may have to recreate the alpha channel after adding the glow effect, using a nearest match to black and dark grey to alpha.

    


    Hmmm. It might be a codec issue as I didn't specify any -c:v in any of my commands....

    


  • MSE Does not show duration when MPEG-DASH segments are created with FFmpeg

    15 mars 2023, par WaveLink

    I am trying to create a system that streams DASH segments to the browser, where it gets decoded by Media Source Extensions. When using MP4Box, the init segment loads the duration into the player, but the version generated by FFmpeg does not.

    


    I believe this is due to some sort of header missing from the init segment generated by FFmpeg, but I have not been able to figure it out. Here is the command I use to generate the files :

    


    ffmpeg -i .\source.opus -c:a libopus -c:v none -seg_duration 1 -dash_segment_type mp4 -preset ultrafast -use_timeline 1 -use_template 1 -streaming 1 -init_seg_name "segment_init.mp4" -media_seg_name "segment_$Number$.m4s" -fflags +genpts -f dash ./out.mpd
When using MP4Box, this is the config I use :

    


    MP4Box -dash 1000 -rap -profile live -segment-name "segment_$Number" .\out.mp4
In both cases, the audio does play, and both of them contain similar headers, but the FFmpeg version just does not contain the media duration in the init segment, or at least the MSE implementation cannot read it.

    


    I tested this in Chrome, Firefox, and Edge, and all of them behave the same.

    


    Thanks for any help.

    


  • Subtitles added to MKV with ffmpeg doesn't show

    15 avril 2023, par Efraín

    I've a weird problem, that I don't seem to understand what's happening. Basically, I've an MKV with 2 subtitle tracks, and I want to add a new one, so after searching a little, I worked this command

    


    ffmpeg -i "file v.mkv" -i "file s.ass" -c copy -map 0 -map 1 -metadata:s:s:2 language=eng -metadata:s:s:2 handler_name=English -metadata:s:s:2 title=English-dl output.mkv


    


    But when I opened output.mkv, and selected the 3rd sub track, It doesn't show the subtitles. I used MKVExtract to extract the track, and it correctly is the same .ass file that I merged, so I don't know why they don't show.

    


    I used this other command as example to make the previous one

    


    ffmpeg -i "file v.mkv" -i "file s.ass" -c copy -map 0 -dn -map "-0:s" -map "-0:d" -map "1:0" "-metadata:s:s:0" "language=eng" "-metadata:s:s:0" "handler_name=English"  "-metadata:s:s:0" "title=English-dl" "output.mkv"


    


    And this one correctly show the subtitles, but it erase the 2 original tracks of the MKV(Which I don't want to happen). What is the difference here that makes the second one show the subtitles, but the first one don't ? Do I have to lose the original subs for this ?