Recherche avancée

Médias (91)

Autres articles (45)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (9437)

  • How to play video Media Source Extensions when the audio start is delayed ? Or how to fix it with ffmpeg ?

    11 décembre 2020, par sheodox

    I have a video that I'm splitting the individual video/audio streams out then dashing with MP4Box, then I'm playing them with Media Source Extensions and appending byte ranges to video/audio source buffers from the MPD files. It's all working nicely, but one video I have has audio that is delayed by about 1.1 second. I couldn't get it to sync up and the audio would always play ahead of the video.

    


    Currently I'm trying to set the audioBuffer.timestampOffset = 1.1 and that gets it to sync up perfectly. The issue I'm running into now though is the video refuses to play unless the audio source buffer has data. So the video stalls right away. If I skip a few seconds in (past the offset) everything works because both video/audio are buffered.

    


    Is there a way to get around this ? Either make it play without the audio loaded, somehow fill the audio buffer with silence (can I generate something with the Web Audio API) ? Add silence to the audio file in ffmpeg ? Something else ?

    


    I first tried adding a delay in ffmpeg with ffmpeg -i video.mkv -map 0:a:0 -acodec aac -af "adelay=1.1s:all=true" out.aac but nothing seemed to change. Was I doing something wrong ? Is there a better way to demux audio while keeping the exact same timing as when it was in the container with the video so I don't have to worry about delays/offsets at all ?

    


  • Ffmpeg - Generate VTT File From Sprite, Using Spatial Media Fragment

    20 octobre 2019, par David

    Hi I am looking to create a .VTT file from a sprite that i have generated using Ffmpeg.

    Ffmpeg command :

    $"-i {inputMediaFile} -vf \"select = not(mod(n\\, 30)),scale = 120:80,tile = 7x7\" -an -vsync 0 {outputMediaFile}"

    This selects every 30th frame, and then scales it to 120x80 pixels and creates 8x8 tiles in the output image.

    I would like to make a .VTT from the generated image in C#, so i know the height and width of my individual images in the sprite (120x80) and there is 64 images in total in the output image.

    From this i need to produce a VTT like this :

    WEBVTT

    1
    00:00:00.000 --> 00:00:01.000
    test-00001.jpg#xywh=0,0,120,80

    2
    00:00:01.000 --> 00:00:02.000
    test-00001.jpg#xywh=120,0,120,80

    3
    00:00:02.000 --> 00:00:03.000
    test-00001.jpg#xywh=240,0,120,80

    4
    00:00:03.000 --> 00:00:04.000
    test-00001.jpg#xywh=360,0,120,80

    5
    00:00:04.000 --> 00:00:05.000
    test-00001.jpg#xywh=480,0,120,80

    6
    00:00:05.000 --> 00:00:06.000
    test-00001.jpg#xywh=600,0,120,80

    7
    00:00:06.000 --> 00:00:07.000
    test-00001.jpg#xywh=720,0,120,80

    8
    00:00:07.000 --> 00:00:08.000
    test-00001.jpg#xywh=840,0,120,80

    9
    00:00:08.000 --> 00:00:09.000
    test-00001.jpg#xywh=0,80,120,80

    There is also situations when there is n amount of sprite files.

    Im hoping there may be a library out there that can handle this, or even better if i can keep it contained within Ffmpeg - based on Ffmpeg docs i dont think this is possible though.

    Thanks in advance if anyone as any ideas, its doable as ive seen Nodejs and Ruby examples.

  • How to fade video using a custom curve function in ffmpeg ?

    2 avril 2018, par Zoltan

    I would like to fade out a video using a different transition curve than what the fade filter uses (which is probably linear).

    For audio, I can easily choose from a wide variety of curves that the afade filter provides, or I can supply a custom expression to the volume filter, like (sin(PI/2 * min(1\, max(-1\, 1/2 * (t - 3)))) + 1)/2. For video, however, I could not find similar possibilities.

    Based on its description, it seems to me that the geq filter could be misused to achieve this, but I couldn’t get it to work. Additionally, it is very slow, probably because it works on individual pixels, even though the expression only depends on time.