
Recherche avancée
Autres articles (53)
-
Gestion générale des documents
13 mai 2011, parMédiaSPIP ne modifie jamais le document original mis en ligne.
Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...) -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
ANNEXE : Les extensions, plugins SPIP des canaux
11 février 2010, parUn plugin est un ajout fonctionnel au noyau principal de SPIP. MediaSPIP consiste en un choix délibéré de plugins existant ou pas auparavant dans la communauté SPIP, qui ont pour certains nécessité soit leur création de A à Z, soit des ajouts de fonctionnalités.
Les extensions que MediaSPIP nécessite pour fonctionner
Depuis la version 2.1.0, SPIP permet d’ajouter des plugins dans le répertoire extensions/.
Les "extensions" ne sont ni plus ni moins que des plugins dont la particularité est qu’ils se (...)
Sur d’autres sites (5772)
-
How to play video Media Source Extensions when the audio start is delayed ? Or how to fix it with ffmpeg ?
11 décembre 2020, par sheodoxI have a video that I'm splitting the individual video/audio streams out then dashing with MP4Box, then I'm playing them with Media Source Extensions and appending byte ranges to video/audio source buffers from the MPD files. It's all working nicely, but one video I have has audio that is delayed by about 1.1 second. I couldn't get it to sync up and the audio would always play ahead of the video.


Currently I'm trying to set the
audioBuffer.timestampOffset = 1.1
and that gets it to sync up perfectly. The issue I'm running into now though is the video refuses to play unless the audio source buffer has data. So the video stalls right away. If I skip a few seconds in (past the offset) everything works because both video/audio are buffered.

Is there a way to get around this ? Either make it play without the audio loaded, somehow fill the audio buffer with silence (can I generate something with the Web Audio API) ? Add silence to the audio file in ffmpeg ? Something else ?


I first tried adding a delay in ffmpeg with
ffmpeg -i video.mkv -map 0:a:0 -acodec aac -af "adelay=1.1s:all=true" out.aac
but nothing seemed to change. Was I doing something wrong ? Is there a better way to demux audio while keeping the exact same timing as when it was in the container with the video so I don't have to worry about delays/offsets at all ?

-
Ffmpeg - Generate VTT File From Sprite, Using Spatial Media Fragment
20 octobre 2019, par DavidHi I am looking to create a .VTT file from a sprite that i have generated using Ffmpeg.
Ffmpeg command :
$"-i {inputMediaFile} -vf \"select = not(mod(n\\, 30)),scale = 120:80,tile = 7x7\" -an -vsync 0 {outputMediaFile}"
This selects every 30th frame, and then scales it to 120x80 pixels and creates 8x8 tiles in the output image.
I would like to make a .VTT from the generated image in C#, so i know the height and width of my individual images in the sprite (120x80) and there is 64 images in total in the output image.
From this i need to produce a VTT like this :
WEBVTT
1
00:00:00.000 --> 00:00:01.000
test-00001.jpg#xywh=0,0,120,80
2
00:00:01.000 --> 00:00:02.000
test-00001.jpg#xywh=120,0,120,80
3
00:00:02.000 --> 00:00:03.000
test-00001.jpg#xywh=240,0,120,80
4
00:00:03.000 --> 00:00:04.000
test-00001.jpg#xywh=360,0,120,80
5
00:00:04.000 --> 00:00:05.000
test-00001.jpg#xywh=480,0,120,80
6
00:00:05.000 --> 00:00:06.000
test-00001.jpg#xywh=600,0,120,80
7
00:00:06.000 --> 00:00:07.000
test-00001.jpg#xywh=720,0,120,80
8
00:00:07.000 --> 00:00:08.000
test-00001.jpg#xywh=840,0,120,80
9
00:00:08.000 --> 00:00:09.000
test-00001.jpg#xywh=0,80,120,80There is also situations when there is n amount of sprite files.
Im hoping there may be a library out there that can handle this, or even better if i can keep it contained within Ffmpeg - based on Ffmpeg docs i dont think this is possible though.
Thanks in advance if anyone as any ideas, its doable as ive seen Nodejs and Ruby examples.
-
How to fade video using a custom curve function in ffmpeg ?
2 avril 2018, par ZoltanI would like to fade out a video using a different transition curve than what the fade filter uses (which is probably linear).
For audio, I can easily choose from a wide variety of curves that the afade filter provides, or I can supply a custom expression to the volume filter, like (sin(PI/2 * min(1\, max(-1\, 1/2 * (t - 3)))) + 1)/2. For video, however, I could not find similar possibilities.
Based on its description, it seems to me that the geq filter could be misused to achieve this, but I couldn’t get it to work. Additionally, it is very slow, probably because it works on individual pixels, even though the expression only depends on time.