
Recherche avancée
Autres articles (46)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (4248)
-
FFMPEG Seeking with concat demuxer causes video & audio to be out of sync
20 février 2023, par GaruukI have a very simple use case that's driving me bananas.


My problem and question :


I'm using ffmpeg version 5.1.2 on a MacOS and i'm using ffmpeg seeking and concat demuxer to cut many 1 minute videos into 15 seconds chopped up over 12 clips where every clip is just 2 seconds from the same video (kind of like a mini teasers for the video). I would really like to not have to re-encode to make the video processing as fast as possible.


First, I take each 1 minute video and cut it up into 12 clips (I do all this programmatically in python fwiw)


ffmpeg -ss 0 -i input.mp4 -t 2 -c copy -y cut_1.mp4
ffmpeg -ss 4 -i input.mp4 -t 2 -c copy -y cut_2.mp4
ffmpeg -ss 8 -i input.mp4 -t 2 -c copy -y cut_3.mp4
...
...



I then write all the output file names to my
concat_manifest.txt


file cut_1.mp4
file cut_2.mp4
...
...



Then I run my concat command :


ffmpeg -f concat -i concat_manifest.txt -c copy -y concat_video.mp4



This works really fast but the audio and video at the stitch point get out of sync and sometimes the video just chokes & lags. It's mostly not a smooth experience.


What I have tried :


- 

- using the concat protocol with intermediate profiles : ffmpeg.org/wiki/Concatenate#demuxer
- Putting the -ss when I seek after the -i. This makes everything worse
- Playing around with different -ss values. This has some noticeable affects but it's not obvious why yet.
- I've also read from the ffmpeg resource regarding seeking and copying :










Which leads me to believe that maybe because ffmpeg is using timestamps instead of frames, seeking isn't accurate using -ss when using the concat demuxer


Is there a way to get concat demuxer cutting and concatenating the video where the audio is somewhat in sync with the video ?


Thanks


EDIT : I found an answer and i'll be posting the solution in the coming few days.


-
FFmpeg, how to skip late input ?
14 novembre 2017, par user3343357I’m running ffmpeg to display incoming stream on a Decklink BlackMagic card with the following command line :
ffmpeg -y -f ourFmt -probesize 32 -i - -f decklink -preset ultrafast
-pix_fmt uyvy422 -s 1920x1080 -r 30 -af volume=0.1 -max_delay 10000
DeckLink Mini MonitorBasically I get the video over the internet by UDP and stream it to ffmpeg stdin. Both audio and video streams have pts and dts and are fully in sync, if the connection is good there is no problems.
However if there are issues with the connection i start getting errors, sometimes the video delay grows significantly, and audio stops working.
The errors i get are :ffmpeg : [decklink @ 0x26cc600] There are not enough buffered video
frames. Video may misbehave ! ffmpeg : [decklink @ 0x26cc600] There’s no
buffered audio. Audio will misbehave ! ffmpeg : Last message
repeated 4 times ffmpeg : [decklink @ 0x26cc600] There are not enough
buffered video frames. Video may misbehave ! ffmpeg : [decklink @
0x26cc600] There’s no buffered audio. Audio will misbehave ! ffmpeg :
Last message repeated 3 times ffmpeg : frame= 5204 fps= 30 q=-0.0
size=N/A time=00:02:53.76 bitrate=N/A dup=385 drop=5 speed=0.993x
ffmpeg : [decklink @ 0x26cc600] There’s no buffered audio. Audio will
misbehave ! ffmpeg : Last message repeated 18 times ffmpeg :
[decklink @ 0x26cc600] There are not enough buffered video frames.
Video may misbehave ! ffmpeg : [decklink @ 0x26cc600] There’s no
buffered audio. Audio will misbehave !The problem is when the connection is back to normal, the video keeps misbehaving until I restart the stream. What I want to do is for FFmpeg to skip to the content of the last second and play synchronized video from there, drop all the late data in between, is it possible ?
-
avutil/pixdesc : deprecate AV_PIX_FMT_FLAG_PSEUDOPAL
29 mars 2018, par wm4avutil/pixdesc : deprecate AV_PIX_FMT_FLAG_PSEUDOPAL
PSEUDOPAL pixel formats are not paletted, but carried a palette with the
intention of allowing code to treat unpaletted formats as paletted. The
palette simply mapped the byte values to the resulting RGB values,
making it some sort of LUT for RGB conversion.It was used for 1 byte formats only : RGB4_BYTE, BGR4_BYTE, RGB8, BGR8,
GRAY8. The first 4 are awfully obscure, used only by some ancient bitmap
formats. The last one, GRAY8, is more common, but its treatment is
grossly incorrect. It considers full range GRAY8 only, so GRAY8 coming
from typical Y video planes was not mapped to the correct RGB values.
This cannot be fixed, because AVFrame.color_range can be freely changed
at runtime, and there is nothing to ensure the pseudo palette is
updated.Also, nothing actually used the PSEUDOPAL palette data, except xwdenc
(trivially changed in the previous commit). All other code had to treat
it as a special case, just to ignore or to propagate palette data.In conclusion, this was just a very strange old mechnaism that has no
real justification to exist anymore (although it may have been nice and
useful in the past). Now it's an artifact that makes the API harder to
use : API users who allocate their own pixel data have to be aware that
they need to allocate the palette, or FFmpeg will crash on them in
_some_ situations. On top of this, there was no API to allocate the
pseuo palette outside of av_frame_get_buffer().This patch not only deprecates AV_PIX_FMT_FLAG_PSEUDOPAL, but also makes
the pseudo palette optional. Nothing accesses it anymore, though if it's
set, it's propagated. It's still allocated and initialized for
compatibility with API users that rely on this feature. But new API
users do not need to allocate it. This was an explicit goal of this
patch.Most changes replace AV_PIX_FMT_FLAG_PSEUDOPAL with FF_PSEUDOPAL. I
first tried #ifdefing all code, but it was a mess. The FF_PSEUDOPAL
macro reduces the mess, and still allows defining FF_API_PSEUDOPAL to 0.Passes FATE with FF_API_PSEUDOPAL enabled and disabled. In addition,
FATE passes with FF_API_PSEUDOPAL set to 1, but with allocation
functions manually changed to not allocating a palette.- [DH] doc/APIchanges
- [DH] fftools/ffprobe.c
- [DH] libavcodec/decode.c
- [DH] libavcodec/ffv1dec.c
- [DH] libavcodec/rawdec.c
- [DH] libavcodec/smvjpegdec.c
- [DH] libavfilter/drawutils.c
- [DH] libavfilter/framepool.c
- [DH] libavfilter/vf_crop.c
- [DH] libavfilter/vf_pixdesctest.c
- [DH] libavfilter/vf_scale.c
- [DH] libavfilter/vf_shuffleplanes.c
- [DH] libavutil/frame.c
- [DH] libavutil/imgutils.c
- [DH] libavutil/internal.h
- [DH] libavutil/pixdesc.c
- [DH] libavutil/pixdesc.h
- [DH] libavutil/version.h
- [DH] libswscale/swscale_internal.h