Recherche avancée

Médias (0)

Mot : - Tags -/logo

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (68)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

Sur d’autres sites (9027)

  • Writing A Dreamcast Media Player

    6 janvier 2017, par Multimedia Mike — Sega Dreamcast

    I know I’m not the only person to have the idea to port a media player to the Sega Dreamcast video game console. But I did make significant progress on an implementation. I’m a little surprised to realize that I haven’t written anything about it on this blog yet, given my propensity for publishing my programming misadventures.


    3 Dreamcast consoles in a row

    This old effort had been on my mind lately due to its architectural similarities to something else I was recently brainstorming.

    Early Days
    Porting a multimedia player was one of the earliest endeavors that I embarked upon in the multimedia domain. It’s a bit fuzzy for me now, but I’m pretty sure that my first exposure to the MPlayer project in 2001 arose from looking for a multimedia player to port. I fed it through the Dreamcast development toolchain but encountered roadblocks pretty quickly. However, this got me looking at the MPlayer source code and made me wonder how I could contribute, which is how I finally broke into practical open source multimedia hacking after studying the concepts and technology for more than a year at that point.

    Eventually, I jumped over to the xine project. After hacking on that for awhile, I remembered my DC media player efforts and endeavored to compile xine to the console. The first attempt was to simply compile the codebase using the Dreamcast hobbyist community’s toolchain. This is when I came to fear the multithreaded snake pit in xine’s core. Again, my memories are hazy on the specifics, but I remember the engine having a bunch of threading hacks with comments along the lines of “this code deadlocks sometimes, so on shutdown, monitor this lock and deliberately break it if it has been more than 3 seconds”.

    Something Workable
    Eventually, I settled on a combination of FFmpeg’s libavcodec library for audio and video decoders, xine’s demuxer library, and xine’s input API, combined with my own engine code to tie it all together along with video and output drivers provided by the KallistiOS hobbyist OS for Dreamcast. Here is a simple diagram of the data movement through this player :


    Architecture diagram for a Sega Dreamcast media player

    Details and Challenges
    This is a rare occasion when I actually got to write the core of a media player engine. I made some mistakes.

    xine’s internal clock ran at 90000 Hz. At least, its internal timestamps were all in reference to a 90 kHz clock. I got this brilliant idea to trigger timer interrupts at 6000 Hz to drive the engine. Whatever the timer facilities on the Dreamcast, I found that 6 kHz was the greatest common divisor with 90 kHz. This means that if I could have found an even higher GCD frequency, I would have used that instead.

    So the idea was that, for a 30 fps video, the engine would know to render a frame on every 200th timer interrupt. I eventually realized that servicing 6000 timer interrupts every second would incur a ridiculous amount of overhead. After that, my engine’s philosophy was to set a timer to fire for the next frame while beginning to process the current frame. I.e., when rendering a frame, set a timer to call back in 1/30th of a second. That worked a lot better.

    As I was still keen on 8-bit paletted image codecs at the time (especially since they were simple and small for bootstrapping this project), I got to use output palette images directly thanks to the Dreamcast’s paletted textures. So that was exciting. The engine didn’t need to convert the paletted images to a different colorspace before rendering. However, I seem to recall that the Dreamcast’s PowerVR graphics hardware required that 8-bit textures be twiddled/swizzled. Thus, it was still required to manipulate the 8-bit image before rendering.

    I made good progress on this player concept. However, a huge blocker for me was that I didn’t know how to make a proper user interface for the media player. Obviously, programming the Dreamcast occurred at a very low level (at least with the approach I was using), so there were no UI widgets easily available.

    This was circa 2003. I assumed there must have been some embedded UI widget libraries with amenable open source licenses that I could leverage. I remember searching and checking out a library named libSTK. I think STK stood for “set-top toolkit” and was positioned specifically for doing things like media player UIs on low-spec embedded computing devices. The domain hosting the project is no longer useful but this appears to be a backup of the core code.

    It sounded promising, but the libSTK developers had a different definition of “low-spec embedded” device than I did. I seem to recall that they were targeting something along with likes of a Pentium III clocked at 800 MHz with 128 MB RAM. The Dreamcast, by contrast, has a 200 MHz SH-4 CPU and 16 MB RAM. LibSTK was also authored in C++ and leveraged the Boost library (my first exposure to that code), and this all had the effect of making binaries quite large while I was trying to keep the player in lean C.

    Regrettably, I never made any serious progress on a proper user interface. I think that’s when the player effort ran out of steam.

    The Code
    So, that’s another project that I never got around to finishing or publishing. I was able to find the source code so I decided to toss it up on github, along with 2 old architecture outlines that I was able to dig up. It looks like I was starting small, just porting over a few of the demuxers and decoders that I knew well.

    I’m wondering if it would still be as straightforward to separate out such components now, more than 13 years later ?

    The post Writing A Dreamcast Media Player first appeared on Breaking Eggs And Making Omelettes.

  • FFMPEG : How to avoid audio/video desync in output of crossfaded clips when input is variable frame rate video

    25 décembre 2018, par Anders Lunde

    I’m doing screen recordings of gameplay (Dota2) using my NVIDIA graphics card GeForce experience hardware recording (NVEC Encoder). This creates a variable frame rate output video. My NVIDIA settings are 60 fps 15000 kbps. I have paid a guy to make a program that generates scripts that given start/stop timepoints can extract clips from the video and merge them with crossfade. See example code below. The script works for many input recordings but fails often : The audio and video are desynchronized (usually audio delay) in many of the clips, ca 0.5 seconds. I think it fails more when frame rate dropped more during recording. He does not know how to fix the problem, and I wonder if anyone could point out if anything could be fixed in the script (example below) ?

    Processing speed is quite important (now making a 10 min ’highlight’ video takes ca 7-10 min). Solutions increasing that amount very much more is not of too big interest, unfortunately. His approach has been to work separately with audio and video and merge in the end. He already has a program to make ffmpeg code for working with different scenarios (also adding overlays, adding music, intro/outro) so it would be preferable with some easy fixes to his code and not dramatic redesigning of the logic. But if nothing else can fix the problem, a redesign in logic is ok. Using other tools than ffmpeg is also ok, but should be automatable (scripts/cli) and not increase processing times too much.

    Running the program "mediainfo" on the input video shows that framerate dropped quite low for this input video :

    Frame rate mode : Variable

    Frame rate : 60.000 FPS

    Minimum frame rate : 3.059 FPS

    Maximum frame rate : 63.739 FPS

    Full report here : https://pastebin.com/TX061Wih

    The input video can be downloaded from dropbox here (6 GB) :
    https://www.dropbox.com/s/ftwdgapazbi62pr/fullgame.mp4?dl=0

    Here the example of a script when asked to extract two clips from input video at 9:57 (41 sec length) and 15:45 (28 sec length) and crossfade merge them with a 0.5 crossfade time. There might be some code-remnants from options that are not used in this example (overlays, music, intro/outro). Using the input video above, this creates audio/video desync.

    6 commands excecuted in sequence :

    ffmpeg.exe -loglevel warning -ss 00:09:57 -i fullgame.mp4 -t 00:00:41 -filter_complex "[0:a]afade=t=out:st=40.5:d=0.5[a1]" -map "[a1]" -y out_temp_00.mp4.wav

    ffmpeg.exe -loglevel warning -i fullgame.mp4 -ss 00:09:57 -t 00:00:41 -an -vcodec copy -f mpegts -avoid_negative_ts make_zero -y out_temp_00.mp4.ts

    ffmpeg.exe -loglevel warning -ss 00:15:45 -i fullgame.mp4 -t 00:00:28 -filter_complex "[0:a]afade=t=in:st=0:d=0.5[a1]" -map "[a1]" -y out_temp_01.mp4.wav

    ffmpeg.exe -loglevel warning -i fullgame.mp4 -ss 00:15:45 -t 00:00:28 -an -vcodec copy -f mpegts -avoid_negative_ts make_zero -y out_temp_01.mp4.ts

    ffmpeg.exe -loglevel warning -i out_temp_00.mp4.wav -i out_temp_01.mp4.wav -y -filter_complex "[0:a]adelay=0|0[a0];[1:a]adelay=40500|40500[a1];[a0][a1]amix=inputs=2:dropout_transition=68.5,atrim=duration=68.5[outa0];[outa0]loudnorm[outa]" -map "[outa]" -ar 48000 -acodec aac -strict -2 fullgame_Output.mp4.aac

    ffmpeg.exe -loglevel warning -i out_temp_00.mp4.ts -i out_temp_01.mp4.ts -y -i fullgame_Output.mp4.aac  -filter_complex "[0:v]trim=start=0.5,setpts=PTS-STARTPTS[0c];[1:v]trim=start=0.5,setpts=PTS-STARTPTS[1c];[0:v]trim=40.5:41,setpts=PTS-STARTPTS[fo];[1:v]trim=0:0.5[fi];[fi]format=pix_fmts=yuva420p,fade=t=in:st=0:d=0.5:alpha=1[z];[fo]format=pix_fmts=yuva420p,fade=t=out:st=0:d=0.5:alpha=1[x];[z]fifo[w];[x]fifo[q];[q][w]overlay[r];[0c][r][1c]concat=n=3[outv]" -map "[outv]" -map 2:a -shortest -acodec copy -vcodec libx264 -preset ultrafast -b 15000k -aspect 1920:1080 fullgame_Output.mp4

    P.S.

    I already asked for help at an ffmpeg chat room. One guy said he knew what the problem was, but didnt know how to fix it(?) :

    [00:10] <kepstin> oh, wait, you're using -vcodec copy
    [00:10] <kepstin> that explains everything.
    [00:10] <kepstin> when you're using -vcodec copy, the start time (set with -ss) is rounded to the nearest keyframe
    [00:10] <kepstin> it's not exact
    [00:11] <kepstin> depending on the keyframe interval, this will result in possibly quite large shifts
    [00:11] <kepstin> (also, your commands are applying audio filters on commands with -an, which is confusing/contradictory)
    [00:12] <birdboy88> so the problem is that the audio temporary clips are not being extracted from the same excat timepoints?
    [00:13] <kepstin> birdboy88: yeah, your audio is being re-encoded to wav so it's being cut sample-accurate, but the video's not being precisely cut.
    [00:16] <birdboy88> kepstin: so I need to use slow seek (?) to extract video accurately? Or somehow extract audio only where there are video keyframes?
    [00:17] <kepstin> birdboy88: i don't know how to extract audio starting at video keyframes with ffmpeg cli. You're already doing slow seek, which doesn't help (you should move the -ss option to before the -i option to speed it up)
    [00:17] <kepstin> if you want accurate video cutting when saving to a file, you have to re-encode the video
    [00:18] <kepstin> (doing this in a single ffmpeg command means you don't have to save to a file, so you can avoid the issue)
    [00:18] * kepstin is off for a bit now
    </kepstin></kepstin></kepstin></birdboy88></kepstin></birdboy88></kepstin></kepstin></kepstin></kepstin></kepstin></kepstin>

    EDIT :
    Everything is done with the latest ffmpeg version.

    I was unable to get Gyan’s code to work. It always loses some audio (audio is either 40.5 or 27.5, so only one audio is used). This is the only one working for me (changes were adelay=40500|40500 and amix=inputs=2[a0] ;[a0]loudnorm) :

    ffmpeg -i fullgame.mp4 -filter_complex "[0]split=2[vpre][vpost];
    [0]asplit=2[apre][apost];
    [vpre]trim=start='00:09:57':duration='00:00:41',setpts=PTS-STARTPTS[vpre-t];
    [apre]atrim=start='00:09:57':duration='00:00:41',asetpts=PTS-STARTPTS,afade=t=out:st=40.5:d=0.5[apre-t];
    [vpost]trim=start='00:15:45':duration='00:00:28',setpts=PTS-STARTPTS,format=yuva420p,fade=t=in:st=0:d=0.5:alpha=1,setpts=PTS+40.5/TB[vpost-t];
    [apost]atrim=start='00:15:45':duration='00:00:28',asetpts=PTS-STARTPTS,afade=t=in:st=0:d=0.5,adelay=40500|40500[apost-t];
    [vpre-t][vpost-t]overlay[v];
    [apre-t][apost-t]amix=inputs=2[a0];[a0]loudnorm[a]" -map "[v]" -map "[a]" -y -c:v libx264 -preset ultrafast -b:v 15000k -aspect 1920:1080 -c:a aac fullgame_Output.mp4

    Then I tried using a similar setup but with 3 clips, but on one machine I got error : "Error while filtering : Cannot allocate memory". And my 16 GB memory machine the processing speed is 0.02x ! Any way to avoid this ? This is the code I tried :

    ffmpeg -i fullgame.mp4 -filter_complex "[0]split=3[vpre][vpost][v3];
    [0]asplit=3[apre][apost][a3];
    [vpre]trim=start=357:duration=41,setpts=PTS-STARTPTS[vpre-t];
    [apre]atrim=start=357:duration=41,asetpts=PTS-STARTPTS,afade=t=out:st=40.5:d=0.5[apre-t];
    [vpost]trim=start=795:duration=28,setpts=PTS-STARTPTS,format=yuva420p,fade=t=in:st=0:d=0.5:alpha=1,fade=t=out:st=40.5:d=0.5:alpha=1,setpts=PTS+40.5/TB[vpost-t];
    [apost]atrim=start=795:duration=28,asetpts=PTS-STARTPTS,afade=t=in:st=0:d=0.5,afade=t=out:st=27.5:d=0.5,adelay=40500|40500[apost-t];
    [v3]trim=start=95:duration=30,setpts=PTS-STARTPTS,format=yuva420p,fade=t=in:st=0:d=0.5,setpts=PTS+41Û0.5/TB[v3-t];
    [a3]atrim=start=95:duration=30,asetpts=PTS-STARTPTS,afade=t=in:st=0:d=0.5,adelay=68500|68500[a3-t];
    [vpre-t][vpost-t]overlay[v1];
    [v1][v3-t]overlay[v];
    [apre-t][apost-t][a3-t]amix=inputs=3[a0];
    [a0]loudnorm[a]" -map "[v]" -map "[a]" -y -c:v libx264 -preset ultrafast -b:v 15000k -aspect 1920:1080 -c:a aac fullgame_Output.mp4
  • ffmpeg does not recognize long string filter in execv

    4 mai 2023, par incertia

    I am writing some simple python script to call ffmpeg and concat some clips together. However, it doesn't work for reasons I am unable to explain.

    &#xA;

    below is a working version of the code after some debugging

    &#xA;

    inputs = sorted(list(askopenfilenames()))&#xA;n = len(inputs)&#xA;&#xA;filter = []&#xA;for i in range(n):&#xA;    filter.append("[{}:v]".format(i))&#xA;    filter.append("[{}:a]".format(i))&#xA;filter.append("concat={}:v=1:a=1".format(n))&#xA;filter.append("[v]")&#xA;filter.append("[a]")&#xA;filter = " ".join(filter)&#xA;&#xA;fargs = zip(itertools.repeat(&#x27;-i&#x27;), inputs)&#xA;fargs = itertools.chain(&#xA;    ["ffmpeg"],&#xA;    itertools.chain.from_iterable(fargs),&#xA;    ["-filter_complex", &#x27;"{}"&#x27;.format(filter), "-vsync", "vfr", "-map", "[v]", "-map", "[a]"],&#xA;    ["-c:v", "libx264", "-crf", "{}".format(quality)],&#xA;    ["-c:a", "aac", "-b:a", "192k"],&#xA;    [out]&#xA;    )&#xA;&#xA;os.execvp("ffmpeg", list(fargs))&#xA;

    &#xA;

    but the entire fargs construction causes ffmpeg to complain about the filter chain when quotes are not utilized. e.g. by utilizing the below process

    &#xA;

    fargs = itertools.chain(&#xA;    ["ffmpeg", "-loglevel", "debug"],&#xA;    itertools.chain.from_iterable(fargs),&#xA;    #["-filter_complex", &#x27;"{}"&#x27;.format(filter), "-vsync", "vfr", "-map", "[v]", "-map", "[a]"],&#xA;    ["-filter_complex", filter, "-vsync", "vfr", "-map", "[v]", "-map", "[a]"],&#xA;    ["-c:v", "libx264", "-crf", "{}".format(quality)],&#xA;    ["-c:a", "aac", "-b:a", "192k"],&#xA;    [out]&#xA;    )&#xA;

    &#xA;

    we see that ffmpeg somehow sees this as multiple arguments

    &#xA;

    Reading option &#x27;-filter_complex&#x27; ... matched as option &#x27;filter_complex&#x27; (create a complex filtergraph) with argument &#x27;[0:v]&#x27;.&#xA;Reading option &#x27;[0:a]&#x27; ... matched as output url.&#xA;Reading option &#x27;[1:v]&#x27; ... matched as output url.&#xA;Reading option &#x27;[1:a]&#x27; ... matched as output url.&#xA;Reading option &#x27;[2:v]&#x27; ... matched as output url.&#xA;Reading option &#x27;[2:a]&#x27; ... matched as output url.&#xA;Reading option &#x27;concat=3:v=1:a=1&#x27; ... matched as output url.&#xA;Reading option &#x27;[v]&#x27; ... matched as output url.&#xA;Reading option &#x27;[a]&#x27; ... matched as output url.&#xA;

    &#xA;

    even though a simple print(list(fargs)) yields

    &#xA;

    [&#x27;ffmpeg&#x27;, &#x27;-loglevel&#x27;, &#x27;debug&#x27;, &#x27;-i&#x27;, &#x27;a.mp4&#x27;, &#x27;-i&#x27;, &#x27;b.mp4&#x27;, &#x27;-i&#x27;, &#x27;c.mp4&#x27;, &#x27;-filter_complex&#x27;, &#x27;[0:v] [0:a] [1:v] [1:a] [2:v] [2:a] concat=3:v=1:a=1 [v] [a]&#x27;, &#x27;-vsync&#x27;, &#x27;vfr&#x27;, &#x27;-map&#x27;, &#x27;[v]&#x27;, &#x27;-map&#x27;, &#x27;[a]&#x27;, &#x27;-c:v&#x27;, &#x27;libx264&#x27;, &#x27;-crf&#x27;, &#x27;20&#x27;, &#x27;-c:a&#x27;, &#x27;aac&#x27;, &#x27;-b:a&#x27;, &#x27;192k&#x27;, &#x27;asdf.mp4&#x27;]&#xA;

    &#xA;

    implying that the long filter string is being passed to ffmpeg as a single argument.

    &#xA;