Recherche avancée

Médias (10)

Mot : - Tags -/wav

Autres articles (101)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Emballe Médias : Mettre en ligne simplement des documents

    29 octobre 2010, par

    Le plugin emballe médias a été développé principalement pour la distribution mediaSPIP mais est également utilisé dans d’autres projets proches comme géodiversité par exemple. Plugins nécessaires et compatibles
    Pour fonctionner ce plugin nécessite que d’autres plugins soient installés : CFG Saisies SPIP Bonux Diogène swfupload jqueryui
    D’autres plugins peuvent être utilisés en complément afin d’améliorer ses capacités : Ancres douces Légendes photo_infos spipmotion (...)

Sur d’autres sites (11616)

  • FFmpeg : Concatenated clips extracted using -ss and -t, hang or go out of sync while playing

    20 juillet 2013, par Yogi

    I have a set of movies in different formats, and I am trying to extract small clips from these source movies, and concatenate them into one movie.

    My workflow has been the following :

    1. Convert all the source movies to the same format (width, height, fps, codec)T the scale and pad options are so that all movies oare of the same size, even if their aspect ratio is different.

      fmpeg  -i $infile -vcodec libx264 -strict -2  -vf scale=iw*sar*min(${MAX_WIDTH}/(iw*sar)\,${MAX_HEIGHT}/ih):ih*min(${MAX_WIDTH}/(iw*sar)\,${MAX_HEIGHT}/ih),pad=${MAX_WIDTH}:${MAX_HEIGHT}:(ow-iw)/2:(oh-ih)/2 -b:v 500k -b:a 64k -movflags +faststart -g 10  -r 25 ${outbasename}.mp4

    2. extract clips :

      ffmpeg -ss $starttime -t $duration -i $in_file -vcodec copy -acodec copy $out_file

    3. Then finally combine clips by first making a concat_list.txt file which contains the list of clips to be concatenated, and their duration, and then using ffmpegs's concat demux :

      ffmpeg -f concat -i concat_list.txt -c copy -movflags +faststart $oname

    The problem I am facing is that many of the final videos hang or go out of sync, somewhere in the middle of playing. I have tried using mjpeg as the codec, but I still get the same behavior. I can play the individual extracted clips, and then all seem to play fine in most players. Does anybody know what I am doing wrong ? I am using ffmpeg version 1.2.1.

  • Is there a faster way to generate video from pixel arrays using python and ffmpeg ?

    8 mai 2019, par devneal17

    I’ve found a few sources which use python and ffmpeg to generate video from pixel arrays by passing the -f rawvideo flag 1 2. However, this is very slow for high-definition video since each individual pixel must be piped into ffmpeg.

    In fact this is provably wasteful, as I’ve found that 2.5Gb of pixel arrays generates about 80Kb of video. I’ve also chanced upon some examples where javascript can render high quality animations in near-real time 1, which makes me even more suspicious that I’m doing something wrong.

    Is there a way to do this more efficiently, perhaps by piping the differences between pixel arrays into ffmpeg rather than the pixels themselves ?

    (edit) This is the line I’m using. Most executions take the else path that follows.

  • ffmpeg specify image start/end time by seconds in slideshow

    31 décembre 2022, par Martin

    I have an ffmpeg command that when ran on command prompt in win10, will combine 2 mp3 files and 1 image file into a low resolution .mkv video file.

    


    06:23 = 383 = song1.mp3 length
05:40 = 340 = song2.mp3 length
12:03 = 723 = estimated total video length
12:04 = 724 = actual video length


    


    Command that generates video file :

    


    ffmpeg -loop 1 -framerate 2 -i images/img1.png  -i "audio files/song1.mp3"  -i "audio files/song2.mp3"  -c:a pcm_s32le  -filter_complex concat=n=2:v=0:a=1  -vcodec libx264  -bufsize 3M  -filter:v "scale=w=640:h=638,pad=ceil(iw/2)*2:ceil(ih/2)*2"  -crf 18  -pix_fmt yuv420p  -shortest  -tune stillimage  -t 724 audioAndImageIntoVideo.mkv 


    


    The current command just uses -i images/img1.png as a static image for the entire video. But I want to have one image for the duration of the first song, and a second image for the duration of the second song. With a timeline like so :

    


    song1.mp3 and img1.png start at 00:00 and end at 06:23 ( 383 seconds )
song2.mp3 and img2.png start at 06:23 ( 383 seconds ) and end at 12:03 ( 723 seconds )


    


    is there any flag to specify the timeline of two images ? Right now I am just trying to get them in order in a video, and then I can change the individual img resolution / size / stretching details for how it fills the frame