Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (103)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

Sur d’autres sites (10963)

  • ffmpeg specify image start/end time by seconds in slideshow

    31 décembre 2022, par Martin

    I have an ffmpeg command that when ran on command prompt in win10, will combine 2 mp3 files and 1 image file into a low resolution .mkv video file.

    


    06:23 = 383 = song1.mp3 length
05:40 = 340 = song2.mp3 length
12:03 = 723 = estimated total video length
12:04 = 724 = actual video length


    


    Command that generates video file :

    


    ffmpeg -loop 1 -framerate 2 -i images/img1.png  -i "audio files/song1.mp3"  -i "audio files/song2.mp3"  -c:a pcm_s32le  -filter_complex concat=n=2:v=0:a=1  -vcodec libx264  -bufsize 3M  -filter:v "scale=w=640:h=638,pad=ceil(iw/2)*2:ceil(ih/2)*2"  -crf 18  -pix_fmt yuv420p  -shortest  -tune stillimage  -t 724 audioAndImageIntoVideo.mkv 


    


    The current command just uses -i images/img1.png as a static image for the entire video. But I want to have one image for the duration of the first song, and a second image for the duration of the second song. With a timeline like so :

    


    song1.mp3 and img1.png start at 00:00 and end at 06:23 ( 383 seconds )
song2.mp3 and img2.png start at 06:23 ( 383 seconds ) and end at 12:03 ( 723 seconds )


    


    is there any flag to specify the timeline of two images ? Right now I am just trying to get them in order in a video, and then I can change the individual img resolution / size / stretching details for how it fills the frame

    


  • Is there a faster way to generate video from pixel arrays using python and ffmpeg ?

    8 mai 2019, par devneal17

    I’ve found a few sources which use python and ffmpeg to generate video from pixel arrays by passing the -f rawvideo flag 1 2. However, this is very slow for high-definition video since each individual pixel must be piped into ffmpeg.

    In fact this is provably wasteful, as I’ve found that 2.5Gb of pixel arrays generates about 80Kb of video. I’ve also chanced upon some examples where javascript can render high quality animations in near-real time 1, which makes me even more suspicious that I’m doing something wrong.

    Is there a way to do this more efficiently, perhaps by piping the differences between pixel arrays into ffmpeg rather than the pixels themselves ?

    (edit) This is the line I’m using. Most executions take the else path that follows.

  • FFmpeg : Concatenated clips extracted using -ss and -t, hang or go out of sync while playing

    20 juillet 2013, par Yogi

    I have a set of movies in different formats, and I am trying to extract small clips from these source movies, and concatenate them into one movie.

    My workflow has been the following :

    1. Convert all the source movies to the same format (width, height, fps, codec)T the scale and pad options are so that all movies oare of the same size, even if their aspect ratio is different.

      fmpeg  -i $infile -vcodec libx264 -strict -2  -vf scale=iw*sar*min(${MAX_WIDTH}/(iw*sar)\,${MAX_HEIGHT}/ih):ih*min(${MAX_WIDTH}/(iw*sar)\,${MAX_HEIGHT}/ih),pad=${MAX_WIDTH}:${MAX_HEIGHT}:(ow-iw)/2:(oh-ih)/2 -b:v 500k -b:a 64k -movflags +faststart -g 10  -r 25 ${outbasename}.mp4

    2. extract clips :

      ffmpeg -ss $starttime -t $duration -i $in_file -vcodec copy -acodec copy $out_file

    3. Then finally combine clips by first making a concat_list.txt file which contains the list of clips to be concatenated, and their duration, and then using ffmpegs's concat demux :

      ffmpeg -f concat -i concat_list.txt -c copy -movflags +faststart $oname

    The problem I am facing is that many of the final videos hang or go out of sync, somewhere in the middle of playing. I have tried using mjpeg as the codec, but I still get the same behavior. I can play the individual extracted clips, and then all seem to play fine in most players. Does anybody know what I am doing wrong ? I am using ffmpeg version 1.2.1.