
Recherche avancée
Autres articles (50)
-
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...) -
Submit enhancements and plugins
13 avril 2011If you have developed a new extension to add one or more useful features to MediaSPIP, let us know and its integration into the core MedisSPIP functionality will be considered.
You can use the development discussion list to request for help with creating a plugin. As MediaSPIP is based on SPIP - or you can use the SPIP discussion list SPIP-Zone. -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (6358)
-
Loop a song for a certain amount of time using FFMPEG
10 avril 2017, par Eduardo PerezSo, I am trying to loop a song from a video game for one hour, using FFMPEG on Linux or in Windows 10 Bash. I have 3 versions of the song, they are as follows :
1.Full song in OGG format with LOOPSTART and LOOPLENGTH metadata.
2.Same song in OGG format, except only the segment before LOOPSTART.
3.Same song in OGG format, except only the segment from LOOPSTART to LOOPSTART+LOOPLENGTH.So, what I want to do is use either the full song file for the loop, which I don’t think FFMPEG supports, or use the files for the beginning and looping portions of the song. Basically, I want to have FFMPEG create a song which starts off with the beginning segment of the song and loops the looping segment indefinitely until an hour has been played. If possible, I also want a second command so I can add an image or an MP4 video to be shown in a loop while the song is being played, but in the case of a video being used to be shown while the song is played, the music from that video won’t be merged with the music from the song. What command should I use for this ?
-
ffmpeg with cuda and concat videos [closed]
15 septembre 2024, par Petr ŠimůnekI have a ffmpeg command with nvidia hw support that I want to join two videos :


/home/videotest/ffmpeg-dev/ffmpeg -benchmark -hide_banner -loglevel warning -y -hwaccel_device 0 -hwaccel cuda -hwaccel_output_format cuda -i /mnt/video-storage/test/cuda/surfing/intermediate1.mp4 -i /mnt/video-storage/test/cuda/surfing/intermediate2. mp4 -filter_complex 'hwupload_cuda,[0:v:0][0:a:0][1:v:0][1:a:0]concat=n=2:v=1:a=1 [outv] [outa]' -map [outv] -map [outa] -pix_fmt yuv420p -c:v h264_nvenc -preset p6 -profile: v high -force_key_frames 'expr:gte(t,n_forced*5)' -bf 0 -movflags +faststart -b:v 3000k -maxrate 4000k /mnt/video-storage/test/cuda/surfing/game.mp4


I get an error message after starting :


Impossible to convert between the formats supported by the filter 'graph 0 input from stream 0:0' and the filter 'auto_scale_0' [fc#0 @ 0x61ae2b37a1c0] Error reinitializing filters!


I can't figure out how to remove it. Can you help me ? Thank you.


If I use nvdec/nvenc instead of cuda, this error does not appear and the video is created fine.


-
Use ffmpeg to compile RGBA image sequence into two separate videos (RGB + Alpha)
27 novembre 2018, par MirceaKitsuneI plan on using Blender to render animated sequences for a game. I need the videos to have transparent backgrounds in the format expected by its engine. This format involves the transparency being defined as a separate grayscale video of equal FPS and duration. Since splitting the RGB and Alpha channels in Blender is more complicated, I’d prefer doing this directly from ffmpeg.
The input directory contains an image sequence of png files in RGBA format (eg : 0000.png to 0100.png). What I need ffmpeg to do is compile them into two separate videos : One containing only the RGB channels (leave transparency black) and another containing only the grayscale alpha channel. In the end I’d have something like my_video.mp4 + my_video_mask.mp4.
I’m familiar with compiling non-transparent image sequences into video using :
ffmpeg -f image2 -i /path/to/sequence/%04d.png output.mp4
But I don’t know how to extract the alpha channel and make it a separate video file. What is the simplest ffmpeg command to achieve this result ?