
Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (33)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (5838)
-
Concatenate chunk containg headers to another chunk in h264
18 novembre 2014, par OrtixxI’m trying to extract thumbnails from a torrent stream by downloading the first couple of chunks to get the headers, another set of chunks from the middle and then concat them to have a single video file.
For this I’m using nodejs but I’m having trouble with the concatenation part. Obviously the headers include the length of the video so if I simply concat another chunk to the end of the headers chunk, it won’t work.
In other words, I have 2 chunks of a video file : The first one contains the headers and some material and the other one is fully composed of a video stream. I want to combine the two to form a single video file
So my question is how can I make this work properly if at all ? -
Add an image behind a video file (.mp4) in python
12 avril 2020, par George LuaI am attempting to add an image behind a video using moviepy and cannot come up with any way to do it. My image dimension is (1080 x 720) and my video dimension is (200 x 200). I am trying to put the video file on top of the image file, place it in the midlle and export the final video with the dimensions of the image. This is a visual representation of what I am trying to achieve :






I tried doing this with the following code, but could not manage to get it to work. Any help is seriously appreciated :)



import moviepy
import moviepy.editor as mp

video = mp.VideoFileClip("/media/pi/vid.mp4")

logo = (mp.ImageClip("/media/pi/pic.png"))

# Preferably I would want the image to be the same duration as the video file. 
# The following was just for testing.
final = mp.CompositeVideoClip([video, logo.set_duration(3)])
final.write_videofile("test.mp4")



-
Got two working ffmpeg commands, help me transform them
7 mars 2019, par lukistarhere are two different commands.
ffmpeg -i input.mp3 -i second.mp3 -filter_complex "[0:a]atrim=end=10,asetpts=N/SR/TB[begin];[0:a]atrim=start=10,asetpts=N/SR/TB[end];[begin][1:a][end]concat=n=3:v=0:a=1[a]" -map "[a]" output
This command inserts second.mp3 into input.mp3. It seems to always keep the parameters of input.mp3. It inserts it in exact 10 seconds of input.mp3.
Here is the second command :
ffmpeg -i input.mp3 -i second.mp3 -filter_complex "[1:a]adelay=10000|10000[1a];[0:a][1a]amix=duration:first" output
This command is closer to my final goal. It plays input.mp3 and in exact 10 seconds it plays along second.mp3 without stopping input.mp3’s sound.
My final goal is to create final.mp3.
Its duration must always equal input.mp3 duration. It must keep the samplerate, the count of channels, etc of input.mp3
When playing final.mp3, it must play the whole input.mp3.
But each 10-15 seconds, it must play second.mp3 without stopping input.mp3.
It could be said that I must use "Second command" but in a loop.
It would be great if there is one-line command for that in ffmpeg.
I am working with flac, mp3 and wav and both of the commands, I got, work great for what they are doing.For example :
input.mp3 could be 40 seconds long.
second.mp3 could be 2 seconds long.
When I play final.mp3 it will be 40 seconds long, but each 10-15 seconds(on random) it will play second.mp3 at the same time as input.mp3.
Sadly I have no experience with ffmpeg, both of the commands I got are answers to questions here in stackoverflow. Hope somebody can help me. Thank you !