
Recherche avancée
Médias (91)
-
Spoon - Revenge !
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
My Morning Jacket - One Big Holiday
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Zap Mama - Wadidyusay ?
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
David Byrne - My Fair Lady
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Beastie Boys - Now Get Busy
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (78)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (11935)
-
Ffmpeg : Move a slider "image" over a "background" from 0% to 100% in sync with the audio
7 février 2014, par ZnuffI'm trying to create a video using the follwing code :
`$`ffmpeg -loop 1 -r 5 -i video.png -r 5 -i progress.png -filter_complex "overlay=x='if(gte(t,0), -W+(t)*5, NAN)':y=H-h" -i video.mp3 -acodec copy video.mp4
I have the following files
- video.png
- this is a 1280x720 px still frame that is simply a background with a waveform of the video.mp3 file
- progress.png
- this is simply a 1280x100 px semi-transparent image that should simulate an animation (from from 0 to 100% of the width of the video.png file, in order to simulate "fill up" animation.
My issues are as following :
- The video is not in sync with the audio. The progress bar is way off, instead of finishing at the end of the song, it just keeps going on and on and on and on...
- Also... it just keeps going on and on ! I left it create a 1 hour video and it never stopped.
I know I'm missing something in the filter, but I have no idea how I could fix it.
Could someone lend me some help ?
- video.png
-
command line audio equalizer ffmpeg sox
5 novembre 2016, par user1320370I need to add equalizer effect on some flac files :
f=4043, 1.65q, g=9.5; f=7024, 1.09q, g=3.7; f=9254, 0.94q, g=-2.5
I was tried ffmpeg :
ffmpeg -i solovoce_compress.flac -af equalizer=f=4043:width_type=q:w=1.65:g=9.5, equalizer=f=7024:width_type=q:w=1.09:g=3.7,equalizer=f=9254:width_type=q:w=0.94:g=-2.5 solovoce_equalizzato.flac
but the result is much different then what I expect, this values was calculated by izotope ozone (professional audio editor software) and tested.
I used the equivalent with sox and the result is some.
A this point I like to try ffmpeg ’anequalizer’ filter that show the graph so I can ’see’ the difference, but I the documentation to show the graph is not clear and I was not found nothing about on web.
Someone can please send me an example of ’anequalizer’ with plot enabled ?
-
Matplotlib pipe canvas.draw() to ffmpeg - unexpected result [duplicate]
31 juillet 2022, par NarusanI'm using this code from here to try and pipe multiple matplotlib plots into ffmpeg to write a video file :


import numpy as np
import matplotlib.pyplot as plt
import subprocess

xlist = np.random.randint(100,size=100)
ylist = np.random.randint(100, size=100)
color = np.random.randint(2, size=100)

f = plt.figure(figsize=(5,5), dpi = 300)
canvas_width, canvas_height = f.canvas.get_width_height()
ax = f.add_axes([0,0,1,1])
ax.axis('off')


# Open an ffmpeg process
outf = 'ffmpeg.mp4'
cmdstring = ('ffmpeg',
 '-y', '-r', '30', # overwrite, 30fps
 '-s', '%dx%d' % (canvas_width, canvas_height), # size of image string
 '-pix_fmt', 'argb', # format
 '-f', 'rawvideo', '-i', '-', # tell ffmpeg to expect raw video from the pipe
 '-vcodec', 'mpeg4', outf) # output encoding
p = subprocess.Popen(cmdstring, stdin=subprocess.PIPE)

# Draw 1000 frames and write to the pipe
for frame in range(10):
 print("Working on frame")
 # draw the frame
 f = plt.figure(figsize=(5,5), dpi=300)
 ax = f.add_axes([0,0,1,1])
 ax.scatter(xlist, ylist,
 c=color, cmap = 'viridis')
 f.canvas.draw()
 plt.show()

 # extract the image as an ARGB string
 string = f.canvas.tostring_argb()
 # write to pipe
 p.stdin.write(string)

# Finish up
p.communicate()



While
plt.show()
does show the correct plot (see image below), the video that ffmpeg creates is a bit different than whatplt.show()
shows. I am presuming the issue is withf.canvas.draw()
, but I'm not sure how to get a look at whatcanvas.draw()
actually plots.



ffmpeg video (imgur link)