
Recherche avancée
Médias (1)
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
Autres articles (37)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (7072)
-
How to fetch both live video frame and timestamp from ffmpeg to python on Windows
6 mars 2017, par vijiboySearching for an alternative as OpenCV would not provide timestamps for live camera stream (on Windows), which are required in my computer vision algorithm, I found ffmpeg and this excellent article https://zulko.github.io/blog/2013/09/27/read-and-write-video-frames-in-python-using-ffmpeg/
The solution uses ffmpeg, accessing its standard output (stdout) stream. I extended it to read the standard error (stderr) stream as well.Working up the python code on windows, while I received the video frames from ffmpeg stdout, but the stderr freezes after delivering the showinfo videofilter details (timestamp) for first frame.
I recollected seeing on ffmpeg forum somewhere that the video filters like showinfo are bypassed when redirected. Is this why the following code does not work as expected ?
Expected : It should write video frames to disk as well as print timestamp details.
Actual : It writes video files but does not get the timestamp (showinfo) details.Here’s the code I tried :
import subprocess as sp
import numpy
import cv2
command = [ 'ffmpeg',
'-i', 'e:\sample.wmv',
'-pix_fmt', 'rgb24',
'-vcodec', 'rawvideo',
'-vf', 'showinfo', # video filter - showinfo will provide frame timestamps
'-an','-sn', #-an, -sn disables audio and sub-title processing respectively
'-f', 'image2pipe', '-'] # we need to output to a pipe
pipe = sp.Popen(command, stdout = sp.PIPE, stderr = sp.PIPE) # TODO someone on ffmpeg forum said video filters (e.g. showinfo) are bypassed when stdout is redirected to pipes???
for i in range(10):
raw_image = pipe.stdout.read(1280*720*3)
img_info = pipe.stderr.read(244) # 244 characters is the current output of showinfo video filter
print "showinfo output", img_info
image1 = numpy.fromstring(raw_image, dtype='uint8')
image2 = image1.reshape((720,1280,3))
# write video frame to file just to verify
videoFrameName = 'Video_Frame{0}.png'.format(i)
cv2.imwrite(videoFrameName,image2)
# throw away the data in the pipe's buffer.
pipe.stdout.flush()
pipe.stderr.flush()So how to still get the frame timestamps from ffmpeg into python code so that it can be used in my computer vision algorithm...
-
How to fetch both live video frame and its timestamp from ffmpeg on Windows
22 février 2017, par vijiboySearching for an alternative as OpenCV would not provide timestamps for live camera stream (on Windows), which are required in my computer vision algorithm, I found ffmpeg and this excellent article https://zulko.github.io/blog/2013/09/27/read-and-write-video-frames-in-python-using-ffmpeg/
The solution uses ffmpeg, accessing its standard output (stdout) stream. I extended it to read the standard error (stderr) stream as well.Working up the python code on windows, while I received the video frames from ffmpeg stdout, but the stderr freezes after delivering the showinfo videofilter details (timestamp) for first frame.
I recollected seeing on ffmpeg forum somewhere that the video filters like showinfo are bypassed when redirected. Is this why the following code does not work as expected ?
Expected : It should write video frames to disk as well as print timestamp details.
Actual : It writes video files but does not get the timestamp (showinfo) details.Here’s the code I tried :
import subprocess as sp
import numpy
import cv2
command = [ 'ffmpeg',
'-i', 'e:\sample.wmv',
'-pix_fmt', 'rgb24',
'-vcodec', 'rawvideo',
'-vf', 'showinfo', # video filter - showinfo will provide frame timestamps
'-an','-sn', #-an, -sn disables audio and sub-title processing respectively
'-f', 'image2pipe', '-'] # we need to output to a pipe
pipe = sp.Popen(command, stdout = sp.PIPE, stderr = sp.PIPE) # TODO someone on ffmpeg forum said video filters (e.g. showinfo) are bypassed when stdout is redirected to pipes???
for i in range(10):
raw_image = pipe.stdout.read(1280*720*3)
img_info = pipe.stderr.read(244) # 244 characters is the current output of showinfo video filter
print "showinfo output", img_info
image1 = numpy.fromstring(raw_image, dtype='uint8')
image2 = image1.reshape((720,1280,3))
# write video frame to file just to verify
videoFrameName = 'Video_Frame{0}.png'.format(i)
cv2.imwrite(videoFrameName,image2)
# throw away the data in the pipe's buffer.
pipe.stdout.flush()
pipe.stderr.flush()So how to still get the frame timestamps from ffmpeg into python code so that it can be used in my computer vision algorithm...
-
Raw Audio from NAudio
20 février 2017, par KenI want to record raw audio from WASAPI loopback by NAudio and pipe to FFmpeg for streaming via memory stream. As from this document, FFmpeg can get input as Raw However, I got result speed at 8 10x !
Here is my code :waveInput = new WasapiLoopbackCapture();
waveInput.DataAvailable += new EventHandler<waveineventargs>((object sender, WaveInEventArgs e) =>
{
lock (e.Buffer)
{
if (waveInput == null)
return;
try
{
using (System.IO.MemoryStream memoryStream = new System.IO.MemoryStream())
{
memoryStream.Write(e.Buffer, 0, e.Buffer.Length);
memoryStream.WriteTo(ffmpeg.StandardInput.BaseStream);
}
}
catch (Exception)
{
throw;
}
}
});
waveInput.StartRecording();
</waveineventargs>FFmpeg Arguments :
ffmpegProcess.StartInfo.Arguments = String.Format("-f s16le -i pipe:0 -y output.wav");
1. Can someone please explain this situation and give me a solution ?
2. Should I add Wav header to the Memory Stream then pipe to FFmpeg as Wav format ?The Working Solution
waveInput = new WasapiLoopbackCapture();
waveInput.DataAvailable += new EventHandler<waveineventargs>((object sender, WaveInEventArgs e) =>
{
lock (e.Buffer)
{
if (waveInput == null)
return;
try
{
using (System.IO.MemoryStream memoryStream = new System.IO.MemoryStream())
{
memoryStream.Write(e.Buffer, 0, e.BytesRecorded);
memoryStream.WriteTo(ffmpeg.StandardInput.BaseStream);
}
}
catch (Exception)
{
throw;
}
}
});
waveInput.StartRecording();
</waveineventargs>FFMpeg Arguments :
ffmpegProcess.StartInfo.Arguments = string.Format("-f f32le -ac 2 -ar 44.1k -i pipe:0 -c:a copy -y output.wav");