Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Running cmd in python (ffmpeg)
29 mars 2017, par CoolcrabAtm I have this as my code, the first line seems to work well but the 2nd gives errrors.
os.chdir('C://Users/Alex/Dropbox/code stuff/test') subprocess.call(['ffmpeg', '-i', 'test%d0.png', 'output.avi'])
error:
Traceback (most recent call last): File "C:\Users\Alex\Dropbox\code stuff\solarsystem.py", line 56, in
subprocess.call(['ffmpeg', '-i', 'test%d0.png', 'output.avi']) File "C:\Python27\lib\subprocess.py", line 524, in call return Popen(*popenargs, **kwargs).wait() File "C:\Python27\lib\subprocess.py", line 711, in __init__ errread, errwrite) File "C:\Python27\lib\subprocess.py", line 948, in _execute_child startupinfo) WindowsError: [Error 2] The system cannot find the file specified -
opencv rtsp stream protocol
29 mars 2017, par k_kazI want to process and display a network rtsp stream that is created from a raspberry camera. I have this code:
#include
#include #include highgui/highgui.hpp> #include imgproc/imgproc.hpp> int main(int argc, char** argv) { cv::VideoCapture * stream = new cv::VideoCapture("rtsp://192.168.55.151:8554/"); if (!stream->isOpened()) return -1; cv::namedWindow("rtsp_stream", CV_WINDOW_AUTOSIZE); cv::Mat frame; while (true) { if (!stream->read(frame)) return -1; cv::imshow("rtsp_stream", frame); cv::waitKey(15); } return 1; } When the stream is not live, the execution of this results in:
[tcp @ 0xa12480] Connection to tcp://192.168.55.151:8554?timeout=0 failed: Connection refused
Which means that the stream tries to connect with tcp. When the stream is live, the execution results in:
[rtsp @ 0xb07960] method SETUP failed: 461 Client error
From internet research i found that the problem may be that the stream uses udp. If i change the URL to:
"udp://192.168.55.151:8554/"
Then the execution freezes in the
cv::VideoCapture("udp://192.168.55.151:8554/");
VLC is able to open the rtsp stream. As i am given to understand, ffmpeg is used to decode the stream with opencv. When i run:
ffmpeg -rtsp_transport udp -i rtsp://192.168.55.151:8554/ -t 5 test.mp4
the stream decoding and saving is successful. So how can i specify the lower level protocol to be udp in the opencv code? Is there another way to do it with opencv?
EDIT: If i change the ffmpeg command to use tcp, i.e:
ffmpeg -rtsp_transport tcp -i rtsp://192.168.55.151:8554/ -t 5 test.mp4
then i get the same exact error with the c++ code, 461 client error
EDIT: When the code uses the udp:// link, after some 15 seconds of freeze, the execution returns with error
-
Video from set of images with fading
29 mars 2017, par MarkusHHi need to generate a video (mp4) from a set of images in a folder. My command works fine but i want the frames to fade in/out, crossfade or something to make the vid "smoother".
My command:
exec("ffmpeg -framerate 30 -y -pattern_type glob -i 'images/*.jpg' -filter:v scale=1280:-1 -q:v 0 -vcodec mpeg4 -strict -2 -movflags faststart folder/video.mp4 2> ffmpeg_log.txt");
Thanks for any assistance
-
ffmpeg - extract exact number of frames from video
29 mars 2017, par Michael BI want to create a maximum of 30 images from a video (and tile them for a sprite sheet).
I've tried using the 'select' with 'mod' but if the total number of frames does not fit neatly into the desired number of images (30) then I sometimes end up with more images, sometimes less.
For example if my video is 72 frames long, my 'mod' would be 72 / 30, which is 2.4.
I'm running this from a python script so i'm doing something like the following for the filter:
select='not(mod(n\," + str(mod) + "))'
I think the mod has to be an integer (?) so I could either round down and use 2 which gives me 36 images or round up which gives me 24 images
Whats the best way to get exactly 30? - obviously the interval wouldn't be identical but thats fine.
Maybe I could use a for loop to generate a list of the frames closest to the desired interval and then pass that in as the select filter?
e.g. to get the frames I would do something like this:
nframes = 72 # number of frames in video outImages = 30 # number of images I want mod = float(nframes) / outImages # 2.4 frames = [] idx = 1 while i < nframes: print str(idx) + ": " + str(math.floor(i+0.5)) frames.append(int(math.floor(i+0.5))) idx += 1 i += mod
Then am I able to pass that (the frames list) into the ffmpeg command? Or can I tell ffmpeg to do something similar?
-
ffmpeg- blend transition between images specified in a text file
29 mars 2017, par hackhow can i create a blend transition effect between specified images? I have the file paths to images in a text file. My requirement is that i should be able to do a blend transition between specific images (whose path are in text file) for a specific duration. I also have to annotate these images with text.For that i am using drawtext filter. The blend transition should be between annotated images.
Thanks