Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
FFMPEG retrieve time duration not working for MOV files
1er avril 2015, par Calvin MossSo I am in php running an ffprobe command on files and grabbing the output (to get duration, time, etc)
exec("usr/bin/ffprobe -v quiet-print_format json -show_format -show_streams $location", $output, $exitCode);
Where $location is the files location + filename. Now this command works 100% perfectly for .mp4 / .avi / everything else I have tried. Except .mov, when using .mov my "output" variable is empty. Is there any way to fix this / change this?
-
rotate Frame in JavaCV
1er avril 2015, par VahagnI write simple app for android whick use FFmpeg + JavaCV. I can capture image from camera and can recorde video. But want to rotate frame when record video, don't can :-(
Stop and save code:
public void stopRecording() { runAudioThread = false; try { audioThread.join(); } catch (InterruptedException e) { e.printStackTrace(); } audioRecordRunnable = null; audioThread = null; if (recorder != null && recording) { if (RECORD_LENGTH > 0) { Log.v(LOG_TAG,"Writing frames"); try { int firstIndex = imagesIndex % samples.length; int lastIndex = (imagesIndex - 1) % images.length; if (imagesIndex <= images.length) { firstIndex = 0; lastIndex = imagesIndex - 1; } if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) { startTime = 0; } if (lastIndex < firstIndex) { lastIndex += images.length; } for (int i = firstIndex; i <= lastIndex; i++) { long t = timestamps[i % timestamps.length] - startTime; if (t >= 0) { if (t > recorder.getTimestamp()) { recorder.setTimestamp(t); } recorder.record(images[i % images.length]); } } firstIndex = samplesIndex % samples.length; lastIndex = (samplesIndex - 1) % samples.length; if (samplesIndex <= samples.length) { firstIndex = 0; lastIndex = samplesIndex - 1; } if (lastIndex < firstIndex) { lastIndex += samples.length; } for (int i = firstIndex; i <= lastIndex; i++) { recorder.recordSamples(samples[i % samples.length]); } } catch (FFmpegFrameRecorder.Exception e) { Log.v(LOG_TAG,e.getMessage()); e.printStackTrace(); } } recording = false; Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder"); try { recorder.stop(); recorder.release(); } catch (FFmpegFrameRecorder.Exception e) { e.printStackTrace(); } recorder = null; } }
in step
recorder.record(images[i % images.length]);
need rotate frame (images parameter)How can help me ? Thank you.
-
Muxing in audio to gstreamer RTMP stream kills both video and Audio
1er avril 2015, par AdamI need some genius help here - I'm trying to set up a live stream for my upcoming wedding... and I have it ALMOST working - audio seems to be the problem.
This is my setup
- Raspberry Pi Model B+
- Logitech C920 (with onboard h264 encoding that I am utilising)
- on-camera (C920) microphone
- USB wifi to iPhone 4G connection
- gstreamer1.0
- Amazon EC2 Wowza RTMP server
I have it all set up, but as soon as I mux in the audio, the streams wont play by any player.
What Works: - my gstreamer pipeline WITHOUT the audio muxed in - Wowza receives a consistent stream, no failures - The various Flash players / iOS / Android and VLC all play back the video
What doesnt: - enabling audio in the mux (using the pipeline below) - BUT gstreamer doesnt complain - BUT Wowza receives a consistent stream, no failures - The various flash players fail to play both Audio and Video. some just display the first video frame - VLC plays 1 video frame, and about 100ms of audio, then stops
Ideally I'd like the muxed audio/video FLV stored on the SD card too in case the network goes down - but if the 'tee' needs to be sacrificed to make it work, so be it.
This is my current FAILING pipeline - I assume there's something really stupid in it because I know practically nothing about gstreamer.... The first frame loads in all the players (except iOS.. which never shows anything)
# set camera resolution to 720p, and the data format to H264 (alternatives are YUV and JPG) v4l2-ctl --device=/dev/video0 --set-fmt-video=width=1280,height=720,pixelformat=1 # set the frame rate v4l2-ctl --device=/dev/video0 --set-parm=10 gst-launch-1.0 -v -e uvch264src initial-bitrate=300000 average-bitrate=300000 device=/dev/video0 name=src auto-start=true src.vidsrc \ ! queue \ ! video/x-h264,width=1280,height=720,framerate=10/1 \ ! h264parse \ ! flvmux streamable=true name=mux \ ! queue \ ! tee name=t \ ! queue \ ! filesink location=/home/pi/wedding.flv t. \ ! queue \ ! rtmpsink location='rtmp://wowzaserver/live/wedding live=1' >>/home/pi/wedding.log 2>&1
Some of the things I can't really afford to change at this late stage are the encapsulation (FLV) and wowza RTMP because I've built everything around that...
Please Help!! Thanks!
UPDATE
Given that I am also saving the FLV file, I have found that if I use ffmpeg to send that FLV file (using audio copy, video copy) to the RTMP server, everything works (but obviously its not live)! So I am now starting to believe this is a problem with the way Gstreamer encapsulates RTMP - and by putting ffmpeg in the middle it fixes it... but it's not live of course. Is it possible to pipe my output to ffmpeg and using ffmpeg's RTMP?
-
Minimum of 10 fps with FFmpeg and libvp9
1er avril 2015, par Richy321I'm encoding a video with ffmpeg and the vp9 encoder (using the image2pipe function) using the following command line called through C#:
ffmpeg.exe" -r 5 -f image2pipe -i pipe:.bmp -vcodec libvpx -b:v 2M -crf 10 -qmin 4 -qmax 40 -threads 8 -slices 8 "fpsTest.avi"
If I set the fps below 10 (using the -r parameter) all I get is a repeat of the first frame for the whole duration of the video. If I set the fps above 10 and above everything works correctly.
Any ideas?
-
Encoding a live stream of bitmaps using FFMPEG
1er avril 2015, par user3088260I've got a constant stream of jpeg's that need to be converted to a video (of any kind).
The problem is that the stream is virtually infinite and I don't know any method of creating videos using pictures that doesn't store the whole file in memory until the stream stops..Is there a way that I can feed FFMPEG with bitmaps or jpegs constantly from c# or at least make ffmpeg read images from a folder with +432000 images (on the command line)?
Please note that there will be no sound, the images will be low quality, they will come at a rate ~5 FPS and this will run on a high-ish end laptop. BTW I can provide code but I assume this is unnecessary at the moment as I am only creating bitmaps and saving them.