Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
How to add a cover image to mp4 file using FFMPEG ?
27 janvier 2019, par IremadzeArchil19910311I need to know how to assign cover image to mp4 file using FFMPEG?
Especially if you really need to understand? Let me explain:When I am trying to navigate a mp4 video file using windows explorer
and navigation option is set toView -> Large icons or View -> Extra large icons
It allows me to see a picture or an image.
But in most cases this image is a middle frame of mp4 video file.
If you will play this mp4 video file and if you will go to middle frame you will see the exact image shown on explorer!But I need to know how to see a cover image during explorer navigation instead of this middle frame image?
So which FFMPEG commands can help me to add an cover image to mp4 video file? -
JavaCV grab frame method delays and returns old frames
27 janvier 2019, par Null PointerI'm trying to create a video player in Java using JavaCV and its FFmpegFrameGrabber class. Simply, Inside a loop, I use
.grab() to get a frame and then paint it on a panel
Problem is, player gets delayed. For example, after 30 seconds passes, in video only 20 seconds passes.
Source is ok. Other players can play the stream normally. Problem is possibly the long printing time.
What I do not understand is that : "why does .grab() method brings me a frame from 10 seconds ago ?" Shouldn't it just grab the frame which is being streamed at the moment ?
(Sorry for not providing a working code, it's all over different huge classes)
I use the following grabber options (selected by some other colleague):
grabber.setImageHeight(480); grabber.setImageWidth(640); grabber.setOption("reconnect", "1"); grabber.setOption("reconnect_at_eof", "1"); grabber.setOption("reconnect_streamed", "1"); grabber.setOption("reconnect_delay_max", "2"); grabber.setOption("preset", "veryfast"); grabber.setOption("probesize", "192"); grabber.setOption("tune", "zerolatency"); grabber.setFrameRate(30.0); grabber.setOption("buffer_size", "" + this.bufferSize); grabber.setOption("max_delay", "500000"); grabber.setOption("stimeout", String.valueOf(6000000)); grabber.setOption("loglevel", "quiet"); grabber.start();
Thanks
-
poor resolution in video out put from ffmpeg
27 janvier 2019, par Clinton WinantI make animation of png or jpg frames output from a fluid flow simulation code. An example image looks like:
The ffmpeg command I use is
ffmpeg -f image2 -r 20 -i img%04d.png -vcodec mpeg4 animate.avi
The resulting video is very poor as can be seen in the screen capture of a single frame of the output file animate.avi:
There are elongated shaded areas on either side of the yellow streaks, and the ffmpeg produced frame is much grainier than the original .png.
I understand that there has to be massive compression in the production of the video, and I assume the problem lies there, where the compression id poor for the kind of images I am trying to animate. How could I use ffmpeg to produce a video with quality comparable to the original images I have?
PS I am using Debian Buster, the png images have a resolution of 100 dpi.
-
Creating automated ffmpeg converter
27 janvier 2019, par Jeffrey MoserI am trying to create an automated process using ffmpeg and am having issues understanding how to adapt the code to automator.
The code works beautifully in the terminal, but requires each filename to me entered manually. I am trying to automate the process.
for f in "$@" do FILE= "$f" OUT=${FILE/mp4/png} OUT_WIDTH=10800 OUT_HEIGHT=3600 eval $(ffprobe -v error -of flat=s=_ -select_streams v:0 -show_entries stream=height,width "${FILE}") IN_WIDTH=${streams_stream_0_width} IN_HEIGHT=${streams_stream_0_height} eval $(ffprobe -v error -of flat=s=_ -count_frames -select_streams v:0 -show_entries stream=nb_read_frames "${FILE}") IN_FRAMES=${streams_stream_0_nb_read_frames} W_TILES=$(bc <<< "(sqrt($OUT_WIDTH*$IN_FRAMES*$IN_HEIGHT/($IN_WIDTH*$OUT_HEIGHT))+3)") H_TILES=$(bc <<< "(sqrt($OUT_HEIGHT*$IN_FRAMES*$IN_WIDTH/($OUT_WIDTH*$IN_HEIGHT)))") W_SHRUNK=$(bc <<< "$OUT_WIDTH/$W_TILES") H_SHRUNK=$(bc <<< "$OUT_HEIGHT/$H_TILES") ffmpeg -i "${FILE}" -vf "hflip,transpose=cclock,scale=${H_SHRUNK}:${W_SHRUNK},tile=${H_TILES}x${W_TILES},transpose=clock,hflip,scale=${OUT_WIDTH}x${OUT_HEIGHT}" -an -sn -vsync 0 "${OUT}" done
I expected the code to produce an image of every frame of the film. But instead I got a massive error message.
The action “Run Shell Script” encountered an error: “-: line 5: /Users/jeffreymoser/21 A New Dawn.mp4: Permission denied -: line 14: ffprobe: command not found -: line 18: ffprobe: command not found (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 2: parse error (standard_in) 2: parse error -: line 29: ffmpeg: command not found -: line 5: /Users/jeffreymoser/22 Battle Hymn.mp4: Permission denied -: line 14: ffprobe: command not found -: line 18: ffprobe: command not found (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 2: parse error (standard_in) 2: parse error -: line 29: ffmpeg: command not found -: line 5: /Users/jeffreymoser/23 Reckless.mp4: Permission denied -: line 14: ffprobe: command not found -: line 18: ffprobe: command not found (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 2: parse error (standard_in) 2: parse error -: line 29: ffmpeg: command not found -: line 5: /Users/jeffreymoser/24 Showdown.mp4: Permission denied -: line 14: ffprobe: command not found -: line 18: ffprobe: command not found (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 2: parse error (standard_in) 2: parse error -: line 29: ffmpeg: command not found -: line 5: /Users/jeffreymoser/25 Wedding Bells.mp4: Permission denied -: line 14: ffprobe: command not found -: line 18: ffprobe: command not found (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 1: parse error (standard_in) 2: parse error (standard_in) 2: parse error -: line 29: ffmpeg: command not found
-
Android FFmpeg - Mux encoded h264 frames from MediaCodec
27 janvier 2019, par Bob RossI'm trying to capture h264 frames from Android camera (encoded by MediaCodec) and pass them to an FFmpeg process running on the same device.
I currently do it by writing the encoded byte arrays I receive from the MediaCodec to a file called out.h264.
Like so :
FileOutputStream fosVideo = new ... ... // encoder callback @Override public void onVideoData(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) { fosVideo.write(h264Buffer); }
While the h264 file is being written to is I start the FFmpeg process and provide the h264 file as input.
ffmpeg -re -i out.h264 -c:v copy -r 30 -loglevel 48 a.mp4
I also tried
ffmpeg -re -framerate 25 -i out.h264 -c:v copy -r 30 -loglevel 48 a.mp4
The FFmpeg process runs anywhere between 10 seconds to a few minutes and then stops abruptly with:
frame= 330 fps= 29 q=31.0 size= 512kB time=00:00:10.98 bitrate= 381.8kbits/s dup=55 drop=0 speed=0.972x [h264 @ 0xf1863800] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1 [h264 @ 0xf1863b80] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1 [h264 @ 0xf1863f00] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1 *** 1 dup! [h264 @ 0xf1864280] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1 Clipping frame in rate conversion by 0.199989 [h264 @ 0xf1864600] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1 [h264 @ 0xf1862a00] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1 [h264 @ 0xf1862d80] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1 [h264 @ 0xf1863100] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1 *** 1 dup! Clipping frame in rate conversion by 0.199989 *** 1 dup! frame= 347 fps= 29 q=31.0 size= 768kB time=00:00:11.53 bitrate= 545.5kbits/s dup=58 drop=0 speed=0.974x Clipping frame in rate conversion by 0.199989 [out_0_0 @ 0xf182e1e0] EOF on sink link out_0_0:default. No more output streams to write to, finishing. frame= 349 fps= 29 q=24.8 Lsize= 920kB time=00:00:17.68 bitrate= 426.1kbits/s dup=58 drop=0 speed=1.48x video:631kB audio:282kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.732886% Input file #0 (/storage/emulated/0/MOVIES/out.h264): Input stream #0:0 (video): 291 packets read (6065016 bytes); 291 frames decoded; Total: 291 packets (6065016 bytes) demuxed Input file #1 (/storage/emulated/0/MOVIES/out.aac): Input stream #1:0 (audio): 830 packets read (289119 bytes); Total: 830 packets (289119 bytes) demuxed Output file #0 (/storage/emulated/0/hls/a.mp4): Output stream #0:0 (video): 349 frames encoded; 349 packets muxed (645952 bytes); Output stream #0:1 (audio): 830 packets muxed (289119 bytes); Total: 1179 packets (935071 bytes) muxed 291 frames successfully decoded, 0 decoding errors
Even though the out.h264 file is still being recorded into. It is as if the ffmpeg process thinks that the file has ended.
Any idea what could it be?