Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
FFMpeg process created from Java on CentOS doesn't exit
21 juin 2017, par DonzI need to convert a lot of wave files simultaneously. About 300 files in parallel. And new files come constantly. I use ffmpeg process call from my Java 1.8 app, which is running on CentOS. I know that I have to read error and input streams for making created process from Java possible to exit.
My code after several expirements:
private void ffmpegconverter(String fileIn, String fileOut){ String[] comand = new String[]{"ffmpeg", "-v", "-8", "-i", fileIn, "-acodec", "pcm_s16le", fileOut}; Process process = null; BufferedReader reader = null; try { ProcessBuilder pb = new ProcessBuilder(comand); pb.redirectErrorStream(true); process = pb.start(); //Reading from error and standard output console buffer of process. Or it could halts because of nobody //reads its buffer reader = new BufferedReader(new InputStreamReader(process.getInputStream())); String s; //noinspection StatementWithEmptyBody while ((s = reader.readLine()) != null) { log.info(Thread.currentThread().getName() + " with fileIn " + fileIn + " and fileOut " + fileOut + " writes " + s); //Ignored as we just need to empty the output buffer from process } log.info(Thread.currentThread().getName() + " ffmpeg process will be waited for"); if (process.waitFor( 10, TimeUnit.SECONDS )) { log.info(Thread.currentThread().getName() + " ffmpeg process exited normally"); } else { log.info(Thread.currentThread().getName() + " ffmpeg process timed out and will be killed"); } } catch (IOException | InterruptedException e) { log.error(Thread.currentThread().getName() + "Error during ffmpeg process executing", e); } finally { if (process != null) { if (reader != null) { try { reader.close(); } catch (IOException e) { log.error("Error during closing the process streams reader", e); } } try { process.getOutputStream().close(); } catch (IOException e) { log.error("Error during closing the process output stream", e); } process.destroyForcibly(); log.info(Thread.currentThread().getName() + " ffmpeg process " + process + " must be dead now"); } } }
If I run separate test with this code it goes normally. But in my app I have hundreds of RUNNING deamon threads "process reaper" which are waiting for ffmpeg process finish. In my real app ffpmeg is started from timer thread. Also I have another activity in separate threads, but I don't think that this is the problem. Max CPU consume is about 10%.
Here is that I usual see in thread dump:
"process reaper" #454 daemon prio=10 os_prio=0 tid=0x00007f641c007000 nid=0x5247 runnable [0x00007f63ec063000] java.lang.Thread.State: RUNNABLE at java.lang.UNIXProcess.waitForProcessExit(Native Method) at java.lang.UNIXProcess.lambda$initStreams$3(UNIXProcess.java:289) at java.lang.UNIXProcess$$Lambda$32/2113551491.run(Unknown Source) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)
What am I doing wrong?
UPD: My app accepts a lot of connects with voice traffic. So I have about 300-500 another "good" threads in every moment. Could it be the reason? Deamon threads have low priority. But I don't beleive that they really can't do their jobs in one hour. Ususally it takes some tens of millis.
UPD2: My synthetic test that runs fine. I tried with new threads option and without it just with straigt calling of run method.
import java.io.BufferedReader; import java.io.File; import java.io.IOException; import java.io.InputStreamReader; public class FFmpegConvert { public static void main(String[] args) throws Exception { FFmpegConvert f = new FFmpegConvert(); f.processDir(args[0], args[1], args.length > 2); } private void processDir(String dirPath, String dirOutPath, boolean isNewThread) { File dir = new File(dirPath); File dirOut = new File(dirOutPath); if(!dirOut.exists()){ dirOut.mkdir(); } for (int i = 0; i < 1000; i++) { for (File f : dir.listFiles()) { try { System.out.println(f.getName()); FFmpegRunner fFmpegRunner = new FFmpegRunner(f.getAbsolutePath(), dirOut.getAbsolutePath() + "/" + System.currentTimeMillis() + f.getName()); if (isNewThread) { new Thread(fFmpegRunner).start(); } else { fFmpegRunner.run(); } } catch (Exception e) { e.printStackTrace(); } } } } class FFmpegRunner implements Runnable { private String fileIn; private String fileOut; FFmpegRunner(String fileIn, String fileOut) { this.fileIn = fileIn; this.fileOut = fileOut; } @Override public void run() { try { ffmpegconverter(fileIn, fileOut); } catch (Exception e) { e.printStackTrace(); } } private void ffmpegconverter(String fileIn, String fileOut) throws Exception{ String[] comand = new String[]{"ffmpeg", "-i", fileIn, "-acodec", "pcm_s16le", fileOut}; Process process = null; try { ProcessBuilder pb = new ProcessBuilder(comand); pb.redirectErrorStream(true); process = pb.start(); //Reading from error and standard output console buffer of process. Or it could halts because of nobody //reads its buffer BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream())); String line; //noinspection StatementWithEmptyBody while ((line = reader.readLine()) != null) { System.out.println(line); //Ignored as we just need to empty the output buffer from process } process.waitFor(); } catch (IOException | InterruptedException e) { throw e; } finally { if (process != null) process.destroy(); } } } }
UPD3: Sorry, I forgot to notice that I see the work of all these process - they created new converted files but anyway don't exit.
-
ffmpeg libx264 settings to keep exact colors
21 juin 2017, par vinnidoes anyone have an idea how I need to edit these ffmpeg settings to keep the original colors? I'm trying to convert a video, but need to absolutely keep the rgb colors, as it's going to be embedded into a website with a background color.
The parameters look currently like this:
'-acodec aac -ac 2 -ab 160k ' '-vcodec libx264 -preset slow -profile:v baseline -level 25 ' '-maxrate 10000000 -bufsize 10000000 -vb 1200k -f mp4 ' '-threads 0'
Thanks!
-
ffmpeg doesn't seem to be working with multiple audio streams correctly
21 juin 2017, par Caius JardI'm having an issue with ffmpeg 3.2.2; ordinarily I ask it to make an MP4 video file with 2 audio streams. The command line looks like this:
ffmpeg.exe -rtbufsize 256M -f dshow -i video="screen-capture-recorder" -thread_queue_size 512 -f dshow -i audio="Line 2 (Virtual Audio Cable)" -f dshow -i audio="Line 3 (Virtual Audio Cable)" -map 0:v -map 1:a -map 2:a -af silencedetect=n=-50dB:d=60 -pix_fmt yuv420p -y "c:\temp\2channelvideo.mp4"
I've wrapped it for legibility. This once worked fine, but something is wrong lately - it doesnt seem to record any audio, even though I can use other tools like Audacity to record audio from these devices just fine
I'm trying to do some diag on it by dropping the video component and asking ffmpeg to record the two audio devices to two separate files:
ffmpeg.exe -f dshow -i audio="Line 2 (Virtual Audio Cable)" "c:\temp\line2.mp3" -f dshow -i audio="Line 3 (Virtual Audio Cable)" "c:\temp\line3.mp3"
ffmpeg's console output looks like:
Guessed Channel Layout for Input Stream #0.0 : stereo Input #0, dshow, from 'audio=Line 2 (Virtual Audio Cable)': Duration: N/A, start: 5935.810000, bitrate: 1411 kb/s Stream #0:0: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s Guessed Channel Layout for Input Stream #1.0 : stereo Input #1, dshow, from 'audio=Line 3 (Virtual Audio Cable)': Duration: N/A, start: 5936.329000, bitrate: 1411 kb/s Stream #1:0: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s Output #0, mp3, to 'c:\temp\line2.mp3': Metadata: TSSE : Lavf57.56.100 Stream #0:0: Audio: mp3 (libmp3lame), 44100 Hz, stereo, s16p Metadata: encoder : Lavc57.64.101 libmp3lame Output #1, mp3, to 'c:\temp\line3.mp3': Metadata: TSSE : Lavf57.56.100 Stream #1:0: Audio: mp3 (libmp3lame), 44100 Hz, stereo, s16p Metadata: encoder : Lavc57.64.101 libmp3lame Stream mapping: Stream #0:0 -> #0:0 (pcm_s16le (native) -> mp3 (libmp3lame)) Stream #0:0 -> #1:0 (pcm_s16le (native) -> mp3 (libmp3lame)) Press [q] to stop, [?] for help
The problem i'm currently having is that the produced mp3 are identical copies of line 2 only; line 3 audio is not recorded. The last line is of concern; it seems to be saying that stream 0 is being mapped to both output 0 and 1? Do I need a map command for each file also? I thought it would be implicit due to the way i specified the arguments
-
using FFMPEG library to perform amix from C code
21 juin 2017, par user3396714I have been exploring FFMPEG recently for a requirement in my team.
We have two rtp streams which we receive on two different sockets. We need to overlay the streams and then send the output as rtp to another socket.
Using command line I can do the same thing as:
ffmpeg -f rtp -i rtp://196.1.110.249:8977 -f rtp -i rtp://196.1.110.249:8999 -filter_complex amix=inputs=2:duration=longest:dropout_transition=3 -f rtp rtp://192.168.105.207:8004
Now I wish to use the FFMPEG library to write a C program to do the above work. Kindly provide some starting point as the documentation of FFMPEG and its examples do not mention how to read from a socket and output to a socket.
-
How to know video input height and use it inside ffmpeg
21 juin 2017, par OdeI want to compile videos with ffmpeg using -crf rate control mode. However i want to use different crf value for different resolution videos.
e.g. for 320x240 videos i want to use -crf 23 and for 640x480 videos I want to use -crf 26.
I tried to use "if" function
-crf "'if(lte(ih,240),1,51)'"
combined with input height constantih
, but it seems to me that you can only useih
constant inside videofilters (-vf
) because I getUndefined constant or missing '(' in 'ih,240),1,51)'
-error when compiling. Otherwise the function is working great.Is there a way to use the
ih
constant somewhere else than inside-vf
or is there a easy way to know video files width. I'm doing batch processing so i can't do it manually.