Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Windows Batch-File to execute n other batch-files in parallel
11 décembre 2015, par JosI use ffmpeg to convert video files. The codec I'm encoding with is single-threaded. There are a lot of video files to convert, so instead of using multithreaded encoding I can simply encode multiple video files with 1 thread each.
To automate the encoding I use the follow commands in a .bat:
SET maindir=%~dp0 FOR /r /d %%f in (*) do pushd "%%f" & Call %maindir%convert-with-ffmpeg.bat & popd
This calls "convert-with-ffmpeg.bat" sequentially`to convert each file/folder in the main dir.
What would my .bat file have to look like if I want to do this in parallel (say, 12 times) until there are no more folders left to process?
EDIT: Let me be clear that I want to LIMIT the amount of running processes at 12. When a .bat is done, I want the main .bat to start a new process until all files are converted (100+).
-
What is the best way to implement ffmpeg for mp3 compression with php on the fly ? [on hold]
11 décembre 2015, par Akash BoseI am trying to compress mp3 files to 92 kbps on the fly when a user uploads a mp3 file. It looks like conversion takes my 35% of CPU for a single file. So if three conversion process happens together it takes my entire CPU recourse. i am using the following command:
$cmd = 'ffmpeg -i ' . $mp3path . ' -ab 92k ' . $smp3path . '.mp3'; exec($cmd);
-
FFMPEG - showfreqs and showwaves over background image ?
11 décembre 2015, par JeremyI'm wanting to get FFMPEG to export my podcast audio to a file I can upload to youtube that is visually interesting.
currently I am using the following piece of code, which I don't fully grasp:
ffmpeg -i E04_ProphetsPrey.wav -filter_complex \ " [0:a]showfreqs=mode=line:ascale=log:fscale=rlog:s=1280x518,pad=1280:720[vs]; \ [0:a]showfreqs=mode=line:ascale=log:fscale=rlog:s=1x1[ss]; \ [0:a]showwaves=s=1280x202:mode=p2p[sw]; \ [vs][ss]overlay=w[bg]; \ [bg][sw]overlay=0:H-h,drawtext=fontfile=/usr/share/fonts/TTF/Vera.ttf:fontcolor=white:x=10:y=10:text='\"Rated80s Prophets Prey\" by Comics On Film'[out]" \ -map "[out]" -map 0:a -c:v libx264 -preset fast -crf 18 -c:a copy -threads 0 output.mkv
what I would like to do is set a (branded) background image, and have showfreqs render over it on the top half and showwaves render over it on the bottom half.
Is that possible, and if so could you provide me a detailed example?
(I'm on arch linux)
-
How to get video frame for a specific time from mp4
11 décembre 2015, par man-ri have an mp4 video byte array and i need to generate a thumbnail for it using its middle frame (e.g. if the video length is 10 seconds then i need to get the picture from 5th second).
i managed to parse through the file and extract its boxes (atom). i have also managed to get the video length from the mvhd box. also i managed to extract 1. the time-To-Sample table from stts box, 2. the sample-To-Chunk table from stcs box, 3. the chunk-Offset table from stco box, 4. the sample Size table from stsz box, 5. the Sync Sample table from stss box
i know that all the actual media are available in the mdat box and that i need to correlate the above table to find the exact frame offset in the file but my question is how? the tables data seems to be compressed (specially the time-To-Sample table) but i don't know how decompress them.
any help is appreciated.
below are code samples
code to convert byte to hex
public static char[] bytesToHex(byte[] bytes) { char[] hexChars = new char[bytes.length * 2]; for ( int j = 0; j < bytes.length; j++ ) { int v = bytes[j] & 0xFF; hexChars[j * 2] = hexArray[v >>> 4]; hexChars[j * 2 + 1] = hexArray[v & 0x0F]; } return hexChars; }
code for getting the box offset
final static String MOOV = "6D6F6F76"; final static String MOOV_MVHD = "6D766864"; final static String MOOV_TRAK = "7472616B"; final static String MOOV_TRAK_MDIA = "6D646961"; final static String MOOV_TRAK_MDIA_MINF = "6D696E66"; final static String MOOV_TRAK_MDIA_MINF_STBL = "7374626C"; final static String MOOV_TRAK_MDIA_MINF_STBL_STSD = "73747364"; final static String MOOV_TRAK_MDIA_MINF_STBL_STTS = "73747473"; final static String MOOV_TRAK_MDIA_MINF_STBL_STSS = "73747373"; final static String MOOV_TRAK_MDIA_MINF_STBL_STSC = "73747363"; final static String MOOV_TRAK_MDIA_MINF_STBL_STCO = "7374636F"; final static String MOOV_TRAK_MDIA_MINF_STBL_STSZ = "7374737A"; static int getBox(char[] s, int offset, String type) { int typeOffset = -1; for (int i = offset*2; i-1) { break; } } i+=(size*2); } return typeOffset; }
code for getting the duration and timescale
static int[] getDuration(char[] s) { int mvhdOffset = getBox(s, 0, MOOV_MVHD); int timeScaleStart = (mvhdOffset*2) + (4 + 4 + 1 + 3 + 4 + 4)*2; int timeScaleEnd = (mvhdOffset*2) + (4 + 4 + 1 + 3 + 4 + 4 + 4)*2; int durationStart = (mvhdOffset*2) + (4 + 4 + 1 + 3 + 4 + 4 + 4)*2; int durationEnd = (mvhdOffset*2) + (4 + 4 + 1 + 3 + 4 + 4 + 4 + 4)*2; String timeScaleHex = new String(Arrays.copyOfRange(s, timeScaleStart, timeScaleEnd)); String durationHex = new String(Arrays.copyOfRange(s, durationStart, durationEnd)); int timeScale = Integer.parseInt(timeScaleHex, 16); int duration = Integer.parseInt(durationHex, 16); int[] result = {duration, timeScale}; return result; }
code to get the time-To-Sample table
static int[][] getTimeToSampleTable(char[] s, int trakOffset) { int offset = getBox(s, trakOffset, MOOV_TRAK_MDIA_MINF_STBL_STTS); int sizeStart = offset*2; int sizeEnd = offset*2 + (4)*2; int typeStart = offset*2 + (4)*2; int typeEnd = offset*2 + (4 + 4)*2; int noOfEntriesStart = offset*2 + (4 + 4 + 1 + 3)*2; int noOfEntriesEnd = offset*2 + (4 + 4 + 1 + 3 + 4)*2; String sizeHex = new String(Arrays.copyOfRange(s, sizeStart, sizeEnd)); String typeHex = new String(Arrays.copyOfRange(s, typeStart, typeEnd)); String noOfEntriesHex = new String(Arrays.copyOfRange(s, noOfEntriesStart, noOfEntriesEnd)); int size = Integer.parseInt(sizeHex, 16); int noOfEntries = Integer.parseInt(noOfEntriesHex, 16); int[][] timeToSampleTable = new int[noOfEntries][2]; for (int i = 0; icode>
-
FFmpeg ubuntu auto transcoding script [on hold]
11 décembre 2015, par SambirHi guys i want to create an ffmpeg script which picks up files from one folder transcodes them then deletes the source.
So think of the following scenario:
I place file A.mp4 and A.srt in folder input I also place file B.mp4 and B.srt in folder input
In the night a script is executed which picks up these files places them in the folder "transcoding" then starts to transcode the movies one by one. So not parallel.
When finished new files should be created in the output folder by the ffmpeg script. And the files in the folder "transcoding" should be deleted.
Can anyone help me out here. It does not seem to be that complex but since I am not that good at coding any help is appreciated :)