Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Gaps when recording using MediaRecorder API(audio/webm opus)
9 août 2018, par Jack Juiceson----- UPDATE HAS BEEN ADDED BELOW -----
I have an issue with MediaRecorder API (https://www.w3.org/TR/mediastream-recording/#mediarecorder-api).
I'm using it to record the speech from the web page(Chrome was used in this case) and save it as chunks. I need to be able to play it while and after it is recorded, so it's important to keep those chunks.
Here is the code which is recording data:
navigator.mediaDevices.getUserMedia({ audio: true, video: false }).then(function(stream) { recorder = new MediaRecorder(stream, { mimeType: 'audio/webm; codecs="opus"' }) recorder.ondataavailable = function(e) { // Read blob from `e.data`, decode64 and send to sever; } recorder.start(1000) })
The issue is that the WebM file which I get when I concatenate all the parts is corrupted(rarely)!. I can play it as WebM, but when I try to convert it(ffmpeg) to something else, it gives me a file with shifted timings.
For example. I'm trying to convert a file which has duration
00:36:27.78
to wav, but I get a file with duration00:36:26.04
, which is 1.74s less.At the beginning of file - the audio is the same, but after about 10min WebM file plays with a small delay.
After some research, I found out that it also does not play correctly with the browser's MediaSource API, which I use for playing the chunks. I tried 2 ways of playing those chunks:
In a case when I just merge all the parts into a single blob - it works fine. In case when I add them via the sourceBuffer object, it has some gaps (i can see them by inspecting
buffered
property). 697.196 - 697.528 (~330ms) 996.198 - 996.754 (~550ms) 1597.16 - 1597.531 (~370ms) 1896.893 - 1897.183 (~290ms)Those gaps are 1.55s in total and they are exactly in the places where the desync between wav & webm files start. Unfortunately, the file where it is reproducible cannot be shared because it's customer's private data and I was not able to reproduce such issue on different media yet.
What can be the cause for such an issue?
----- UPDATE ----- I was able to reproduce the issue on https://jsfiddle.net/96uj34nf/4/
In order to see the problem, click on the "Print buffer zones" button and it will display time ranges. You can see that there are two gaps: 0 - 136.349, 141.388 - 195.439, 197.57 - 198.589
- 136.349 - 141.388
- 195.439 - 197.57
So, as you can see there are 5 and 2 second gaps. Would be happy if someone could shed some light on why it is happening or how to avoid this issue.
Thank you
-
FFMPEG filter complex concat video with slide transition between image
9 août 2018, par Karate_DogI've been working on video to create slideshow with slide transition, the input receives multiple images. The input will be scaled first before drawtext is drawn on each video, then apply transition effects by using overlay and finally concat the result into one video.
I am having trouble getting the result of the drawtext to make a overlay slide transition
ffmpeg -i, image1.png, -i, image2.png, -filter_complex, nullsrc=size=720x720[background]; [0:v]scale=720:720, setsar=1[scl1]; [1:v]scale=720:720, setsar=1[scl2]; [scl1]zoompan=z=if(lte(zoom,1.0),1.5,max(1.001,zoom - 0.0025)):fps=45:s=720x720:d=360[v0]; [scl2]zoompan=z=if(lte(zoom,1.0),1.5,max(1.001,zoom - 0.0025)):fps=45:s=720x720:d=360[v1] [v0]drawtext=fontfile=Lato-Bold.ttf: text='Example 1' :x=10:y=h-220:fontsize=80:fontcolor=white[text1]; [v1]drawtext=fontfile=Lato-Bold.ttf: text='Example 2' :x=10:y=h-220:fontsize=80:fontcolor=white[text2]; [background][text1]overlay=x=min(-w+(t*w/0.5),0):shortest=1[ovr1]; [ovr1][text2]overlay=x=min(-w+(t*w/0.5),0):shortest=1[ovr2]; [ovr1][ovr2]concat=n=2:v=1:a=0 format=yuv420p[video] -map [video] outputvideo.mp4
I got error saying that my label was invalid
[png_pipe @ 0xf3fc2000] Invalid stream specifier: ovr1. Last message repeated 1 times Stream specifier 'ovr1' in filtergraph description
-
Using ffmpeg to read SRTP input
9 août 2018, par mulg0rRelated to question and answer in using-ffmpeg-for-stream-encryption-by-srtp-in-windows, I see how to transmit a file as SRTP output flow and it's played with ffplay.
Well, I'm trying to do the opposite operation: I need to launch a ffmpeg that reads SRTP input and saves a mpegtsfile into disk.
I've tried something like this:
Launch ffmpeg to generate a SRTP output flow (same step than previous link)
ffmpeg -re -i input.avi -f rtp_mpegts -acodec mp3 -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz srtp://127.0.0.1:20000
Launch ffmpeg, not ffplay, to get this output as a new input and save it to a mpegts file, according to ffmpeg-srtp-documentation
ffmpeg -i srtp://127.0.0.1:20000 -srtp_in_suite AES_CM_128_HMAC_SHA1_80 -srtp_in_params zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz -vcodec copy -acodec copy -f mpegts myfile.ts
And I get:
srtp://127.0.0.1:19000: Invalid data found when processing input
Can anyone help me, please?
-
Not able to decode the audio(mp3) file using C API
9 août 2018, par IronMarioI am executing decode_audio.c file. I compiled it successfully. I am getting segmentation fault when I execute. I included the avformat.h header file. I changed the codec logic according to mp3 format. I am using the following command to compile and execute.
mycode$ gcc -o decode_audio decode_audio.c -lavutil -lavformat -lavcodec -lswresample -lz -lm mycode$ ./decode_audio audio.mp3 raw.bin
What is the reason for this segmentation fault in my program?
I am using Ubuntu 16.04 LTS and ffmpeg 3.4.4 versions. Please help me.
Thanks in advance.
-
ffmpeg read the current segmentation file
9 août 2018, par GuaronetI'm developing a system using ffmpeg to store some ip camera videos. i'm using the segmentation command for store each 5 minutes a video for camera. I have a wpf view where i can search historycal videos by dates. In this case i use the ffmpeg command concat to generate a video with the desire duration. All this work excelent, my question is: it's possible concatenate the current file of the segmentation? i need for example, make a serch from the X date to the current time, but the last file is not generated yet by the ffmpeg. when i concatenate the files, the last one is not showing because is not finish the segment.
I hope someone can give me some guidance on what I can do.