Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
net.ypresto.qtfaststartjava not supporting in maven java
9 mai 2018, par UserI am using Qtfaststart to stream mp4 video(place MOOV atom to first) faster. I got maven repository from the link https://javalibs.com/artifact/net.ypresto.qtfaststartjava/qtfaststart .
net.ypresto.qtfaststartjava qtfaststart 0.1.0 I am getting
Missing artifact net.ypresto.qtfaststartjava:qtfaststart:jar:0.1.0
error in the pom.xmlThis dependency is not supporting, Can anyone help me how to solve this OR any other libraries to place MOOV atom like Qtfaststart.
-
FFMPEG Video Recording in Android getting overlay of Green latches
9 mai 2018, par Appoorva FalduI have used, FFMPEG & OpenCV for integrating the Video Player into Android Application.
Build Gradle:-
compile('org.bytedeco:javacv-platform:1.4') { exclude group: 'org.bytedeco.javacpp-presets' } compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.4.0-1.4' compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '3.4.1-1.4' compile files('libs/ffmpeg-android-arm.jar') compile files('libs/ffmpeg-android-x86.jar') compile files('libs/opencv-android-arm.jar') compile files('libs/opencv-android-x86.jar')
I have included 'jniLibs' in the 'main' folder with 'armeabi,amreabi-v7a, x86' folder's.
I am able to open Camera and record the video.
The O/P of the video is not coming as expected, audio quality is fine. Please see the Image below.
The code I used for integration: https://github.com/CrazyOrr/FFmpegRecorder
Thanks in advance!!
-
wowza + live + ffmpeg + hls player, how to create the playlist.m3u8 ?
9 mai 2018, par Ziv BarberI'm trying to setup a wowza live test server and then I can play hls from my mobile app. It do work without any problem for vod. I can play it in my app. I can also see the .m3p8 file if I enter this uri in the browser. I tried to do the same in live mode (my goal is to test some streaming parameters for live streaming). I tried to use ffmpeg to create the live stream:
ffmpeg -re -i "myInputTestVideo.mp4" -vcodec libx264 -vb 150000 -g 60 -vprofile baseline -level 2.1 -acodec aac -ab 64000 -ar 48000 -ac 2 -vbsf h264_mp4toannexb -strict experimental -f mpegts udp://127.0.0.1:10000
I created a "source file" and connected it to the "Incoming Streams". I can see in my application's Monitoring / Network tab that it do getting the data from ffmpeg.
My problem is how to get the playlist.m3p8 file so I can play it from inside my app (hls based)?
Again, for now I need a way to test playing with the streaming settings and in real live I'll have a real live streaming source.
-
Error : No input specified at FfmpegCommand.proto.setStartTime.proto.seekInput ffmpeg
9 mai 2018, par omkar mestryError: No input specified at FfmpegCommand.proto.setStartTime.proto.seekInput (/Users/omkar/Desktop/whatsappvideo/node_modules/fluent-ffmpeg/lib/options/inputs.js:147:13) at process_video (/Users/omkar/Desktop/whatsappvideo/app.js:25:10) at /Users/omkar/Desktop/whatsappvideo/app.js:15:7 at Layer.handle [as handle_request] (/Users/omkar/Desktop/whatsappvideo/node_modules/express/lib/router/layer.js:95:5) at next (/Users/omkar/Desktop/whatsappvideo/node_modules/express/lib/router/route.js:137:13) at Route.dispatch (/Users/omkar/Desktop/whatsappvideo/node_modules/express/lib/router/route.js:112:3) at Layer.handle [as handle_request] (/Users/omkar/Desktop/whatsappvideo/node_modules/express/lib/router/layer.js:95:5) at /Users/omkar/Desktop/whatsappvideo/node_modules/express/lib/router/index.js:281:22 at Function.process_params (/Users/omkar/Desktop/whatsappvideo/node_modules/express/lib/router/index.js:335:12) at Busboy.next (/Users/omkar/Desktop/whatsappvideo/node_modules/express/lib/router/index.js:275:10)
Code :- app.js
var express = require("express"), app = express(), http = require("http").Server(app).listen(8080), upload = require("express-fileupload"); var video=null; app.use(upload()) console.log("Server Started!"); app.get("/",function(req,res){ res.sendFile(__dirname+"/index.html"); }) app.post("/",function(req,res){ if(req.files){ video = req.files; process_video(req.files.upfile.data); //console.log(req.files.upfile.data); } }) function process_video(video){ var ffmpeg = require('fluent-ffmpeg'); ffmpeg(video) .setStartTime(120) .seekInput(0) .setDuration(10) .output('test.mp4') .on('start', function(commandLine) { console.log('start : ' + commandLine); }) .on('progress', function(progress) { console.log('In Progress !!' + Date()); }) .on('end', function(err) { if(!err) { console.log('conversion Done'); } }) .on('error', function(err){ console.log('error: ', +err); }).run(); }
-
ffmpeg output in Audacity
9 mai 2018, par siamakI wanted to split the audio track of a mp4 file each receiving different filter then merge to an output mp4 file. Please note I do not wanted series filter but rather parallel filter and then merge.
I came up with the following command.
ffmpeg -i input.mp4 -filter_complex "[0:a]asplit[audio1][audio2];[audio1]highpass=f=200:p=1:t=h:w=50;[audio2]lowpass=f=700:p=1:t=h:w=200;[audio1][audio2]amerge=inputs=2[out]" -map "[out]" -map 0:v -c:v copy -map 0:s? -c:s copy -ac 2 -y output.mp4
this output will play on Vlc and mpv. However, when I try to open it in Audacity, I get:
Why I do get this? I assume the index[05] is the correct audio output. This raise the question, which audio track is playing when opened in Vlc? How can I create an output which has only one final audio track?