Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
How can I find video rotation and rotate the clip accordingly using moviepy ?
26 mars 2017, par GloinI'm using moviepy to import some videos, but the videos that should be in portrait mode are imported in landscape. I need to check whether the rotation has been changed, and if it has, rotate it back.
Is this functionality built into moviepy? If not, how else can I check it?
-
avformat_write_header is not working properly in ffmepg
26 mars 2017, par ganesh kumarI was working on mp4 file creation project using FFMPEG, i tried to convert the stream information of video packet based on FFMPEG muxing,remuxing code, but header get damaged after convert into so file is corrupted.
/*this code using set the stream information */
AVFormatContext *input_context,*output_context; AVDictionary *opt; AVStream *out_stream; AVCodecContext *newcontext=NULL; out_stream= avformat_new_stream(output_context,NULL); newcontext = avcodec_alloc_context3(codec); newcontext->codec_id=Output_fmt->video_codec; newcontext->bit_rate =in_stream->codec->bit_rate; newcontext->width = in_stream->codec->width; newcontext->height = in_stream->codec->height; newcontext->timecode_frame_start=in_stream->codec->timecode_frame_start; newcontext->gop_size = in_stream->codec->gop_size; newcontext->profile = in_stream->codec->profile; newcontext->level =in_stream->codec->level; newcontext->pix_fmt = PIX_FMT_YUV420P; newcontext->frame_size=in_stream->codec->frame_size; newcontext->sample_fmt=in_stream->codec->sample_fmt; newcontext->sample_rate=in_stream->codec->sample_rate; time_base= (double)in_stream->time_base.num / (double)in_stream->time_base.den; duration= (double)in_stream->duration * time_base * 1000.0; if (!out_stream) { fprintf(stderr, "Failed allocating output stream\n"); ret = AVERROR_UNKNOWN; return; } ret = avcodec_copy_context(out_stream->codec,newcontext); if (ret < 0) { fprintf(stderr, "Failed to copy context from input to output stream codec context\n"); goto end; } out_stream->codec->codec_tag = 0; if (output_context->oformat->flags & AVFMT_GLOBALHEADER) out_stream->codec->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
Changed the Header Information using : /*this code using set the metadata */
av_dict_set(&opt, "major_brand", "mp42", 0); av_dict_set(&opt, "minor_version","512" , 0); av_dict_set(&opt, "compatible_brands","isomiso2avc1mp41",0); av_dict_set(&opt, "comment","Hash=855738390",0); output_context->metadata = opt; ret = avformat_write_header(output_context,NULL);
after create the mp4 file check file using ffmpeg in terminal. getting Error like this:
/this error message/
[mpeg4 @ 0x7ff2b9811c00] header damaged Last message repeated 39 times [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7ff2ba800000] decoding for stream 0 failed [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7ff2ba800000] Could not find codec parameters for stream 0 (Video: mpeg4 (mp4v / 0x7634706D), none, 376 kb/s): unspecified size Consider increasing the value for the 'analyzeduration' and 'probesize' options.
-
FFmpeg gives an error while extract MP3 from video
26 mars 2017, par Urvashi PatelI want to execute FFmpeg command for extract audio in MP3 format from video that is in asset folder of the app. For more, refer http://writingminds.github.io/ffmpeg-android-java/
String sourceFilePath = "file:///android_asset/SampleVideo.mp4"; destFilePath = mp3File.getAbsolutePath(); FFmpeg ffmpeg = FFmpeg.getInstance(MainActivity.this); try { String cmd[] = new String[]{"/data/data/com.example.bansi_vasundhara.videotomp3_demo/files/ffmpeg", "-i", sourceFilePath, "-vn", "-acodec copy", destFilePath}; ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() { @Override public void onStart() { Log.e("gc", "Command Started"); } @Override public void onProgress(String message) { Log.e("gc", "onProgress" + message); } @Override public void onFailure(String message) { Log.e("gc", "onFailure command" + message); } @Override public void onSuccess(String message) { Log.e("gc", "onSuccess command" + message); } @Override public void onFinish() { Log.e("gc", "onFinish command"); } }); } catch (FFmpegCommandAlreadyRunningException e) { // Handle if FFmpeg is already running e.printStackTrace(); }
but I get an error. Please suggest any solution.
java.io.IOException: Error running exec(). Command: [/data/data/com.example.bansi_vasundhara.videotomp3_demo/files/ffmpeg, /data/data/com.example.bansi_vasundhara.videotomp3_demo/files/ffmpeg, -i, file:///android_asset/SampleVideo.mp4, -vn, -acodec copy, /storage/sdcard0/VideoToMP3/SampleMP3.mp3] Working Directory: null Environment: null at java.lang.ProcessManager.exec(ProcessManager.java:211) at java.lang.Runtime.exec(Runtime.java:173) at java.lang.Runtime.exec(Runtime.java:128) at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10) at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38) at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10) at android.os.AsyncTask$2.call(AsyncTask.java:288) at java.util.concurrent.FutureTask.run(FutureTask.java:237) at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587) at java.lang.Thread.run(Thread.java:818) Caused by: java.io.IOException: No such file or directory at java.lang.ProcessManager.exec(Native Method) at java.lang.ProcessManager.exec(ProcessManager.java:209)
-
Android JavaCV FFmpeg webstream to local static website
26 mars 2017, par Thomas DevoogdtFor my integrated test I'm working on an application that needs to provide a live stream to a locally hosted website. I've already built a working site that run's on nanohttpd. This application performs also special image processing. Therefore I use JavaCV. The library is working perfectly and all cpp bindings are working too.
My question: How to set up a live stream that can directly be played in a static site hosted by nanohttpd? - I am on the right way?
My code:
init:
private void initLiveStream() throws FrameRecorder.Exception { /* ~~~ https://github.com/bytedeco/javacv/issues/598 ~~~ */ frameRecorder = new FFmpegFrameRecorder("http://localhost:9090", imageWidth, imageHeight, 0); frameRecorder.setVideoOption("preset", "ultrafast"); frameRecorder.setVideoCodec(avcodec.AV_CODEC_ID_H264); frameRecorder.setAudioCodec(0); frameRecorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P); frameRecorder.setFormat("webm"); frameRecorder.setGopSize(10); frameRecorder.setFrameRate(frameRate); frameRecorder.setVideoBitrate(5000); frameRecorder.setOption("content_type","video/webm"); frameRecorder.setOption("listen", "1"); frameRecorder.start(); }
In my CameraView:
@Override public void onPreviewFrame(byte[] data, Camera camera) { Camera.Size size = camera.getParameters().getPreviewSize(); Frame frame = new AndroidFrameConverter().convert(data, size.width, size.height); try { if(frameRecorder!=null){ frameRecorder.record(frame); } } catch (FrameRecorder.Exception e) { e.printStackTrace(); } }
Here is one of the stack traces that ar shown frequently in my search to the solution:
org.bytedeco.javacv.FrameRecorder$Exception: avio_open error() error -111: Could not open 'http://localhost:9090'
I couldn't find any other thread addressing this specific issue.
Thanks in advance
EDIT
Thanks to Chester Cobus, Here is my used code:
Websocket:
//Constructor AsyncHttpServer serverStream = new AsyncHttpServer(); List
sockets = new ArrayList<>(); //http://stackoverflow.com/a/33021907/5500092 //I'm planning to use more sockets. This is the only uniform expression I found. serverStream.websocket("/((?:[^/]*/)*)(.*)", new AsyncHttpServer.WebSocketRequestCallback() { @Override public void onConnected(final WebSocket webSocket, AsyncHttpServerRequest request) { String uri = request.getPath(); if (uri.equals("/live")) { sockets.add(webSocket); //Use this to clean up any references to your websocket webSocket.setClosedCallback(new CompletedCallback() { @Override public void onCompleted(Exception ex) { try { if (ex != null) Log.e("WebSocket", "Error"); } finally { sockets.remove(webSocket); } } }); } } }); //Updater (Observer pattern) @Override public void updated(byte[] data) { for (WebSocket socket : sockets) { socket.write(new ByteBufferList(data)); } } Record Acitivy
private long start_time = System.currentTimeMillis(); @Override public void onPreviewFrame(byte[] data, Camera camera) { long now_time = System.currentTimeMillis(); if ((now_time - start_time) > 250) { start_time = now_time; //https://forums.xamarin.com/discussion/40991/onpreviewframe-issue-converting-preview-byte-to-android-graphics-bitmap Camera.Size size = camera.getParameters().getPreviewSize(); YuvImage image = new YuvImage(data, ImageFormat.NV21, size.width, size.height, null); ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream(); image.compressToJpeg(new Rect(0, 0, size.width, size.height), 60, byteArrayOutputStream); MainActivity.getWebStreamer().updated(byteArrayOutputStream.toByteArray()); } }
JavaScript
var socket; var imageElement; /** * path - String.Format("ws://{0}:8090/live", Window.Location.HostName) * image - HTMLImageElement */ function imageStreamer(path, image) { imageElement = image; socket = new WebSocket(path); socket.onmessage = function(msg) { var arrayBuffer = msg.data; var reader = new FileReader(); reader.onload = function(e) { imageElement.src = e.target.result; }; reader.readAsDataURL(arrayBuffer); }; }
-
What is the best way to merge (concat) mp4 files in android ?
26 mars 2017, par GlimpseI am build an camera app which will allow users to pause & resume recording of an video. Currently pause & resume is possible using MediaRecorder but only for SDK >= 24.
For version SDK < 24 I am using mediaRecorder.stop() & mediaRecorder.start(). Now I must merge those separate mp4 files into one.
This is the current approach using FFMPEG (but with no luck):
private void mergeVideos() { if (filePaths.size() == 1) { cropVideoToSquare(); return; } List
cmdList = new ArrayList<>(); StringBuilder sb = new StringBuilder(); for (int i = 0; i < filePaths.size(); i++) { cmdList.add("-i"); cmdList.add(filePaths.get(i)); sb.append("[").append(i).append(":0] [").append(i).append(":1]"); } sb.append(" concat=n=").append(filePaths.size()).append(":v=1:a=1 [v] [a]"); cmdList.add("-filter_complex"); cmdList.add("\'" + sb.toString() + "\'"); cmdList.add("-map"); cmdList.add("\'[v]\'"); cmdList.add("-map"); cmdList.add("\'[a]\'"); cmdList.add("-preset"); cmdList.add("ultrafast"); final String mergedVideoLocation = CameraHelper.generateVideoNameWithPath(); cmdList.add(mergedVideoLocation); sb = new StringBuilder(); for (String str : cmdList) { sb.append(str).append(" "); } String[] cmd = cmdList.toArray(new String[cmdList.size()]); try { ffmpeg = FFmpeg.getInstance(this); ffmpeg.execute(cmd, new MergingHandler() { @Override public void onSuccess(String message) { Log.d("mergeVideos", "onSuccess"); currentFilePath = mergedVideoLocation; cropVideoToSquare(); } @Override public void onFailure(String message) { Log.d("mergeVideos", "onFailure: " + message); pDialog.dismiss(); } @Override public void onStart() { Log.d("mergeVideos", "onStart"); pDialog.show(); } }); } catch (FFmpegCommandAlreadyRunningException e) { e.printStackTrace(); } } The exception that I am getting:
ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers built with gcc 4.8 (GCC) configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags= libavutil 55. 17.103 / 55. 17.103 libavcodec 57. 24.102 / 57. 24.102 libavformat 57. 25.100 / 57. 25.100 libavdevice 57. 0.101 / 57. 0.101 libavfilter 6. 31.100 / 6. 31.100 libswscale 4. 0.100 / 4. 0.100 libswresample 2. 0.101 / 2. 0.101 libpostproc 54. 0.100 / 54. 0.100 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Pictures/CameraFeatures/VID_20170326_100318.mp4': Metadata: major_brand : 3gp4 minor_version : 0 compatible_brands: isom3gp4 creation_time : 2017-03-26 08:03:24 Duration: 00:00:04.82, start: 0.000000, bitrate: N/A Stream #0:0(eng): Video: mpeg4 (Simple Profile) (mp4v / 0x7634706D), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], 7764 kb/s, 30.02 fps, 30 tbr, 90k tbn, 300 tbc (default) Metadata: rotate : 90 creation_time : 2017-03-26 08:03:24 handler_name : VideoHandle Side data: displaymatrix: rotation of -90.00 degrees Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 127 kb/s (default) Metadata: creation_time : 2017-03-26 08:03:24 handler_name : SoundHandle /storage/emulated/0/Pictures/CameraFeatures/VID_20170326_100328.mp4: No such file or directory
Can someone point out what am I doing wrong?