Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Is there a way to use InputStream to get the media details
13 mai 2019, par NamanI am currently accepting
InputStream
from a client of my server for aFile
uploaded via multipart/form-data.I am currently using
ffmpeg-cli-wrapper
library to useffprobe
andffmpeg
.The challenge I see up front is that the APIs exposed by the client doesn't make use of a stream, rather a
mediaPath
. Is there a way/library similar to this which can provide me an FFProbe instance as shown in the usage example of the library.To add to the pain, I am aware of transforming the
InputStream
into a file and then passing themediaPath
. But that's an unnecessary space on my processing disk and an additional step for cleanup as well. -
ffmpeg - Text fontcolor is changed (faded) because of alpha(I think). Is there any possibility to avoid this behavior ?
13 mai 2019, par Bedrule PaulI am using following command
ffmpeg -i ~/Desktop/input.mp4 -filter_complex "color=black:100x100[c];\ [c][0]scale2ref[ct][mv]; \ [ct]setsar=1,split=1[t1]; \ [t1]drawtext=text='Test Text 1':fontsize=36:fontcolor=#13348b\ ,split[text1][alpha1]; \ [text1][alpha1]alphamerge,rotate=30:ow=rotw(30)\ :oh=roth(30):c=black@0[txta1]; \ [mv][txta1]overlay=x='min(0,-H*sin(30))+500':\ y='min(0,W*sin(30))+350':shortest=1" \ ~/Desktop/result.mp4 -y
I think alpha is the problem, but I don't know how to avoid it.
-
How to convert a javascript animation to video on the server-side using nodejs ?
13 mai 2019, par user9964622I have a app where a user can create animations , I want to be able to convert these animations to video on server side, so user can save and share them eg YouTube, etc
Here is what I have so far , animation created using create js and ffmpegserver.js.
ffmpegserver.js.
This is a simple node server and library that sends canvas frames to the server and uses FFmpeg to compress the video. It can be used standalone or with CCapture.js
Test3.html
<body onload="init();">
Simple Tween Demo
<script src="http://localhost:8081/ffmpegserver/CCapture.js"></script>
<script src="http://localhost:8081/ffmpegserver/ffmpegserver.js"></script>
<script src="https://code.createjs.com/1.0.0/createjs.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/tween.js/17.2.0/Tween.js"></script>
<script src='http://stackoverflow.com/feeds/tag/test3.js'></script>
Test3.js
/* eslint-disable eol-last */ /* eslint-disable no-undef */ /* eslint-disable quotes */ var canvas, stage; function init() { var framesPerSecond = 60; var numFrames = framesPerSecond * 5; // a 5 second 60fps video var frameNum = 0; var progressElem = document.getElementById("progress"); var progressNode = document.createTextNode(""); progressElem.appendChild(progressNode); function onProgress(progress) { progressNode.nodeValue = (progress * 100).toFixed(1) + "%"; } function showVideoLink(url, size) { size = size ? (" [size: " + (size / 1024 / 1024).toFixed(1) + "meg]") : " [unknown size]"; var a = document.createElement("a"); a.href = url; var filename = url; var slashNdx = filename.lastIndexOf("/"); if (slashNdx >= 0) { filename = filename.substr(slashNdx + 1); } a.download = filename; a.appendChild(document.createTextNode("Download")); var container = document.getElementById("container").insertBefore(a, progressElem); } var capturer = new CCapture( { format: 'ffmpegserver', //workersPath: "3rdparty/", //format: 'gif', //verbose: true, framerate: framesPerSecond, onProgress: onProgress, //extension: ".mp4", //codec: "libx264", } ); capturer.start(); canvas = document.getElementById("testCanvas"); stage = new createjs.Stage(canvas); var ball = new createjs.Shape(); ball.graphics.setStrokeStyle(5, 'round', 'round'); // eslint-disable-next-line quotes ball.graphics.beginStroke('#000000'); ball.graphics.beginFill("#FF0000").drawCircle(0, 0, 50); ball.graphics.setStrokeStyle(1, 'round', 'round'); ball.graphics.beginStroke('#000000'); ball.graphics.moveTo(0, 0); ball.graphics.lineTo(0, 50); ball.graphics.endStroke(); ball.x = 200; ball.y = -50; createjs.Tween.get(ball, {loop: -1}) .to({x: ball.x, y: canvas.height - 55, rotation: -360}, 1500, createjs.Ease.bounceOut) .wait(1000) .to({x: canvas.width - 55, rotation: 360}, 2500, createjs.Ease.bounceOut) .wait(1000) .to({scaleX: 2, scaleY: 2}, 2500, createjs.Ease.quadOut) .wait(1000) stage.addChild(ball); createjs.Ticker.addEventListener("tick", stage); function render() { requestAnimationFrame(render); capturer.capture( canvas ); ++frameNum; if (frameNum < numFrames) { progressNode.nodeValue = "rendered frame# " + frameNum + " of " + numFrames; } else if (frameNum === numFrames) { capturer.stop(); capturer.save(showVideoLink); } } render(); }
Everything works fine, you can test it yourself if you want by cloning the repo.
Right now animation rendering happens in client side, I would like this animation rendering to happen in the backend side
What do I need to change to make this animation rendering in backend server side using Nodejs? any help or suggestions will be appreciated.
-
Extract audio with ffmpeg, linux
13 mai 2019, par PuffinI'm trying to extract audio tracks from some Avi videos and save them to their own files, ideally without re-encoding.
I've had a look through here https://www.ffmpeg.org/ffmpeg.html#Audio-Options and here ffmpeg to extract audio from video though I'm getting errors regardless of the approach I try.
My latest command string is:
ffmpeg -i /home/d/Pictures/Test/input-video.AVI -map 0:a -vn -acodec copy /home/d/Pictures/Test/output-audio.m4a
The key part of the output is:
Guessed Channel Layout for Input Stream #0.1 : mono Input #0, avi, from '/home/d/Pictures/Test/input-video.AVI': Duration: 00:00:05.94, start: 0.000000, bitrate: 18131 kb/s Stream #0:0: Video: mjpeg (MJPG / 0x47504A4D), yuvj422p(pc, bt470bg/unknown/unknown), 1280x720, 17995 kb/s, 30.28 fps, 30.28 tbr, 30.28 tbn, 30.28 tbc Stream #0:1: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 11025 Hz, 1 channels, s16, 176 kb/s File '/home/d/Pictures/Test/output-audio.m4a' already exists. Overwrite ? [y/N] y [ipod @ 0x1d89520] Codec for stream 0 does not use global headers but container format requires global headers [ipod @ 0x1d89520] Could not find tag for codec pcm_s16le in stream #0, codec not currently supported in container Output #0, ipod, to '/home/d/Pictures/Test/output-audio.m4a': Metadata: encoder : Lavf56.40.101 Stream #0:0: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 11025 Hz, mono, 176 kb/s Stream mapping: Stream #0:1 -> #0:0 (copy) Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
I'm believe I have got the right audio stream number from this output and thus am assuming the "-map 0:a" part isn't the problem.
I'm running on Linux Mint 18.1
-
FFMPEG decoding video produces discolored YUV frame
13 mai 2019, par Jeff GongI am using FFMPEG directly to decode a single video frame encoded in H.264 from a test video I'm running using the following command:
ffmpeg -i test.mp4 -ss 00:00:00 -vframes 1 -pix_fmt yuv420 output.yuv
For some reason, when I open the file on a YUV viewer, I can distinctly tell that the colors are slightly off compared to the original input. I've tried to play with the colorspace and color matrix options, but nothing I do seems to replicate exactly the original colors.
For example, I've also tried the following commands:
ffmpeg -i test.mp4 -ss 00:00:00 -vframes 1 -pix_fmt yuv420p -vf colormatrix=bt470:bt709 output.yuv
and
ffmpeg -i test.mp4 -ss 00:00:00 -vframes 1 -pix_fmt yuv420p -color_primaries bt709 -color_trc linear -colorspace bt709 output.yuv