Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
ffmpeg carrierwave-video always returns "unknown encoder libfaac"
17 juillet 2015, par olgashI can give ffmpeg videos to convert via command line, and it converts them happily, but when I ask it to convert things in rails, it returns "Unknown encoder libfaac" no matter the video I give it.
I call it using this line: process encode_video: [:mp4, resolution: "640x480"]
I've already spent hours trying to (unsuccessfully) compile ffmpeg with libfaac on Windows, but now it just seems ridiculous, because not everything I pass it is even aac. What's going on?
-
create video from images which is in Camera Folder using ffmpeg and opencv
17 juillet 2015, par Nikhil Boradnew AsyncTask() { ProgressDialog dialog; protected void onPreExecute() { dialog = new ProgressDialog(MainActivity.this); dialog.setMessage("Genrating video, Please wait........."); dialog.setCancelable(false); dialog.show(); }; @Override protected Void doInBackground(Void... arg0) { File folder = Environment .getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM); String path = folder.getAbsolutePath() + "/Camera"; ArrayList
paths = new ArrayList (Arrays.asList(folder.list())); FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path + "/" + "test.mp4", 400, 400); videoPath = path + "/" + "test.mp4"; try { //recorder.setVideoCodec(5); recorder.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4); //recorder.setFormat("3gp"); recorder.setFormat("mp4"); recorder.setFrameRate(frameRate); recorder.setVideoBitrate(30); startTime = System.currentTimeMillis(); recorder.start(); for (int i = 0; i " + paths.get(i)); long t = 3000 * (System.currentTimeMillis() - startTime); if (t > recorder.getTimestamp()) { recorder.setTimestamp(t); **recorder.record(image);** } } System.out.println("Total Time:- " + recorder.getTimestamp()); recorder.stop(); } catch (Exception e) { e.printStackTrace(); } return null; } protected void onPostExecute(Void result) { dialog.dismiss(); Intent intent = new Intent(Intent.ACTION_VIEW); intent.setDataAndType(Uri.parse(videoPath), "video/mp4"); startActivity(intent); Toast.makeText(MainActivity.this, "Done", Toast.LENGTH_SHORT) .show(); }; }.execute(); Got an Error in recorder.record(image);
Error Says :
Error:(69, 37) error: no suitable method found for record(IplImage) method FFmpegFrameRecorder.record(AVFrame) is not applicable (actual argument IplImage cannot be converted to AVFrame by method invocation conversion) method FFmpegFrameRecorder.record(Frame) is not applicable (actual argument IplImage cannot be converted to Frame by method invocation conversion) method FrameRecorder.record(Frame) is not applicable (actual argument IplImage cannot be converted to Frame by method invocation conversion)
**Thanks in Advance
-
Can someone please explain to me why is this happening
17 juillet 2015, par user2270995I am trying to show the frame decoded in FFMPEG in a
UIImageView
.The method given belowstepFrame
is called at the interval of 1/30 second. ThesetImage
method does the conversion for the frame and sets the image for ImageView.Now if i call
setImage
insidestepFrame
after frame has finished decoding the image doesn't show in the ImageView. But if i callsetImage
after callingstepFrame
the image shows fine in ImageView.- (BOOL)stepFrame { // AVPacket packet; int frameFinished=0; while (!frameFinished && av_read_frame(pFormatCtx, &packet) >=0 ) { // Is this a packet from the video stream? if(packet.stream_index==videoStream) { // Decode video frame avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet); } if (frameFinished) // this block of code if my source of confusion { [self setImage]; } if (packet.stream_index==audioStream) { // NSLog(@"audio stream"); // audio stuff } } return frameFinished!=0; } -(void)setImage { if (!pFrame->data[0]) return; [self convertFrameToRGB]; // uses sws_scale to write the converted frame to picture imageview.image = [self imageFromAVPicture:picture width:outputWidth height:outputHeight]; }
-
Upload a photo into an online video [on hold]
17 juillet 2015, par jennie788Id like to allow my online users to upload their photo onto my web server. Then I want their photo to be embedded into an online video.
The idea is to make it look as if their photo appears in an empty frame on a shelf for example. I want them to be the hero of the video.
I'd also like them to be able to view the final video with their uploaded photo seconds after the upload.
I know how to do this in flash but what other tools can i use to achieve this please?
I thought about using ffmpeg to add the picture on the fly. Is it the right approach?
All I need is somebody to point me in the right direction. What tool should I use to do this.
Thanks
Jennie
-
Record local audio in a docker container
17 juillet 2015, par pabloHow can I record the audio of an application like Firefox inside a docker container with ffmpeg? I've found examples how to forward pulseaudio to the host - netflix, skype.
When I'm trying to use pactl:
pactl list sources
Or
docker exec -it bash apt-get install pulseaudio pactl load-module module-native-protocol-unix auth-anonymous=1 socket=/tmp/.pulse-socket
I'm getting an error:
Connection failure: Connection refused pa_context_connect() failed: Connection refused
This also fail
ffmpeg -f pulse -i default /tmp/pulse.wav