Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Camera2 API switching camera during recording

    27 juin, par Max Efimov

    I'm working with Camera2 API and in my app it's important to switch the camera during recording. After each camera switch I save video fragment to disk, and after stopping the recording I merge videos into one using -f concat -safe 0 -i listPath -c copy finalPath command of FFmpeg for Android. When I open a separate video fragment, it looks correct, but when I open the merged video, videos recorded on the back camera are rotated 180 degrees. The problem is only in the merged video.

    In meta-tag of videos if video recorded on back camera, then "rotate" tag has 90 degrees, otherwise 270 degrees.

    As I understood when ffmpeg merges a videos it automatically rotates the video if it has different "rotate" tag in he's metatags.

    How I can solve this problem?

    P.S Also I tried to use MP4Parser Library for merging, but it has the same problems.
    P.P.S. -noautorotate param for ffmpeg does not help me.

  • use ffmpeg to combine raspberry pi audio and video to send to youtube [closed]

    27 juin, par CrabbyPete

    I can stream video from my pi camera to youtube using this script

    libcamera-vid --inline --nopreview -t 0 --width 640 --height 480 --framerate 15 --codec h264 -o - | ffmpeg -f lavfi -i anullsrc=channel_layout=stereo:sample_rate=44100 -thread_queue_size 1024 -use_wallclock_as_timestamps 1 -i pipe:0 -c:v copy -c:a aac -preset fast -strict experimental -f flv 
    

    I have an i2s microphone board on the pi an can record audio using alsa and pulse audio using the command

    arecord -D pulse -f S16_LE -r 48000 -c 1 -d 10 test.wav

    Is there a way to edit the script to combine the audio and video stream on the pi?

  • How to encode jpeg images to H264 very fast (transform images to video)

    27 juin, par Paul

    I have 30 JPEG images (.jpg) at a resolution of 480 x 640. Each image takes aboout 20KB (all of them takes about 600KB).

    I am using FFmpeg command to encode these images into a video in H264 format.

    I need this to be done very fast - about 1 second.

    Using the classic command:

    ffmpeg -y  -f  image2   -r 1/5   -i image_%d.jpg   -c:v libx264   -r 30   video.mp4
    

    takes about 90 seconds.

    After adding -preset ultrafast:

    ffmpeg -y  -f  image2   -r 1/5   -i image_%d.jpg   -c:v libx264   -preset ultrafast    -r 30   video.mp4
    

    the encoding takes about 15 seconds which is much better, but still not enough

    I've tried others parameters also, like:

    -profile:v baseline
    
    -qscale:v
    
    -b:v 1000k
    
    -crf 24
    

    but the encoding time does not fall below 10 seconds.

    I'm not familiar with FFmpeg commands nor with the parameters I need to use, and this is the reason I post here this question.

    The video quality needs to be ok, doesn't need to be perfect.

    As a note: I am running these commands in an Android application where I have the ffmpeg executable, using an ProcessBuilder.

    Reply1 (to Robert Rowntree):

    ArrayList l2 = new ArrayList();
    
            //l2.add("ffmpeg");
            l2.add("/data/data/" + packageName + "/ffmpeg");
            l2.add("-y");
            l2.add("-loop");
            l2.add("1");
    
            l2.add("-i");
            l2.add("frame_%d.jpg");
    
    //            l2.add("-t");
    //            l2.add(strngs[3]);
    
            l2.add("-r");
            l2.add("1/2");
            l2.add("-preset");
            l2.add("superfast");
            l2.add("-tune");
            l2.add("zerolatency");
    
    //            l2.add("-pass");
    //            l2.add(Integer.valueOf(pass).toString());
    
            l2.add("-vcodec");
            l2.add("libx264");
            l2.add("-b:v");
            l2.add("200k");
            l2.add("-bt");
            l2.add("50k");
            l2.add("-threads");
            l2.add("0");
            l2.add("-b_strategy");
            l2.add("1");
    
    //            if(pass ==1){
    //                l2.add("-an");
    //            } else {
    //                l2.add("-acodec");
    //                l2.add("copy");
    //            }
    
            l2.add("-f");
            l2.add("mp4");
            l2.add("-strict");
            l2.add("-2");
    //            l2.add("-passlogfile");
    //            l2.add(strngs[4]);
    
    //            if(pass ==1){
    //                l2.add("/dev/null");
    //            } else {
    //                l2.add(strngs[5]);
    //            }
    
            l2.add("video.mp4");
            //return l2;
    
  • FFmpeg : high quality animated GIF ? [closed]

    26 juin, par David Hope

    I'm generating animated a GIF from a video on my server.

    The generated GIF is not really high quality and it looks like the pixels are huge.

    Example:

    example GIF

    This is how I generate the GIF:

    /usr/bin/ffmpeg -i video.mkv -vf scale=500:-1 -t 10 -r 10 image.gif
    

    I did a search on Google and came across this:

    /usr/bin/ffmpeg -i video.mkv -r 20 -f image2pipe -vcodec ppm - | convert -delay 5 - output.gif
    

    But the command above doesn't do anything and no output.gif is being generated at all.

    There are some tutorials that I came across but none of them worked for me and some of them involve using ImageMagick which I dont have access to.

    Could someone please let me know if there is a clear way to generate a high-quality GIF using FFmpeg?

  • Use an IP-camera with webRTC

    26 juin, par Minz

    I want to use an IP camera with webrtc. However webrtc seems to support only webcams. So I try to convert the IP camera's stream to a virtual webcam.

    I found software like IP Camera Adapter, but they don't work well (2-3 frames per second and delay of 2 seconds) and they work only on Windows, I prefer use Linux (if possible).

    I try ffmpeg/avconv:

    • firstly, I created a virtual device with v4l2loopback (the command was: sudo modprobe v4l2loopback). The virtual device is detected and can be feed with a video (.avi) with a command like: ffmpeg -re -i testsrc.avi -f v4l2 /dev/video1

    • the stream from the IP camera is available with: rtsp://IP/play2.sdp for a Dlink DCS-5222L camera. This stream can be captured by ffmpeg.

    My problem is to make the link between these two steps (receive the rstp stream and write it to the virtual webcam). I tried ffmpeg -re -i rtsp://192.168.1.16/play2.sdp -f video4linux2 -input_format mjpeg -i /dev/video0 but there is an error with v4l2 (v4l2 not found).

    Does anyones has an idea how to use an IP camera with webRTC?