Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Is it possible to convert m2ts with Dolby Vision into mp4 ?

    24 août 2018, par Godmax

    Is it possible to convert a m2ts with dual layer Dolby Vision information into an MP4 without any loss of the information?

  • ffmpeg Error extracting one frame

    24 août 2018, par Efrem Blazquez

    I need to extract a frame from a video to generate a thumbnail, and I'm using the next call to ffmpeg:

    ffmpeg -i /projectes/macba/TEMP/video_test/DIG_A-HIS-04943_001_h.mov -r 1 -ss 00:00:59 -t 120:-1 test.jpg
    

    and it results in a [buffer @ 0x174efe0] Buffering several frames is not supported. Please consume all available frames before adding a new one.

    And doesn't generates the image, I'm not very used to use ffmpeg and I'm having trubbles to spot the problem.

    Using ffmpeg version 0.8.17-4:0.8.17-0ubuntu0.12.04.2

    Any hints?

    Thanks!

  • Creating own video server with ffserver/ffmpeg streaming RTSP

    24 août 2018, par Brykyz

    I have created app that plays video when url is provided. It can recieve video via RTSP (already tried to play famous big bunny), but now I want to debug it with my own server. Currently I have downloaded ffserver/ffmpeg and want to create my own video server, but i don't know how.

    I have already tried

    ffserver &
    sudo ffmpeg -r 24 -i "video.mp4" http://localhost:8090/feed1.ffm
    ffplay http://localhost:8090/feed1.ffm
    

    but it doesn't work. I am trying to achieve same thing that happens when I type to terminal

    ffplay rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov
    

    (it should play my video), but I want to use private server.

    How can I create server that will stream video via RTSP?

    Thank you for your help!

    //Edit:

    ffserver.conf

    HttpPort 8090 
    RtspPort 5554
    HttpBindAddress 0.0.0.0 
    MaxClients 1000 
    MaxBandwidth 10000 
    NoDaemon 
    
     
    File /tmp/feed1.ffm 
    FileMaxSize 1000M 
     
    
    
    Feed feed1.ffm
    Format rtp
    VideoCodec mpeg4
    VideoFrameRate 25
    VideoBufferSize 80000
    VideoBitRate 100
    VideoQMin 1
    VideoQMax 5
    VideoSize 1920x1080
    PreRoll 0
    Noaudio
    
    

    and I am trying to launch it via

    ffserver -d & ffmpeg -re -i "simpsons.mp4" http://localhost:8090/feed1.ffm
    

    and then ffplay "rtsp://localhost:5554/test.mpeg4" Output is

    [tcp @ 0x7fb0880079c0] Connection to tcp://localhost:5554?timeout=0 failed: Connection refused
    rtsp://localhost:5554/test.mpeg4: Connection refused
    
  • How send only video to bitstream filter trough tee option

    24 août 2018, par Benji Joa

    I want to encode video in mpeg4 format and get two output of it, one in h264 format(without sound) and the other in mpeg4-ts format. I don't find the option for enable sound in the first output. Here my command line:

    ./ffmpeg -i input -map 0 -c:a aac -c:v libx264  -y -f tee "[f=h264:bsfs/v=bitstreamfilter1]out.264|[f=mpegts:bsfs/v=bitstreamfilter2]out.ts"
    

    Thanks you

  • org.bytedeco.javacv.CanvasFrame showImage is hanging

    24 août 2018, par user3911119

    An IOS device is uploading h264 files (3 sec videos) to a server. Each file is successfully readable by VLC.

    Using FFMpegFrameGrabber, I grab each frame and try to display them using CanvasFrame.showImage as below. However, the method call hangs.

    CanvasFrame canvas = new CanvasFrame("ios");
    canvas.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
    canvas.setAlwaysOnTop(true);
    canvas.setResizable(true);
    try(FileInputStream fis = new FileInputStream(file))
    {
        try(FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(fis))
        {
            grabber.setFormat("h264");
            grabber.start();
            while(true)
            {
                Frame frame = grabber.grabImage();
                if(frame != null)
                {
                    canvas.showImage(frame);
                }
            }
        }
    }
    

    Am I doing anything wrong in the above code?

    EDIT#1: When I try to save the buffered image for the frame, a valid image is saved.

    BufferedImage image = converter.getBufferedImage(frame);
    File outputfile = new File("png_file");
    ImageIO.write(image, "png", outputfile);