Newest 'x264' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/x264

Les articles publiés sur le site

  • What is ffmpeg, avcodec, x264 ? [closed]

    26 février 2016, par onmyway133

    From wiki, I read that

    FFmpeg is a free software project that produces libraries and programs for handling multimedia data. The most notable parts of FFmpeg are libavcodec, an audio/video codec library used by several other projects, libavformat, an audio/video container mux and demux library, and the ffmpeg command line program for transcoding multimedia files.

    So ffmpeg is a wrapper of avcodec? And I often hear that people encode video with x264 using ffmpeg. So ffmpeg is also a wrapper of x264?

    How are they related ?

  • Unable to build x264 for Android : configure doesn't work with cross-compile flags

    18 février 2016, par Pavel S.

    There is a problem when I try to build x264 lib in order to make it work with ffmpeg lib. I use Ubuntu 14.04. I have cloned fresh x264 sources. But when I run ./configure script, I got several issues:

    1. It doesn't accept cross compile flags It doesn't accept cross-compile flags (--cross-prefix, --host, --sysroot). Here's how I run configure script:

      ./configure     --enable-pic \
                  --enable-static \
                  --disable-cli \
                  --host=arm-linux (or ARM, it doesn't work either) \
                  --cross-prefix=$ANDROID_NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-$HOST_ARCH/bin/arm-linux-androideabi- \
                  --sysroot=$ANDROID_NDK/platforms/android-14/arch-arm \
      

    In this case default configuration is used.

    When I don't pass 3 last flags, flags 1-3 are successfully used in config.

    1. With any flags passed to configure script, I see these errors in config.log:

      checking for -lpostproc... no
      Failed commandline was:
      gcc conftest.c -m64  -Wall -I. -I$(SRCPATH) -std=gnu99 -mpreferred-stack-boundary=5  -lpostproc  -m64  -lm -lpthread -o conftest
      ...
      /usr/bin/ld: cannot find -lpostproc
      collect2: error: ld returned 1 exit status
      

    Same I see for

    ...
    conftest.c:1:32: fatal error: libswscale/swscale.h: No such file or directory
    #include swscale.h>
    

    Here's full config.log: http://pastebin.com/U6aHKc28

    I guess I probably need to install ffmpeg on my Ubuntu to properly build x264?

    Any advice?

  • How to enable hardware-accelerated decoding in libVLC ?

    7 février 2016, par Gerardo Hernandez

    I am receiving multiple x264 RTSP streams and I am decoding them using libVLC and I would like to use hardware acceleration for the task.

    If I use the VLC player itself on Windows, I can choose "DirectX Video Acceleration (DXVA) 2.0" in Simple Preferences->Input/Codecs->Hardware-accelerated decoding and I can see a significant drop in CPU utilization when compared to disabling that option.

    In the C++ code, I tried to add the option "--avcodec-hw=dxva2" to the arguments of libvlc_new() but no luck, hardware acceleration does not seem to be used (I would say the decoding is 50% slower than in the player with dxva2 on)

  • How to fix " Could not open codec 'libx264' : Unspecified error" ?

    3 février 2016, par Nakeun Choi

    I am using BeagleBone Black on my project. When I try x264 codec in my opencv code, it shows me error like this:

    [libx264 @ 0x2b1600] broken ffmpeg default settings detected
    [libx264 @ 0x2b1600] use an encoding preset (e.g. -vpre medium)
    [libx264 @ 0x2b1600] preset usage: -vpre -vpre
    [libx264 @ 0x2b1600] speed presets are listed in x264 --help
    [libx264 @ 0x2b1600] profile is optional; x264 defaults to high
    Could not open codec 'libx264': Unspecified error

    How can I fix this?
    I have to encode with h264 codec.

  • How to encode a video from several images generated in a C++ program without writing the separate frame images to disk ?

    29 janvier 2016, par ksb496

    I am writing a C++ code where a sequence of N different frames is generated after performing some operations implemented therein. After each frame is completed, I write it on the disk as IMG_%d.png, and finally I encode them to a video through ffmpeg using the x264 codec.

    The summarized pseudocode of the main part of the program is the following one:

    std::vector B(width*height*3);
    for (i=0; i/ void generateframe(std::vector &, int)
      generateframe(B, i); // Returns different images for different i values.
      sprintf(s, "IMG_%d.png", i+1);
      WriteToDisk(B, s); // void WriteToDisk(std::vector, char[])
    }
    

    The problem of this implementation is that the number of desired frames, N, is usually high (N~100000) as well as the resolution of the pictures (1920x1080), resulting into an overload of the disk, producing write cycles of dozens of GB after each execution.

    In order to avoid this, I have been trying to find documentation about parsing directly each image stored in the vector B to an encoder such as x264 (without having to write the intermediate image files to the disk). Albeit some interesting topics were found, none of them solved specifically what I exactly want to, as many of them concern the execution of the encoder with existing images files on the disk, whilst others provide solutions for other programming languages such as Python (here you can find a fully satisfactory solution for that platform).

    The pseudocode of what I would like to obtain is something similar to this:

    std::vector B(width*height*3);
    video_file=open_video("Generated_Video.mp4", ...[encoder options]...);
    for (i=0; icode>

    According to what I have read on related topics, the x264 C++ API might be able to do this, but, as stated above, I did not find a satisfactory answer for my specific question. I tried learning and using directly the ffmpeg source code, but both its low ease of use and compilation issues forced me to discard this possibility as a mere non-professional programmer I am (I take it as just as a hobby and unluckily I cannot waste that many time learning something so demanding).

    Another possible solution that came to my mind is to find a way to call the ffmpeg binary file in the C++ code, and somehow manage to transfer the image data of each iteration (stored in B) to the encoder, letting the addition of each frame (that is, not "closing" the video file to write) until the last frame, so that more frames can be added until reaching the N-th one, where the video file will be "closed". In other words, call ffmpeg.exe through the C++ program to write the first frame to a video, but make the encoder "wait" for more frames. Then call again ffmpeg to add the second frame and make the encoder "wait" again for more frames, and so on until reaching the last frame, where the video will be finished. However, I do not know how to proceed or if it is actually possible.

    Edit 1:

    As suggested in the replies, I have been documenting about named pipes and tried to use them in my code. First of all, it should be remarked that I am working with Cygwin, so my named pipes are created as they would be created under Linux. The modified pseudocode I used (including the corresponding system libraries) is the following one:

    FILE *fd;
    mkfifo("myfifo", 0666);
    
    for (i=0; i/ void WriteToPipe(std::vector, FILE *&fd)
      fflush(fd);
      fd=fclose("myfifo");
    }
    unlink("myfifo");
    

    WriteToPipe is a slight modification of the previous WriteToFile function, where I made sure that the write buffer to send the image data is small enough to fit the pipe buffering limitations.

    Then I compile and write the following command in the Cygwin terminal:

    ./myprogram | ffmpeg -i pipe:myfifo -c:v libx264 -preset slow -crf 20 Video.mp4
    

    However, it remains stuck at the loop when i=0 at the "fopen" line (that is, the first fopen call). If I had not called ffmpeg it would be natural as the server (my program) would be waiting for a client program to connect to the "other side" of the pipe, but it is not the case. It looks like they cannot be connected through the pipe somehow, but I have not been able to find further documentation in order to overcome this issue. Any suggestion?