Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Is it possible to stream mp3 playlist file with ffmpeg ?

    31 janvier 2012, par faridv

    I have playlists for mp3 files stored in mysql database, so I can create any kind of plalist file (m3u, xml, etc.). I need a program to stream these mp3 files on Internet. Streaming type is not that important, it can be rtp, udp or whatever, it's supposed to play with JWPlayer. How can I do this with FFMPEG or VLC or any other free softwares?

  • How to mirror swscale PIX_FMT_YUYV422

    31 janvier 2012, par superg

    I'm trying to mirror libswscale PIX_FMT_YUYV422-type image horizontally. Using simple loop for each line with 16-bits per pixel results in having colors wrong, for example blue objects are orange. Here is my code:

       typedef unsigned short YUVPixel; // 16 bits per pixel
        for (int y = 0; y < outHeight; y++)
        {
            YUVPixel *p1 = (YUVPixel*)pBits + y * outWidth;
            YUVPixel *p2 = p1 + outWidth - 1;
            for (int x = 0; x < outWidth/2; x++) // outWidth is image width in pixels
            {
                // packed YUV 4:2:2, 16bpp, Y0 Cb Y1 Cr
                unsigned short tmp;
                tmp = *p1;
                *p1 = *p2;
                *p2 = tmp;
            }
        }
    

    Then I tried redefining YUVPixel as 32-bit type and modifying my loop accordingly. This results in correct colors, but looks like neighboring pixels are swapped. Any ideas, I'm totally lost with this?

  • apache /etc/ld.so.conf path error

    31 janvier 2012, par user346443

    Hi I have just installed ffmpeg and am getting the following error:

    ./ffmpeg: error while loading shared libraries: libavfilter.so.1: cannot open shared object file: No such file or directory
    

    So I add the paths /usr/local/lib and /usr/local/bin to my /etc/ld.so.conf so it looks like:

    include ld.so.conf.d/*.conf
    /usr/local/lib
    /usr/local/bin
    

    Then run ldconfig -v and ffmpeg works successfully.

    My problem is when i add the paths to the /etc/ld.so.conf file all my perl sql modules stop working.

    How can I add the paths to the ld.so.conf file without disrupting the perl sql modules?

    Thanks in advance

    UPDATE : SOLUTION

    I removed the paths from /etc/ld.so.conf leaving just include ld.so.conf.d/*.conf then created a file /etc/ld.so.conf.d/ffmpeg.conf and added the paths to it. Ran ldconfig -v and i was good to go

  • Perl FFMpeg print output

    31 janvier 2012, par user346443

    Hi im using the following perl command to convert files with ffmpeg:

    system ("/usr/local/bin/ffmpeg -i $inputFile $outputFile");
    

    I would like to know if its possible to print the ffmpeg output?

    Cheers

    UPDATE

    The solution was to use backticks

    my $output = qx{/usr/local/bin/ffmpeg -i $inputFile $outputFile 2>&1};
    print $output
    

    This prints out the following:

    FFmpeg version SVN-r26402, Copyright (c) 2000-2011 the FFmpeg developers built on Jan 31 2012 12:30:35 with gcc 4.4.5 20110214 (Red Hat 4.4.5-6) configuration: --enable-libmp3lame --disable-mmx --enable-shared libavutil 50.36. 0 / 50.36. 0 libavcore 0.16. 1 / 0.16. 1 libavcodec 52.108. 0 / 52.108. 0 libavformat 52.93. 0 / 52.93. 0 libavdevice 52. 2. 3 / 52. 2. 3 libavfilter 1.74. 0 / 1.74. 0 libswscale 0.12. 0 / 0.12. 0 [wav @ 0x8af94c0] max_analyze_duration reached Input #0, wav, from 'a.wav': Duration: 00:00:05.84, bitrate: 1537 kb/s Stream #0.0: Audio: pcm_s16le, 48000 Hz, 2 channels, s16, 1536 kb/s Output #0, mp3, to 'a.mp3': Metadata: TSSE : Lavf52.93.0 Stream #0.0: Audio: libmp3lame, 48000 Hz, 2 channels, s16, 64 kb/s Stream mapping: Stream #0.0 -> #0.0 Press [q] to stop encoding size= 42kB time=5.42 bitrate= 64.0kbits/s size= 46kB time=5.88 bitrate= 64.0kbits/s video:0kB audio:46kB global headers:0kB muxing overhead 0.070153%
    
  • Changing YUV values of AVFrames

    31 janvier 2012, par Kage

    I'm trying to apply an effect to a video by altering the YUV values using FFMpeg programmatically in C.

    Let's say I want to increase the luminescence of each pixel of each frame by 10.

    At the moment I can open a video, get the video stream, find a frame, decode the frame, alter the YUV values of the AvFrame->data, encode the frame, and save the frame to be played back.

    When I play back the video, the first frame looks as expected, but the next is even brighter and the one after that is even brighter, and each frame seems to be having more and more added to the Y values instead of just 10.

    I tried changing the code so instead of altering the pixels for each frame, it just alters the first AVFrame's data. However when I play the video back, it is not only the first frame that has been altered, but like the first 30 or so frames.

    Why are the other frames effected when I only change the AVFrame->data of the first frame?