Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to run FFMPEG with —enable-libfontconfig on Amazon Linux 2

    22 avril, par Adrien Kaczmarek

    Problem

    I want to run FFmpeg on AWS Lambda (Amazon Linux 2) with the configuration --enable-libfontconfig enable.

    Situation

    I already have FFmpeg running on AWS Lambda without the configuration --enable-libfontconfig.

    Here is the step I took to run FFmpeg on AWS Lambda (see official guide):

    • Connect to Amazon EC2 running on AL2 (environment used by Lambda for Python 3.11)
    • Download and package FFmpeg from John Van Sickle
    • Create a Lambda Layer with FFmpeg

    Unfortunately, the version built by John Van Sickle doesn't have the configuration --enable-libfontconfig enabled.

    Unsuccessful Trials

    I tried to rebuilt it from scratch following the installation guide but without success (and the guide doesn't install font related dependencies)

    I tried to install it with brew but the command brew install ffmpeg didn't succeed on AL2.

    I tried to install ffmpeg from ffmpeg-master-latest-linux64-gpl.tar.xz. Unfortunately, this build of ffmpeg doesn't run on AL2:

    ffmpeg: /lib64/libm.so.6: version `GLIBC_2.27' not found (required by ffmpeg)
    ffmpeg: /lib64/libpthread.so.0: version `GLIBC_2.28' not found (required by ffmpeg)
    ffmpeg: /lib64/libc.so.6: version `GLIBC_2.27' not found (required by ffmpeg)
    ffmpeg: /lib64/libc.so.6: version `GLIBC_2.28' not found (required by ffmpeg)
    

    Any help would be greatly appreciated,

    Please make sure your answer is up to date and tested. Too many answers out there are auto-generated, too generic, or simple redirect without context.

    Thank you

  • Using ffmpeg to convert all files to a subdir with same name [duplicate]

    22 avril, par Marc Mouton

    I cannot manage to make a working script for this under Linux. Basically i want a script that execute a ffmpeg command and output the files to a subdirectory with the same name as the source. Long story short it is to convert audio from several mkv files, but i want to keep the same name and as ffmpeg doesn't overwrite files, i need to output them to a subdirectory.

    There is this answer : https://superuser.com/questions/912730/ffmpeg-batch-convert-make-same-filename?newreg=4676efc538b54a178fcbcc17e1fd2127

    But the Linux solution:

    mkdir outdir
    for i in *.wav; do
      ffmpeg -i $i -acodec pcm_s16le -ac 1 -ar 22050 outdir/$i;
    done
    

    In my case it should be:

    mkdir outdir
    for i in *.mkv; do
      ffmpeg -i $i -c:v copy -c:a flac -compression_level 12 -map 0 -c:s copy outdir/$i;
    done
    

    But it doesn't work:

    [in#0 @ 0x5fd710398840] Error opening input: No such file or directory Error opening input file AAA. Error opening input files: No such file or directory

  • ffmpeg output format mp3 not available [closed]

    22 avril, par fazzal

    I am working on a Node.js application that uses ffmpeg to convert videos to mp3 files. I have installed ffmpeg full build (libmp3lame included), and I have configured the path to ffmpeg correctly in my application. However, when I try to convert a video to mp3, I get the error "Output format mp3 is not available.


    ffmpeg(inputPath)
      .outputOptions('-vn', '-ab', '128k', '-ar', '44100')
      .toFormat('mp3')
      .save(outputPath)
      .on('error', (err) => console.error(`Error converting file: ${err}`))
      .on('end', () => console.log(`Converted ${file}`));
    `

    I have verified that ffmpeg is correctly installed in the ffmpeg -formatsmp3 is included.

  • Websocket connection in Docker

    22 avril, par Akhil Varghese

    I have a Node.js app that takes an RTSP URL as input and provide a jsmpeg websocket stream as output. The focus of app is to generate a web compatible stream from an RTSP stream, and I have acheived this using a package rtsp-relay. The app is working fine in locally. But after dockerizing the app, I am not receiving the output stream. The stream is getting terminated.

    The log is enter image description here

    The Dockerfile is: I am building the ffmpeg from source in docker.

    
    # Install necessary packages
    RUN apt-get update && \
        DEBIAN_FRONTEND=noninteractive apt-get install -yq \
        autoconf \
        automake \
        build-essential \
        cmake \
        git \
        libass-dev \
        libfreetype6-dev \
        libgnutls28-dev \
        libmp3lame-dev \
        libsdl2-dev \
        libtool \
        libva-dev \
        libvdpau-dev \
        libvorbis-dev \
        libxcb1-dev \
        libxcb-shm0-dev \
        libxcb-xfixes0-dev \
        meson \
        ninja-build \
        pkg-config \
        texinfo \
        wget \
        yasm \
        zlib1g-dev \
        libunistring-dev \
        libaom-dev \
        libdav1d-dev \
        libnuma-dev \
        libopus-dev && \
        rm -rf /var/lib/apt/lists/*
    
    # Create directories
    RUN mkdir -p /ffmpeg_sources /ffmpeg_build /bin
    RUN apt-get update
    RUN  apt-get install -y nasm
    RUN  apt-get install -y yasm
    
    # Build and install NASM
    WORKDIR /ffmpeg_sources
    RUN wget https://www.nasm.us/pub/nasm/releasebuilds/2.16.01/nasm-2.16.01.tar.bz2 && \
        tar xjvf nasm-2.16.01.tar.bz2 && \
        cd nasm-2.16.01 && \
        ./autogen.sh && \
        ./configure --prefix="/ffmpeg_build" --bindir="/bin" && \
        make && \
        make install
    
    # Build and install x264
    RUN apt-get update
    RUN apt-get install -y libx264-dev
    
    
    # Build and install x265
    RUN apt-get install -y libx265-dev libnuma-dev
    
    # Build and install libvpx
    RUN git clone --depth 1 https://chromium.googlesource.com/webm/libvpx.git && \
        cd libvpx && \
        ./configure --prefix="/ffmpeg_build" --disable-examples --disable-unit-tests --enable-vp9-highbitdepth --as=yasm && \
        make && \
        make install
    
    # Build and install fdk-aac
    RUN git clone --depth 1 https://github.com/mstorsjo/fdk-aac && \
        cd fdk-aac && \
        autoreconf -fiv && \
        ./configure --prefix="/ffmpeg_build" --disable-shared && \
        make && \
        make install
    
    # Build and install opus
    RUN git clone --depth 1 https://github.com/xiph/opus.git && \
        cd opus && \
        ./autogen.sh && \
        ./configure --prefix="/ffmpeg_build" --disable-shared && \
        make && \
        make install
    
    # Build and install SVT-AV1
    RUN git clone --depth=1 https://gitlab.com/AOMediaCodec/SVT-AV1.git && \
        cd SVT-AV1 && \
        cd Build && \
        cmake .. -G "Unix Makefiles" -DCMAKE_BUILD_TYPE=Release && \
        make && \
        make install
    
    
    # Build and install dav1d
    RUN git clone --depth 1 https://code.videolan.org/videolan/dav1d.git && \
        mkdir -p dav1d/build && \
        cd dav1d/build && \
        meson setup -Denable_tools=false -Denable_tests=false --default-library=static .. --prefix "/ffmpeg_build" --libdir="/ffmpeg_build/lib" && \
        ninja && \
        ninja install
    
    # Build and install ffmpeg
    RUN wget -O ffmpeg-snapshot.tar.bz2 https://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2 && \
        tar xjvf ffmpeg-snapshot.tar.bz2 && \
        cd ffmpeg && \
        ./configure \
          --prefix="/ffmpeg_build" \
          --pkg-config-flags="--static" \
          --extra-cflags="-I/ffmpeg_build/include" \
          --extra-ldflags="-L/ffmpeg_build/lib" \
          --extra-libs="-lpthread -lm" \
          --ld="g++" \
          --bindir="/bin" \
          --enable-gpl \
          --enable-gnutls \
          --enable-libaom \
          --enable-libass \
          --enable-libfdk-aac \
          --enable-libfreetype \
          --enable-libmp3lame \
          --enable-libopus \
          --enable-libsvtav1 \
          --enable-libdav1d \
          --enable-libvorbis \
          --enable-libvpx \
          --enable-libx264 \
          --enable-libx265 \
          --enable-nonfree \
          --extra-ldflags="-lm -lstdc++ -L/ffmpeg_build/lib -Wl,-rpath,/ffmpeg_build/lib" \
          --extra-cflags="-I/ffmpeg_build/include" \
          --extra-libs="-lpthread -lm -lstdc++" && \
        make && \
        make install && \
        hash -r
    
    # Update PATH
    ENV PATH="/ffmpeg_build/bin:${PATH}"
    
    # Cleanup
    RUN rm -rf /ffmpeg_sources
    
    # Source profile
    RUN echo "source /root/.profile" >> ~/.bashrc
    
    
    
    WORKDIR /usr/src/app
    RUN apt-get update
    RUN apt-get install -y nodejs
    RUN apt-get install -y npm
    COPY package*.json ./
    RUN npm install
    COPY . .
    EXPOSE 2000
    CMD ["npm", "start"]
    

    While checking the network rules in the Docker image with the command: iptables -L I am getting the following response: enter image description here

    I am able to access every API of the dockerized app, and I receive the responses and logs. but I'm not receiving the jsmpeg stream through the websocket connection.

    I'm uncertain about the exact problem or how to resolve it.If you have encountered a similar issue before, and have a solution, I would greatly appreciate your assistance.

  • FFMPEG : how to overlay a moving PNG with multiply blend effect (overlay filter + blend filter) [closed]

    22 avril, par user24565885

    I tried hunting for all existing questions and couldn't find a match :-(

    I have a video and a PNG image that I overlay on top of that. The PNG needs to move across the video (this I have working). But I want the PNG to use a "multiply" blend effect vs. a standard alpha overlay. Whenever I try to apply both filters I get an error.

    This code works and moves the image (the last one) across as expected:

    ffmpeg -i d6b5ec90-8823-41cf-9b81-3086ce83054a.mp4   -loop 1 -i 593e677d-02bb-49c2-a0c6-c3dd8f8c2f72.png  -loop 1 -i 32fea58f-ebe1-447e-b079-06f8ebb9df52.png  -loop 1 -i cb7435fb-5aff-4eeb-af2a-22b92c900db6.png -filter_complex "[0]fps=30,scale=3840:3840[o1];[1]fps=30,setpts=PTS+0.000/TB[t_1];[o1][t_1]overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2:enable='between(t,0.000,5.000)':shortest=1[o2];[2]fps=30,setpts=PTS+1.000/TB[t_2];[o2][t_2]overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2:enable='between(t,1.000,5.000)':shortest=1[o3];[3]fps=30[t_3];[o3][t_3]overlay='main_w*(1.5+(0-1.5)*max(0,1-(n-1.000*30)/(4.5*30)))':0:enable='between(t,1.000,4.500)':shortest=1" -r 30 -pix_fmt yuv420p -c:a aac -b:a 128k -c:v libx264 -b:v 8000k out.mp4
    

    I now want that last effect to use multiply for blending and that's what I can't figure out. Applying both filters with a comma seems reasonable but bonks (NOTE: the MP4 and all PNGs are 3840x3840 so it's not a size issue)

    [o3][t_3]overlay='main_w*(1.5+(0-1.5)*max(0,1-(n-1.000*30)/(4.5*30)))':0:enable='between(t,1.000,4.500)':shortest=1,blend=all_mode='multiply':all_opacity=1
    

    The error I get is "Cannot find a matching stream for unlabeled input pad 1 on filter Parsed_blend_10".

    I've looked at this one: In FFmpeg, using -filter_complex, how can I overlay and blend at the same time? but can't quite figure out how I would split, blend, then overlay in my situation where the last input PNG ([3]) needs to blend and move...

    Would really appreciate the help if anyone has a solution.

    THANKS!