Recherche avancée

Médias (91)

Autres articles (68)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (11110)

  • Resizing AVIF images with transparency with FFmpeg [closed]

    4 octobre 2024, par Calebmer

    I'm trying to resize an image with transparency with FFmpeg, however the output looks to only be a resized version of the alpha layer.

    


    When I try to do a noop transform of the AVIF image with an alpha layer :

    


    ffmpeg -i input.avif output.avif


    


    output.avif appears to be the alpha layer with black representing alpha 0 and white representing alpha 1.

    


    ffprobe input.avif gives me :

    


    ffprobe version 7.0.2 Copyright (c) 2007-2024 the FFmpeg developers
  built with Apple clang version 15.0.0 (clang-1500.3.9.4)
  configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon
  libavutil      59.  8.100 / 59.  8.100
  libavcodec     61.  3.100 / 61.  3.100
  libavformat    61.  1.100 / 61.  1.100
  libavdevice    61.  1.100 / 61.  1.100
  libavfilter    10.  1.100 / 10.  1.100
  libswscale      8.  1.100 /  8.  1.100
  libswresample   5.  1.100 /  5.  1.100
  libpostproc    58.  1.100 / 58.  1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'input.avif':
  Metadata:
    major_brand     : avif
    minor_version   : 0
    compatible_brands: avifmif1miaf
  Duration: N/A, start: 0.000000, bitrate: N/A
  Stream #0:0[0x1]: Video: av1 (libdav1d) (Main) (av01 / 0x31307661), gray(pc), 336x252 [SAR 1:1 DAR 4:3], 1 fps, 1 tbr, 1 tbn (default)
  Stream #0:1[0x2]: Video: av1 (libdav1d) (High) (av01 / 0x31307661), yuv444p(pc, smpte170m/bt709/iec61966-2-1), 336x252 [SAR 1:1 DAR 4:3], 1 fps, 1 tbr, 1 tbn


    


    Seeing there are two streams (the first stream being gray(pc), probably the alpha layer) I next tried :

    


    ffmpeg -i input.avif -map 0:v:1 output.avif


    


    To see the second stream and it gave me the image without any alpha channel. Transparent pixels were black.

    


    Ultimately I want to resize the AVIF file with ffmpeg -i input.avif -vf "scale=iw/2:-1" output.avif but that appears to only resize the greyscale alpha channel. Furthermore, this will be part of a script that operates on some AVIF files without an alpha channel and some AVIF files with an alpha channel and I don't know which files have an alpha channel ahead of time. ffmpeg -i input.avif -vf "scale=iw/2:-1" output.avif works for files without an alpha channel.

    


  • ffmpeg minimal linux build with only one filter [closed]

    31 juillet 2024, par at8993

    I am aiming to compile a minimal build of ffmpeg for Ubuntu 22.04. The application requires only the concat filter for .mp4s using H.264 video encoder.

    


    I have followed this guide and used the following commands to include libx264 only :

    


    sudo apt-get update -qq && sudo apt-get -y install \
  autoconf \
  automake \
  build-essential \
  cmake \
  git-core \
  libass-dev \
  libfreetype6-dev \
  libgnutls28-dev \
  libmp3lame-dev \
  libsdl2-dev \
  libtool \
  libva-dev \
  libvdpau-dev \
  libvorbis-dev \
  libxcb1-dev \
  libxcb-shm0-dev \
  libxcb-xfixes0-dev \
  meson \
  ninja-build \
  pkg-config \
  texinfo \
  wget \
  yasm \
  zlib1g-dev

mkdir -p ~/ffmpeg_sources ~/bin

sudo apt-get install libx264-dev


    


    and for the configure command I have used the following options with a view to only enable what's necessary for concat.

    


    cd ~/ffmpeg_sources && \
wget -O ffmpeg-snapshot.tar.bz2 https://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2 && \
tar xjvf ffmpeg-snapshot.tar.bz2 && \
cd ffmpeg && \

PATH="$HOME/bin:$PATH" PKG_CONFIG_PATH="$HOME/ffmpeg_build/lib/pkgconfig" ./configure \
  --prefix="$HOME/ffmpeg_build" \
  --pkg-config-flags="--static" \
  --extra-cflags="-I$HOME/ffmpeg_build/include" \
  --extra-ldflags="-L$HOME/ffmpeg_build/lib" \
  --extra-libs="-lpthread -lm" \
  --ld="g++" \
  --bindir="$HOME/bin" \
  --disable-everything \
  --enable-filter=concat \
  --enable-avformat \
  --enable-avdevice \
  --enable-avcodec \
  --enable-decoder=h264 
  --enable-libx264 
  --enable-muxer=mp4 
  --enable-gpl

PATH="$HOME/bin:$PATH" make && \
make install && \
hash -r


    


    However running

    


    ffmpeg -f concat -i files_to_merge.txt -c copy output.mp4


    


    returns error

    


    ffmpeg -f concat -i files_to_merge.txt -c copy output.mp4
ffmpeg version N-116445-gb1d410716b Copyright (c) 2000-2024 the FFmpeg developers
  built with gcc 11 (Ubuntu 11.4.0-1ubuntu1~22.04)
  configuration: --prefix=/home/user/ffmpeg_build 
--extra-cflags=-I/home/user/ffmpeg_build/include --extra-ldflags=-L/home/user/ffmpeg_build/lib 
--extra-libs='-lpthread -lm' 
--ld=g++ --bindir=/home/user/bin --disable-everything 
--enable-filter=concat --enable-avformat 
--enable-avdevice --enable-avcodec --enable-decoder=h264
 --enable-libx264 --enable-muxer=mp4 
--enable-gpl
  libavutil      59. 30.100 / 59. 30.100
  libavcodec     61. 10.100 / 61. 10.100
  libavformat    61.  5.101 / 61.  5.101
  libavdevice    61.  2.100 / 61.  2.100
  libavfilter    10.  2.102 / 10.  2.102
  libswscale      8.  2.100 /  8.  2.100
  libswresample   5.  2.100 /  5.  2.100
  libpostproc    58.  2.100 / 58.  2.100
[in#0 @ 0x555605fab640] Unknown input format: 'concat'


    


    I assume this is due to the omission of an enable option in the configure command.

    


    Thanks !

    


  • How to write a video stream to a server ?

    14 août, par The Mask

    Recently been playing with FFmpeg and it's powerful abilities. Working on a cool project where I'm trying to create a live video stream using FFmpeg. The client (reactJs) and server (nodeJS) are connected via web-socket. The client sends the byte packets to server and the server then spawns an FFmpeg process and serve it to an nginx server.

    


    Client(live-stream.js) :

    


    const stream = await navigator.mediaDevices.getUserMedia({
    video: true,
    audio: true,
  });
  videoRef.current.srcObject = stream;

  const ingestUrl = `ws://localhost:8081/ws`
  const socket = new WebSocket(ingestUrl);
  socket.binaryType = "arraybuffer";
  socket.onopen = () => {
    console.log("✅ WebSocket connection established");
    socket.send(JSON.stringify({ type: "start", stream_key: streamKey }));
    mediaRecorderRef.current.start(500);
  };
  socketRef.current = socket;

  socket.onerror = (error) => {
    console.error("❌ WebSocket error:", error);
  };

  mediaRecorderRef.current = new MediaRecorder(stream, {
    mimeType: "video/webm;codecs=vp8,opus",
    videoBitsPerSecond: 1000000,
    audioBitsPerSecond: 128000
  });
  mediaRecorderRef.current.ondataavailable = (event) => {
    if (event.data.size > 0 && socket.readyState === WebSocket.OPEN) {
      event.data.arrayBuffer().then((buffer) => socket.send(buffer));
    }
  };


    


    Server(index.js) :

    


    const http = require('http');
const WebSocket = require('ws');
const { spawn } = require('child_process');
const fs = require('fs');


const server = new WebSocket.Server({ server:wss, path:'/ws'});

const startFFmpeg = (stream_key) => {
  return ffmpeg = spawn("ffmpeg", [
    "-re",
    "-f", "matroska",
    "-i", "pipe:0",
    "-map", "0:v:0",
    "-map", "0:a:0",
    "-c:v", "libx264",
    "-c:a", "aac ",
    "-b:v", "6000k",
    "-maxrate", "6000k ",
    "-bufsize", "6000k ",
    "-pix_fmt", "yuv420p ",
    "-f", "flv",
    `rtmp://localhost/live/${stream_key}`,
  ]);
}
server.on('connection', (ws) => {
  console.log('📡 New WebSocket connection');

  let ffmpeg = null;
  let buffer = Buffer.alloc(0);
  let streamStarted = false;

  ws.on('message', (msg, isBinary) => {
    if (!isBinary) {
      const parsed = JSON.parse(msg);
      if (parsed.type === "start") {
        const { stream_key } = parsed;
        console.log(`🔑 Stream key: ${stream_key}`);
        console.log(`🎥 Starting ingest for stream key: ${stream_key}`);

        ffmpeg = startFFmpeg(stream_key)
        ffmpeg.stdin.on("error", (e) => {
          console.error("FFmpeg stdin error:", e.message);
        });

        ffmpeg.stderr.on("data", (data) => {
          console.log(`FFmpeg Data: ${data}`);
        });

        ffmpeg.on("close", (code) => {
          console.log(`FFmpeg exited with code ${code}`);
        });

        ffmpeg.on("exit", (code, signal) => {
          console.log(`FFmpeg exited with code: ${code}, signal: ${signal}`);
          if (signal === 'SIGSEGV') {
            console.log('🔄 FFmpeg segfaulted, attempting restart...');
            setTimeout(() => {
              if (ws.readyState === WebSocket.OPEN) {
                startFFmpeg(stream_key);
              }
            }, 1000);
          }
        });

        streamStarted = true;
      }
    } else if (isBinary && ffmpeg && ffmpeg.stdin.writable) {
        try {
        // Convert to Buffer if it's an ArrayBuffer
        let data;
        if (msg instanceof ArrayBuffer) {
          data = Buffer.from(msg);
        } else {
          data = Buffer.from(msg);
        }

        // Buffer the data
        buffer = Buffer.concat([buffer, data]);
        
        // Write in larger chunks to reduce overhead
        if (buffer.length >= 8192) { // 8KB threshold
          console.log(`📥 Writing ${buffer.length} bytes to FFmpeg`);
          
          if (ffmpeg.stdin.write(buffer)) {
            buffer = Buffer.alloc(0);
          } else {
            // Handle backpressure
            ffmpeg.stdin.once('drain', () => {
              buffer = Buffer.alloc(0);
              ffmpeg.stdin.setMaxListeners(20); // or a safe upper bound
            });
          }
        }
      } catch (e) {
        console.error("FFmpeg write error:", e);
      }
    }
  });
  
  ws.on('close', () => {
    console.log('❌ WebSocket closed');
    streamStarted = false;

    if (ffmpeg){     // Write any remaining buffer
      if (buffer.length > 0 && ffmpeg.stdin.writable) {
        console.log(`📥 Writing final ${buffer.length} bytes to FFmpeg`);
        ffmpeg.stdin.write(buffer);
      }
            
      // Gracefully close FFmpeg
      if (ffmpeg.stdin.writable) {
        ffmpeg.stdin.end();
      }
      
      setTimeout(() => {
        if (ffmpeg && !ffmpeg.killed) {
          ffmpeg.kill('SIGTERM');
          setTimeout(() => {
            if (ffmpeg && !ffmpeg.killed) {
              ffmpeg.kill('SIGKILL');
            }
          }, 5000);
        }
      }, 1000);
    }
  });
});

wss.listen(8081, "localhost", () => {
  console.log("🛰️ Server listening on http://localhost:8081/ws");
});


    


    The problem statment :
Been facing error like pixels drops in the video, bad quality. FFmpeg is crashing with error :

    


    FFmpeg Data: Input #0, matroska,webm, from 'pipe:0':

FFmpeg Data:   Metadata:
    encoder         : Chrome
  Duration: N/A, start: 0.000000, bitrate: N/A
  Stream #0:0(eng): Audio: opus, 48000 Hz, mono, fltp (default)

FFmpeg Data:   Stream #0:1(eng): Video: vp8, yuv420p(progressive), 640x480, SAR 1:1 DAR 4:3, 
FFmpeg Data: 1k tbr, 1k tbn (default)
    Metadata:
      alpha_mode      : 1

FFmpeg Data: Unknown pixel format requested: yuv420p .

FFmpeg stdin error: write EPIPE
FFmpeg exited with code: 1, signal: null
FFmpeg exited with code 1