Recherche avancée

Médias (91)

Autres articles (31)

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (4772)

  • When I use ffmpeg to go from a video to frames, and then back to video, the duration is different between the videos

    24 février 2024, par bluepanda

    I am trying to use ffmpeg to convert from a .mp4 (or .mov) video into individual frames, do some processing on those frames, and then convert back to .mp4. The problem is that the resulting video I create is a different duration than the input - I can see this visually when I play the two videos side by side. The difference is not large (i.e. 00:00:00.50 for the input video and 00:00:00.52 for the output video), but when the videos are looped next to each other they get out of sync.

    


    Here is information about the input video retrieved using fluent-ffmpeg's ffmpeg.ffprobe(videoPath) :

    


    metadata {
  streams: [
    {
      index: 0,
      codec_name: 'h264',
      codec_long_name: 'H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10',
      profile: 'High',
      codec_type: 'video',
      codec_tag_string: 'avc1',
      codec_tag: '0x31637661',
      width: 1080,
      height: 1920,
      coded_width: 1080,
      coded_height: 1920,
      closed_captions: 0,
      has_b_frames: 2,
      sample_aspect_ratio: 'N/A',
      display_aspect_ratio: 'N/A',
      pix_fmt: 'yuv420p',
      level: 40,
      color_range: 'tv',
      color_space: 'bt709',
      color_transfer: 'bt709',
      color_primaries: 'bt709',
      chroma_location: 'left',
      field_order: 'unknown',
      refs: 1,
      is_avc: 'true',
      nal_length_size: 4,
      id: 'N/A',
      r_frame_rate: '30000/1001',
      avg_frame_rate: '27000/1001',
      time_base: '1/30000',
      start_pts: 0,
      start_time: 0,
      duration_ts: 15100,
      duration: 0.503333,
      bit_rate: 5660223,
      max_bit_rate: 'N/A',
      bits_per_raw_sample: 8,
      nb_frames: 36,
      nb_read_frames: 'N/A',
      nb_read_packets: 'N/A',
      tags: [Object],
      disposition: [Object]
    },
    {
      index: 1,
      codec_name: 'aac',
      codec_long_name: 'AAC (Advanced Audio Coding)',
      profile: 'LC',
      codec_type: 'audio',
      codec_tag_string: 'mp4a',
      codec_tag: '0x6134706d',
      sample_fmt: 'fltp',
      sample_rate: 48000,
      channels: 2,
      channel_layout: 'stereo',
      bits_per_sample: 0,
      id: 'N/A',
      r_frame_rate: '0/0',
      avg_frame_rate: '0/0',
      time_base: '1/48000',
      start_pts: 0,
      start_time: 0,
      duration_ts: 24160,
      duration: 0.503333,
      bit_rate: 248416,
      max_bit_rate: 'N/A',
      bits_per_raw_sample: 'N/A',
      nb_frames: 27,
      nb_read_frames: 'N/A',
      nb_read_packets: 'N/A',
      tags: [Object],
      disposition: [Object]
    }
  ],
  format: {
    filename: '/Users/name/images/input.mp4',
    nb_streams: 2,
    nb_programs: 0,
    format_name: 'mov,mp4,m4a,3gp,3g2,mj2',
    format_long_name: 'QuickTime / MOV',
    start_time: 0,
    duration: 0.503333,
    size: 963879,
    bit_rate: 15319941,
    probe_score: 100,
    tags: {
      major_brand: 'mp42',
      minor_version: '1',
      compatible_brands: 'isommp41mp42',
      creation_time: '2024-02-14T01:21:12.000000Z'
    }
  },
  chapters: []
}


    


    and here is from running ffprobe directly :

    


    ffprobe '/Users/name/images/input.mp4'
ffprobe version 6.1.1 Copyright (c) 2007-2023 the FFmpeg developers
  built with Apple clang version 15.0.0 (clang-1500.1.0.2.5)
  configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/6.1.1_2 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopenvino --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon
  libavutil      58. 29.100 / 58. 29.100
  libavcodec     60. 31.102 / 60. 31.102
  libavformat    60. 16.100 / 60. 16.100
  libavdevice    60.  3.100 / 60.  3.100
  libavfilter     9. 12.100 /  9. 12.100
  libswscale      7.  5.100 /  7.  5.100
  libswresample   4. 12.100 /  4. 12.100
  libpostproc    57.  3.100 / 57.  3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/Users/name/images/input.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 1
    compatible_brands: isommp41mp42
    creation_time   : 2024-02-14T01:21:12.000000Z
  Duration: 00:00:00.50, start: 0.000000, bitrate: 15319 kb/s
  Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709, progressive), 1080x1920, 5660 kb/s, 26.97 fps, 29.97 tbr, 30k tbn (default)
    Metadata:
      creation_time   : 2024-02-14T01:21:12.000000Z
      handler_name    : Core Media Video
      vendor_id       : [0][0][0][0]
      encoder         : AVC Coding
  Stream #0:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 248 kb/s (default)
    Metadata:
      creation_time   : 2024-02-14T01:21:12.000000Z
      handler_name    : Core Media Audio
      vendor_id       : [0][0][0][0]


    


    And this is my command to go from video to frames :

    


    ffmpeg -i /Users/name/images/input.mp4 -y -f image2 /Users/name/images/frames/%d.png


    


    After which I convert the frames back to video with this - note that I get by seeing avg_frame_rate is 27000/1001 = 26.97302697 :

    


    ffmpeg -r 26.973026973026972 -i /Users/name/images/frames/%d.png -y -r 26.973026973026972 -b:v 5660223k -f mp4 -pix_fmt yuv420p -t 0.503333 /Users/name/images/output.mp4


    


    And if I then run fluent-ffmpeg's ffmpeg.ffprobe(videoPath) I get :

    


    metadata {
  streams: [
    {
      index: 0,
      codec_name: 'h264',
      codec_long_name: 'H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10',
      profile: 'High',
      codec_type: 'video',
      codec_tag_string: 'avc1',
      codec_tag: '0x31637661',
      width: 1080,
      height: 1920,
      coded_width: 1080,
      coded_height: 1920,
      closed_captions: 0,
      has_b_frames: 2,
      sample_aspect_ratio: '1:1',
      display_aspect_ratio: '9:16',
      pix_fmt: 'yuv420p',
      level: 62,
      color_range: 'unknown',
      color_space: 'unknown',
      color_transfer: 'unknown',
      color_primaries: 'unknown',
      chroma_location: 'left',
      field_order: 'unknown',
      refs: 1,
      is_avc: 'true',
      nal_length_size: 4,
      id: 'N/A',
      r_frame_rate: '27000/1001',
      avg_frame_rate: '27000/1001',
      time_base: '1/27000',
      start_pts: 0,
      start_time: 0,
      duration_ts: 14014,
      duration: 0.519037,
      bit_rate: 52138429,
      max_bit_rate: 'N/A',
      bits_per_raw_sample: 8,
      nb_frames: 14,
      nb_read_frames: 'N/A',
      nb_read_packets: 'N/A',
      tags: [Object],
      disposition: [Object]
    }
  ],
  format: {
    filename: '/Users/name/images/output.mp4',
    nb_streams: 1,
    nb_programs: 0,
    format_name: 'mov,mp4,m4a,3gp,3g2,mj2',
    format_long_name: 'QuickTime / MOV',
    start_time: 0,
    duration: 0.52,
    size: 3383708,
    bit_rate: 52057046,
    probe_score: 100,
    tags: {
      major_brand: 'isom',
      minor_version: '512',
      compatible_brands: 'isomiso2avc1mp41',
      encoder: 'Lavf60.3.100'
    }
  },
  chapters: []
}


    


    and here is from running ffprobe directly :

    


    ffprobe '/Users/name/images/output.mp4'
ffprobe version 6.1.1 Copyright (c) 2007-2023 the FFmpeg developers
  built with Apple clang version 15.0.0 (clang-1500.1.0.2.5)
  configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/6.1.1_2 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopenvino --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon
  libavutil      58. 29.100 / 58. 29.100
  libavcodec     60. 31.102 / 60. 31.102
  libavformat    60. 16.100 / 60. 16.100
  libavdevice    60.  3.100 / 60.  3.100
  libavfilter     9. 12.100 /  9. 12.100
  libswscale      7.  5.100 /  7.  5.100
  libswresample   4. 12.100 /  4. 12.100
  libpostproc    57.  3.100 / 57.  3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/Users/name/images/output.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    encoder         : Lavf60.3.100
  Duration: 00:00:00.52, start: 0.000000, bitrate: 52153 kb/s
  Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1080x1920 [SAR 1:1 DAR 9:16], 52138 kb/s, 26.97 fps, 26.97 tbr, 27k tbn (default)
    Metadata:
      handler_name    : VideoHandler
      vendor_id       : [0][0][0][0]
      encoder         : Lavc60.3.100 libx264


    


    This seems like it should be a fairly common scenario, but I have not been able to find examples of this, and the other questions about incorrect durations on Stack Overflow are about bigger differences (i.e. 3 seconds instead of 10 seconds : Wrong video duration when recording with ffmpeg).

    


    Some other details :

    


      

    • I am running this through a Node.js script with fluent-ffmpeg, but I have also tried running the commands directly in the terminal and the result is the same.
    • 


    • I am fine with the output frames being .png / .jpg / other formats.
    • 


    • I am fine with setting this to a different frame rate than the original as long as the two output videos end up with the same duration.
    • 


    • One suspicious thing is that I set -t 0.503333 when creating the video, but it doesn't seem to work as the result video shows duration: 0.519037 / 00:00:00.52.
    • 


    


    Thank you for any help !

    


  • Using ffmpeg to merge video segments created by the MediaRecorder API

    10 avril 2023, par Dario Cimmino

    I am recording a live video from a webcam using mediarecorder API un chunks of 3 seconds :

    


    startButton.addEventListener('click', () => {
navigator.mediaDevices.getUserMedia({
    video: {
        width: 1280,
        height: 720,
        frameRate: { ideal: 30, max: 30 }
    }
})
    .then(stream => {
        video.srcObject = stream;
        mediaRecorder = new MediaRecorder(stream, { mimeType: 'video/webm' });
        mediaRecorder.ondataavailable = async (event) => {
            const blob = new Blob([event.data], { type: 'video/mp4' });
            const formData = new FormData();
            formData.append('segment', blob, `segment${segmentNumber}.mp4`);

            // When a new video segment is ready
            fetch('http://localhost:3000/upload', {
                method: 'POST',
                body: formData
            })
                .then((response) => response.text())
                .then((result) => {
                    console.log('Upload result:', result);
                })
                .catch((error) => {
                    console.error('Error uploading video segment:', error);
                });
            //Upload data to mysql
            fetch('upload.php', {
                method: 'POST',
                body: formData
            })
                .then(response => response.text())
                .then(result => {
                    console.log('Upload result to MYSQL:', result);
                })
                .catch(error => {
                    console.error('Error uploading video segment to MYSQL:', error);
                });
            segmentNumber++;
        };

        mediaRecorder.start(3000);
    })
    .catch(error => {
        console.error('Error accessing camera:', error);
    });


    


    }) ;

    


    I am left with only the first segment playable, as is expected.

    


    How ever when the recording stops, i'd like to merge all those segments recorded using ffmpeg (or any other) with the help of my nodeJs server.

    


    I am having difficulty understand the parsing of mp4 files.

    


    if I try the command :

    


    ffmpeg -i segment1.mp4 -i segment2.mp4 -i segment3.mp4  out.mp4


    


    I get the following error :

    


        ffmpeg version N-110223-gb18a9c2971-20230410 Copyright (c) 2000-2023 the FFmpeg developers
  built with gcc 12.2.0 (crosstool-NG 1.25.0.152_89671bf)
  configuration: --prefix=/ffbuild/prefix --pkg-config-flags=--static --pkg-config=pkg-config --cross-prefix=x86_64-w64-mingw32- --arch=x86_64 --target-os=mingw32 --enable-gpl --enable-version3 --disable-debug --disable-w32threads --enable-pthreads --enable-iconv --enable-libxml2 --enable-zlib --enable-libfreetype --enable-libfribidi --enable-gmp --enable-lzma --enable-fontconfig --enable-libvorbis --enable-opencl --disable-libpulse --enable-libvmaf --disable-libxcb --disable-xlib --enable-amf --enable-libaom --enable-libaribb24 --enable-avisynth --enable-chromaprint --enable-libdav1d --enable-libdavs2 --disable-libfdk-aac --enable-ffnvcodec --enable-cuda-llvm --enable-frei0r --enable-libgme --enable-libkvazaar --enable-libass --enable-libbluray --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librist --enable-libssh --enable-libtheora --enable-libvpx --enable-libwebp --enable-lv2 --disable-libmfx --enable-libvpl --enable-openal --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --enable-libopenmpt --enable-librav1e --enable-librubberband --enable-schannel --enable-sdl2 --enable-libsoxr --enable-libsrt --enable-libsvtav1 --enable-libtwolame --enable-libuavs3d --disable-libdrm --disable-vaapi --enable-libvidstab --enable-vulkan --enable-libshaderc --enable-libplacebo --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libzimg --enable-libzvbi --extra-cflags=-DLIBTWOLAME_STATIC --extra-cxxflags= --extra-ldflags=-pthread --extra-ldexeflags= --extra-libs=-lgomp --extra-version=20230410
  libavutil      58.  6.100 / 58.  6.100
  libavcodec     60.  9.100 / 60.  9.100
  libavformat    60.  4.101 / 60.  4.101
  libavdevice    60.  2.100 / 60.  2.100
  libavfilter     9.  5.100 /  9.  5.100
  libswscale      7.  2.100 /  7.  2.100
  libswresample   4. 11.100 /  4. 11.100
  libpostproc    57.  2.100 / 57.  2.100
Input #0, matroska,webm, from 'segment1.mp4':
  Metadata:
    encoder         : Chrome
  Duration: N/A, start: 0.000000, bitrate: N/A
  Stream #0:0(eng): Video: h264 (Constrained Baseline), yuv420p(progressive), 1280x720 [SAR 1:1 DAR 16:9], 30.30 fps, 30 tbr, 1k tbn (default)
[mov,mp4,m4a,3gp,3g2,mj2 @ 000001d93cf25fc0] Format mov,mp4,m4a,3gp,3g2,mj2 detected only with low score of 1, misdetection possible!
[mov,mp4,m4a,3gp,3g2,mj2 @ 000001d93cf25fc0] moov atom not found
segment2.mp4: Invalid data found when processing input


    


    any help or inputs are appreciated. THanks !

    


  • Cannot duplicate virtual camera on Orange Pi 5B [closed]

    21 janvier 2024, par Denis Kotov

    I have faced with the issue that I cannot duplicate virtual camera for SkyDroid 5.8GHz OTG Receiver on Orange Pi 5B :

    


    orangepi@orangepi-desktop:~$ gst-launch-1.0 v4l2src device=/dev/video0 ! xvimagesink device=/dev/video2

Setting pipeline to PAUSED ...
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
../sys/xvimage/xvimagesink.c(1806): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not open display (null)
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...

orangepi@orangepi-desktop:~$ gst-launch-1.0 v4l2src device=/dev/video0 ! v4l2sink device=/dev/video2

Setting pipeline to PAUSED ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Cannot identify device '/dev/video2'.
Additional debug info:
../sys/v4l2/v4l2_calls.c(608): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
system error: No such file or directory
ERROR: pipeline doesn't want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...


    


    Also I tried to duplicate video camera using ffmpeg :

    


    orangepi@orangepi-desktop:~$ sudo ffmpeg -f video4linux2 -i /dev/video0 -codec copy -f v4l2 /dev/video2

ffmpeg version 6.0-5+git230804.e243e8d001~j1 Copyright (c) 2000-2023 the FFmpeg developers
  built with gcc 11 (Ubuntu 11.4.0-1ubuntu1~22.04)
  configuration: --prefix=/usr --extra-version='5+git230804.e243e8d001~j1' --toolchain=hardened --libdir=/usr/lib/aarch64-linux-gnu --incdir=/usr/include/aarch64-linux-gnu --arch=arm64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libglslang --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librist --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-version3 --enable-rkmpp --enable-sdl2 --disable-sndio --enable-pocketsphinx --enable-librsvg --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-libplacebo --enable-shared
  libavutil      58.  2.100 / 58.  2.100
  libavcodec     60.  3.100 / 60.  3.100
  libavformat    60.  3.100 / 60.  3.100
  libavdevice    60.  1.100 / 60.  1.100
  libavfilter     9.  3.100 /  9.  3.100
  libswscale      7.  1.100 /  7.  1.100
  libswresample   4. 10.100 /  4. 10.100
  libpostproc    57.  1.100 / 57.  1.100
[mjpeg @ 0xaaab04104190] EOI missing, emulating
Input #0, video4linux2,v4l2, from '/dev/video0':
  Duration: N/A, start: 3317.330868, bitrate: N/A
  Stream #0:0: Video: mjpeg (Baseline), yuvj422p(pc, bt470bg/unknown/unknown), 640x480, 30 fps, 30 tbr, 1000k tbn
[video4linux2,v4l2 @ 0xaaab04105750] Unable to open V4L2 device '/dev/video2'
[out#0/video4linux2,v4l2 @ 0xaaab040b9200] Could not write header (incorrect codec parameters ?): No such file or directory
[vost#0:0/copy @ 0xaaab04107640] Error initializing output stream: 
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
    Last message repeated 1 times
[video4linux2,v4l2 @ 0xaaab041039d0] Some buffers are still owned by the caller on close.
ioctl(VIDIOC_QBUF): Bad file descriptor


    


    But it also did not work ...

    


    The reason I want it, because I try to have additional processing using OpenCV but the camera is opened by another application, and it is not possible to similtuniuously open camera in 2 applications at the same time ...

    


    Have somebody faced with similar issue ?