Recherche avancée

Médias (9)

Mot : - Tags -/soundtrack

Autres articles (99)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

Sur d’autres sites (11254)

  • Remove obsolete bower.json.

    20 juillet 2019, par blueimp
  • What is the reason for getting libavcodec AVpacket presentation time as 0 ?

    13 août 2019, par Chamara Manoj

    Recently I am using FFmpeg library with VS 2017 decode and encode some video data. I used the example codes given with FFmpeg lib packages, transcoding.c.

    After encoding the video I was trying to extract the frames of the encoded and muxed video.mp4. However, when I read the frames using av_read_frame (AVFormatContext *s, AVPacket *pkt), in the last received frame, pts and dts times are both 0.

    What could be the reason for this ? My Video is playing smoothly and I have used H265 codec.

  • FFMPEG to create an MPEG-DASH stream with VP8

    16 septembre 2019, par Kenny Worden

    I’m trying to use FFMPEG to stream a live video feed from my webcam /dev/video0. Following scattered tutorials and scarce documentation (is this a known problem for the encoding community ?) I arrived at the following bash script :

    #!/bin/bash

    ffmpeg \
       -y \
       -f v4l2 \
           -i /dev/video0 \
           -s 640x480 \
           -input_format mjpeg \
           -r 24 \
       -map 0:0 \
       -pix_fmt yuv420p \
       -codec:v libvpx \
           -s 640x480 \
           -threads 4 \
           -b:v 50k \
           -tile-columns 4 \
           -frame-parallel 1 \
           -keyint_min 24 -g 24 \
       -f webm_chunk \
           -header "stream.hdr" \
           -chunk_start_index 1 \
       stream_%d.chk &

    sleep 2

    ffmpeg \
       -f webm_dash_manifest -live 1 \
       -i stream.hdr \
       -c copy \
       -map 0 \
       -f webm_dash_manifest -live 1 \
           -adaptation_sets "id=0,streams=0" \
           -chunk_start_index 1 \
           -chunk_duration_ms 1000 \
           -time_shift_buffer_depth 30000 \
           -minimum_update_period 60000 \
       stream_manifest.mpd

    When I run this script, my webcam light turns on, the stream.hdr and stream_manifest.mpd files are written, and chunks start to be created (i.e. stream_1.chk, stream_2.chk, etc...). However, FFMPEG throws the following error :

    Could not write header for output file #0 (incorrect codec parameters
     ?) : Invalid data found when processing input

    I will explain what I think I am doing with this script, and hopefully this will expose any errors in my thinking.

    First, we invoke FFMPEG to use Video for Linux 2 (v4l2) to read from my webcam (/dev/video0) of a resolution 640x480. The input format is mjpeg with a framerate of 24fps.

    I then declare that FFMPEG should "map" (copy) the video stream output by v4l2 to a file. I specify the pixel format (YUV420P) and use libvpx (VP8 encoding) to encode the video stream. I set the size to be 640x480, use 4 threads, set the bitrate to be 50kbps, do some magic with tile-columns and frame-parallel options, and set the I-frames to be 24 frames apart.

    I then create a stream.hdr file. The starting index is 1. This command continues to run infinitely until I kill it, grabbing new video from my webcam and outputting it into chunks.

    I then sleep for 2 seconds to give the previous command time to generate a header file.

    And that’s really it. The next invocation of FFMPEG simply creates the MPEG-DASH manifest file given the header generated in the previous step.

    So what’s going on ? Why can I not view the video in a web browser (I’m using Dash.js) ? I serve the manifest, header, and chunks on a Node.js server so that trivial issue is not the problem.


    Edit : Here is my full console output.

    ffmpeg version 3.0.7-0ubuntu0.16.10.1 Copyright (c) 2000-2017 the FFmpeg developers
     built with gcc 6.2.0 (Ubuntu 6.2.0-5ubuntu12) 20161005
     configuration: --prefix=/usr --extra-version=0ubuntu0.16.10.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librubberband --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-chromaprint --enable-libx264
     libavutil      55. 17.103 / 55. 17.103
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libavresample   3.  0.  0 /  3.  0.  0
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    [video4linux2,v4l2 @ 0x55847e244ea0] The driver changed the time per frame from 1/24 to 1/30
    [mjpeg @ 0x55847e245c00] Changing bps to 8
    Input #0, video4linux2,v4l2, from '/dev/video0':
     Duration: N/A, start: 64305.102081, bitrate: N/A
       Stream #0:0: Video: mjpeg, yuvj422p(pc, bt470bg/unknown/unknown), 640x480, -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
    Codec AVOption frame-parallel (Enable frame parallel decodability features) specified for output file #0 (stream_%d.chk) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
    Codec AVOption tile-columns (Number of tile columns to use, log2) specified for output file #0 (stream_%d.chk) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
    [swscaler @ 0x55847e24b720] deprecated pixel format used, make sure you did set range correctly
    [libvpx @ 0x55847e248a20] v1.5.0
    Output #0, webm_chunk, to 'stream_%d.chk':
     Metadata:
       encoder         : Lavf57.25.100
       Stream #0:0: Video: vp8 (libvpx), yuv420p, 640x480, q=-1--1, 50 kb/s, 30 fps, 30 tbn, 30 tbc
       Metadata:
         encoder         : Lavc57.24.102 libvpx
       Side data:
         unknown side data type 10 (24 bytes)
    Stream mapping:
     Stream #0:0 -> #0:0 (mjpeg (native) -> vp8 (libvpx))
    Press [q] to stop, [?] for help
    frame=   21 fps=0.0 q=0.0 size=N/A time=00:00:00.70 bitrate=N/A dup=5 drop=frame=   36 fps= 35 q=0.0 size=N/A time=00:00:01.20 bitrate=N/A dup=5 drop=frame=   51 fps= 33 q=0.0 size=N/A time=00:00:01.70 bitrate=N/A dup=5 drop=ffmpeg version 3.0.7-0ubuntu0.16.10.1 Copyright (c) 2000-2017 the FFmpeg developers
     built with gcc 6.2.0 (Ubuntu 6.2.0-5ubuntu12) 20161005
     configuration: --prefix=/usr --extra-version=0ubuntu0.16.10.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librubberband --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-chromaprint --enable-libx264
     libavutil      55. 17.103 / 55. 17.103
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libavresample   3.  0.  0 /  3.  0.  0
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, webm_dash_manifest, from 'stream.hdr':
     Metadata:
       encoder         : Lavf57.25.100
     Duration: N/A, bitrate: N/A
       Stream #0:0: Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 30 fps, 30 tbr, 1k tbn, 1k tbc (default)
       Metadata:
         webm_dash_manifest_file_name: stream.hdr
         webm_dash_manifest_track_number: 1
    Output #0, webm_dash_manifest, to 'stream_manifest.mpd':
     Metadata:
       encoder         : Lavf57.25.100
       Stream #0:0: Video: vp8, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 30 fps, 30 tbr, 1k tbn, 1k tbc (default)
       Metadata:
         webm_dash_manifest_file_name: stream.hdr
         webm_dash_manifest_track_number: 1
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
    Could not write header for output file #0 (incorrect codec parameters ?): Invalid data found when processing input
    frame=   67 fps= 33 q=0.0 size
    frame=   82 fps= 32 q=0.0 size=N/A time=00:00:02.73 bitrate=N/A dup=5 drop=
    frame=   97 fps= 32 q=0.0 size=N/A time=00:00:03.23 bitrate=N/A dup=5 drop=
    frame=  112 fps= 32 q=0.0 size=N/A time=00:00:03.73 bitrate=N/A dup=5 ...