Recherche avancée

Médias (1)

Mot : - Tags -/blender

Autres articles (63)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (7600)

  • FFMPEG to create an MPEG-DASH stream with VP8

    24 avril 2017, par Kenneth Worden

    I’m trying to use FFMPEG to stream a live video feed from my webcam /dev/video0. Following scattered tutorials and scarce documentation (is this a known problem for the encoding community ?) I arrived at the following bash script :

    #!/bin/bash

    ffmpeg \
       -y \
       -f v4l2 \
           -i /dev/video0 \
           -s 640x480 \
           -input_format mjpeg \
           -r 24 \
       -map 0:0 \
       -pix_fmt yuv420p \
       -codec:v libvpx \
           -s 640x480 \
           -threads 4 \
           -b:v 50k \
           -tile-columns 4 \
           -frame-parallel 1 \
           -keyint_min 24 -g 24 \
       -f webm_chunk \
           -header "stream.hdr" \
           -chunk_start_index 1 \
       stream_%d.chk &

    sleep 2

    ffmpeg \
       -f webm_dash_manifest -live 1 \
       -i stream.hdr \
       -c copy \
       -map 0 \
       -f webm_dash_manifest -live 1 \
           -adaptation_sets "id=0,streams=0" \
           -chunk_start_index 1 \
           -chunk_duration_ms 1000 \
           -time_shift_buffer_depth 30000 \
           -minimum_update_period 60000 \
       stream_manifest.mpd

    When I run this script, my webcam light turns on, the stream.hdr and stream_manifest.mpd files are written, and chunks start to be created (i.e. stream_1.chk, stream_2.chk, etc...). However, FFMPEG throws the following error :

    Could not write header for output file #0 (incorrect codec parameters
     ?) : Invalid data found when processing input

    I will explain what I think I am doing with this script, and hopefully this will expose any errors in my thinking.

    First, we invoke FFMPEG to use Video for Linux 2 (v4l2) to read from my webcam (/dev/video0) of a resolution 640x480. The input format is mjpeg with a framerate of 24fps.

    I then declare that FFMPEG should "map" (copy) the video stream output by v4l2 to a file. I specify the pixel format (YUV420P) and use libvpx (VP8 encoding) to encode the video stream. I set the size to be 640x480, use 4 threads, set the bitrate to be 50kbps, do some magic with tile-columns and frame-parallel options, and set the I-frames to be 24 frames apart.

    I then create a stream.hdr file. The starting index is 1. This command continues to run infinitely until I kill it, grabbing new video from my webcam and outputting it into chunks.

    I then sleep for 2 seconds to give the previous command time to generate a header file.

    And that’s really it. The next invocation of FFMPEG simply creates the MPEG-DASH manifest file given the header generated in the previous step.

    So what’s going on ? Why can I not view the video in a web browser (I’m using Dash.js) ? I serve the manifest, header, and chunks on a Node.js server so that trivial issue is not the problem.


    Edit : Here is my full console output.

    ffmpeg version 3.0.7-0ubuntu0.16.10.1 Copyright (c) 2000-2017 the FFmpeg developers
     built with gcc 6.2.0 (Ubuntu 6.2.0-5ubuntu12) 20161005
     configuration: --prefix=/usr --extra-version=0ubuntu0.16.10.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librubberband --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-chromaprint --enable-libx264
     libavutil      55. 17.103 / 55. 17.103
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libavresample   3.  0.  0 /  3.  0.  0
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    [video4linux2,v4l2 @ 0x55847e244ea0] The driver changed the time per frame from 1/24 to 1/30
    [mjpeg @ 0x55847e245c00] Changing bps to 8
    Input #0, video4linux2,v4l2, from '/dev/video0':
     Duration: N/A, start: 64305.102081, bitrate: N/A
       Stream #0:0: Video: mjpeg, yuvj422p(pc, bt470bg/unknown/unknown), 640x480, -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
    Codec AVOption frame-parallel (Enable frame parallel decodability features) specified for output file #0 (stream_%d.chk) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
    Codec AVOption tile-columns (Number of tile columns to use, log2) specified for output file #0 (stream_%d.chk) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
    [swscaler @ 0x55847e24b720] deprecated pixel format used, make sure you did set range correctly
    [libvpx @ 0x55847e248a20] v1.5.0
    Output #0, webm_chunk, to 'stream_%d.chk':
     Metadata:
       encoder         : Lavf57.25.100
       Stream #0:0: Video: vp8 (libvpx), yuv420p, 640x480, q=-1--1, 50 kb/s, 30 fps, 30 tbn, 30 tbc
       Metadata:
         encoder         : Lavc57.24.102 libvpx
       Side data:
         unknown side data type 10 (24 bytes)
    Stream mapping:
     Stream #0:0 -> #0:0 (mjpeg (native) -> vp8 (libvpx))
    Press [q] to stop, [?] for help
    frame=   21 fps=0.0 q=0.0 size=N/A time=00:00:00.70 bitrate=N/A dup=5 drop=frame=   36 fps= 35 q=0.0 size=N/A time=00:00:01.20 bitrate=N/A dup=5 drop=frame=   51 fps= 33 q=0.0 size=N/A time=00:00:01.70 bitrate=N/A dup=5 drop=ffmpeg version 3.0.7-0ubuntu0.16.10.1 Copyright (c) 2000-2017 the FFmpeg developers
     built with gcc 6.2.0 (Ubuntu 6.2.0-5ubuntu12) 20161005
     configuration: --prefix=/usr --extra-version=0ubuntu0.16.10.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librubberband --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-chromaprint --enable-libx264
     libavutil      55. 17.103 / 55. 17.103
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libavresample   3.  0.  0 /  3.  0.  0
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, webm_dash_manifest, from 'stream.hdr':
     Metadata:
       encoder         : Lavf57.25.100
     Duration: N/A, bitrate: N/A
       Stream #0:0: Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 30 fps, 30 tbr, 1k tbn, 1k tbc (default)
       Metadata:
         webm_dash_manifest_file_name: stream.hdr
         webm_dash_manifest_track_number: 1
    Output #0, webm_dash_manifest, to 'stream_manifest.mpd':
     Metadata:
       encoder         : Lavf57.25.100
       Stream #0:0: Video: vp8, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 30 fps, 30 tbr, 1k tbn, 1k tbc (default)
       Metadata:
         webm_dash_manifest_file_name: stream.hdr
         webm_dash_manifest_track_number: 1
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
    Could not write header for output file #0 (incorrect codec parameters ?): Invalid data found when processing input
    frame=   67 fps= 33 q=0.0 size
    frame=   82 fps= 32 q=0.0 size=N/A time=00:00:02.73 bitrate=N/A dup=5 drop=
    frame=   97 fps= 32 q=0.0 size=N/A time=00:00:03.23 bitrate=N/A dup=5 drop=
    frame=  112 fps= 32 q=0.0 size=N/A time=00:00:03.73 bitrate=N/A dup=5 ...
  • Stremio::FFMPEG Error while opening encoder for output stream #0:1

    28 février 2015, par user2547496

    Getting this error using Stremio::FFMPEG wrapper for converting /MOV to .MP4

    Error while opening encoder for output stream #0:1 - maybe incorrect     parameters such as bit_rate, rate, width or height

    Here are my commands

    require 'streamio-ffmpeg'
    movie = FFMPEG::Movie.new("tmp/movie.ogg")
    movie.transcode("tmp/movie.mp4")

    Here’s the output =>

    Running transcoding...
    ffmpeg -y -i tmp/movie.ogg  tmp/movie.mp4

    E, [2015-02-28T03:06:00.797778 #6381] ERROR -- : Failed encoding...
    ffmpeg -y -i tmp/movie.ogg  tmp/movie.mp4

    Output #0, mp4, to 'tmp/movie.mp4':
       Stream #0:0: Video: h264, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=-1--1, 90k tbn, 24 tbc

    Stream mapping:
     Stream #0:0 -> #0:0 (theora -> libx264)
     Stream #0:1 -> #0:1 (flac -> libaacplus)
    Error while opening encoder for output stream #0:1 - maybe incorrect parameters such as bit_rate, rate, width or height

    Errors: encoded file is invalid.

    Stream mapping:
     Stream #0:0 -> #0:0 (theora -> libx264)
     Stream #0:1 -> #0:1 (flac -> libaacplus)
    Error while opening encoder for output stream #0:1 - maybe incorrect parameters such as bit_rate, rate, width or height
  • How to pipe output from ffmpeg

    27 février 2015, par Andrew Simpson

    I have a c# desktop app.

    I want to stream an RTSP feed from my camera to my app, and in my app save jpegs as they come through.

    I use mpdecimate to restrict frames that are changes to the previous frame.

    If I open a DOS windows and type in :

    ffmpeg -i rtsp://admin:admin@192.168.0.17:554/video_0 -vf mpdecimate -r 10 output.avi 2>log.txt

    it works.

    if I put this into a C# app the log file reports :

      ffmpeg version N-69779-g2a72b16 Copyright (c) 2000-2015 the FFmpeg developers
     built with gcc 4.9.2 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-lzma --enable-decklink --enable-zlib
     libavutil      54. 18.100 / 54. 18.100
     libavcodec     56. 22.100 / 56. 22.100
     libavformat    56. 21.100 / 56. 21.100
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 11.100 /  5. 11.100
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, rtsp, from 'rtsp://admin:admin@192.168.0.17:554/video_0':
     Metadata:
       title           : YOKO Live Streaming
       comment         : video_0
     Duration: N/A, start: 0.000000, bitrate: N/A
       Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 352x288 [SAR 1:1 DAR 11:9], 90k tbr, 90k tbn, 90k tbc
    [NULL @ 04d6a1e0] Unable to find a suitable output format for 'mpdecimate'
    mpdecimate: Invalid argument

    So, I tweaked it with things I thought would make it work but still teh same error.

    This is my last attempt at this :

    Process process = new Process();
    {
       FileStream baseStream = null;  
       string arguments = "-i rtsp://admin:admin@192.168.0.17:554/video_0 -vf image2pipe mpdecimate -r 10 - 0";
       string file = @"C:\bin\ffmpeg.exe";
       process.StartInfo.FileName = file;
       process.StartInfo.Arguments = arguments;
       process.StartInfo.CreateNoWindow = true;
       process.StartInfo.RedirectStandardError = true;
       process.StartInfo.RedirectStandardOutput = true;
       process.StartInfo.UseShellExecute = false;
       Task.Run(() =>
       {
           try
           {
               process.Start();
               byte[] buffer = new byte[200000];
               var error = process.StandardError.ReadToEnd();
               baseStream = process.StandardOutput.BaseStream as FileStream;
               bool gotHeader = false;
               bool imgFound = false;
               int counter = 0;
               while (true)
               {
                   byte bit1 = (byte)baseStream.ReadByte();
                   buffer[counter] = bit1;
                   if (bit1 == 217 && gotHeader)
                   {
                       if (buffer[counter - 1] == 255)
                       {
                           byte[] jpeg = new byte[counter];
                           Buffer.BlockCopy(buffer, 0, jpeg, 0, counter);
                           using (MemoryStream ms = new MemoryStream(jpeg))
                           {
                               pictureBox1.Image = Image.FromStream(ms);
                               ms.Close();
                           }
                           imgFound = true;
                       }
                   }
                   else if (bit1 == 216)
                   {
                       if (buffer[counter - 1] == 255)
                       {
                           gotHeader = true;
                       }
                   }
                   if (imgFound)
                   {
                       counter = 0;
                       buffer = new byte[200000];
                       imgFound = false;
                       gotHeader = false;
                   }
                   else
                   {
                       counter++;
                   }
               }
           }
           catch (Exception ex2)
           {
               //log error here
           }
       });
    }
    catch (Exception ex)
    {
       //log error here
    }

    any suggestion I will glady try out...