Recherche avancée

Médias (1)

Mot : - Tags -/epub

Autres articles (33)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

Sur d’autres sites (7957)

  • Pass camera input to ffmpeg

    30 mai 2017, par Sachin Sonone

    I am trying to develop an delphi app which will show a preview of the camera in the delphi form and then send the frames of video as images to the ffmpeg in the stdin of ffmpeg. ffmpeg then broadcast the video to wowza server over rtmp. but fmpeg gives an error ’pipe:0 Bad file descriptor’

       FFMPEG command:
       ffmpeg -f image2pipe -i pipe:0  -c:v libx264 -y -report -f flv wowza_url

    I followed this https://msdn.microsoft.com/en-us/library/ms682499(VS.85).aspx article to build the parent app which will open ffmpeg as a child and use FFmpeg’s stdin to pass input

      // Create a pipe for the child process's STDOUT.
    Createpipe(STDOut_ReadPipe, STDOut_WritePipe, @Security,0);

     // Ensure the read handle to the pipe for STDOUT is not inherited.
    SetHandleInformation(STDOut_ReadPipe,HANDLE_FLAG_INHERIT,0);

    // Create a pipe for the child process's STDIN.

    if (  CreatePipe(STDIn_ReadPipe , STDIn_WritePipe, @Security, 0) = true) then


    // Ensure the write handle to the pipe for STDIN is not inherited.

    SetHandleInformation(STDIn_WritePipe, HANDLE_FLAG_INHERIT, 0)  ;

    This the code to start ffmpeg process

    FillChar(StartupInfo,Sizeof(StartupInfo),0);
    StartupInfo.cb := Sizeof(StartupInfo);
    StartupInfo.dwFlags := STARTF_USESTDHANDLES;


    StartupInfo.wShowWindow := 1;
    S := args;
    UniqueString(S);

    StartupInfo.hStdError := STDOut_WritePipe;
    StartupInfo.hStdOutput := STDOut_WritePipe;
    StartupInfo.hStdInput := STDIn_ReadPipe;

    but as soon as program start ffmpeg exits with the error.

    This is the full console log of the ffmpeg

    ffmpeg started on 2017-05-30 at 18:45:36
    Report written to "ffmpeg-20170530-184536.log"
    Command line:
    "ffmpeg.exe" -f image2pipe -i pipe:0 -report -movflags +faststart -tune
    zerolatency -an -pix_fmt yuv420p -preset ultrafast -vcodec libx264 -b:v 250k
    -y -f flv test.flv
    ffmpeg version N-85750-ga75ef15 Copyright (c) 2000-2017 the FFmpeg
    developers
    built with gcc 6.3.0 (GCC)
    configuration: --enable-gpl --enable-version3 --enable-cuda --enable-cuvid
    --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc --enable-
    avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls
    --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-
    libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-
     libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb
     --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --
    enable-libopus --enable-librtmp --enable-libsnappy --enable-libsoxr --
     enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab
    --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-
        libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-
      libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-zlib
     libavutil      55. 61.100 / 55. 61.100
     libavcodec     57. 93.100 / 57. 93.100
     libavformat    57. 72.101 / 57. 72.101
     libavdevice    57.  7.100 / 57.  7.100
     libavfilter     6. 88.100 /  6. 88.100
     libswscale      4.  7.101 /  4.  7.101
     libswresample   2.  8.100 /  2.  8.100
     libpostproc    54.  6.100 / 54.  6.100
     Splitting the commandline.
      Reading option '-f' ... matched as option 'f' (force format) with
       argument 'image2pipe'.
      Reading option '-i' ... matched as input url with argument 'pipe:0'.
     Reading option '-report' ... matched as option 'report' (generate a
     report) with argument '1'.
     Reading option '-movflags' ... matched as AVOption 'movflags' with
     argument '+faststart'.
     Reading option '-tune' ... matched as AVOption 'tune' with argument
     'zerolatency'.
      Reading option '-an' ... matched as option 'an' (disable audio) with
      argument '1'.
      Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel
      format)
     with argument 'yuv420p'.
    Reading option '-preset' ... matched as AVOption 'preset' with argument
    'ultrafast'.
    Reading option '-vcodec' ... matched as option 'vcodec' (force video codec
    ('copy' to copy stream)) with argument 'libx264'.
     Reading option '-b:v' ... matched as option 'b' (video bitrate (please use -
    b:v)) with argument '250k'.
     Reading option '-y' ... matched as option 'y' (overwrite output files)
    with
    argument '1'.
    Reading option '-f' ... matched as option 'f' (force format) with argument
    'flv'.
    Reading option 'test.flv' ... matched as output url.
    Finished splitting the commandline.
    Parsing a group of options: global .
     Applying option report (generate a report) with argument 1.
    Applying option y (overwrite output files) with argument 1.
    Successfully parsed a group of options.
    Parsing a group of options: input url pipe:0.
    Applying option f (force format) with argument image2pipe.
     Successfully parsed a group of options.
    Opening an input file: pipe:0.
    [pipe @ 003bdea0] Setting default whitelist 'crypto'
    [AVIOContext @ 03115e60] Statistics: 0 bytes read, 0 seeks
    pipe:0: Bad file descriptor'
  • strange artifacts in image - where does a frame start ?

    26 mai 2017, par user3387542

    We are live broadcasting a webcam stream. No audio, video only. The current command that works great :

    # Direct replay works well:
    ffmpeg -f video4linux2 -r 10 -s 1280x720 -i /dev/video0 -c:v rawvideo -f rawvideo -pix_fmt yuv420p - | \
    ffplay -f rawvideo -pixel_format yuv420p -video_size 1280x720 -framerate 10 -

    but as soon as we try to send this data over the network (udp broadcast / gigabit lan) we are getting strange artefacts into the image.

    # server command:
    ffmpeg -f video4linux2 -r 10 -s 1280x720 -i /dev/video0 -c:v rawvideo -f rawvideo -pix_fmt yuv420p - | \
    socat - UDP-DATAGRAM:10.0.0.255:12345,broadcast

    # client command:
    socat -u udp-recv:12345,reuseaddr - | \
    ffplay -f rawvideo -pixel_format yuv420p -video_size 1280x720 -framerate 10 -

    Where do these artifacts come from and how to get rid of them ? Does this has something to do with the client not knowing where a certain video frame starts ?

    We have chosen to stream raw video to reduce latency. The final goal would be to apply opencv tools to the video and react live depending on the situation. Which works great, as long as the camera is plugged in directly into this computer. But we need to set it apart and need multiple clients.

    enter image description here

    The camera used is a Microsoft® LifeCam Studio(TM).

    $ v4l2-ctl -d 0 --list-formats
    ioctl: VIDIOC_ENUM_FMT
    Index       : 0
    Type        : Video Capture
    Pixel Format: 'YUYV'
    Name        : YUYV 4:2:2

    Index       : 1
    Type        : Video Capture
    Pixel Format: 'MJPG' (compressed)
    Name        : Motion-JPEG

    Index       : 2
    Type        : Video Capture
    Pixel Format: 'M420'
    Name        : YUV 4:2:0 (M420)

    Update

    To narrow down the issue, I tried to split it up into different tasks :

    1.0. Writing the stream to a file :

    ffmpeg -f video4linux2 -r 10 -s 1280x720 -i /dev/video0 -c:v rawvideo -f rawvideo -pix_fmt yuv420p - > ~/deltemp/rawout

    1.1. Reading the file : The result looks great, no artefacts :

    cat ~/deltemp/rawout | ffplay -f rawvideo -pixel_format yuv420p -video_size 1280x720 -framerate 10 -

    2.0 Starting the stream and broadcasting the stream as mentioned in the server command above

    2.1 Writing the UDP stream to a file. And watching the file (artifacts are back again)

    socat -u udp-recv:12345,reuseaddr - > ~/deltemp/rawout
    cat ~/deltemp/rawout | ffplay -f rawvideo -pixel_format yuv420p -video_size 1280x720 -framerate 10 -

    As test 1 showed no artifacts and test 2 did, it must be something with udp packet loss.

    Test 3 : Reducing quality to 640x480 did not help either.

  • Has anyone used Azure vm's for gpu-accelerated video enc/decoding with FFmpeg/libav ?

    4 mai 2017, par user3776020

    I am trying to use NVIDIA hardware acceleration with FFmpeg/libav on a Microsoft Azure GPU vm (any of the new N-series). I am able to get it working with a comparable AWS vm, but am not having luck with Azure.

    I have a different question posted here detailing the specific problem I’m running into (Trouble with hardware-assisted encoding/decoding via FFmpeg on Azure GPU vm’s (ubuntu 16.04)), but have since realized that the better question to ask is if anyone has been able to achieve it, full-stop.

    If so, would you please share the details of your environment (type of vm, os version, installation commands, etc). Thanks !