Recherche avancée

Médias (2)

Mot : - Tags -/plugins

Autres articles (112)

  • Script d’installation automatique de MediaSPIP

    25 avril 2011, par

    Afin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
    Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
    La documentation de l’utilisation du script d’installation (...)

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

Sur d’autres sites (10978)

  • FFmpeg with Pipe - how can I periodically grab real-time frames out of live streams in C# ?

    2 mars 2020, par BBy

    I am new to FFmpeg and C# and I want grab frames to do image processing with IP Camera.

    I have made the following C# class and I could get a single frame from IP Camera.

    class FFmpegHandler
       {
           public Process ffmpeg = new Process();
           public Image image;

           public Image init()
           {
               ffmpeg = new Process()
               {
                   StartInfo =
                   {
                       FileName = @"./ffmpeg/ffmpeg.exe",
                       //Arguments = "-i http://admin:@192.168.10.1/videostream.asf -an -f image2pipe -preset ultrafast -tune zerolatency -s 320x240 pipe:1", // Hangs
                       Arguments = "-i http://admin:@192.168.10.1/videostream.asf -vframes 1 -an -f image2pipe -preset ultrafast -tune zerolatency -s 320x240 pipe:1",
                       UseShellExecute = false,
                       RedirectStandardOutput = true,
                       RedirectStandardError = true,
                       CreateNoWindow = true,
                       WorkingDirectory = "./ffmpeg/"
                   }
               };

               ffmpeg.EnableRaisingEvents = true;
               ffmpeg.Start();

               var stream = ffmpeg.StandardOutput.BaseStream;
               var img = Image.FromStream(stream);
               //ffmpeg.WaitForExit();

               return img;
           }
       }  

    The problem is that I want to grab real-time (latest) images when I request.

    If I run FFmpegHandler.init(), it will take 2 seconds to give me delayed image output.

    I have tried removing argument -vframes 1, then it will hang after image = Image.FromStream(stream) ;.

    When I check the ffmpeg output directly, it looks like ffmpeg is keep building the stream

    frame=    6 fps=0.0 q=2.2 size=      25kB time=00:00:00.24 bitrate= 861.9kbits/s dup=4 drop=0 speed=0.435x    
    frame=   65 fps= 60 q=24.8 size=     140kB time=00:00:02.60 bitrate= 440.9kbits/s dup=4 drop=0 speed=2.41x    
    frame=   77 fps= 49 q=24.8 size=     161kB time=00:00:03.08 bitrate= 428.0kbits/s dup=4 drop=0 speed=1.95x    
    frame=   89 fps= 43 q=24.8 size=     182kB time=00:00:03.56 bitrate= 418.6kbits/s dup=4 drop=0 speed= 1.7x    
    frame=  102 fps= 39 q=24.8 size=     205kB time=00:00:04.08 bitrate= 410.7kbits/s dup=4 drop=0 speed=1.57x    
    frame=  116 fps= 37 q=24.8 size=     229kB time=00:00:04.64 bitrate= 404.2kbits/s dup=4 drop=0 speed=1.49x    
    frame=  128 fps= 35 q=24.8 size=     250kB time=00:00:05.12 bitrate= 399.8kbits/s dup=4 drop=0 speed=1.41x    
    frame=  142 fps= 34 q=24.8 size=     274kB time=00:00:05.68 bitrate= 395.7kbits/s dup=4 drop=0 speed=1.36x    
    frame=  156 fps= 33 q=24.8 size=     299kB time=00:00:06.24 bitrate= 392.3kbits/s dup=4 drop=0 speed=1.32x    
    frame=  169 fps= 32 q=24.8 size=     322kB time=00:00:06.76 bitrate= 389.7kbits/s dup=4 drop=0 speed=1.29x    
    frame=  182 fps= 32 q=24.8 size=     344kB time=00:00:07.28 bitrate= 387.4kbits/s dup=4 drop=0 speed=1.26x    
    frame=  195 fps= 31 q=24.8 size=     367kB time=00:00:07.80 bitrate= 385.5kbits/s dup=4 drop=0 speed=1.24x    
    frame=  208 fps= 31 q=24.8 size=     390kB time=00:00:08.32 bitrate= 383.8kbits/s dup=4 drop=0 speed=1.22x    
    frame=  221 fps= 30 q=24.8 size=     413kB time=00:00:08.84 bitrate= 382.3kbits/s dup=4 drop=0 speed=1.21x  

    How can I grab the latest frames out of this live-stream image ? (OR is there a thread-safe way to clean the stream and only get the latest frame when I request ?)

  • Faster Real Time A.R Drone Video Streaming

    5 septembre 2017, par mike

    I’ve attempted ffplay tcp://192.168.1.1:5555 to video stream from the AR Drone 2.0 ; however, the delay is way too high.

    My second attempt was with the following :

    var arDrone = require('ar-drone');
    var http    = require('http');

    console.log('Connecting png stream ...');

    var pngStream = arDrone.createClient().getPngStream();

    var lastPng;
    pngStream
     .on('error', console.log)
     .on('data', function(pngBuffer) {
       lastPng = pngBuffer;
     });

    var server = http.createServer(function(req, res) {
     if (!lastPng) {
       res.writeHead(503);
       res.end('Did not receive any png data yet.');
       return;
     }

     res.writeHead(200, {'Content-Type': 'image/png'});
     res.end(lastPng);
    });

    server.listen(8080, function() {
     console.log('Serving latest png on port 8080 ...');
    });

    This only streamed images. I had to refresh browser every second.

    My third option was using this option :

    var arDrone=require('ar-drone')
    var client= arDrone.createclient();
    require('ar-drone-png-stream')(client,{port:8000})

    It streamed a lot of images in a short amount of time. The delay is still significant and I’m looking for a video.

    Are there other approaches that will significantly lower the delay of the video stream ?

  • How to give real timestamp information to encoded frames inside mpeg1 container

    24 août 2021, par jackey balwani

    I referred following link for my implementation. https://ffmpeg.org/doxygen/trunk/muxing_8c_source.html
I am converting raw data rgb to yuv420 format through scaling and conversion apis available in FFMPEG and then passing the frames to MPEG1 encoder.
I observe that the encoded video plays too fast. Below is the code of encoding the frame and then writing it to output file.

    


      static int write_frame(AVFormatContext *fmt_ctx, AVCodecContext *c,
                    AVStream *st, AVFrame *frame)
  {
    int ret;

    // send the frame to the encoder
    ret = avcodec_send_frame(c, frame);
    if (ret < 0) {
     fprintf(stderr, "Error sending a frame to the encoder: %s\n",
             av_err2str(ret));
     exit(1);
   }

   while (ret >= 0) {
     AVPacket pkt = { 0 };

     ret = avcodec_receive_packet(c, &pkt);
     if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
         break;
     else if (ret < 0) {
         fprintf(stderr, "Error encoding a frame: %s\n", av_err2str(ret));
         exit(1);
     }

     /* rescale output packet timestamp values from codec to stream timebase */
     av_packet_rescale_ts(&pkt, c->time_base, st->time_base);
     pkt.stream_index = st->index;

     /* Write the compressed frame to the media file. */
     log_packet(fmt_ctx, &pkt);
     ret = av_interleaved_write_frame(fmt_ctx, &pkt);
     av_packet_unref(&pkt);
     if (ret < 0) {
         fprintf(stderr, "Error while writing output packet: %s\n", av_err2str(ret));
         exit(1);
     }
 }

 return ret == AVERROR_EOF ? 1 : 0;
}


    


    resulting mpeg video's playback time is very quick, hence video gets played so fast.

    


    so to match output video duration with input video coming from source, I am trying to pass following realtime information to Avframe structure before calling avcodec_send_frame() -

    


      

    1. realtime PTS value (current time + duration got through av_gettime() in microseconds) to AvFrame structure before calling avcodec_send_frame().

      


    2. 


    3. populating pkt_duration with time difference between frames (current_PTS - previous_PTS)

      


    4. 


    5. Removed this call av_packet_rescale_ts(&pkt, c->time_base, st->time_base) ; which is used after avcodec_receive_packet.

      


    6. 


    


    below highlighted code are the changes done for real time info-

    


    **static int64_t m_currVideopts = 0;
static int64_t m_prevVideopts = 0;**

static int write_frame(AVFormatContext *fmt_ctx, AVCodecContext *c, AVStream *st, AVFrame *frame)
{
  **int64_t pts = av_gettime();  // returns current time in micro seconds
  int64_t duration = 90000/STREAM_FRAME_RATE;    /*duration for first frame taken default as 
   3600 as we dont have reference frame to compare diff */
  pts = av_rescale_q(pts, (AVRational){1, 1000000}, st->time_base);    //pAvStream->time_base - stream time base (1:90000)
  if((m_prevVideopts > 0LL) && (pts > m_prevVideopts))
  {
    duration = pts - m_prevVideopts;
  }
  else if (pts < m_prevVideopts)
  {
    pts = m_prevVideopts + duration;
  }
  m_prevVideopts = pts;
  /* Going with the approach of pts value is equal to pts value for every  packet */
  frame->pts = m_currVideopts;    /*AV_NOPTS_VALUE; */
  m_currVideopts += duration;
  //pFfmpegVidCtx->next_pts = m_currVideopts;
  frame->pkt_duration = duration;**

 // send the frame to the encoder
 ret = avcodec_send_frame(c, frame);
 if (ret < 0) {
     fprintf(stderr, "Error sending a frame to the encoder: %s\n",
             av_err2str(ret));
     exit(1);
 }
 ....
 // receive the packet avcodec_receive_packet()
 ...
 
 // removed or commented av_packet_rescale_ts
 **/* av_packet_rescale_ts(&pkt, c->time_base, st->time_base) */**
 ret = av_interleaved_write_frame(fmt_ctx, &pkt);


    


    }

    


    with the above changes, video is not playing proper.
There are couple of issues with respect to total duration (some time it is not proper on player) and also some frames are getting dropped or lost while playing in vlc or media player.

    


    enter image description here

    


    I am unable to find the cause of these frames loss while playing. Is it the correct way of passing real time pts information to the encoder or any mistake in above code.
Any suggestion would help me proceed further,
thanks.