Recherche avancée

Médias (0)

Mot : - Tags -/masques

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (39)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

Sur d’autres sites (7815)

  • Creating a usable H.264 video file

    4 mai 2019, par Ethan McTague

    I am trying to use libavcodec to generate an mp4 video file from individual frames. Each input frame is a qt QImage, and the output file is written to using the Qt QFile class.

    I’ve done this through a VideoTarget class which opens the given ’target’ file when initialized, records frames when addFrame(image) is called, and then saves/closes the file when its destructor is called.

    The class has the following fields :

    AVCodec* m_codec = nullptr;
    AVCodecContext *m_context = nullptr;
    AVPacket* m_packet = nullptr;
    AVFrame* m_frame = nullptr;

    QFile m_target;

    And looks like this :

    VideoTarget::VideoTarget(QString target, QObject *parent) : QObject(parent), m_target(target)
    {
       // Find video codec
       m_codec = avcodec_find_encoder_by_name("libx264rgb");
       if (!m_codec) throw std::runtime_error("Unable to find codec.");

       // Make codec context
       m_context = avcodec_alloc_context3(m_codec);
       if (!m_context) throw std::runtime_error("Unable to allocate codec context.");

       // Make codec packet
       m_packet = av_packet_alloc();
       if (!m_packet) throw std::runtime_error("Unable to allocate packet.");

       // Configure context
       m_context->bit_rate = 400000;
       m_context->width = 1280;
       m_context->height = 720;
       m_context->time_base = (AVRational){1, 60};
       m_context->framerate = (AVRational){60, 1};
       m_context->gop_size = 10;
       m_context->max_b_frames = 1;
       m_context->pix_fmt = AV_PIX_FMT_RGB24;

       if (m_codec->id == AV_CODEC_ID_H264)
           av_opt_set(m_context->priv_data, "preset", "slow", 0);

       // Open Codec
       int ret = avcodec_open2(m_context, m_codec, nullptr);
       if (ret < 0) {
           throw std::runtime_error("Unable to open codec.");
       }

       // Open file
       if (!m_target.open(QIODevice::WriteOnly))
           throw std::runtime_error("Unable to open target file.");

       // Allocate frame
       m_frame = av_frame_alloc();
       if (!m_frame) throw std::runtime_error("Unable to allocate frame.");

       m_frame->format = m_context->pix_fmt;
       m_frame->width = m_context->width;
       m_frame->height = m_context->height;
       m_frame->pts = 0;

       ret = av_frame_get_buffer(m_frame, 24);
       if (ret < 0) throw std::runtime_error("Unable to allocate frame buffer.");
    }

    void VideoTarget::addFrame(QImage &image)
    {
       // Ensure frame data is writable
       int ret = av_frame_make_writable(m_frame);
       if (ret < 0) throw std::runtime_error("Unable to make frame writable.");

       // Prepare image
       for (int y = 0; y < m_context->height; y++) {
           for (int x = 0; x < m_context->width; x++) {
               auto pixel = image.pixelColor(x, y);
               int pos = (y * 1024 + x) * 3;
               m_frame->data[0][pos] = pixel.red();
               m_frame->data[0][pos + 1] = pixel.green();
               m_frame->data[0][pos + 2] = pixel.blue();
           }
       }

       m_frame->pts++;

       // Send the frame
       ret = avcodec_send_frame(m_context, m_frame);
       if (ret < 0) throw std::runtime_error("Unable to send AV frame.");

       while (ret >= 0) {
           ret = avcodec_receive_packet(m_context, m_packet);
           if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
               return;
           else if (ret < 0) throw std::runtime_error("Error during encoding.");

           m_target.write((const char*)m_packet->data, m_packet->size);
           av_packet_unref(m_packet);
       }
    }

    VideoTarget::~VideoTarget()
    {
       int ret = avcodec_send_frame(m_context, nullptr);
       if (ret < 0) throw std::runtime_error("Unable to send AV null frame.");

       while (ret >= 0) {
           ret = avcodec_receive_packet(m_context, m_packet);
           if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
               return;
           else if (ret < 0) throw std::runtime_error("Error during encoding.");

           m_target.write((const char*)m_packet->data, m_packet->size);
           av_packet_unref(m_packet);
       }

       // Magic number at the end of the file
       uint8_t endcode[] = { 0, 0, 1, 0xb7 };
       m_target.write((const char*)endcode, sizeof(endcode));
       m_target.close();

       // Free codec stuff
       avcodec_free_context(&m_context);
       av_frame_free(&m_frame);
       av_packet_free(&m_packet);
    }

    When used, the class seems to work, and data is written to the file, except I am unable to play back the resulting file in any application.

    My main suspect is these lines :

       // Prepare image
       for (int y = 0; y < m_context->height; y++) {
           for (int x = 0; x < m_context->width; x++) {
               auto pixel = image.pixelColor(x, y);
               int pos = (y * 1024 + x) * 3;
               m_frame->data[0][pos] = pixel.red();
               m_frame->data[0][pos + 1] = pixel.green();
               m_frame->data[0][pos + 2] = pixel.blue();
           }
       }

    The libavcodec documentation was extremely vague regarding the layout of image data, so I effectively had to guess and be happy with the first thing that didn’t crash, so chances are I’m writing this incorrectly. There’s also the issue of size mismatch between my pixel color data calls (giving int values) and the 24-bits-per-pixel RGB format I have selected.

    How do I tweak this code to output actual, functioning video files ?

  • avfilter/graphparser : allow specifying filter@id as filter instance

    17 mai 2017, par Muhammad Faiz
    avfilter/graphparser : allow specifying filter@id as filter instance
    

    See http://lists.ffmpeg.org/pipermail/ffmpeg-user/2017-April/035975.html
    Parsed_filter_X could remain and user can override it with custom one.

    Example :
    ffplay -f lavfi "nullsrc=s=640x360,
    sendcmd='1 drawtext@top reinit text=Hello ; 2 drawtext@bottom reinit text=World',
    drawtext@top=x=16:y=16:fontsize=20:fontcolor=Red:text='',
    drawtext@bottom=x=16:y=340:fontsize=16:fontcolor=Blue:text=''"

    Reviewed-by : Paul B Mahol <onemda@gmail.com>
    Signed-off-by : Muhammad Faiz <mfcc64@gmail.com>

    • [DH] doc/filters.texi
    • [DH] libavfilter/graphparser.c
  • unable to successfully close ffmpeg stream in node.js

    25 septembre 2022, par K. Russell Smith

    i'm trying to write a node video app that generates frames using the canvas api (via node-canvas, the project's only npm dependancy right now), and writes it to ffmpeg via a stream to generate a video :

    &#xA;

    const { createCanvas } = require(&#x27;canvas&#x27;);&#xA;const { spawn } = require(&#x27;child_process&#x27;);&#xA;const fs = require(&#x27;fs&#x27;);&#xA;const canvas = createCanvas(1280, 720);&#xA;&#xA;const ffmpeg = spawn(&#x27;ffmpeg&#x27;, [&#xA;    &#x27;-y&#x27;,&#xA;    &#x27;-f&#x27;, &#x27;rawVideo&#x27;,&#xA;    &#x27;-vcodec&#x27;, &#x27;rawVideo&#x27;,&#xA;    &#x27;-pix_fmt&#x27;, &#x27;rgb24&#x27;,&#xA;    &#x27;-s&#x27;, `${ canvas.width }x${ canvas.height }`,&#xA;    &#x27;-r&#x27;, &#x27;40&#x27;,&#xA;    &#x27;-i&#x27;, &#x27;-&#x27;, &#x27;-f&#x27;, &#x27;mp4&#x27;,&#xA;    &#x27;-q:v&#x27;, &#x27;5&#x27;,&#xA;    &#x27;-an&#x27;, &#x27;-vcodec&#x27;, &#x27;mpeg4&#x27;, &#x27;output.mp4&#x27;,&#xA;]);&#xA;&#xA;const ctx = canvas.getContext(&#x27;2d&#x27;);&#xA;ctx.font = &#x27;30px Prime&#x27;;&#xA;ctx.fillStyle = &#x27;blue&#x27;;&#xA;ctx.textAlign = &#x27;center&#x27;;&#xA;ctx.textBaseline = &#x27;middle&#x27;;&#xA;ctx.fillText(&#x27;Hello Canvas&#x27;, canvas.width / 2, canvas.height / 2);&#xA;&#xA;for (let i = 0; i &lt; 250; &#x2B;&#x2B;i)&#xA;{&#xA;    console.log(i);&#xA;    ffmpeg.stdin.write(Buffer.from(ctx.getImageData(0, 0, canvas.width, canvas.height).data));&#xA;}&#xA;ffmpeg.stdin.end();&#xA;

    &#xA;

    unfortunately, when i run it, the program throws this after writing the frames :

    &#xA;

    node:events:368&#xA;      throw er; // Unhandled &#x27;error&#x27; event&#xA;      ^&#xA;&#xA;Error: write EPIPE&#xA;    at WriteWrap.onWriteComplete [as oncomplete] (node:internal/stream_base_commons:98:16)&#xA;Emitted &#x27;error&#x27; event on Socket instance at:&#xA;    at emitErrorNT (node:internal/streams/destroy:164:8)&#xA;    at emitErrorCloseNT (node:internal/streams/destroy:129:3)&#xA;    at processTicksAndRejections (node:internal/process/task_queues:83:21) {&#xA;  errno: -32,&#xA;  code: &#x27;EPIPE&#x27;,&#xA;  syscall: &#x27;write&#x27;&#xA;}&#xA;&#xA;Node.js v17.1.0&#xA;

    &#xA;

    what am i doing wrong ?

    &#xA;