Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (32)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

Sur d’autres sites (4432)

  • Colors are not correct for FFMPEG videos

    23 août 2016, par Mohammad Abu Musa

    I am writing a screen recorder, I managed to record the videos and export them as webm but I have a color and timing issue(which I am working on).

    For coloring the video frames I get are formatted with FORMAT_I420 which I transfer them to YUV sources and encode them as the following code

    void EncoderInstance::OnGetFrame(int32_t result, pp::VideoFrame frame) {
       if (result != PP_OK)
           return;

       //const uint8_t* data = static_cast<const>(frame.GetDataBuffer());
       unsigned char* data = static_cast<unsigned>(frame.GetDataBuffer());
       pp::Size size;
       frame.GetSize(&amp;size);
       uint32_t buffersize = frame.GetDataBufferSize();
       //frame.GetFormat() == PP_VIDEOFRAME_FORMAT_BGRA);
       //Logger::Log("Format is :%d", frame.GetFormat());
       //PP_VIDEOFRAME_FORMAT_I420



       Logger::Log("stream received is: %02X", data);

       if (is_recording_) {
           vpx_codec_iter_t iter = NULL;
           const vpx_codec_cx_pkt_t *pkt;


           int iSizeY = out_width * out_height;
           int iSizeUV = (out_width * out_height)/4;
           unsigned char* pY = data;
           unsigned char* pU = data + iSizeY; // there are width * height Y components
           unsigned char* pV = data + iSizeUV;    // skip the U components

           Logger::Log("pY: %02X", pY);
           Logger::Log("pU: %02X", pU);
           Logger::Log("pV: %02X", pV);


           // copy the pixels into our "raw input" container.
           int bytes_filled = avpicture_fill(&amp;pic_raw, NULL, AV_PIX_FMT_YUV420P, out_width, out_height);
           pic_raw.data[0] = pY;
           pic_raw.data[1] = pU;
           pic_raw.data[2] = pU;

           Logger::Log("bytes filled: %d", bytes_filled);

           if(!bytes_filled) {
               Logger::Log("Cannot fill the raw input buffer");
               return;
           }

           // convert to i420 for vp8
           int h = sws_scale(sws, pic_raw.data, pic_raw.linesize, 0, out_height, raw.planes, raw.stride);

           if(h != out_height) {
               Logger::Log("scale failed: %d", h);
               return;
           }



           Logger::Log("Picture Raw data is: %d", pic_raw.data);

           if(vpx_codec_encode(&amp;codec, &amp;raw, frame_cnt, 1, flags, VPX_DL_REALTIME))
                 die_codec(&amp;codec, "Failed to encode frame");

           while( (pkt = vpx_codec_get_cx_data(&amp;codec, &amp;iter)) ) {
               switch(pkt->kind) {
                   case VPX_CODEC_CX_FRAME_PKT:
                       Logger::Log("Compressed buffer is %02X:",pkt->data.frame.buf);
                       Logger::Log("Compressed buffer size %02X:",pkt->data.frame.sz);

                       glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&amp;EncoderInstance::write_ivf_frame_header, pkt));
                       glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&amp;EncoderInstance::WriteFile, pkt));
                       break;
                   default:break;
               }
           }

           frame_cnt++;
       }

       video_track_.RecycleFrame(frame);
       if (need_config_) {
           ConfigureTrack();
           need_config_ = false;
       } else {
           video_track_.GetFrame(
                   callback_factory_.NewCallbackWithOutput(
                           &amp;EncoderInstance::OnGetFrame));
       }
    }
    </unsigned></const>

    Here is a screenshot of an image I got from the video generated Colors are not correct, this is what I want to fix

  • Video encoding green screen

    22 août 2016, par Mohammad Abu Musa

    I am building a screen recorder, the input stream is formatted with PP_VIDEOFRAME_FORMAT_I420 and the output is formatted with AV_PIX_FMT_YUV420P below is the code I use to do the conversion

    const uint8_t* data = static_cast<const>(frame.GetDataBuffer());
       pp::Size size;
       frame.GetSize(&amp;size);
       uint32_t buffersize = frame.GetDataBufferSize();


       if (is_recording_) {
           vpx_codec_iter_t iter = NULL;
           const vpx_codec_cx_pkt_t *pkt;
           // copy the pixels into our "raw input" container.
           int bytes_filled = avpicture_fill(&amp;pic_raw, data, AV_PIX_FMT_YUV420P, out_width, out_height);
           if(!bytes_filled) {
               Logger::Log("Cannot fill the raw input buffer");
               return;
           }

           if(vpx_codec_encode(&amp;codec, &amp;raw, frame_cnt, 1, flags, VPX_DL_REALTIME))
                 die_codec(&amp;codec, "Failed to encode frame");

           while( (pkt = vpx_codec_get_cx_data(&amp;codec, &amp;iter)) ) {
               switch(pkt->kind) {
                   case VPX_CODEC_CX_FRAME_PKT:
                       glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&amp;EncoderInstance::write_ivf_frame_header, pkt));
                       glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&amp;EncoderInstance::WriteFile, pkt));
                       break;
                   default:break;
               }
           }

           frame_cnt++;
    </const>

    I have three questions :

    1- Is the conversion is done correctly ? do I have to investigate image formats more ? are data channels mapped correctly.

    2- What causes the green screen to show ? what does it mean ?

    3- Is this a thread issue ? I mean is data passed correctly and the conversion is done correctly but the threads are racing

  • Passing pointer to thread via message_loop

    21 août 2016, par Mohammad Abu Musa

    I am using FFMPEG to encode a video, I got a green screen for the packets which I think it means I am getting empty buffer. I suspect I am passing the parameters wrong. I want help in getting the parameters passed correctly.

       vpx_codec_iter_t iter = NULL;
       const vpx_codec_cx_pkt_t *pkt;
       // copy the pixels into our "raw input" container.
       int bytes_filled = avpicture_fill(&amp;pic_raw, data, AV_PIX_FMT_YUV420P, out_width, out_height);
       if(!bytes_filled) {
           Logger::Log("Cannot fill the raw input buffer");
           return;
       }

       if(vpx_codec_encode(&amp;codec, &amp;raw, frame_cnt, 1, flags, VPX_DL_REALTIME))
             die_codec(&amp;codec, "Failed to encode frame");

       while( (pkt = vpx_codec_get_cx_data(&amp;codec, &amp;iter)) ) {
           switch(pkt->kind) {
               case VPX_CODEC_CX_FRAME_PKT:
                   glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&amp;EncoderInstance::write_ivf_frame_header, pkt));
                   glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&amp;EncoderInstance::WriteFile, pkt));
                   break;
               default:break;
           }
       }

    for the event handler

    void EncoderInstance::WriteFile(int32_t result, const vpx_codec_cx_pkt_t *pkt){
       lock_file_.Acquire();
       (void) fwrite(pkt->data.frame.buf, 1, pkt->data.frame.sz, outfile);
       Logger::Log("Packet written");
       lock_file_.Release();

    }

    My question is, am I passing the argument pkt correctly ? or should I put these packets into a queue then let the writeFile function work on the queue.