Recherche avancée

Médias (1)

Mot : - Tags -/berlin

Autres articles (51)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (7178)

  • Fragmented MP4 - problem playing in browser

    12 juin 2019, par PookyFan

    I try to create fragmented MP4 from raw H264 video data so I could play it in internet browser’s player. My goal is to create live streaming system, where media server would send fragmented MP4 pieces to browser. The server would buffer input data from RaspberryPi camera, which sends video as H264 frames. It would then mux that video data and make it available for client. The browser would play media data (that were muxed by server and sent i.e. through websocket) by using Media Source Extensions.

    For test purpose I wrote the following pieces of code (using many examples I found in the intenet) :

    C++ application using avcodec which muxes raw H264 video to fragmented MP4 and saves it to a file :

    #define READBUFSIZE 4096
    #define IOBUFSIZE 4096
    #define ERRMSGSIZE 128

    #include <cstdint>
    #include <iostream>
    #include <fstream>
    #include <string>
    #include <vector>

    extern "C"
    {
       #include <libavformat></libavformat>avformat.h>
       #include <libavutil></libavutil>error.h>
       #include <libavutil></libavutil>opt.h>
    }

    enum NalType : uint8_t
    {
       //NALs containing stream metadata
       SEQ_PARAM_SET = 0x7,
       PIC_PARAM_SET = 0x8
    };

    std::vector outputData;

    int mediaMuxCallback(void *opaque, uint8_t *buf, int bufSize)
    {
       outputData.insert(outputData.end(), buf, buf + bufSize);
       return bufSize;
    }

    std::string getAvErrorString(int errNr)
    {
       char errMsg[ERRMSGSIZE];
       av_strerror(errNr, errMsg, ERRMSGSIZE);
       return std::string(errMsg);
    }

    int main(int argc, char **argv)
    {
       if(argc &lt; 2)
       {
           std::cout &lt;&lt; "Missing file name" &lt;&lt; std::endl;
           return 1;
       }

       std::fstream file(argv[1], std::ios::in | std::ios::binary);
       if(!file.is_open())
       {
           std::cout &lt;&lt; "Couldn't open file " &lt;&lt; argv[1] &lt;&lt; std::endl;
           return 2;
       }

       std::vector inputMediaData;
       do
       {
           char buf[READBUFSIZE];
           file.read(buf, READBUFSIZE);

           int size = file.gcount();
           if(size > 0)
               inputMediaData.insert(inputMediaData.end(), buf, buf + size);
       } while(!file.eof());
       file.close();

       //Initialize avcodec
       av_register_all();
       uint8_t *ioBuffer;
       AVCodec *codec = avcodec_find_decoder(AV_CODEC_ID_H264);
       AVCodecContext *codecCtxt = avcodec_alloc_context3(codec);
       AVCodecParserContext *parserCtxt = av_parser_init(AV_CODEC_ID_H264);
       AVOutputFormat *outputFormat = av_guess_format("mp4", nullptr, nullptr);
       AVFormatContext *formatCtxt;
       AVIOContext *ioCtxt;
       AVStream *videoStream;

       int res = avformat_alloc_output_context2(&amp;formatCtxt, outputFormat, nullptr, nullptr);
       if(res &lt; 0)
       {
           std::cout &lt;&lt; "Couldn't initialize format context; the error was: " &lt;&lt; getAvErrorString(res) &lt;&lt; std::endl;
           return 3;
       }

       if((videoStream = avformat_new_stream( formatCtxt, avcodec_find_encoder(formatCtxt->oformat->video_codec) )) == nullptr)
       {
           std::cout &lt;&lt; "Couldn't initialize video stream" &lt;&lt; std::endl;
           return 4;
       }
       else if(!codec)
       {
           std::cout &lt;&lt; "Couldn't initialize codec" &lt;&lt; std::endl;
           return 5;
       }
       else if(codecCtxt == nullptr)
       {
           std::cout &lt;&lt; "Couldn't initialize codec context" &lt;&lt; std::endl;
           return 6;
       }
       else if(parserCtxt == nullptr)
       {
           std::cout &lt;&lt; "Couldn't initialize parser context" &lt;&lt; std::endl;
           return 7;
       }
       else if((ioBuffer = (uint8_t*)av_malloc(IOBUFSIZE)) == nullptr)
       {
           std::cout &lt;&lt; "Couldn't allocate I/O buffer" &lt;&lt; std::endl;
           return 8;
       }
       else if((ioCtxt = avio_alloc_context(ioBuffer, IOBUFSIZE, 1, nullptr, nullptr, mediaMuxCallback, nullptr)) == nullptr)
       {
           std::cout &lt;&lt; "Couldn't initialize I/O context" &lt;&lt; std::endl;
           return 9;
       }

       //Set video stream data
       videoStream->id = formatCtxt->nb_streams - 1;
       videoStream->codec->width = 1280;
       videoStream->codec->height = 720;
       videoStream->time_base.den = 60; //FPS
       videoStream->time_base.num = 1;
       videoStream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
       formatCtxt->pb = ioCtxt;

       //Retrieve SPS and PPS for codec extdata
       const uint32_t synchMarker = 0x01000000;
       unsigned int i = 0;
       int spsStart = -1, ppsStart = -1;
       uint16_t spsSize = 0, ppsSize = 0;
       while(spsSize == 0 || ppsSize == 0)
       {
           uint32_t *curr =  (uint32_t*)(inputMediaData.data() + i);
           if(*curr == synchMarker)
           {
               unsigned int currentNalStart = i;
               i += sizeof(uint32_t);
               uint8_t nalType = inputMediaData.data()[i] &amp; 0x1F;
               if(nalType == SEQ_PARAM_SET)
                   spsStart = currentNalStart;
               else if(nalType == PIC_PARAM_SET)
                   ppsStart = currentNalStart;

               if(spsStart >= 0 &amp;&amp; spsSize == 0 &amp;&amp; spsStart != i)
                   spsSize = currentNalStart - spsStart;
               else if(ppsStart >= 0 &amp;&amp; ppsSize == 0 &amp;&amp; ppsStart != i)
                   ppsSize = currentNalStart - ppsStart;
           }
           ++i;
       }

       videoStream->codec->extradata = inputMediaData.data() + spsStart;
       videoStream->codec->extradata_size = ppsStart + ppsSize;

       //Write main header
       AVDictionary *options = nullptr;
       av_dict_set(&amp;options, "movflags", "frag_custom+empty_moov", 0);
       res = avformat_write_header(formatCtxt, &amp;options);
       if(res &lt; 0)
       {
           std::cout &lt;&lt; "Couldn't write container main header; the error was: " &lt;&lt; getAvErrorString(res) &lt;&lt; std::endl;
           return 10;
       }

       //Retrieve frames from input video and wrap them in container
       int currentInputIndex = 0;
       int framesInSecond = 0;
       while(currentInputIndex &lt; inputMediaData.size())
       {
           uint8_t *frameBuffer;
           int frameSize;
           res = av_parser_parse2(parserCtxt, codecCtxt, &amp;frameBuffer, &amp;frameSize, inputMediaData.data() + currentInputIndex,
               inputMediaData.size() - currentInputIndex, AV_NOPTS_VALUE, AV_NOPTS_VALUE, 0);
           if(frameSize == 0) //No more frames while some data still remains (is that even possible?)
           {
               std::cout &lt;&lt; "Some data left unparsed: " &lt;&lt; std::to_string(inputMediaData.size() - currentInputIndex) &lt;&lt; std::endl;
               break;
           }

           //Prepare packet with video frame to be dumped into container
           AVPacket packet;
           av_init_packet(&amp;packet);
           packet.data = frameBuffer;
           packet.size = frameSize;
           packet.stream_index = videoStream->index;
           currentInputIndex += frameSize;

           //Write packet to the video stream
           res = av_write_frame(formatCtxt, &amp;packet);
           if(res &lt; 0)
           {
               std::cout &lt;&lt; "Couldn't write packet with video frame; the error was: " &lt;&lt; getAvErrorString(res) &lt;&lt; std::endl;
               return 11;
           }

           if(++framesInSecond == 60) //We want 1 segment per second
           {
               framesInSecond = 0;
               res = av_write_frame(formatCtxt, nullptr); //Flush segment
           }
       }
       res = av_write_frame(formatCtxt, nullptr); //Flush if something has been left

       //Write media data in container to file
       file.open("my_mp4.mp4", std::ios::out | std::ios::binary);
       if(!file.is_open())
       {
           std::cout &lt;&lt; "Couldn't open output file " &lt;&lt; std::endl;
           return 12;
       }

       file.write((char*)outputData.data(), outputData.size());
       if(file.fail())
       {
           std::cout &lt;&lt; "Couldn't write to file" &lt;&lt; std::endl;
           return 13;
       }

       std::cout &lt;&lt; "Media file muxed successfully" &lt;&lt; std::endl;
       return 0;
    }
    </vector></string></fstream></iostream></cstdint>

    (I hardcoded a few values, such as video dimensions or framerate, but as I said this is just a test code.)


    Simple HTML webpage using MSE to play my fragmented MP4

       


       <video width="1280" height="720" controls="controls">
       </video>

    <code class="echappe-js">&lt;script&gt;<br />
    var vidElement = document.querySelector('video');<br />
    <br />
    if (window.MediaSource) {<br />
     var mediaSource = new MediaSource();<br />
     vidElement.src = URL.createObjectURL(mediaSource);<br />
     mediaSource.addEventListener('sourceopen', sourceOpen);<br />
    } else {<br />
     console.log(&quot;The Media Source Extensions API is not supported.&quot;)<br />
    }<br />
    <br />
    function sourceOpen(e) {<br />
     URL.revokeObjectURL(vidElement.src);<br />
     var mime = 'video/mp4; codecs=&quot;avc1.640028&quot;';<br />
     var mediaSource = e.target;<br />
     var sourceBuffer = mediaSource.addSourceBuffer(mime);<br />
     var videoUrl = 'my_mp4.mp4';<br />
     fetch(videoUrl)<br />
       .then(function(response) {<br />
         return response.arrayBuffer();<br />
       })<br />
       .then(function(arrayBuffer) {<br />
         sourceBuffer.addEventListener('updateend', function(e) {<br />
           if (!sourceBuffer.updating &amp;amp;&amp;amp; mediaSource.readyState === 'open') {<br />
             mediaSource.endOfStream();<br />
           }<br />
         });<br />
         sourceBuffer.appendBuffer(arrayBuffer);<br />
       });<br />
    }<br />
    &lt;/script&gt;

    Output MP4 file generated by my C++ application can be played i.e. in MPC, but it doesn’t play in any web browser I tested it with. It also doesn’t have any duration (MPC keeps showing 00:00).

    To compare output MP4 file I got from my C++ application described above, I also used FFMPEG to create fragmented MP4 file from the same source file with raw H264 stream. I used the following command :

    ffmpeg -r 60 -i input.h264 -c:v copy -f mp4 -movflags empty_moov+default_base_moof+frag_keyframe test.mp4

    This file generated by FFMPEG is played correctly by every web browser I used for tests. It also has correct duration (but also it has trailing atom, which wouldn’t be present in my live stream anyway, and as I need a live stream, it won’t have any fixed duration in the first place).

    MP4 atoms for both files look very similiar (they have identical avcc section for sure). What’s interesting (but not sure if it’s of any importance), both files have different NALs format than input file (RPI camera produces video stream in Annex-B format, while output MP4 files contain NALs in AVCC format... or at least it looks like it’s the case when I compare mdat atoms with input H264 data).

    I assume there is some field (or a few fields) I need to set for avcodec to make it produce video stream that would be properly decoded and played by browsers players. But what field(s) do I need to set ? Or maybe problem lies somewhere else ? I ran out of ideas.


    EDIT 1 :
    As suggested, I investigated binary content of both MP4 files (generated by my app and FFMPEG tool) with hex editor. What I can confirm :

    • both files have identical avcc section (they match perfectly and are in AVCC format, I analyzed it byte after byte and there’s no mistake about it)
    • both files have NALs in AVCC format (I looked closely at mdat atoms and they don’t differ between both MP4 files)

    So I guess there’s nothing wrong with the extradata creation in my code - avcodec takes care of it properly, even if I just feed it with SPS and PPS NALs. It converts them by itself, so no need for me to do it by hand. Still, my original problem remains.

    EDIT 2 : I achieved partial success - MP4 generated by my app now plays in Firefox. I added this line to the code (along with rest of stream initialization) :

    videoStream->codec->time_base = videoStream->time_base;

    So now this section of my code looks like this :

    //Set video stream data
    videoStream->id = formatCtxt->nb_streams - 1;
    videoStream->codec->width = 1280;
    videoStream->codec->height = 720;
    videoStream->time_base.den = 60; //FPS
    videoStream->time_base.num = 1;
    videoStream->codec->time_base = videoStream->time_base;
    videoStream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
    formatCtxt->pb = ioCtxt;
  • x264 parameters configuration

    5 novembre 2013, par user1558688

    I need to configure x264 to reproduce the below parameters.

    The configuration was extracted using Elecard Stream Analyzer from the first keyframe received from a legacy SIP phone h264 encoder. It is mission critical to reproduce it in software to keep compatibility with other sip clients.

    All the information I have are the parameters below.

    Is it possible at least to get close to the original ?

    Thanks a lot.

    0x00000000 H264 Sequence Parameter Set  
       profile_idc = 66 (PROFILE_IDC_Baseline)  
       constraint_set0_flag = 1  
       constraint_set1_flag = 0  
       constraint_set2_flag = 0  
       constraint_set3_flag = 0  
       constraint_set4_flag = 0  
       constraint_set5_flag = 0  
       reserved_zero_2bits = 0  
       level_idc = 12  
       seq_parameter_set_id = 4  
       log2_max_frame_num_minus4 = 4  
       pic_order_cnt_type = 2  
       num_ref_frames = 1  
       gaps_in_frame_num_value_allowed_flag = 0  
       pic_width_in_mbs_minus1 = 19 (320)  
       pic_height_in_map_units_minus1 = 14 (240)  
       frame_mbs_only_flag = 1  
       direct_8x8_inference_flag = 1  
       frame_cropping_flag = 0  
       vui_parameters_present_flag = 0  

    0x0000000C H264 Picture Parameter Set  
       pic_parameter_set_id = 4  
       seq_parameter_set_id = 4  
       entropy_coding_mode_flag = 0  
       pic_order_present_flag = 0  
       num_slice_groups_minus1 = 0  
       num_ref_idx_L0_active_minus1 = 0  
       num_ref_idx_L1_active_minus1 = 0  
       weighted_pred_flag = 0  
       weighted_bipred_idc = 0  
       pic_init_qp_minus26 = 2  
       pic_init_qs_minus26 = 0  
       chroma_qp_index_offset = 0  
       deblocking_filter_control_present_flag = 1  
       constrained_intra_pred_flag = 0  
       redundant_pic_cnt_present_flag = 0  

    0x00000014 H264 I slice #1 { frame_num = 0 }  
       first_mb_in_slice = 0  
       slice_type = 2  
       pic_parameter_set_id = 4  
       frame_num = 0  
       idr_pic_id = 4  
       dec_ref_pic_marking():  
       if(IdrPicFlag)  
           no_output_of_prior_pics_flag = 0  
           long_term_reference_flag = 0  
    slice_qp_delta = 0  
    disable_deblocking_filter_idc = 0  
       slice_alpha_c0_offset_div2 = 0  
       slice_beta_offset_div2 = 0  

    0x00000591 H264 I slice #1 { frame_num = 0 }  
       first_mb_in_slice = 126  
       slice_type = 2  
       pic_parameter_set_id = 4  
       frame_num = 0  
       idr_pic_id = 4  
       dec_ref_pic_marking():  
           if(IdrPicFlag)  
               no_output_of_prior_pics_flag = 0  
               long_term_reference_flag = 0  
       slice_qp_delta = 0  
       disable_deblocking_filter_idc = 0  
       slice_alpha_c0_offset_div2 = 0  
       slice_beta_offset_div2 = 0  

    0x00000B34 H264 I slice #1 { frame_num = 0 }  
       first_mb_in_slice = 201  
       slice_type = 2  
       pic_parameter_set_id = 4  
       frame_num = 0  
       idr_pic_id = 4  
       dec_ref_pic_marking():  
           if(IdrPicFlag)  
               no_output_of_prior_pics_flag = 0  
               long_term_reference_flag = 0  
       slice_qp_delta = 0  
       disable_deblocking_filter_idc = 0  
       slice_alpha_c0_offset_div2 = 0  
       slice_beta_offset_div2 = 0  
  • More Imagick refresh

    4 octobre 2013, par Mikko Koppanen — Imagick, PHP stuff

    I’ve committed quite a few changes lately, mainly removing excessive macro usage and making the code more robust. Large amounts of the code was written about six years ago and a lot of things have changed since. Among other things, I’ve probably become a lot better in C.

    Under the hood ImagickPixelIterator went through almost a full rewrite, a lot of the internal routines have been renamed and improved and I am happy to say that most of the (useless) macros have been removed.

    Some of the user visible/interesting features added recently :

    Countable

    Imagick class now supports Countable interface and calling count on the object returns amount of images currently in memory. For example for PDF files this is usually the amount of pages. This is purely syntactic sugar as the functionality was available before using the getNumberImages method. The usage of the countable is pretty simple : 021-countable.phpt.

    writeImageFile

    After tracking down (what I thought was) a bug related to writeImageFile not honouring the format set with setImageFormat I was advised that the format actually depends on the filename. The filename is set during reading the image and just calling setImageFormat and writeImageFile would cause the original file format to be written in the handle.

    There is now an additional parameter for writeImageFile for setting the format during the operation. The following test demonstrates the functionality and the issue : 022-writeimagefileformat.phpt.

    Memory Management

    One of the things that pops up now and then (especially from shared hosting providers) is whether Imagick supports PHP memory limits. Before today the answer was no and you needed to configure ImageMagick separately with reasonable limits.

    In the latest master version there is a new compile time flag –enable-imagick-zend-mm, which adds Zend Memory Manager support. This means that Imagick will honour the PHP memory limits and will cause “Out of memory” error to be returned in case of overflow. The following test demonstrates the “usage” : 023-php-allocators.phpt.