Recherche avancée

Médias (2)

Mot : - Tags -/kml

Autres articles (13)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (4913)

  • Crash in ffmpeg avcodec_free_context in one application but not in other

    5 novembre 2017, par geekowl

    I am building an application in C++ on Windows 10 using Microsoft Visual Studio 2012 Professional.

    I have created a wrapper library around ffmpeg (libavcodec) to encode video in H.264 format using libx264. This wrapper contains following functions :

    Initialize()
    Open()
    EncodeFrame()
    Close()
    Uninitialize()

    I created a test application to test the wrapper library. The test application works perfectly fine.

    When I use the wrapper library in my actual main application, the main application crashes in Close() API. Inside close, it crashes in avcodec_free_context(). The difference between the main application and the test application is that the main application links with some more dependent libraries that test application does not link with.

    To debug the problem in the main application, I put avcodec_free_context() in Open() after the context is allocated. The crash occurs if avcodec_free_context() is put at a certain point as shown below.

    pCodecContext = avcodec_alloc_context3(pCodec);

    // <---- No crash here.

    pCodecContext->bit_rate = 200000;
    pCodecContext->width = 320;
    pCodecContext->height = 240;
    .
    .
    .
    if (pCodec->id == AV_CODEC_ID_H264)
    {
       av_opt_set(pCodecContext->priv_data, "preset", "slow", 0);
       av_opt_set(pCodecContext->priv_data, "tune", "zerolatency", 0);
    }

    pCodecContext->flags |= CODEC_FLAG_GLOBAL_HEADER;

    // <-- No crash here

    if (avcodec_open2(pCodecContext, pCodec, NULL) < 0)
    {
       return -1;
    }

    // <-- Crash here [avcodec_free_context(&pCodecContext);]

    What is a correct approach to identify and resolve this problem ?

    Thanks in advance.

    Update

    I found out that ffmpeg was built using gcc. That was causing the crash. I rebuilt ffmpeg using Visual Studio. That solved problem.

  • Crash in ffmpeg avcodec_free_context in one application but not in other

    5 novembre 2017, par geekowl

    I am building an application in C++ on Windows 10 using Microsoft Visual Studio 2012 Professional.

    I have created a wrapper library around ffmpeg (libavcodec) to encode video in H.264 format using libx264. This wrapper contains following functions :

    Initialize()
    Open()
    EncodeFrame()
    Close()
    Uninitialize()

    I created a test application to test the wrapper library. The test application works perfectly fine.

    When I use the wrapper library in my actual main application, the main application crashes in Close() API. Inside close, it crashes in avcodec_free_context(). The difference between the main application and the test application is that the main application links with some more dependent libraries that test application does not link with.

    To debug the problem in the main application, I put avcodec_free_context() in Open() after the context is allocated. The crash occurs if avcodec_free_context() is put at a certain point as shown below.

    pCodecContext = avcodec_alloc_context3(pCodec);

    // <---- No crash here.

    pCodecContext->bit_rate = 200000;
    pCodecContext->width = 320;
    pCodecContext->height = 240;
    .
    .
    .
    if (pCodec->id == AV_CODEC_ID_H264)
    {
       av_opt_set(pCodecContext->priv_data, "preset", "slow", 0);
       av_opt_set(pCodecContext->priv_data, "tune", "zerolatency", 0);
    }

    pCodecContext->flags |= CODEC_FLAG_GLOBAL_HEADER;

    // <-- No crash here

    if (avcodec_open2(pCodecContext, pCodec, NULL) < 0)
    {
       return -1;
    }

    // <-- Crash here [avcodec_free_context(&pCodecContext);]

    What is a correct approach to identify and resolve this problem ?

    Thanks in advance.

    Update

    I found out that ffmpeg was built using gcc. That was causing the crash. I rebuilt ffmpeg using Visual Studio. That solved problem.

  • How to correctly calculate which segments are ready to be downloaded using MPEG-DASH

    24 avril 2019, par igal k

    What i’m trying to do ?

    Write a simple MPEG-DASH client using the SegmentTemplate pattern to calculate which segments are ready to be downloaded for a live source.

    A picture taken using chrome’s debugging tools at a moment X showing an mpd request(8af651fd747.....mpd) and the actual segments fetched respectfully to that request.

    enter image description here

    Given the following MPD

    <mpd availabilitystarttime="2019-04-24T06:43:32Z" maxsegmentduration="PT4.096S" minbuffertime="PT4.096S" minimumupdateperiod="PT15.835S" profiles="urn:mpeg:dash:profile:isoff-live:2011" publishtime="2019-04-24T11:14:01Z" suggestedpresentationdelay="PT11.878S" timeshiftbufferdepth="PT65.536S" type="dynamic" xmlns="urn:mpeg:dash:schema:mpd:2011">
     <location>https://content-aaps1.uplynk.com/channel/8af651fd7473474f86a05ffb0a1c8972.mpd?rmt=wv&amp;amp;cid=8af651fd7473474f86a05ffb0a1c8972&amp;amp;oid=600e5c27541344a1bf3818617ad712ce&amp;amp;prettydash=1&amp;amp;exp=1556091088&amp;amp;rn=4138683939&amp;amp;tc=1&amp;amp;ct=c&amp;amp;sig=5fb7f0c18f3f1d2ad4fdee53c02c1e1ed904bc5e8474f4ebf886d209ff7f21c9&amp;amp;pbs=05b6594bcf4b4728ac1094976a80194d</location>
     <period start="PT2826.240S">
       <adaptationset maxframerate="30" maxheight="720" maxwidth="1280" mimetype="video/mp4" segmentalignment="true" startwithsap="1">
         <representation bandwidth="2604473" codecs="avc1.64001e" framerate="30" height="360" scantype="progressive" width="640">
           <baseurl>https://x-default-stgec.uplynk.com/aapm/slices/8c1/600e5c27541344a1bf3818617ad712ce/8c1027496a964b049f1bd5895f8f0412/</baseurl>
           <segmenttemplate duration="368640" initialization="https://x-default-stgec.uplynk.com/aapm/slices/8c1/600e5c27541344a1bf3818617ad712ce/8c1027496a964b049f1bd5895f8f0412/$RepresentationID$_init.mp4?pbs=05b6594bcf4b4728ac1094976a80194d&amp;amp;_jt=l&amp;amp;chid=8af651fd7473474f86a05ffb0a1c8972" media="$RepresentationID$$Number%08d$.m4f?pbs=05b6594bcf4b4728ac1094976a80194d&amp;amp;_jt=l&amp;amp;chid=8af651fd7473474f86a05ffb0a1c8972" presentationtimeoffset="254361599" startnumber="690" timescale="90000"></segmenttemplate>
         </representation>
       </adaptationset>
     </period>
     <utctiming schemeiduri="urn:mpeg:dash:utc:http-iso:2014" value="https://content-aaps1.uplynk.com/misc/utcservertime"></utctiming>
    </mpd>

    I see that the next segment request should be #3955

    What i have tried so far

       period.end = 1556104456;
       period.start = 2826;
       availability_start_time = 1556088212;
       max_segment_duration = 4;
       time_shift_buffer_depth = 65

    So, first of all, i read DASH-IF-IOP 4.3 section 4.3.4.2 page #82 and implemented the following code :

      int k1 = 1;
    int period_duration = period.end - (period.start + data_.availability_start_time);
    int k2 = ceil((float)period_duration / (float)data_.max_segment_duration);
    double duration = ((float)representation.duration / (float)representation.timeScale);
    size_t live_edge = std::min(
       (int)floor((float)((data_.publish_time - data_.availability_start_time - period.start) / duration)), k2);

    size_t oldest = std::max(k1, (int)floor((float)((data_.publish_time - data_.availability_start_time - period.start -
                                                       data_.time_shift_buffer_depth) /
                                                   duration)));

    after calculating everything : k1=1, k2=3355, live_edge=3272 and oldest=3256

    Also tried using ffmpeg’s dashdec.c

    for min_segment :

    if (c->is_live &amp;&amp; pls->fragment_duration)
       {
           num = pls->first_seq_no + (((get_current_time_in_sec() - c->availability_start_time) - c->time_shift_buffer_depth) * pls->fragment_timescale) / pls->fragment_duration;
       }

    for max_segment :

    num = pls->first_seq_no + (((get_current_time_in_sec() - c->availability_start_time)) * pls->fragment_timescale) / pls->fragment_duration;

    after a small modification :

    size_t pmax = (((data_.publish_time - data_.availability_start_time))) / duration;
    size_t pmin = ((data_.publish_time - data_.availability_start_time) - data_.time_shift_buffer_depth) / duration;

    pmin=3946 pmax=3961

    in the ffmpeg example, i had to manually remove the first_seq_no variable because it looked like i doubled added the SegmentTemplate@StartNumber.

    even after succeeding in this task, how do i exactly build the request list of Segment(NOW) ----> Segment(LIVE_EDGE)