Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (38)

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • Selection of projects using MediaSPIP

    2 mai 2011, par

    The examples below are representative elements of MediaSPIP specific uses for specific projects.
    MediaSPIP farm @ Infini
    The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...)

Sur d’autres sites (5483)

  • Using libav to encode RGBA frames into MP4 but the output is a mess

    5 octobre 2019, par Cu2S

    I’m trying to decode a video into RGB frames, and then postprocess the frames, finally encode the frames into a video. But the output video is completely a mess :
    Screenshot from potplayer

    I wrote a minimal example to illustrate my idea. First, I read some information from some source video :

       AVFormatContext* inputFormatCtx = nullptr;
       int ret = avformat_open_input(&inputFormatCtx, inputParamsVideo, nullptr, nullptr);
       assert(ret >= 0);
       ret = avformat_find_stream_info(inputFormatCtx, NULL);
       av_dump_format(inputFormatCtx, 0, inputParamsVideo, 0);

       assert(ret >= 0);
       AVStream* inputVideoStream = nullptr;
       for (int i = 0; i < inputFormatCtx->nb_streams; i++)
       {
           const auto inputStream = inputFormatCtx->streams[i];
           if (inputStream->codec->codec_type == AVMEDIA_TYPE_VIDEO)
           {
               inputVideoStream = inputStream;
               break;
           }
       }

       assert(inputVideoStream != nullptr);
       AVCodecParameters* inputParams = inputVideoStream->codecpar;
       AVRational framerate = inputVideoStream->codec->framerate;
       auto gop_size = inputVideoStream->codec->gop_size;
       auto maxBFrames = inputVideoStream->codec->max_b_frames;

    Then I assign the information to the output stream :

    AVFormatContext *outputAVFormat = nullptr;
    avformat_alloc_output_context2(&outputAVFormat, nullptr, nullptr, kOutputPath);
    assert(outputAVFormat);
    AVCodec* codec = avcodec_find_encoder(outputAVFormat->oformat->video_codec);
    assert(codec);
    AVCodecContext* encodingCtx = avcodec_alloc_context3(codec);
    avcodec_parameters_to_context(encodingCtx, inputParams);
    encodingCtx->time_base = av_inv_q(framerate);
    encodingCtx->max_b_frames = maxBFrames;
    encodingCtx->gop_size = gop_size;


    if (outputAVFormat->oformat->flags & AVFMT_GLOBALHEADER)
       encodingCtx->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
    AVStream* outStream = avformat_new_stream(outputAVFormat, nullptr);
    assert(outStream != nullptr);
    ret = avcodec_parameters_from_context(outStream->codecpar, encodingCtx);
    assert(ret >= 0);
    outStream->time_base = encodingCtx->time_base;

    Then I convert RGBA frames(which is read from files) into YUV420P via sws_scale, and encoding :

       ret = avcodec_open2(encodingCtx, codec, nullptr);
       assert(ret >= 0);
       av_dump_format(outputAVFormat, 0, kOutputPath, 1);

       ret = avio_open(&outputAVFormat->pb, kOutputPath, AVIO_FLAG_WRITE);
       assert(ret >= 0);
       ret = avformat_write_header(outputAVFormat, nullptr);
       assert(ret >= 0);

       AVFrame* frame = av_frame_alloc();
       frame->width = inputParams->width;
       frame->height = inputParams->height;
       frame->format = inputParams->format;
       frame->pts = 0;
       assert(ret >= 0);

       ret = av_frame_get_buffer(frame, 32);
       int frameCount = 0;
       assert(ret >= 0);
       ret = av_frame_make_writable(frame);
       assert(ret >= 0);
       SwsContext* swsContext = sws_getContext(inputParams->width, inputParams->height,
           AV_PIX_FMT_RGBA, frame->width,
           frame->height, static_cast<avpixelformat>(inputParams->format),
           SWS_BILINEAR, NULL, NULL, NULL);


       for (auto inputPicPath : std::filesystem::directory_iterator(kInputDir))
       {
           int width, height, comp;
           unsigned char* data = stbi_load(inputPicPath.path().string().c_str(), &amp;width, &amp;height, &amp;comp, 4);
           int srcStrides[1] = { 4 * width };
           int ret = sws_scale(swsContext, &amp;data, srcStrides, 0, height, frame->data,
               frame->linesize);
           assert(ret >= 0);
           frame->pts = frameCount;
           //frame->pict_type = AV_PICTURE_TYPE_I;
           frameCount += 1;
           encode(encodingCtx, frame, 0, outputAVFormat);

           stbi_image_free(data);
       }

       while (encode(encodingCtx, nullptr, 0, outputAVFormat))
       {
           ;
       }

       static bool encode(AVCodecContext* enc_ctx, AVFrame* frame, std::uint32_t streamIndex,
           AVFormatContext * formatCtx)
       {
           int ret;
           int got_output = 0;
           AVPacket packet = {};
           av_init_packet(&amp;packet);
           ret = avcodec_encode_video2(enc_ctx, &amp;packet, frame, &amp;got_output);
           assert(ret >= 0);
           if (got_output) {
               packet.stream_index = streamIndex;
               av_packet_rescale_ts(&amp;packet, enc_ctx->time_base, formatCtx->streams[streamIndex]->time_base);
               ret = av_interleaved_write_frame(formatCtx, &amp;packet);
               assert(ret >= 0);
               return true;
           }
           else {
               return false;
           }
       }
    </avpixelformat>

    Finally I cleaned up stuff :

       av_write_trailer(outputAVFormat);
       sws_freeContext(swsContext);
       avcodec_free_context(&amp;encodingCtx);
       avio_closep(&amp;outputAVFormat->pb);
       avformat_free_context(outputAVFormat);
       av_frame_free(&amp;frame);

    I dumped my input format and my output format :

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'H:\Me.MP4':
     Metadata:
       major_brand     : mp42
       minor_version   : 1
       compatible_brands: mp41mp42isom
       creation_time   : 2019-04-03T05:44:22.000000Z
     Duration: 00:00:06.90, start: 0.000000, bitrate: 1268 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 540x960, 1238 kb/s, 29.86 fps, 30 tbr, 600 tbn, 1200 tbc (default)
       Metadata:
         creation_time   : 2019-04-03T05:44:22.000000Z
         handler_name    : Core Media Video
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 8000 Hz, stereo, fltp, 24 kb/s (default)
       Metadata:
         creation_time   : 2019-04-03T05:44:22.000000Z
         handler_name    : Core Media Audio
    [libx264 @ 000002126F90C1C0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
    [libx264 @ 000002126F90C1C0] profile High, level 3.1, 4:2:0, 8-bit
    [libx264 @ 000002126F90C1C0] 264 - core 157 - H.264/MPEG-4 AVC codec - Copyleft 2003-2018 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=2 keyint=12 keyint_min=1 scenecut=40 intra_refresh=0 rc_lookahead=12 rc=abr mbtree=1 bitrate=1238 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to './output.mp4':
       Stream #0:0: Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 540x960, q=2-31, 1238 kb/s, 29.86 tbn

    Update :

    After I deleted

    encodingCtx->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;

    the output video is right. Also, outputting avi works, too.

  • FFmpeg "Illegal instruction" Raspberry Pi Zero W

    20 août 2019, par ErezM

    I’ve been struggling with having FFmpeg run on Pi Zero W, as a newbie I can’t say where the root of the problem whether its the compilation configuration of the FFmpeg or maybe the syntax of the command line itself..

    The current configuration is :

    Raspberry Pi Zero Wireless
    Set GPU memory to 256

    Running OS :

    Linux raspberrypi 4.19.66+ #1253 Thu Aug 15 11:37:30 BST 2019 armv6l GNU/Linux

    PRETTY_NAME="Raspbian GNU/Linux 10 (buster)"
    NAME="Raspbian GNU/Linux"
    VERSION_ID="10"
    VERSION="10 (buster)"
    VERSION_CODENAME=buster
    ID=raspbian
    ID_LIKE=debian

    No LSB modules are available.
    Distributor ID: Raspbian
    Description: Raspbian GNU/Linux 10 (buster)
    Release: 10
    Codename: buster

    FFmpeg version and configuration :

    ffmpeg version N-94582-gd0fa1a58da Copyright (c) 2000-2019 the FFmpeg developers
    built with gcc 8 (Raspbian 8.3.0-6+rpi1)
    configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-omx --enable-omx-rpi --enable-nonfree
    libavutil 56. 33.100 / 56. 33.100
    libavcodec 58. 55.100 / 58. 55.100
    libavformat 58. 31.101 / 58. 31.101
    libavdevice 58. 9.100 / 58. 9.100
    libavfilter 7. 58.100 / 7. 58.100
    libswscale 5. 6.100 / 5. 6.100
    libswresample 3. 6.100 / 3. 6.100
    libpostproc 55. 6.100 / 55. 6.100
    Hyper fast Audio and Video encoder

    Command to copy 30 seconds clip from the 10 second mark :

    sudo ffmpeg -ss 00:00:10 -t 00:00:30 -i GOPR2546.MP4 -c:v libx264 -c:a copy out.MP4

    The FFmpeg output :

    ffmpeg version N-94582-gd0fa1a58da Copyright (c) 2000-2019 the FFmpeg developers
    built with gcc 8 (Raspbian 8.3.0-6+rpi1)
    configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-omx --enable-omx-rpi --enable-nonfree
    libavutil 56. 33.100 / 56. 33.100
    libavcodec 58. 55.100 / 58. 55.100
    libavformat 58. 31.101 / 58. 31.101
    libavdevice 58. 9.100 / 58. 9.100
    libavfilter 7. 58.100 / 7. 58.100
    libswscale 5. 6.100 / 5. 6.100
    libswresample 3. 6.100 / 3. 6.100
    libpostproc 55. 6.100 / 55. 6.100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x210d360] Using non-standard frame rate 59/1
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'GOPR2546.MP4':
    Metadata:
    major_brand : mp41
    minor_version : 538120216
    compatible_brands: mp41
    creation_time : 2019-08-20T11:09:41.000000Z
    firmware : HD7.01.01.80.00
    Duration: 00:01:00.84, start: 0.000000, bitrate: 45293 kb/s
    Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 45043 kb/s, 59.94 fps, 59.94 tbr, 60k tbn, 119.88 tbc (default)
    Metadata:
    creation_time : 2019-08-20T11:09:41.000000Z
    handler_name : GoPro AVC
    encoder : GoPro AVC encoder
    timecode : 11:20:21:32
    Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 189 kb/s (default)
    Metadata:
    creation_time : 2019-08-20T11:09:41.000000Z
    handler_name : GoPro AAC
    timecode : 11:20:21:32
    Stream #0:2(eng): Data: none (tmcd / 0x64636D74) (default)
    Metadata:
    creation_time : 2019-08-20T11:09:41.000000Z
    handler_name : GoPro TCD
    timecode : 11:20:21:32
    Stream #0:3(eng): Data: bin_data (gpmd / 0x646D7067), 41 kb/s (default)
    Metadata:
    creation_time : 2019-08-20T11:09:41.000000Z
    handler_name : GoPro MET
    Stream #0:4(eng): Data: none (fdsc / 0x63736466), 13 kb/s (default)
    Metadata:
    creation_time : 2019-08-20T11:09:41.000000Z
    handler_name : GoPro SOS
    Stream mapping:
    Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
    Stream #0:1 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    Illegal instruction

    The result is an empty out.MP4 file... I’ve tried different configuration but never got it to work.

    the one minute input file i’m using to test can be downloaded from https://dl.dropboxusercontent.com/s/dxd285zrz6d57da/GOPR2546.MP4

    Thank you all in advance !!

  • Media player get stuck in the middle of a buffered range on Chrome

    29 septembre 2019, par Feng Yu

    WHAT IS MY PROBLEM ?

    My website’s live streaming player use hls.js. From my server’s stat, there is many case where player get stuck in the middle of a buffered range.

    Here is my server raw stat log(removed some useless params) :

    tm=2019-09-27 12:04:41`bufferLevel=8.447303999999974`currentTime=158.4`buffered=[6.024,166.832]`readyState=4`ua=Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.116 Safari/537.36 QBCore/3.53.1153.400 QQBrowser/9.0.2524.400 Tencent AppMarket/4.8 GameCenter

    currentTime is got by HTMLMediaElement.currentTime and buffered is got by HTMLMediaElement.buffered :

    currentTime=158.4
    buffered=[6.024,166.832]
    readyState=4

    From W3c :

    If HTMLMediaElement.buffered contains a TimeRange that includes the current playback position and enough data to ensure uninterrupted playback :

    1. Set the HTMLMediaElement.readyState attribute to HAVE_ENOUGH_DATA.
    2. Playback may resume at this point if it was previously suspended by a transition to HAVE_CURRENT_DATA.

    In this case, 613.3 is in the middle of [469.277,677.612], video should be progressing, but it is not.

    Hls.js will periodly check currentTime has progressed every 100ms. if currentTime has not progressed for 1000ms, then hls.js will trigger STALL event and I will send a stall stat to server.

    I cannot reproduce this problem on my side, it only appears on my server stat.

    WHAT I’VE TRIED

    shaka player has a module detect this case(https://www.ellealcatrase.eu/player2/docs/api/lib_media_stall_detector.js.html), Its comment shows that :

    Some platforms/browsers can get stuck in the middle of a
    buffered range (e.g. when seeking in a background tab). Detect when
    we get stuck so that the player can respond.

    but I cannot reproduce when my browser is in a background tab.