Recherche avancée

Médias (39)

Mot : - Tags -/audio

Autres articles (25)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (7197)

  • Use libav to copy raw H264 stream as mp4 without codec

    11 décembre 2018, par cKt 1010

    Since my platform didn’t include libx264 some I can’t use H264 codec in libav.
    I know this question is similar to Raw H264 frames in mpegts container using libavcodec. But trust me, I test bob2’s method but didn’t work.
    There are some problem here :

    1. How to set PTS and DTS ?

    If I use setting below which was bob2 told, Libav will print : "Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly"

    packet.flags |= AV_PKT_FLAG_KEY;
    packet.pts = packet.dts = 0;

    So should I use frame rate to calculate it manually?

    1. How to set PPS and SPS ?

    bob2 didn’t told it in his code, but it seems we can’t skip this step. Someone told me that it should be set to extradata which is in AVCodecContext struct. But what is formate ? should it include H264 header ?

    1. Should we delete 0x00 0x00 0x00 0x01 header one by one ?

    Seems we must delete H264 header for every H264 frame. But it cost time anyway. Should we must do it ?

    My code is mess (I tried to many method, lost in it now...). I past it below, and hope not confuse you.

    Init :

    AVOutputFormat *fmt;
    AVFormatContext *oc;
    AVCodec *audio_codec = NULL, *video_codec = NULL;
    Int32 ret;

    assert(video_st != NULL);
    assert(audio_st != NULL);
    /* Initialize libavcodec, and register all codecs and formats. */
    av_register_all();
    /* allocate the output media context */
    printf("MediaSave: save file to %s", pObj->filePath);
    avformat_alloc_output_context2(&oc, NULL, NULL, pObj->filePath);
    if (!oc) {
       Vps_printf(
           "Could not deduce output format from file extension: using MPEG.");
       avformat_alloc_output_context2(&oc, NULL, "mpeg", pObj->filePath);
    }
    if (!oc)
       return SYSTEM_LINK_STATUS_EFAIL;
    pObj->oc = oc;

    fmt = oc->oformat;
    fmt->video_codec = AV_CODEC_ID_H264;
    Vps_printf("MediaSave: codec is %s", fmt->name);
    /* Add the video streams using the default format codecs
    * and initialize the codecs. */
    if ((fmt->video_codec != AV_CODEC_ID_NONE) &&
       (pObj->formate_type & MEDIA_SAVE_TYPE_VIDEO)) {
       add_stream(video_st, oc, &video_codec, fmt->video_codec);
       //open_video(oc, video_codec, video_st);
       pObj->video_st = video_st;
       pObj->video_codec = video_codec;
    }
    if ((fmt->audio_codec != AV_CODEC_ID_NONE) &&
       (pObj->formate_type & MEDIA_SAVE_TYPE_AUDIO)) {
       add_stream(audio_st, oc, &audio_codec, fmt->audio_codec);
       //open_audio(oc, audio_codec, audio_st);
       pObj->audio_codec = audio_codec;
       pObj->audio_st = audio_st;
    }

    /* open the output file, if needed */
    if (!(fmt->flags & AVFMT_NOFILE)) {
       ret = avio_open(&oc->pb, pObj->filePath, AVIO_FLAG_WRITE);
       if (ret < 0) {
           Vps_printf("Could not open '%s': %s", pObj->filePath,
                   av_err2str(ret));
           return SYSTEM_LINK_STATUS_EFAIL;
       }
    }

    Write h264

    /* Write the stream header, if any. */
    ret = avformat_write_header(oc, NULL);

    int nOffset = 0;
    int  nPos =0;
    uint8_t sps_pps[4] = { 0x00, 0x00, 0x00, 0x01 };
    while(1)
    {
       AVFormatContext *oc = pObj->oc;
       nPos = ReadOneNaluFromBuf(&naluUnit, bitstreamBuf->bufAddr + nOffset, bitstreamBuf->fillLength - nOffset);
       if(naluUnit.type == 7 || naluUnit.type == 8) {
           Vps_printf("Get type 7 or 8, Store it to extradata");
           video_st->st->codec->extradata_size = naluUnit.size + sizeof(sps_pps);
           video_st->st->codec->extradata = OSA_memAllocSR(OSA_HEAPID_DDR_CACHED_SR1, naluUnit.size + AV_INPUT_BUFFER_PADDING_SIZE, 32U);
           memcpy(video_st->st->codec->extradata, sps_pps , sizeof(sps_pps));
           memcpy(video_st->st->codec->extradata + sizeof(sps_pps), naluUnit.data, naluUnit.size);
           break;
       }
       nOffset += nPos;
       write_video_frame(oc, video_st, naluUnit);
       if (nOffset >= bitstreamBuf->fillLength) {
           FrameCounter++;
           break;
       }
    }

    static Int32 write_video_frame(AVFormatContext *oc, OutputStream *ost,
       NaluUnit bitstreamBuf) {
    Int32 ret;
    static Int32 waitkey = 1, ptsInc = 0;

    if (0 > ost->st->index) {
       Vps_printf("Stream index less than 0");
       return SYSTEM_LINK_STATUS_EFAIL;
    }
    AVStream *pst = oc->streams[ost->st->index];

    // Init packet
    AVPacket pkt;
    av_init_packet(&pkt);
    pkt.flags |= (0 >= getVopType(bitstreamBuf.data, bitstreamBuf.size))
       ? AV_PKT_FLAG_KEY
       : 0;
    pkt.stream_index = pst->index;

    // Wait for key frame
    if (waitkey) {
       if (0 == (pkt.flags & AV_PKT_FLAG_KEY)){
           return SYSTEM_LINK_STATUS_SOK;
       }
       else {
           waitkey = 0;
           Vps_printf("First frame");
       }
    }

    pkt.pts = (ptsInc) * (90000 / STREAM_FRAME_RATE);
    pkt.dts = (ptsInc) * (90000/STREAM_FRAME_RATE);

    pkt.duration = 3000;
    pkt.pos = -1;
    ptsInc++;
    ret = av_interleaved_write_frame(oc, &pkt);
    if (ret < 0) {
       Vps_printf("cannot write frame");
    }
    return SYSTEM_LINK_STATUS_SOK;

    }

  • Node js request huge amount of pictures

    5 janvier 2019, par Manos Koutselakis

    I am trying to request a huge amount of images on node js (n > 3000)
    and then save them into a folder.
    I have tried using request library and request-promise.
    The problem is that if the number of pictures is too big some pictures do not complete downloading, leaving incomplete data or get an error(an empty .jpeg file). Is there a better way of downloading huge amounts of pictures ?
    Also i need when the pictures all download to request a function compile()
    to make them into a video. I am using ffmpeg for that.

    Below are 2 ways i tried doing this.

    const req = require('request');
    const request = require('request-promise');
    const fs = require('fs');
    const ffmpegPath = require('@ffmpeg-installer/ffmpeg').path;
    const ffmpeg = require('fluent-ffmpeg');
    ffmpeg.setFfmpegPath(ffmpegPath);

    function downloadImgs(imgUrls) {
     let promises = [];
     let prom;
     for (let i = 0; i < imgUrls.length; i++) {
       imgPath = `assets/pics/st_${pad(i + 1, 3)}.jpg`

       prom = request(imgUrls[i]);
       prom.on('error', () => console.log('err'));
       prom.pipe(fs.createWriteStream(imgPath));
       promises.push(prom);
     }

     Promise.all(promises).then(() => {
       console.log('I Run');
       compilee();
     });

     //SECOND TRY-----------------------------------------
     for (let i = 0; i < imgUrls.lengh; i++) {
       imgUrl = imgUrls[i];
       req(imgUrl)
         .on('error', () => console.log("img error", imgUrl))
         .pipe(fs.createWriteStream(`assets/pics/st_${pad(i + 1, 3)}.jpg`)).on('close', () => {
           console.log('downloaded', x)
         })
     }
     compilee();

    //---------------------------------------------------------

     function compilee() {
       command
         .on('end', onEnd)
         // .on('progress', onProgress)
         .on('error', onError)
         .input('assets/pics/st_%03d.jpg')
         .inputFPS(5)
         .output('assets/output/pepe.mp4')
         .outputFps(30)
         .run();
     }

    The errors i am getting for the first : UnhandledPromiseRejectionWarning : RequestError : Error : socket hang up

    and the second :
    Error : socket hang up
    at createHangUpError (_http_client.js:330:15)
    at TLSSocket.socketOnEnd (_http_client.js:433:23)
    at TLSSocket.emit (events.js:187:15)
    at endReadableNT (_stream_readable.js:1098:12)
    at process.internalTickCallback
    (internal/process/next_tick.js:72:19) code : ’ECONNRESET’ }

    Also, should i download the pictures on the server or on the client if i want to sent the video to the client immediately.

  • ffmpeg - Stream media file with alpha channel

    14 décembre 2018, par cub33

    What I would like to achieve is to stream a PNG image (containing alpha background) to localhost in order to use it as input to watermark the main video stream. For example I have a ffmpeg process that would stream the png :

    ffmpeg -loop 1 -i image_with_alpha_backgroung.png -options options protocol://localhost:3000

    Main ffmpeg process :

    ffmpeg -i main_video -i protocol://localhost:3000 -filter_complex overlay -f flv rtmp://streaming_server

    The main ffmpeg process would listen for that png watermark stream and would insert it only when streamed from the png ffmpeg stream process. How is it achievable ? What I have tried is to stream the png with image2pipe format but it transforms the images in mjpeg and also tried to stream .webm files since the vp9 codec supports alpha but when receiving the webm stream the main process doesn’t like the input format. Thank you for your attention