Recherche avancée

Médias (91)

Autres articles (22)

  • Création définitive du canal

    12 mars 2010, par

    Lorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
    A la validation, vous recevez un email vous invitant donc à créer votre canal.
    Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
    A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Taille des images et des logos définissables

    9 février 2011, par

    Dans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
    Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...)

Sur d’autres sites (3788)

  • (FFMPEG) avformat_write_header crashes (MSVC2013) (C++) (Qt)

    29 avril 2015, par user3502626

    I just downloaded FFMPEG and now I’m trying to use it in Qt with MSVC2013 compiler.

    To understand how it works, I started reading the documentation and the API.
    According to this figure, I was trying to make a little test with libavformat.

    I did all they said in the demuxing module, then the muxing module. But, my program crashes when I call the avformat_write_header() function.

    I would like to know what I did wrong and if you could help me to understand that.

    In the main :

    av_register_all();

    if(!decode())
       return;

    The decode() methode :

    bool MainWindow::decode()
    {
    AVFormatContext *formatContext = NULL;
    AVPacket packet;

    /**************** muxing varaiables ******************/

    AVFormatContext *muxingContext = avformat_alloc_context();
    AVOutputFormat *outputFormat = NULL;
    AVIOContext *contextIO = NULL;
    AVCodec *codecEncode = avcodec_find_encoder(AV_CODEC_ID_WMAV2);
    AVStream *avStream =  NULL;
    AVCodecContext *codecContext = NULL;


    /******************* demuxing **************************/

    //open a media file
    if(avformat_open_input(&formatContext,"h.mp3",NULL,NULL)!=0)
    {
       qDebug() << "paka ouve fichier";
       return false;
    }

    //function which tries to read and decode a few frames to find missing          
    information.
    if(avformat_find_stream_info(formatContext,NULL)<0)
    {
       qDebug()<<"paka find stream";
       return false;
    }


    /**************** muxing *************************/

    //The oformat field must be set to select the muxer that will be used.
    muxingContext->oformat = outputFormat;

    //Unless the format is of the AVFMT_NOFILE type, the pb field must be set to
    //an opened IO context, either returned from avio_open2() or a custom one.
    if(avio_open2(&contextIO,"out.wma",AVIO_FLAG_WRITE,NULL,NULL)<0)
    {
       qDebug() <<"paka kreye fichier soti";
       return false;
    }
    muxingContext->pb = contextIO;

    //Unless the format is of the AVFMT_NOSTREAMS type, at least
    //one stream must be created with the avformat_new_stream() function.
    avStream = avformat_new_stream(muxingContext,codecEncode);

    //The caller should fill the stream codec context information,
    //such as the codec type, id and other parameters
    //(e.g. width / height, the pixel or sample format, etc.) as known

    codecContext = avStream->codec;
    codecContext->codec_type = AVMEDIA_TYPE_AUDIO;
    codecContext->codec_id = AV_CODEC_ID_WMAV2;
    codecContext->sample_fmt = codecEncode->sample_fmts[0];
    codecContext->bit_rate = 128000;
    codecContext->sample_rate = 44000;
    codecContext->channels = 2;

    //The stream timebase should be set to the timebase that the caller desires
    //to use for this stream (note that the timebase actually used by the muxer
    //can be different, as will be described later).

    avStream->time_base = formatContext->streams[0]->time_base;
    qDebug()<streams[0]->time_base.num <<"/"
    <streams[0]->time_base.den;


    //When the muxing context is fully set up, the caller must call    
    //avformat_write_header()
    //to initialize the muxer internals and write the file header

    qDebug() << "does not crash yet";
    if(avformat_write_header(muxingContext,NULL) <0)
    {
       qDebug()<<"cannot write header";
       return false;
    }
    qDebug() << "OOps you can't see me (John Cena)";

    ///////////////////// Reading from an opened file //////////////////////////
    while(av_read_frame(formatContext,&packet)==0)
    {
       //The data is then sent to the muxer by repeatedly calling
       //av_write_frame() or av_interleaved_write_frame()
       if(av_write_frame(muxingContext,&packet)<0)
           qDebug()<<"paka write frame";
       else
           qDebug()<<"writing";
    }

    //Once all the data has been written, the caller must call
    //av_write_trailer() to flush any buffered packets and finalize
    //the output file, then close the IO context (if any) and finally
    //free the muxing context with avformat_free_context().

    if(av_write_trailer(muxingContext)!=0)
    {
       qDebug()<<"paka ekri trailer";
       return false;
    }


    return true;
    }

    The program shows the message does not crash yet. But not OOps you can’t see me (John Cena)

    And there is no error. I used an MP3 file as input and I would like to ouput it in WMA.

  • x86 : Serialize rdtsc in read_time()

    8 juillet 2015, par Henrik Gramner
    x86 : Serialize rdtsc in read_time()
    

    Improves the accuracy of measurements, especially in short sections.

    To quote the Intel 64 and IA-32 Architectures Software Developer’s Manual :
    "The RDTSC instruction is not a serializing instruction. It does not necessarily
    wait until all previous instructions have been executed before reading the counter.
    Similarly, subsequent instructions may begin execution before the read operation
    is performed. If software requires RDTSC to be executed only after all previous
    instructions have completed locally, it can either use RDTSCP (if the processor
    supports that instruction) or execute the sequence LFENCE ;RDTSC."

    SSE2 is a requirement for lfence so only use it on SSE2-capable systems.
    Prefer lfence ;rdtsc over rdtscp since rdtscp is supported on fewer systems.

    Signed-off-by : Luca Barbato <lu_zero@gentoo.org>

    • [DBH] libavutil/x86/timer.h
  • ffmpeg command never work in lambda function using nodejs [closed]

    4 décembre 2022, par Santosh swain

    I am trying to implement FFmpeg video streaming functionality such as Instagram countdown functionality. In this code, first of all, I get records(URLs) from the s3 bucket and then split them according to my need, and then create the command and execute it with exec() belonging to childe_process. in this, I am trying to store the out in some specific folder in lambda function but it was never stored. I thought lambda does allow to write files locally so I am trying to do the direct upload on the s3 bucket by using the stdout parameter of exec()'s callback. guys, please help to do that. I have a question lambda does allow to write content in its local folder ? or if not allow then whats the way to do that thing ? I just share my code please guide me.

    &#xA;

    &#xA;    // dependencies&#xA;var AWS = require(&#x27;aws-sdk&#x27;);&#xA;var s3 = new AWS.S3();&#xA;var { exec } = require(&#x27;child_process&#x27;);&#xA;var path = require(&#x27;path&#x27;)&#xA;var AWS_ACCESS_KEY = &#x27;&#x27;;&#xA;var AWS_SECRET_ACCESS_KEY = &#x27;&#x27;;&#xA;var fs = require(&#x27;fs&#x27;)&#xA;&#xA;s3 = new AWS.S3({&#xA;    accessKeyId: AWS_ACCESS_KEY,&#xA;    secretAccessKey: AWS_SECRET_ACCESS_KEY&#xA;});&#xA;&#xA;exports.handler = async function (event, context) {&#xA;&#xA;    var bucket_name = "sycu-game";&#xA;    var bucketName = "sycu-test";&#xA;&#xA;    //CREATE OVERLAY AND BG_VALUE PATH TO GET VALUE FROM S3&#xA;    const bgValue = (event.Records[0].bg_value).split(&#x27;/&#x27;);&#xA;    const overlayImage = (event.Records[0].overlay_image_url).split(&#x27;/&#x27;);&#xA;&#xA;&#xA;    var s3_bg_value = bgValue[3] &#x2B; "/" &#x2B; bgValue[4];&#xA;    var s3_overlay_image = overlayImage[4] &#x2B; "/" &#x2B; overlayImage[5] &#x2B; "/" &#x2B; overlayImage[6];&#xA;    const signedUrlExpireSeconds = 60 * 5;&#xA;&#xA;&#xA;    //RETREIVE BG_VALUE FROM S3 AND CREATE URL FOR FFMPEG INPUT VALUE&#xA;    var bg_value_url = s3.getSignedUrl(&#x27;getObject&#x27;, {&#xA;        Bucket: bucket_name,&#xA;        Key: s3_bg_value,&#xA;        Expires: signedUrlExpireSeconds&#xA;    });&#xA;    bg_value_url = bg_value_url.split("?");&#xA;    bg_value_url = bg_value_url[0];&#xA;&#xA;&#xA;    //RETREIVE OVERLAY IMAGE FROM S3 AND CREATE URL FOR FFMPEG INPUT VALUE   &#xA;    var overlay_image_url = s3.getSignedUrl(&#x27;getObject&#x27;, {&#xA;        Bucket: bucket_name,&#xA;        Key: s3_overlay_image,&#xA;        Expires: signedUrlExpireSeconds&#xA;    });&#xA;    overlay_image_url = overlay_image_url.split("?");&#xA;    overlay_image_url = overlay_image_url[0];&#xA;&#xA;&#xA;    //MANUAL ASSIGN VARIABLE FOR FFMPEG COMMAND &#xA;    var command,&#xA;        ExtraTimerSec = event.Records[0].timer_seconds &#x2B; 5,&#xA;        TimerSec = event.Records[0].timer_seconds &#x2B; 1,&#xA;        BackgroundWidth = 1080,&#xA;        BackgroundHeight = 1920,&#xA;        videoPath = (__dirname &#x2B; &#x27;/tmp/&#x27; &#x2B; event.Records[0].name);&#xA;    console.log("path", videoPath)&#xA;    //TEMP DIRECTORY&#xA;&#xA;    var videoPath = &#x27;/media/volume-d/generatedCountdownS3/tmp/&#x27; &#x2B; event.Records[0].name&#xA;    var tmpFile = fs.createWriteStream(videoPath)&#xA;    //FFMPEG COMMAND &#xA;    if (event.Records[0].bg_type == 2) {&#xA;        if (event.Records[0].is_rotate) {&#xA;            command = &#x27; -stream_loop -1 -t &#x27; &#x2B; ExtraTimerSec &#x2B; &#x27; -i &#x27; &#x2B; bg_value_url &#x2B; &#x27; -i &#x27; &#x2B; overlay_image_url &#x2B; &#x27; -filter_complex "color=color=0x000000@0.0:s= &#x27; &#x2B; event.Records[0].resized_box_width &#x2B; &#x27;x&#x27; &#x2B; event.Records[0].resized_box_height &#x2B; &#x27;,drawtext=fontcolor=&#x27; &#x2B; event.Records[0].time_text_color &#x2B; &#x27;:fontsize=&#x27; &#x2B; event.Records[0].time_text_size &#x2B; &#x27;:x=&#x27; &#x2B; event.Records[0].minute_x &#x2B; &#x27;:y=&#x27; &#x2B; event.Records[0].minute_y &#x2B; &#x27;:text=\&#x27;%{eif\\:trunc(mod(((&#x27; &#x2B; TimerSec &#x2B; &#x27;-if(between(t, 0, 1),1,if(gte(t,&#x27; &#x2B; TimerSec &#x2B; &#x27;),&#x27; &#x2B; TimerSec &#x2B; &#x27;,t)))/60),60))\\:d\\:2}\&#x27;,drawtext=fontcolor=&#x27; &#x2B; event.Records[0].time_text_color &#x2B; &#x27;:fontsize=&#x27; &#x2B; event.Records[0].time_text_size &#x2B; &#x27;:x=&#x27; &#x2B; event.Records[0].second_x &#x2B; &#x27;:y=&#x27; &#x2B; event.Records[0].second_y &#x2B; &#x27;:text=\&#x27;%{eif\\:trunc(mod(&#x27; &#x2B; TimerSec &#x2B; &#x27;-if(between(t, 0, 1),1,if(gte(t,&#x27; &#x2B; TimerSec &#x2B; &#x27;),&#x27; &#x2B; TimerSec &#x2B; &#x27;,t))\,60))\\:d\\:2}\&#x27;[txt]; [txt] rotate=&#x27; &#x2B; event.Records[0].box_angle &#x2B; &#x27;*PI/180:fillcolor=#00000000 [rotated];[0] scale=w=&#x27; &#x2B; BackgroundWidth &#x2B; &#x27;:h=&#x27; &#x2B; BackgroundHeight &#x2B; &#x27;[t];[1] scale=w=&#x27; &#x2B; BackgroundWidth &#x2B; &#x27;:h=&#x27; &#x2B; BackgroundHeight &#x2B; &#x27;[ot];[t][ot] overlay = :x=0 :y=0 [m1];[m1][rotated]overlay = :x=&#x27; &#x2B; event.Records[0].flat_box_coordinate_x &#x2B; &#x27; :y=&#x27; &#x2B; event.Records[0].flat_box_coordinate_x &#x2B; &#x27; [m2]" -map "[m2]" -pix_fmt yuv420p -t &#x27; &#x2B;&#xA;                ExtraTimerSec &#x2B; &#x27; -r 24 -c:a copy &#x27; &#x2B; videoPath &#x2B; "";&#xA;        }&#xA;        else {&#xA;            command = &#x27; -stream_loop -1 -t &#x27; &#x2B; ExtraTimerSec &#x2B; &#x27; -i &#x27; &#x2B; bg_value_url &#x2B; &#x27; -i &#x27; &#x2B; overlay_image_url &#x2B; &#x27; -filter_complex "color=color=0x000000@0.0:s= &#x27; &#x2B; event.Records[0].resized_box_width &#x2B; &#x27;x&#x27; &#x2B; event.Records[0].resized_box_height &#x2B; &#x27;,drawtext=fontcolor=&#x27; &#x2B; event.Records[0].time_text_color &#x2B; &#x27;:fontsize=&#x27; &#x2B; event.Records[0].time_text_size &#x2B; &#x27;:x=&#x27; &#x2B; event.Records[0].minute_x &#x2B; &#x27;:y=&#x27; &#x2B; event.Records[0].minute_y &#x2B; &#x27;:text=\&#x27;%{eif\\:trunc(mod(((&#x27; &#x2B; TimerSec &#x2B; &#x27;-if(between(t, 0, 1),1,if(gte(t,&#x27; &#x2B; TimerSec &#x2B; &#x27;),&#x27; &#x2B; TimerSec &#x2B; &#x27;,t)))/60),60))\\:d\\:2}\&#x27;,drawtext=fontcolor=&#x27; &#x2B; event.Records[0].time_text_color &#x2B; &#x27;:fontsize=&#x27; &#x2B; event.Records[0].time_text_size &#x2B; &#x27;:x=&#x27; &#x2B; event.Records[0].second_x &#x2B; &#x27;:y=&#x27; &#x2B; event.Records[0].second_y &#x2B; &#x27;:text=\&#x27;%{eif\\:trunc(mod(&#x27; &#x2B; TimerSec &#x2B; &#x27;-if(between(t, 0, 1),1,if(gte(t,&#x27; &#x2B; TimerSec &#x2B; &#x27;),&#x27; &#x2B; TimerSec &#x2B; &#x27;,t))\,60))\\:d\\:2}\&#x27;[txt]; [txt] rotate=&#x27; &#x2B; event.Records[0].box_angle &#x2B; &#x27;*PI/180:fillcolor=#00000000 [rotated];[0] scale=w=&#x27; &#x2B; BackgroundWidth &#x2B; &#x27;:h=&#x27; &#x2B; BackgroundHeight &#x2B; &#x27;[t];[1] scale=w=&#x27; &#x2B; BackgroundWidth &#x2B; &#x27;:h=&#x27; &#x2B; BackgroundHeight &#x2B; &#x27;[ot];[t][ot] overlay = :x=0 :y=0 [m1];[m1][rotated]overlay = :x=&#x27; &#x2B; event.Records[0].flat_box_coordinate_x &#x2B; &#x27; :y=&#x27; &#x2B; event.Records[0].flat_box_coordinate_x &#x2B; &#x27; [m2]" -map "[m2]" -pix_fmt yuv420p -t &#x27; &#x2B;&#xA;                ExtraTimerSec &#x2B; &#x27; -r 24 -c:a copy &#x27; &#x2B; videoPath &#x2B; "";&#xA;        }&#xA;    }&#xA;    var final_command = &#x27;/usr/bin/ffmpeg&#x27; &#x2B; command;&#xA;&#xA;&#xA;    //COMMAND EXECUTE HERE&#xA;&#xA;    await exec(final_command, function (err, stdout, stderr) {&#xA;        console.log("data is here")&#xA;        console.log(&#x27;err:&#x27;, err);&#xA;        console.log(&#x27;stdout:&#x27;, stdout);&#xA;        console.log(&#x27;stderr:&#x27;, stderr);&#xA;        const params = {&#xA;            Bucket: bucketName,&#xA;            Key: "countdown/output.mp4",&#xA;            Body: stdout,&#xA;        }&#xA;        s3.upload(params).promise().then(data => {&#xA;            console.log("data is here -->", data)&#xA;        });&#xA;    });&#xA;    var tmpFile = fs.createReadStream(videoPath)&#xA;    console.log(&#x27;temp file data:&#x27;, tmpFile.toString())&#xA;};&#xA;

    &#xA;