Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (74)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (14681)

  • Treating a video stream as playback with pausing

    21 janvier 2020, par kealist

    I am working on an application that streams multiple h264 video streams to a video wall. I am using libav/ffmpeg libs to stream multiple video files at once from inside the application. The application will control playback speed, seeking, pausing, resuming, stopping, and the video wall will only be receiving udp streams.

    I want to implement streaming such that if the videos are paused, the same frame is sent continually so that it looks as if it is a video window in a paused state.

    How can i insert copies of the same h264 frame into the stream so that it does not mess up sending of later frames ?

    My code is almost an exact port of transcoding.c from ffmpeg.exe. Planning on retaining a copy of the last frame/packet, and when paused to send this. Is this likely to function properly, or should I approach this a different way.

    while (true)
    {
       if (paused) {
           // USE LAST PACKET
       }
       else
       {
           if ((ret = ffmpeg.av_read_frame(ifmt_ctx, &packet)) < 0)
               break;
       }
       stream_index = packet.stream_index;

       type = ifmt_ctx->streams[packet.stream_index]->codec->codec_type;
       Console.WriteLine("Demuxer gave frame of stream_index " + stream_index);
       if (filter_ctx[stream_index].filter_graph != null)
       {
           Console.WriteLine("Going to reencode&filter the frame\n");
           frame = ffmpeg.av_frame_alloc();
           if (frame == null)
           {
               ret = ffmpeg.AVERROR(ffmpeg.ENOMEM);
               break;
           }

           packet.dts = ffmpeg.av_rescale_q_rnd(packet.dts,
                   ifmt_ctx->streams[stream_index]->time_base,
                   ifmt_ctx->streams[stream_index]->codec->time_base,
                   AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
           packet.pts = ffmpeg.av_rescale_q_rnd(packet.pts,
                   ifmt_ctx->streams[stream_index]->time_base,
                   ifmt_ctx->streams[stream_index]->codec->time_base,
                   AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);



           if (type == AVMediaType.AVMEDIA_TYPE_VIDEO)
           {

               ret = ffmpeg.avcodec_decode_video2(stream_ctx[packet.stream_index].dec_ctx, frame,
                   &got_frame, &packet);

           }
           else
           {
               ret = ffmpeg.avcodec_decode_audio4(stream_ctx[packet.stream_index].dec_ctx, frame,
                   &got_frame, &packet);
           }
           if (ret < 0)
           {
               ffmpeg.av_frame_free(&frame);
               Console.WriteLine("Decoding failed\n");
               break;
           }
           if (got_frame != 0)
           {
               frame->pts = ffmpeg.av_frame_get_best_effort_timestamp(frame);
               ret = filter_encode_write_frame(frame, (uint)stream_index);
               // SAVE LAST FRAME/PACKET HERE
               ffmpeg.av_frame_free(&frame);
               if (ret < 0)
                   goto end;
           }
           else
           {
               ffmpeg.av_frame_free(&frame);
           }
       }
       else
       {
           /* remux this frame without reencoding */
           packet.dts = ffmpeg.av_rescale_q_rnd(packet.dts,
                   ifmt_ctx->streams[stream_index]->time_base,
                   ofmt_ctx->streams[stream_index]->time_base,
                   AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
           packet.pts = ffmpeg.av_rescale_q_rnd(packet.pts,
                   ifmt_ctx->streams[stream_index]->time_base,
                   ofmt_ctx->streams[stream_index]->time_base,
                   AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
           ret = ffmpeg.av_interleaved_write_frame(ofmt_ctx, &packet);
           if (ret < 0)
               goto end;
       }
       ffmpeg.av_free_packet(&packet);
    }
  • Serving video stream in Node with ffmpeg

    30 mars 2023, par Spedwards

    I have a local-only utility where the backend is Adonis.js and the frontend is Vue.js. I'm trying to get a readable stream and have it play in my frontend. I have got the very basics down, the video plays, but I can't skip to anywhere else in the video, it'll just jump back to where it left off and continue playing.

    


    I've been told that it requires a bi-directional data flow. What I was planning on doing was updating the frontend stream URL to add a query string to the end with the timestamp of where the user (me) skips to. This would go back to the backend and I'd use ffmpeg to create a new stream from the video starting at that timestamp.

    


    The problem is that I've never really messed around with streams before and I'm finding all of this very confusing. I'm able to get a ReadStream of my video and serve it, but I can't write to it. I can create a WriteStream and have it start at my timestamp (I think) but I can't serve it because I can only return ReadStream, ReadWriteStream, or ReadableStream. The ReadWriteStream sounds perfect but I have no idea how to create one and I couldn't find anything fruitful after a few hours of searching, nor could I find anyway of converting a WriteStream to a ReadStream.

    


    There's also the problem I alluded to ; I have no idea if my ffmpeg method is actually working since I can't serve it to test.

    


    My working controller method without any of the timestamp stuff is as follows :

    


    public async stream({ params, response }: HttpContextContract) {
    const file = await File.find(params.id)
    if (!file) {
        return response.badRequest()
    }
    const stream = await Drive.getStream(file.path) // this creates a ReadableStream
    return response.stream(stream)
}


    


    For all the ffmpeg stuff, I'm using fluent-ffmpeg as it was the best wrapper I could find.

    


    This was my first attempt.

    


    public async stream({ params, request, response }: HttpContextContract) {
    const file = await File.find(params.id)
    if (!file) {
        return response.badRequest()
    }
    const stream = await Drive.getStream(file.path) // this creates a ReadableStream
    if (request.input('t')) {
        const timestamp = request.input('t')
        ffmpeg()
            .input(stream)
            .seekInput(timestamp)
            .output(stream)
    }
    return response.stream(stream)
}


    


    How can I achieve what I want ? Am I going about this the wrong way and/or is there a better way ?

    


  • Invalid too big FFmpeg Crop

    15 avril 2021, par cropperteam

    I have a video cropper working fine to all videos I tested, but a video from s20 ultra, gives me some error in FFmpeg, i don't really know if the video is oversize or not my source code https://github.com/rushidevmurari/RushiCropVideo/blob/213166834b0bc6ce83ab3f4ac494f261a57d7436/app/src/main/java/com/androworld/allinonevideoeditor/videocrop/VideoCropActivity.java

    


    I got these link but nothing good to my work Error when cropping video using FFMPEG

    


      [graph 0 input from stream 0:0 @ 0x7673807a4b40] sws_param option is deprecated and ignored
    [Parsed_crop_0 @ 0x7673807a4c80] Invalid too big or non positive size for width '3840' or height '1920'
    [Parsed_crop_0 @ 0x7673807a4c80] Failed to configure input pad on Parsed_crop_0
    Error reinitializing filters!
    Failed to inject frame into filter network: Invalid argument
    Error while processing the decoded data for stream #0:0
    Conversion failed!


    


    my command

    


    I/System.out: "-y" "-ss" "0" "-t" "31" "-i" "/storage/emulated/0/Download/20210227_175547.mp4" "-strict" "experimental" "-vf" "crop=w=3837:h=2160:x=1:y=-2070" "-r" "15" "-ab" "128k" "-vcodec" "mpeg4" "-acodec" "copy" "-b:v" "2500k" "-sample_fmt" "s16" "-ss" "0" "-t" "31" "/storage/emulated/0/VEditor/VideoCroper/20210227_175547-0-13.mp4"


    


    I/mobile-ffmpeg : Loading mobile-ffmpeg.

    


    my full log https://ghostbin.co/paste/s2qdw