Recherche avancée

Médias (3)

Mot : - Tags -/image

Autres articles (73)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (14566)

  • How do I configure ffmpeg & openh264 so that the video file can be opened in Windows Media Player 12

    10 mars 2017, par Sacha Guyer

    I have successfully created h264/mp4 movie files with ffmpeg and the x264 library.

    Now I would like to change the h264 library from x264 to openH264. I could replace the x264 library with openH264, recompile ffmpeg and produce movie files, without changing my sources that produce the movie. The resulting movie opens fine in Quicktime on Mac, but on Windows, Windows Media Player 12 cannot play it.

    The documentation about Windows Media Player support for h264 is unclear. File types supported by Windows Media Player states in the table that Windows Media Player 12 supports mp4, but the text below says :

    Windows Media Player does not support the playback of the .mp4 file format.

    From what I have observed, Windows Media Player 12 IS capable of playing h264/mp4 files, but only when created with x264.

    Does anyone know how I need to adjust the configuration of the codec/context so that the movie plays in Windows Media Player ? Does Windows Media Player only support certain h264 profiles ?

    I noticed the warning :

    [libopenh264 @ 0x...] [OpenH264] this = 0x..., Warning:bEnableFrameSkip = 0,bitrate can’t be controlled for RC_QUALITY_MODE,RC_BITRATE_MODE and RC_TIMESTAMP_MODE without enabling skip frame

    With the configuration :

    av_dict_set(&options, "allow_skip_frames", "1", 0);

    I could get rid of this warning, but the movie still does not play. Are there other options that need to be set so that the movie plays in Windows Media Player ?

    Thank you for your help

    ffprobe output of the file that does play fine in Windows Media Player :

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test_x264.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       title           : retina
       encoder         : Lavf57.56.100
       comment         : Creation Date: 2017-03-10 07:47:39.601
     Duration: 00:00:04.17, start: 0.000000, bitrate: 17497 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661),
         yuv420p, 852x754, 17495 kb/s, 24 fps, 24 tbr, 24k tbn, 48 tbc (default)
       Metadata:
         handler_name    : VideoHandler

    ffprobe output of the file that does not play in Windows Media Player :

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test_openh264.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       title           : retina
       encoder         : Lavf57.56.100
       comment         : Creation Date: 2017-03-10 07:49:27.024
     Duration: 00:00:04.17, start: 0.000000, bitrate: 17781 kb/s
       Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661),
         yuv420p, 852x754, 17779 kb/s, 24 fps, 24 tbr, 24k tbn, 48k tbc (default)
       Metadata:
         handler_name    : VideoHandler
  • ffmpeg crashes on crossfades between 3 clips if 2 clips coming from same input file [closed]

    14 avril 2020, par Erik

    I observed that ffmpeg 4.2.2 (macOS) crashes in particular cases of crossfades between clips, if one clip comes from file 1.dv, and two clips are cut out of file 2.dv, as shown below :

    



    ffmpeg -f lavfi -i color=black:size=720x576:duration=11:rate=25 -i 1.dv -i 2.dv -filter_complex "\
    [1:v]trim=5:10,setpts=expr=PTS-STARTPTS,yadif,fade=alpha=1:d=2:st=3:type=out,setpts=expr=PTS-STARTPTS,fifo[s5];\
    [2:v]split=2[s7][s8];\
    [s7]trim=5:10,setpts=expr=PTS-STARTPTS,yadif,fade=alpha=1:d=2:type=in,fade=alpha=1:d=2:st=6:type=out,setpts=expr=PTS-STARTPTS+(3/TB),fifo[s15];\
    [s8]trim=12:17,setpts=expr=PTS-STARTPTS,yadif,fade=alpha=1:d=2:type=in,setpts=expr=PTS-STARTPTS+(6/TB),fifo[s22];\
    [0:v][s5]overlay=eof_action=repeat[s6];\
    [s6][s15]overlay=eof_action=repeat[s16];\
    [s16][s22]overlay=eof_action=repeat[s24];\
    [1:a]atrim=5:10,asetpts=expr=PTS-STARTPTS[s26];\
    [2:a]asplit=2[s27][s28];\
    [s27]atrim=5:10,asetpts=expr=PTS-STARTPTS[s30];\
    [s28]atrim=12:17,asetpts=expr=PTS-STARTPTS[s33];\
    [s26][s30]acrossfade=d=2[s31];\
    [s31][s33]acrossfade=d=2[s36]" \
     -map "[s24]" -map "[s36]" -ab 128k -acodec aac -crf 23 -movflags faststart -preset medium -tune film -vcodec libx264 -aspect 1024:576 out.mp4 -y


    



    The order makes a difference : if the two clips from 2.dv are used first and then the clip from 1.dv is appended, everything works fine. Also, if all clips are coming from different files.

    



    ffmpeg 3.4.6 (ubuntu 18.04) shows no issues in any case.

    



    A self-compiled ffmpeg version N-97322-gb1699f4 (commit 2020-04-13) works with short clips as above, but crashes if one of the two clips taken from 2.dv is getting longer. In my tests 1500 frames (64 sec) is OK, 1700 (68 sec) leads to a segmentation fault. That is, if you replace in the command line above :

    



      

    • [s7]trim=5:10... -> [s7]trim=0:68 and accordingly
    • 


    • [s27]atrim=5:10... -> [s27]atrim=0:68
    • 


    



    Interestingly, the length of the clip taken from 1.dv does not play a role.

    



    The ffmpeg output shows about 20 times :

    



    frame=    0 fps=0.0 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x  


    



    before it continues (seg fault case) :

    



    frame=    4 fps=0.3 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   24 fps=1.6 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   25 fps=1.5 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   34 fps=1.9 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   36 fps=2.0 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   39 fps=2.1 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   40 fps=2.0 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   40 fps=2.0 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   40 fps=1.9 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    



    



    success case :

    



    frame=    5 fps=0.4 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   42 fps=3.2 q=28.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
frame=   53 fps=3.9 q=28.0 size=       0kB time=00:00:00.36 bitrate=   2.9kbits/s speed=0.0264x    
frame=   65 fps=4.6 q=28.0 size=       0kB time=00:00:00.84 bitrate=   1.2kbits/s speed=0.0594x 
...   


    



    Slightly older versions included in the newest MacOS builds from zeranoe.com (git-2020-04-13-59e3a9a) and evermeet.cx (N-97308-g14dd0a9057-tessus, from 2020-04-12) are working nicely - also on my production cases (longer clips).

    



    Any feedback would be appreciated !

    


  • How to extract frames at 30 fps using FFMPEG APIs on Android ?

    8 septembre 2016, par Amber Beriwal

    We are working on a project that consumes FFMPEG library for video frame extraction on Android platform.

    On Windows, we have observed :

    • Using CLI, ffmpeg is capable of extracting frames at 30 fps using command ffmpeg -i input.flv -vf fps=1 out%d.png.
    • Using Xuggler, we are able to extract frames at 30 fps.
    • Using FFMPEG APIs directly in code, we are getting frames at 30 fps.

    But when we use FFMPEG APIs directly on Android (See Hardware Details), we are getting following results :

    • 720p video (1280 x 720) - 16 fps (approx. 60 ms/frame)
    • 1080p video (1920 x 1080) - 7 fps (approx. 140 ms/frame)

    We haven’t tested Xuggler/CLI on Android yet.

    Ideally, we should be able to get the data in constant time (approx. 30 ms/frame).

    How can we get 30 fps on Android ?

    Code being used on Android :

    if (avformat_open_input(&pFormatCtx, pcVideoFile, NULL, NULL)) {
       iError = -1;  //Couldn't open file
    }

    if (!iError) {
       //Retrieve stream information
       if (avformat_find_stream_info(pFormatCtx, NULL) < 0)
           iError = -2; //Couldn't find stream information
    }

    //Find the first video stream
    if (!iError) {

       for (i = 0; i < pFormatCtx->nb_streams; i++) {
           if (AVMEDIA_TYPE_VIDEO
                   == pFormatCtx->streams[i]->codec->codec_type) {
               iFramesInVideo = pFormatCtx->streams[i]->nb_index_entries;
               duration = pFormatCtx->streams[i]->duration;
               begin = pFormatCtx->streams[i]->start_time;
               time_base = (pFormatCtx->streams[i]->time_base.num * 1.0f)
                       / pFormatCtx->streams[i]->time_base.den;

               pCodecCtx = avcodec_alloc_context3(NULL);
               if (!pCodecCtx) {
                   iError = -6;
                   break;
               }

               AVCodecParameters params = { 0 };
               iReturn = avcodec_parameters_from_context(&params,
                       pFormatCtx->streams[i]->codec);
               if (iReturn < 0) {
                   iError = -7;
                   break;
               }

               iReturn = avcodec_parameters_to_context(pCodecCtx, &params);
               if (iReturn < 0) {
                   iError = -7;
                   break;
               }

               //pCodecCtx = pFormatCtx->streams[i]->codec;

               iVideoStreamIndex = i;
               break;
           }
       }
    }

    if (!iError) {
       if (iVideoStreamIndex == -1) {
           iError = -3; // Didn't find a video stream
       }
    }

    if (!iError) {
       // Find the decoder for the video stream
       pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
       if (pCodec == NULL) {
           iError = -4;
       }
    }

    if (!iError) {
       // Open codec
       if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0)
           iError = -5;
    }

    if (!iError) {
       iNumBytes = av_image_get_buffer_size(AV_PIX_FMT_RGB24, pCodecCtx->width,
               pCodecCtx->height, 1);

       // initialize SWS context for software scaling
       sws_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height,
               pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height,
               AV_PIX_FMT_RGB24,
               SWS_BILINEAR,
               NULL,
               NULL,
               NULL);
       if (!sws_ctx) {
           iError = -7;
       }
    }
    clock_gettime(CLOCK_MONOTONIC_RAW, &end);
    delta_us = (end.tv_sec - start.tv_sec) * 1000000
           + (end.tv_nsec - start.tv_nsec) / 1000;
    start = end;
    //LOGI("Starting_Frame_Extraction: %lld", delta_us);
    if (!iError) {
       while (av_read_frame(pFormatCtx, &packet) == 0) {
           // Is this a packet from the video stream?
           if (packet.stream_index == iVideoStreamIndex) {
               pFrame = av_frame_alloc();
               if (NULL == pFrame) {
                   iError = -8;
                   break;
               }

               // Decode video frame
               avcodec_decode_video2(pCodecCtx, pFrame, &iFrameFinished,
                       &packet);
               if (iFrameFinished) {
                   //OUR CODE
               }
               av_frame_free(&pFrame);
               pFrame = NULL;
           }
           av_packet_unref(&packet);
       }
    }