Recherche avancée

Médias (0)

Mot : - Tags -/serveur

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (100)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • L’agrémenter visuellement

    10 avril 2011

    MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
    Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.

Sur d’autres sites (12395)

  • ffmpeg : missing frames with mp4 encoding

    6 juillet 2016, par Sierra

    I’m currently developing a desktop app that generates videos from pictures (QImage to be more specific). I’m working with Qt 5.6 and the last build of ffmpeg (build git-0a9e781 (2016-06-10)).

    I encode severals QImage to create an .mp4 video. I already have an output but it seems that some frames are missing.

    Here is my code. I tried to be as clear as possible, removing comments and errors catching.

    ## INITIALIZATION
    #####################################################################

    AVOutputFormat  * outputFormat  = Q_NULLPTR;
    AVFormatContext * formatContext = Q_NULLPTR;

    // filePath: "C:/Users/.../qt_temp.Jv7868.mp4"
    avformat_alloc_output_context2(&formatContext, NULL, NULL, filePath.data());

    outputFormat = formatContext->oformat;
    if (outputFormat->video_codec != AV_CODEC_ID_NONE) {
       // Finding a registered encoder with a matching codec ID...
       *codec = avcodec_find_encoder(outputFormat->video_codec);

       // Adding a new stream to a media file...
       stream = avformat_new_stream(formatContext, *codec);
       stream->id = formatContext->nb_streams - 1;


       AVCodecContext * codecContext = avcodec_alloc_context3(*codec);

       switch ((*codec)->type) {
       case AVMEDIA_TYPE_VIDEO:
           codecContext->codec_id  = outputFormat->video_codec;
           codecContext->bit_rate  = 400000;

           codecContext->width     = 1240;
           codecContext->height    = 874;

           // Timebase: this is the fundamental unit of time (in seconds) in terms of which frame
           // timestamps are represented. For fixed-fps content, timebase should be 1/framerate
           // and timestamp increments should be identical to 1.
           stream->time_base       = (AVRational){1, 24};
           codecContext->time_base = stream->time_base;

           // Emit 1 intra frame every 12 frames at most
           codecContext->gop_size  = 12;
           codecContext->pix_fmt   = AV_PIX_FMT_YUV420P;

           if (codecContext->codec_id == AV_CODEC_ID_H264) {
               av_opt_set(codecContext->priv_data, "preset", "slow", 0);
           }
           break;
       }

       if (formatContext->oformat->flags & AVFMT_GLOBALHEADER) {
           codecContext->flags |= CODEC_FLAG_GLOBAL_HEADER;
       }
    }

    avcodec_open2(codecContext, codec, NULL);

    // Allocating and initializing a re-usable frames...
    frame = allocPicture(codecContext->width, codecContext->height, codecContext->pix_fmt);
    tmpFrame = allocPicture(codecContext->width, codecContext->height, AV_PIX_FMT_BGRA);

    avcodec_parameters_from_context(stream->codecpar, codecContext);

    av_dump_format(formatContext, 0, filePath.data(), 1);

    if (!(outputFormat->flags & AVFMT_NOFILE)) {
       avio_open(&formatContext->pb, filePath.data(), AVIO_FLAG_WRITE);
    }

    // Writing the stream header, if any...
    avformat_write_header(formatContext, NULL);

    ## RECEIVING A NEW FRAME
    #####################################################################

    // New QImage received: QImage image
    const qint32 width  = image.width();
    const qint32 height = image.height();

    // When we pass a frame to the encoder, it may keep a reference to it internally;
    // make sure we do not overwrite it here!
    av_frame_make_writable(tmpFrame);

    for (qint32 y = 0; y < height(); y++) {
       const uint8_t * scanline = image.scanLine(y);

       for (qint32 x = 0; x < width() * 4; x++) {
           tmpFrame->data[0][y * tmpFrame->linesize[0] + x] = scanline[x];
       }
    }

    // As we only generate a BGRA picture, we must convert it to the
    // codec pixel format if needed.
    if (!swsCtx) {
       swsCtx = sws_getContext(width, height,
                               AV_PIX_FMT_BGRA,
                               codecContext->width, codecContext->height,
                               codecContext->pix_fmt,
                               swsFlags, NULL, NULL, NULL);
    }

    sws_scale(swsCtx,
             (const uint8_t * const *)tmpFrame->data,
             tmpFrame->linesize,
             0,
             codecContext->height,
             frame->data,
             frame->linesize);

    ...

    AVFrame * frame = Q_NULLPTR;
    int gotPacket = 0;

    av_init_packet(&packet);

    // Packet data will be allocated by the encoder
    this->packet.data = NULL;
    this->packet.size = 0;

    frame->pts = nextPts++; // nextPts starts to 0
    avcodec_encode_video2(codecContext, &packet, frame, &gotPacket);

    if (gotPacket) {
       if (codecContext->coded_frame->key_frame) {
          packet.flags |= AV_PKT_FLAG_KEY;
       }

       // Rescale output packet timestamp values from codec to stream timebase
       av_packet_rescale_ts(packet, *codecContext->time_base, stream->time_base);
       packet->stream_index = stream->index;

       // Write the compressed frame to the media file.
       av_interleaved_write_frame(formatContext, packet);

       av_packet_unref(&this->packet);
    }

    ## FINISHING ENCODING
    #####################################################################

    // Retrieving delayed frames if any...
    for (int gotOutput = 1; gotOutput;) {
       avcodec_encode_video2(codecContext, &packet, NULL, &gotOutput);

       if (gotOutput) {
           // Rescale output packet timestamp values from codec to stream timebase
           av_packet_rescale_ts(packet, *codecContext->time_base, stream->time_base);
           packet->stream_index = stream->index;

           // Write the compressed frame to the media file.
           av_interleaved_write_frame(formatContext, packet);
           av_packet_unref(&packet);
       }
    }

    av_write_trailer(formatContext);

    avcodec_free_context(&codecContext);
    av_frame_free(&frame);
    av_frame_free(&tmpFrame);
    sws_freeContext(swsCtx);

    if (!(outputFormat->flags & AVFMT_NOFILE)) {
       // Closing the output file...
       avio_closep(&formatContext->pb);
    }

    avformat_free_context(formatContext);

    A part of the last second is always cut (e.g. when I send 48 frames, 24 fps, media players show 1,9 seconds of the video). I analyzed the video (48 frames, 24fps) with ffmpeg in command line, and I found something weird :
    enter image description here
    When I re encode the video with ffmpeg (in command line) to the same format, I get a more logical output :
    enter image description here

    From what I rode on different topics, I think it is closely connected to the h264 codec but I have no idea how to fix it. I’m not familiar with ffmpeg so any kind of help would be highly appreciated. Thank you.

    EDIT 06/07/2016
    Digging a little bit more in ffmpeg examples, I noticed these lines when closing the media file :

    uint8_t endcode[] = { 0, 0, 1, 0xb7 };
    ...
    /* add sequence end code to have a real mpeg file */
    fwrite(endcode, 1, sizeof(endcode), f);

    Is that sequence could be linked to my problem ? I’m trying to implement this to my code but, for now, it corrupts the media file. Any idea on how can I implement that line for my case ?

  • No such filter in ffmpeg

    13 avril 2017, par sneha desai

    I am trying to create the slideshow with below command.

    Here is the command I have executed :

    ffmpeg
    -loop 1 -t 1 -i /sdcard/input0.png
    -loop 1 -t 1 -i /sdcard/input1.png
    -loop 1 -t 1 -i /sdcard/input2.png
    -loop 1 -t 1 -i /sdcard/input3.png
    -loop 1 -t 1 -i /sdcard/input4.png
    -filter_complex
    "[0:v]trim=duration=15,fade=t=out:st=14.5:d=0.5[v0];
    [1:v]trim=duration=15,fade=t=in:st=0:d=0.5,fade=t=out:st=14.5:d=0.5[v1];
    [2:v]trim=duration=15,fade=t=in:st=0:d=0.5,fade=t=out:st=14.5:d=0.5[v2];
    [3:v]trim=duration=15,fade=t=in:st=0:d=0.5,fade=t=out:st=14.5:d=0.5[v3];
    [4:v]trim=duration=15,fade=t=in:st=0:d=0.5,fade=t=out:st=14.5:d=0.5[v4];
    [v0][v1][v2][v3][v4]concat=n=5:v=1:a=0,format=yuv420p[v]" -map "[v]" /sdcard/out.mp4

    on execution of this command it gives error something like :

    onFailure: ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.8 (GCC)
     configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
     libavutil      55. 17.103 / 55. 17.103
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    [mjpeg @ 0x4362af10] Changing bps to 8
    Input #0, image2, from '/sdcard/img0001.jpg':
     Duration: 00:00:00.04, start: 0.000000, bitrate: 2410 kb/s
       Stream #0:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 259x194 [SAR 1:1 DAR 259:194], 25 fps, 25 tbr, 25 tbn, 25 tbc
    [mjpeg @ 0x436300a0] Changing bps to 8
    Input #1, image2, from '/sdcard/img0002.jpg':
     Duration: 00:00:00.04, start: 0.000000, bitrate: 2053 kb/s
       Stream #1:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 290x174 [SAR 1:1 DAR 5:3], 25 fps, 25 tbr, 25 tbn, 25 tbc
    [mjpeg @ 0x436383a0] Changing bps to 8
    Input #2, image2, from '/sdcard/img0003.jpg':
     Duration: 00:00:00.04, start: 0.000000, bitrate: 3791 kb/s
       Stream #2:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 300x168 [SAR 1:1 DAR 25:14], 25 fps, 25 tbr, 25 tbn, 25 tbc
    [mjpeg @ 0x43648f50] Changing bps to 8
    Input #3, image2, from '/sdcard/img0004.jpg':
     Duration: 00:00:00.04, start: 0.000000, bitrate: 1796 kb/s
       Stream #3:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 259x194 [SAR 1:1 DAR 259:194], 25 fps, 25 tbr, 25 tbn, 25 tbc
    [mjpeg @ 0x437b4070] Changing bps to 8
    Input #4, image2, from '/sdcard/img0005.jpg':
     Duration: 00:00:00.04, start: 0.000000, bitrate: 1083 kb/s
       Stream #4:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 212x160 [SAR 1:1 DAR 53:40], 25 fps, 25 tbr, 25 tbn, 25 tbc
    [AVFilterGraph @ 0x4393c960] No such filter: '"'
    Error initializing complex filters.
    Invalid argument

    and i used this demo https://github.com/WritingMinds/ffmpeg-android-java

  • gst-ffmpeg building from source failure

    30 juin 2016, par Bharath Vemula

    I’am a non-root user on a cluster and i’am using centos-6.2 , i had to install gst-ffmpeg. i was installing it from source as i was a non-root user in 3 steps as known

    1. ./configure —prefix="my homedirectory"
      2.make
      3.make install

    when i configure it shows all the dependecies have been resolved when i run the make command i get the following error

    > gcc -g -O2 -Wall -Wno-switch -o ffplay ffplay-cmdutils.o
    > ffplay-ffplay.o  -lSDL -lpthread libavutil/.libs/libavutil.a
    > libavformat/.libs/libavformat.a -ldl libavcodec/.libs/libavcodec.a -lz
    > -lm ffplay-ffplay.o: In function `main': /home/internal/evol/sskadam/test/gst-ffmpeg-0.10.1/gst-libs/ext/ffmpeg/ffplay.c:2451:
    > undefined reference to `XOpenDisplay'
    > /home/internal/evol/sskadam/test/gst-ffmpeg-0.10.1/gst-libs/ext/ffmpeg/ffplay.c:2455:
    > undefined reference to `XCloseDisplay' collect2: ld returned 1 exit
    > status make[7]: *** [ffplay] Error 1 make[7]: Leaving directory
    > `/home/internal/evol/sskadam/test/gst-ffmpeg-0.10.1/gst-libs/ext/ffmpeg'
    > make[6]: *** [all-recursive] Error 1 make[6]: Leaving directory
    > `/home/internal/evol/sskadam/test/gst-ffmpeg-0.10.1/gst-libs/ext/ffmpeg'
    > make[5]: *** [all] Error 2 make[5]: Leaving directory
    > `/home/internal/evol/sskadam/test/gst-ffmpeg-0.10.1/gst-libs/ext/ffmpeg'
    > make[4]: *** [all-local] Error 2 make[4]: Leaving directory
    > `/home/internal/evol/sskadam/test/gst-ffmpeg-0.10.1/gst-libs/ext'
    > make[3]: *** [all-recursive] Error 1 make[3]: Leaving directory
    > `/home/internal/evol/sskadam/test/gst-ffmpeg-0.10.1/gst-libs/ext'
    > make[2]: *** [all-recursive] Error 1 make[2]: Leaving directory
    > `/home/internal/evol/sskadam/test/gst-ffmpeg-0.10.1/gst-libs' make[1]:
    > *** [all-recursive] Error 1 make[1]: Leaving directory `/home/internal/evol/sskadam/test/gst-ffmpeg-0.10.1' make: *** [all]
    > Error 2 [sskadam@ycn1 gst-ffmpeg-0.10.1]$ cd gst-libs/ext/ffmpeg/
    > [sskadam@ycn1 ffmpeg]$ vi ffplay.c

    Please help me resolve this