Recherche avancée

Médias (91)

Autres articles (85)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • MediaSPIP Player : problèmes potentiels

    22 février 2011, par

    Le lecteur ne fonctionne pas sur Internet Explorer
    Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
    Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (7147)

  • lavf/avio : Be more explicit in logging white/black list matches

    11 mars 2017, par Alexander Strasser
    lavf/avio : Be more explicit in logging white/black list matches
    

    The current form of the messages indicating matches in the white
    or black lists seems to be a bit too much relying on context.

    Make the messages more explicit.

    Signed-off-by : Alexander Strasser <eclipse7@gmx.net>

    • [DH] libavformat/avio.c
  • libavformat : calling avformat_open_input 2 times results in decoding white frames

    25 avril 2017, par explodus
    • pre build ffmpeg libs format/util/scale
    • version 57.56.101
    • don’t use any deprecated function
    • use the actual style
      • av_read_frame -> avcodec_send_packet -> avcodec_receive_frame -> sws_scale

    Everything is fine on the first run, but when i wanna load/open another file i only get white frames.

    void video::app::flush_cached_frames() {
       if (nullptr == avcontext)
           return;
       if (nullptr == avpicture)
           return;

       // send an empty packet which instructs the codec to start flushing
       AVPacket pkt;
       av_init_packet(&amp;pkt);
       pkt.data = NULL;
       pkt.size = 0;
       avcodec_send_packet(avcontext, &amp;pkt);

       // drain the codec
       while (true) {
           int r = avcodec_receive_frame(avcontext, avpicture);
           if (r != 0)
               break;
       }
    }

    void video::app::close_avi() {
       flush_cached_frames();

       if (avformat &amp;&amp; avstream)
           seek_to_frame(0);
       avstream = nullptr;

       if (avfile)
           fclose(avfile);
       avfile = nullptr;

       if (avcontext)
           avcodec_close(avcontext);
       avcontext = nullptr;

       if (avformat)
           avformat_free_context(avformat);
       avformat = nullptr;

       if (sws_ctx)
           sws_freeContext(sws_ctx);
       sws_ctx = nullptr;

       if (avparser)
           av_parser_close(avparser);
       avparser = nullptr;

       if (avinbuf)
           av_free(avinbuf);
       avinbuf = nullptr;
    }

    I think i close anything perfectly. Has anyone an idea ?

    edit1 : init/load

    unsigned video::app::load(const std::string&amp; name) {
       _file = name_;

       close_avi();

       av_register_all();
       avcodec_register_all();

       av_init_packet(&amp;avpkt);

       AVCodecID codec_id = AV_CODEC_ID_H264;
       int64_t duration = 0;
       double fps = .0;
       int ret = 0;
       {
           av_log_set_level(1);

           avfile = fopen(name_.c_str(), "rb");

           avformat = avformat_alloc_context();
           ret = avformat_open_input(&amp;avformat, name_.c_str(), nullptr, nullptr);
           ret = avformat_find_stream_info(avformat, nullptr);
           duration = avformat->duration;

           avstream = nullptr;
           if (avformat->nb_streams == 1) {
               avstream = avformat->streams[0];
           } else {
               avstream = avformat->streams[av_find_default_stream_index(avformat)];
           }

           if (avstream) {
               fps = (double(avstream->avg_frame_rate.num) / double(avstream->avg_frame_rate.den));
               codec_id = avstream->codecpar->codec_id;
               duration = avstream->duration;

               _vid.v_width = avstream->codecpar->width;
               _vid.v_height = avstream->codecpar->height;

               _vid.lastframe = duration / fps;
               _vid.lastframe = avstream->nb_frames;
           }

           avcodec = avcodec_find_decoder(avstream->codecpar->codec_id);
           avparser = av_parser_init(avcodec->id);
           avcontext = avcodec_alloc_context3(avcodec);

           avcontext->flags |= AVFMT_FLAG_NONBLOCK;
           avcontext->flags |= AVFMT_FLAG_FLUSH_PACKETS;
           avcontext->flags |= AVFMT_FLAG_DISCARD_CORRUPT;
           avcontext->flags |= AVFMT_FLAG_NOBUFFER;

           ret = avcodec_parameters_to_context(avcontext, avstream->codecpar);
           ret = avcodec_open2(avcontext, avcodec, nullptr);


           // Determine required buffer size and allocate buffer
           auto numBytes = av_image_get_buffer_size(
                 AV_PIX_FMT_BGRA
               , avcontext->width
               , avcontext->height
               , 1);
           if (avinbuf)
               av_free(avinbuf);
           avinbuf = nullptr;
           avinbuf = (uint8_t *)av_malloc(numBytes * sizeof(uint8_t));
           ret = av_image_fill_arrays(
                 avrgb->data
               , avrgb->linesize
               , avinbuf
               , AV_PIX_FMT_BGRA
               , avcontext->width
               , avcontext->height
               , 1);

           sws_ctx = sws_getContext(
                   avcontext->width
               , avcontext->height
               , avcontext->pix_fmt
               , avcontext->width
               , avcontext->height
               , AV_PIX_FMT_BGRA
               , SWS_BILINEAR
               , nullptr
               , nullptr
               , nullptr
           );
       }

       int err = (sws_ctx &amp;&amp; avcontext &amp;&amp; avformat) ? 0 : 1;
       // ...
    }

    getting the frame :

    uint8_t * video::app::get_frame(uint32_t frame) {
       if (!avcontext)
           return nullptr;
       if (!avformat)
           return nullptr;
       if (!avpicture)
           return nullptr;
       if (!avfile)
           return nullptr;

       try {
           int ret = 0;

           if (avpicture->data)
               av_frame_unref(avpicture);

           while (true) {
               if ((ret = av_read_frame(avformat, &amp;avpkt)) &lt; 0)
                   break;

               if (avpkt.stream_index == avstream->index) {
                   ret = avcodec_send_packet(avcontext, &amp;avpkt);
                   if (ret &lt; 0)
                       break;

                   while (ret >= 0) {
                       ret = avcodec_receive_frame(avcontext, avpicture);
                       if (ret == AVERROR_EOF) {
                           return nullptr;
                       } else if (ret == -11) {
                           avpkt.data = nullptr;
                           avpkt.size = 0;
                           break;
                       } else if (ret &lt; 0) {
                           return nullptr;
                       }

                       if (ret == AVERROR(EAGAIN)) {
                           avpkt.data = nullptr;
                           avpkt.size = 0;
                           break;
                       }

                       if (ret >= 0) {
                           int linesize[AV_NUM_DATA_POINTERS] = {
                                 avpicture->linesize[0]
                               , avpicture->linesize[1]
                               , avpicture->linesize[2]
                               , avpicture->linesize[3]
                               , avpicture->linesize[4]
                               , avpicture->linesize[5]
                               , avpicture->linesize[6]
                               , avpicture->linesize[7]
                           };
                           uint8_t * data[AV_NUM_DATA_POINTERS] = {
                                 avpicture->data[0]
                               , avpicture->data[1]
                               , avpicture->data[2]
                               , avpicture->data[3]
                               , avpicture->data[4]
                               , avpicture->data[5]
                               , avpicture->data[6]
                               , avpicture->data[7]
                           };

                           {
                               // flip the frame, never ever touch this thing again!
                               // If the planes in the image are unequal size(e.g.YUV420) you need to adapt the height.
                               auto h = avcontext->height;
                               for (int i = 0; i &lt; 4; i++) {
                                   if (i)
                                       data[i] += linesize[i] * ((h >> 1) - 1);
                                   else
                                       data[i] += linesize[i] * (h - 1);
                                   linesize[i] = -linesize[i];
                               }
                           }

                           ret = sws_scale(
                                 sws_ctx
                               , (uint8_t const * const *)data
                               , linesize
                               , 0
                               , avcontext->height
                               , avrgb->data
                               , avrgb->linesize);

                           av_packet_unref(&amp;avpkt);

                           currPts = avpkt.dts;
                           currPts *= av_q2d(avstream->time_base);

                           usleep(1000000 * (currPts - prevPts));
                           prevPts = currPts;

                           return avrgb->data[0];
                       }
                   }
               }
               av_packet_unref(&amp;avpkt);
           }

       } catch (...) {
       }

       return nullptr;
    }
  • After scaled video, it convert transparent background to white

    8 mars 2017, par Sohaib Raza

    I scaled video using ffmpeg. When i scaled video, the output video having white background ,while the input video having transparent background.
    below command using for scaled video.

    ffmpeg -i inputVideo.mp4 -vf scale=100:100 outputVideo.mp4

    you can see the image below that is take from my video,for more understanding my problem.

    image from input video
    enter image description here

    you can see the above image from input video having no white background.

    image from output video
    enter image description here

    you can see the above image from output video having white backgournd.

    please give me your valuable answer regarding this question.
    any answer will be very appreciate.

    D: onProgress:   configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
    D: onProgress:   libavutil      55. 17.103 / 55. 17.103
    D: onProgress:   libavcodec     57. 24.102 / 57. 24.102
    D: onProgress:   libavformat    57. 25.100 / 57. 25.100
    D: onProgress:   libavdevice    57.  0.101 / 57.  0.101
    D: onProgress:   libavfilter     6. 31.100 /  6. 31.100
    D: onProgress:   libswscale      4.  0.100 /  4.  0.100
    D: onProgress:   libswresample   2.  0.101 /  2.  0.101
    D: onProgress:   libpostproc    54.  0.100 / 54.  0.100
    D: onProgress: Input #0, gif, from '/storage/emulated/0/Pictures/eye.gif':
    D: onProgress:   Duration: N/A, bitrate: N/A
    D: onProgress:     Stream #0:0: Video: gif, bgra, 480x190, 5 fps, 10 tbr, 100 tbn, 100 tbc
    D: onProgress: Output #0, gif, to '/storage/emulated/0/GIF/REV20170308_101145.gif':
    D: onProgress:   Metadata:
    D: onProgress:     encoder         : Lavf57.25.100
    D: onProgress:     Stream #0:0: Video: gif, bgr8, 125x56, q=2-31, 200 kb/s, 10 fps, 100 tbn, 10 tbc
    D: onProgress:     Metadata:
    D: onProgress:       encoder         : Lavc57.24.102 gif
    D: onProgress: Stream mapping:
    D: onProgress:   Stream #0:0 -> #0:0 (gif (native) -> gif (native))
    D: onProgress: Press [q] to stop, [?] for help
    D: onProgress: frame=   11 fps=0.0 q=-0.0 Lsize=      17kB time=00:00:02.90 bitrate=  48.8kbits/s speed=14.8x<br />
    D: onProgress: video:16kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 5.287261%
    D: Started command : ffmpeg [Ljava.lang.String;@422f62d0
    D: onSuccess: ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
        built with gcc 4.8 (GCC)
        configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
        libavutil      55. 17.103 / 55. 17.103
        libavcodec     57. 24.102 / 57. 24.102
        libavformat    57. 25.100 / 57. 25.100
        libavdevice    57.  0.101 / 57.  0.101
        libavfilter     6. 31.100 /  6. 31.100
        libswscale      4.  0.100 /  4.  0.100
        libswresample   2.  0.101 /  2.  0.101
        libpostproc    54.  0.100 / 54.  0.100
      Input #0, gif, from '/storage/emulated/0/Pictures/eye.gif':
        Duration: N/A, bitrate: N/A
          Stream #0:0: Video: gif, bgra, 480x190, 5 fps, 10 tbr, 100 tbn, 100 tbc
      Output #0, gif, to '/storage/emulated/0/GIF/REV20170308_101145.gif':
        Metadata:
          encoder         : Lavf57.25.100
          Stream #0:0: Video: gif, bgr8, 125x56, q=2-31, 200 kb/s, 10 fps, 100 tbn, 10 tbc
          Metadata:
            encoder         : Lavc57.24.102 gif
      Stream mapping:
        Stream #0:0 -> #0:0 (gif (native) -> gif (native))
      Press [q] to stop, [?] for help
      frame=   11 fps=0.0 q=-0.0 Lsize=      17kB time=00:00:02.90 bitrate=  48.8kbits/s speed=14.8x<br />
      video:16kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 5.287261%