Recherche avancée

Médias (91)

Autres articles (95)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (7262)

  • Issue when calling ffmpeg.c twice that makes the app crashes ?

    6 février 2013, par Jimmy

    I'm trying to call ffmpeg.c to trim a video based on this code 'video-trimmer'. So when I try to run the activity (that loads and uses the native lib) the first time I click trin it works and I could trim the video but when I try to run it again it crashes (and it only work with the application restarts).

    So I spend three days looking for a solution for this issue, most of the answers says the issue with the static variables in ffmpeg.c and creating a lib that loads and unload the class fixes the issue (answer1, answer2). So I tried to apply the solution that is based on the answers and this github repo on the video-trimmer project but all my attempts failed.

    Is there any one knows about a fork of the 'video-trimmer' project that fixes the issue ?. or can anybody provide step by step answer of how to implement the solution in the 'video-trimmer' project (because I tried to follow all the solution on the web and apply them in that project but with no luck).

  • lavf/mxfenc : Write correct interlaced flag when muxing dnxhd.

    4 mai 2015, par Carl Eugen Hoyos
    lavf/mxfenc : Write correct interlaced flag when muxing dnxhd.
    

    Reported and early patch version tested by forum user gridtank.
    Reviewed-by : Tomas Härdin

    • [DH] libavcodec/dnxhddata.c
    • [DH] libavcodec/dnxhddata.h
    • [DH] libavcodec/version.h
    • [DH] libavformat/mxfenc.c
    • [DH] libavformat/version.h
  • libav works with RTP stream on PC, but not on Android (Same RTP stream)

    25 septembre 2016, par Nitay

    I’m using libav to decode video received from a 3rd party. The video is received in an Android app and is then made into an RTP stream and published to another device.

    When I use the PC as the target device, the stream decodes properly and I see video. When I use android (same code, compiled to android) the video does not decode at all.
    This happens only with the video from the 3rd party. Other video streams works okay both on PC and on Android.

    To be clear :

    • If stream is casted from a command line using ffmpeg -> Video is displayed both on Android & on PC
    • If stream is casted from the Android App -> Video is displayed only on PC (the same code, compiled for different platforms)

    libav 11.7 was compiled to android using the following line on configure :

    NDK=/opt/android-ndk-r12b
    SYSROOT="${NDK}/platforms/android-23/arch-arm/"
    ECFLAGS="-march=armv7-a -mfloat-abi=softfp -I /usr/local/include"
    ELDFLAGS="-Wl,--fix-cortex-a8 -L /usr/local/lib"
    ARCH_SPECIFIC="--disable-asm --arch=arm --cpu=armv7-a --cross-prefix=/opt/android-ndk-r12b/prebuilt/linux-x86_64/bin/../../../toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64/bin/arm-linux-androideabi-"

    ./configure \
    ${ARCH_SPECIFIC} \
    --target-os=linux \
    --sysroot="$SYSROOT" \
    --extra-cflags="$ECFLAGS" \
    --extra-ldflags="$ELDFLAGS" \
    --enable-shared \
    --disable-symver

    (—disabled-asm is unfortunately needed to avoid text-relocations in the compiled library which is not allowed on Android)

    Here are the libav logs from the android side : http://pastebin.com/MDE3N7BD

    The logs starting with LIBAV are libav messaged, the ones without are my own messages wrapped around the libav functions.

    Logs from the PC side : http://pastebin.com/N0Fd18F9

    The loop that reads the frames :

           // If read frame fails (which happens), keep trying
           LOG_DEBUG("Before read frame");
           while (av_read_frame(formatContext, &packet) >= 0 && !terminate)
           {
               LOG_DEBUG1("Packet read. Size: %d", packet.size);

               this->DecodeFrame(videoStreamIndex, formatContext->streams[videoStreamIndex]->codec, &packet);

               av_free_packet(&packet);
               av_init_packet(&packet);
           }

    And here’s the frames decoding code :

    void VideoDecoder::DecodeFrame(int videoStreamIndex, AVCodecContext* streamCodec, AVPacket* packet)
    {
       static bool save_file = false;

       AVPixelFormat destinationFormat = AV_PIX_FMT_RGBA;
       LOG_DEBUG("Decoding frame!");


       if (this->isFirstFrame)
       {
           LOG_DEBUG("Creating codecs");
           this->isFirstFrame = false;
           // For parsing the packets, we first need to create the right codec
           AVCodec* h264Codec = NULL;
           // (I'm not sure about why does ffmpeg need this. It has an SDP file which states exactly that, but okay)
           h264Codec = avcodec_find_decoder(AV_CODEC_ID_H264);

           // Now make a copy of the codec for us to change
           codecContext = avcodec_alloc_context3(h264Codec);
           avcodec_get_context_defaults3(codecContext, h264Codec);
           avcodec_copy_context(codecContext, streamCodec);


           // Initialize codecContext to use codec
           if (avcodec_open2(codecContext, h264Codec, NULL) >= 0)
           {
               // There's a nasty edge case here that we need to handle first
               if (streamCodec->width == 0 || streamCodec->height == 0)
               {
                   // That means that the stream initialized before any packets were sent to it, we can't initialize
                   // any buffers without knowing their size. So to tackle this we'll initialize the largest buffer
                   // can think of

                   codecContext->width = MAX_RESOLUTION_WIDTH;
                   codecContext->height = MAX_RESOLUTION_HEIGHT;
               }

               // Instantiate new buffers
               int size = avpicture_get_size(AV_PIX_FMT_YUV420P, codecContext->width, codecContext->height);
               originalPic = av_frame_alloc();
               originalPicBuffer = (uint8_t*)(av_malloc(size));

               avpicture_fill((AVPicture*)originalPic, originalPicBuffer, AV_PIX_FMT_YUV420P, codecContext->width, codecContext->height);
           }

           // Instantiate an output context, for usage in the conversion of the picture
           outputFormatContext = avformat_alloc_context();
       }

       if ((packet->stream_index == videoStreamIndex) && !terminate)
       {
           // Packet is video. Convert!

           if (outputStream == NULL)
           {
               //create stream in file
               outputStream = avformat_new_stream(outputFormatContext, streamCodec->codec);
               avcodec_copy_context(outputStream->codec, streamCodec);
               outputStream->sample_aspect_ratio = streamCodec->sample_aspect_ratio;
           }

           int pictureReceived = 0;
           packet->stream_index = outputStream->id;
           int result = avcodec_decode_video2(codecContext, originalPic, &pictureReceived, packet);
           //          std::cout << "Bytes decoded " << result << " check " << check << std::endl;

           if (pictureReceived)
           {
               LOG_DEBUG("New frame received");
               // NOTICE: It is generally not a good practice to allocate on demand instead on initialization.
               // It this case the edge cases demand it (what happens if width==0 on the first packet?)
               if (this->imageConvertContext == NULL)
               {
                   // Allocate pictures and buffers for conversion
                   this->imageConvertContext = sws_getContext(
                       codecContext->width,
                       codecContext->height,
                       codecContext->pix_fmt,
                       codecContext->width,
                       codecContext->height,
                       destinationFormat,
                       SWS_BICUBIC,
                       NULL, NULL, NULL);
               }

               if (this->convertedPic == NULL)
               {
                   int size_rgba = avpicture_get_size(destinationFormat, codecContext->width, codecContext->height);
                   convertedPicBuffer = (uint8_t*)(av_malloc(size_rgba));
                   convertedPic = av_frame_alloc();
                   avpicture_fill((AVPicture*)convertedPic, convertedPicBuffer, destinationFormat, codecContext->width, codecContext->height);
               }

               // Scale the image
               sws_scale(imageConvertContext, originalPic->data, originalPic->linesize, 0, codecContext->height, convertedPic->data, convertedPic->linesize);

               // We have a frame! Callback
               if (frameReadyCallback != NULL)
               {
                   LOG_DEBUG3("Updated frame [width=%d, height=%d, ptr=0x%08x]", codecContext->width, codecContext->height, convertedPic->data[0]);
                   if (save_file)
                   {
                       save_file = false;
                       std::string filename = "/storage/emulated/0/DCIM/saved_file.rgba";
                       save_buffer_to_file((unsigned char*)convertedPic->data[0], codecContext->width * codecContext->height * 4, filename.c_str());
                       LOG_DEBUG("Exported file");
                   }
                   frameReadyCallback((char*)convertedPic->data[0], codecContext->width, codecContext->height);
               }
           }
           else
           {
               LOG_DEBUG("Packet without frame");
           }
       }
    }

    Obviously the stream from the 3rd party is somehow different, and probably from a different encoder. But it works with libav (same version) on PC. What could be the difference on Android causing it to not find the frames ?