Recherche avancée

Médias (1)

Mot : - Tags -/iphone

Autres articles (64)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (8961)

  • JNI crash when I split code in two functions

    13 septembre 2013, par Lescott

    I have a properly working native C function which I call from my java code. But when I split this code in two functions and sequentially call them both I got fatal error.

    //global variables
    AVFormatContext *pFormatCtx;
    AVFrame         *pFrame;
    AVFrame         *pFrameRGB;
    AVCodecContext  *pCodecCtx;
    AVCodec         *pCodec;
    uint8_t         *buffer;
    int             videoStream;
    struct SwsContext      *sws_ctx = NULL;
    int outWidth, outHeight;

    Working unsplitted function

    JNIEXPORT void JNICALL Java_foo(JNIEnv * env, jclass class) {
       av_register_all();
       const char* videoPath = "11.mp4";

       int             numBytes;
       AVDictionary    *optionsDict = NULL;

       pFrame = NULL;
       pFrameRGB = NULL;
       buffer = NULL;
       pCodec = NULL;
       pFormatCtx = NULL;

       // Open video file
       if(avformat_open_input(&pFormatCtx, videoPath, NULL, NULL)!=0)
               exit(1); // Couldn't open file


       // Retrieve stream information
       if(avformat_find_stream_info(pFormatCtx, NULL)<0)
               exit(1); // Couldn't find stream information

        av_dump_format(pFormatCtx, 0,videoPath, 0);


       // Find the first video stream
       videoStream=-1;
       int i;
       for(i=0; inb_streams; i++) {
               if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
                       videoStream=i;
                       break;
               }
       }

       if(videoStream==-1)
               exit(1); // Didn't find a video stream

       // Get a pointer to the codec context for the video stream
       pCodecCtx=pFormatCtx->streams[videoStream]->codec;


       // Find the decoder for the video stream
       pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
       if(pCodec==NULL) {
               fprintf(stderr, "Unsupported codec!\n");
               exit(1); // Codec not found
       }

       // Open codec
       if(avcodec_open2(pCodecCtx, pCodec, &optionsDict)<0)
               exit(1); // Could not open codec

       // Allocate video frame
       pFrame=avcodec_alloc_frame();

       // Allocate an AVFrame structure
       pFrameRGB=avcodec_alloc_frame();
       if(pFrameRGB==NULL)
               exit(1);

       outWidth = 128;
       outHeight = 128;

       // Determine required buffer size and allocate buffer
       numBytes=avpicture_get_size(PIX_FMT_RGB24, outWidth, outHeight);
       buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));

       sws_ctx = sws_getContext(
                               pCodecCtx->width,
                               pCodecCtx->height,
                               pCodecCtx->pix_fmt,
                               outWidth,
                               outHeight,
                               PIX_FMT_RGB24,
                               SWS_BILINEAR,
                               NULL,
                               NULL,
                               NULL
                       );

       // Assign appropriate parts of buffer to image planes in pFrameRGB
       // Note that pFrameRGB is an AVFrame, but AVFrame is a superset
       // of AVPicture
       avpicture_fill((AVPicture *)pFrameRGB, buffer, PIX_FMT_RGB24, outWidth, outHeight);
    }

    Failing splitted functions

    JNIEXPORT void JNICALL Java_foo1(JNIEnv * env, jclass class) {
       av_register_all();
    }

    JNIEXPORT void JNICALL Java_foo2(JNIEnv * env, jclass class) {
       //all lines of code from Java_foo exept the first
    }

    Java code

    System.loadLibrary("mylib");
    Mylib.foo1();
    Mylib.foo2(); //fatal error


    #
    # A fatal error has been detected by the Java Runtime Environment:
    #
    #  SIGSEGV (0xb) at pc=0x00007faab5012dc0, pid=15571, tid=140371352766208

    Any ideas ?

  • How is video decoding corruption be debugged ?

    17 septembre 2013, par TopGunCoder

    I just started working for a new company and my new role demands that I help debug the video corruption that they are receiving through decoding frames. As much as I intend on digging down deep into the code and looking into the specifics of my problem, it made me think about video debugging in general.

    Since handling videos is very new to me, the whole process seems pretty complex and it seems there are a lot of places for corruption to present itself. The way I see it there is at least three places where corruption could pop up (barring memory corruption from the machine) :

    • Transporting the data before it is decoded
    • decoding implementation that perpetuates corruption once it is encountered, or is all together incorrect (Which seems to be my problem)
    • Transportation to the monitor(which seems unlikely but possible)

    So what i'm really curious about is if/how people debug their video streams to determine the location of any potential corruption they are encountering. I'm sure there is no sure fire method but I am curious to see what problems are even possible and how they can be identified and triaged.

    P.S. - I'm not sure of the differences between different decoding methods but, if this question seems too vague maybe it helps to mention I am using ffmpeg and avcodec_decode_video2 for the decoding.

  • playing MPEG-dash streaming with FFMPEG [on hold]

    9 mars 2019, par Thanh Nguyen

    I’m having a problem when playing MPEG-dash streaming with ffmpeg. I know there are a lot of tools can play MPEG-dash like OSMO4, DASH.js.. However, I am wondering if I can play MPEG-dash with FFMPEG ?
    And one more question, How can I determine startup time when playing MPEG-dash streaming ? It means, the duration time when I request a MPD file from a server until I get the first picture of video display on a screen ?
    Thanks !