Recherche avancée

Médias (91)

Autres articles (48)

  • MediaSPIP : Modification des droits de création d’objets et de publication définitive

    11 novembre 2010, par

    Par défaut, MediaSPIP permet de créer 5 types d’objets.
    Toujours par défaut les droits de création et de publication définitive de ces objets sont réservés aux administrateurs, mais ils sont bien entendu configurables par les webmestres.
    Ces droits sont ainsi bloqués pour plusieurs raisons : parce que le fait d’autoriser à publier doit être la volonté du webmestre pas de l’ensemble de la plateforme et donc ne pas être un choix par défaut ; parce qu’avoir un compte peut servir à autre choses également, (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (8991)

  • Compiling x264 on a Mac : "No working C compiler found" and "arm-linux-androideabi-gcc : command not found"

    29 novembre 2014, par Xavi Gil

    I am trying to compile the x264 library for Android, following this post.

    I have cloned the x264 project git clone git://git.videolan.org/x264.git and tried to compile with the following configuration :

    NDK=~/development/android-ndk-r10c    
    TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64
    PLATFORM=$NDK/platforms/android-21/arch-arm

    ./configure \
    --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
    --sysroot=$PLATFORM \
    --host=arm-linux \
    --enable-pic \
    --enable-static \
    --disable-cli

    The problem is that I get a No working C compiler found. error.

    The conftest.log output :

    $ cat conftest.log
    ./configure: line 153: arm-linux-androideabi-gcc: command not found

    But the arm-linux-androideabi-gcc is the toolchain’s bin folder !!

    Looking at this other question it looks like for some reason, even though the file exists, since it is a 64bit Mac, it won’t execute the arm-linux-androideabi-gcc file and will return this weird error and log.


    I am in a Mac OS X 10.10 and I have installed the XCode Command Line Tools :

    $ xcode-select -p
    /Applications/Xcode.app/Contents/Developer

    GCC version :

    $ gcc --version
    Configured with: --prefix=/Applications/Xcode.app/Contents/Developer/usr --with-gxx-include-dir=/usr/include/c++/4.2.1
    Apple LLVM version 6.0 (clang-600.0.54) (based on LLVM 3.5svn)
    Target: x86_64-apple-darwin14.0.0
    Thread model: posix

    Can anyone tell me how to fix this please ?

  • Streaming client over TCP and RTSP through Wi-Fi or LAN in Android

    6 janvier 2015, par Gowtham

    I am struggling to develop streaming client for DVR camera’s, I tried with VLC Media player through RTSP protocol I got the solution (used Wi-Fi standard model like, Netgear etc.,), but the same code is not supporting for other Wi-Fi Modem’s, now am working with FFMPEG framework to implement the streaming client in android using JNI API. Not getting any proper idea to implement JNI api

    Network Camera working with IP Cam Viewer App

    code below,

    /*****************************************************/
    /* functional call */
    /*****************************************************/

    jboolean Java_FFmpeg_allocateBuffer( JNIEnv* env, jobject thiz )
    {

       // Allocate an AVFrame structure
       pFrameRGB=avcodec_alloc_frame();
       if(pFrameRGB==NULL)
           return 0;
    sprintf(debugMsg, "%d %d", screenWidth, screenHeight);
    INFO(debugMsg);
       // Determine required buffer size and allocate buffer
       numBytes=avpicture_get_size(dstFmt, screenWidth, screenHeight);
    /*
       numBytes=avpicture_get_size(dstFmt, pCodecCtx->width,
                     pCodecCtx->height);
    */
       buffer=(uint8_t *)av_malloc(numBytes * sizeof(uint8_t));

       // Assign appropriate parts of buffer to image planes in pFrameRGB
       // Note that pFrameRGB is an AVFrame, but AVFrame is a superset
       // of AVPicture
       avpicture_fill((AVPicture *)pFrameRGB, buffer, dstFmt, screenWidth, screenHeight);

       return 1;
    }


    /* for each decoded frame */
    jbyteArray Java_FFmpeg_getNextDecodedFrame( JNIEnv* env, jobject thiz )
    {


    av_free_packet(&packet);

    while(av_read_frame(pFormatCtx, &packet)>=0) {

       if(packet.stream_index==videoStream) {

           avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);

           if(frameFinished) {    

           img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt, screenWidth, screenHeight, dstFmt, SWS_BICUBIC, NULL, NULL, NULL);

    /*
    img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height, dstFmt, SWS_BICUBIC, NULL, NULL, NULL);
    */

           sws_scale(img_convert_ctx, (const uint8_t* const*)pFrame->data, pFrame->linesize,
        0, pCodecCtx->height, pFrameRGB->data, pFrameRGB->linesize);

    ++frameCount;

           /* uint8_t == unsigned 8 bits == jboolean */
           jbyteArray nativePixels = (*env)->NewByteArray(env, numBytes);
           (*env)->SetByteArrayRegion(env, nativePixels, 0, numBytes, buffer);
           return nativePixels;
           }

       }

       av_free_packet(&packet);
    }

    return NULL;
    }

    /*****************************************************/
    /* / functional call */
    /*****************************************************/


    jstring
    Java_FFmpeg_play( JNIEnv* env, jobject thiz, jstring jfilePath )
    {
       INFO("--- Play");
    char* filePath = (char *)(*env)->GetStringUTFChars(env, jfilePath, NULL);
    RE(filePath);

    /*****************************************************/

     AVFormatContext *pFormatCtx;
     int             i, videoStream;
     AVCodecContext  *pCodecCtx;
     AVCodec         *pCodec;
     AVFrame         *pFrame;
     AVPacket        packet;
     int             frameFinished;
     float           aspect_ratio;
     struct SwsContext *img_convert_ctx;

    INFO(filePath);

    /* FFmpeg */

     av_register_all();

     if(av_open_input_file(&pFormatCtx, filePath, NULL, 0, NULL)!=0)
       RE("failed av_open_input_file ");

     if(av_find_stream_info(pFormatCtx)<0)
           RE("failed av_find_stream_info");

     videoStream=-1;
     for(i=0; inb_streams; i++)
       if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
         videoStream=i;
         break;
       }
     if(videoStream==-1)
           RE("failed videostream == -1");

     pCodecCtx=pFormatCtx->streams[videoStream]->codec;

     pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
     if(pCodec==NULL) {
       RE("Unsupported codec!");
     }

     if(avcodec_open(pCodecCtx, pCodec)<0)
       RE("failed codec_open");

     pFrame=avcodec_alloc_frame();

    /* /FFmpeg */

    INFO("codec name:");
    INFO(pCodec->name);
    INFO("Getting into stream decode:");

    /* video stream */

     i=0;
     while(av_read_frame(pFormatCtx, &packet)>=0) {

       if(packet.stream_index==videoStream) {
         avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
         if(frameFinished) {
    ++i;
    INFO("frame finished");

       AVPicture pict;
    /*
       img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height,
    pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height,
    PIX_FMT_YUV420P, SWS_BICUBIC, NULL, NULL, NULL);

       sws_scale(img_convert_ctx, (const uint8_t* const*)pFrame->data, pFrame->linesize,
    0, pCodecCtx->height, pict.data, pict.linesize);
    */
         }
       }
       av_free_packet(&packet);
     }

    /* /video stream */

     av_free(pFrame);

     avcodec_close(pCodecCtx);

     av_close_input_file(pFormatCtx);

     RE("end of main");
    }

    I can’t able to get the frames from Network camera

    And give some idea to implement the live stream client for DVR camera in Android

  • gstreamer h264 multicast missing SPS/PPS information

    6 mars 2015, par RBI

    I have two video streaming units capable of streaming live video inputs :

    • AXIS Q7424-R Video Encoder
    • EPIPHAN VGADVI Broadcaster 99460 -

    I am using gstreamer to view these streams on client terminals running linux. I am interested in the h264, rtp multicast streams (which both units support).

    I can stream the Epiphan video using the following gstreamer pipeline :

    gst-launch-0.10 udpsrc multicast-group=ADDRESS port=PORT caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' ! rtph264depay ! ffdec_h254 ! autovideosink

    However, this pipeline does not work for the Axis unit as I get the following error repeatedly :

    ffmpeg:0:: non-existing PPS referenced
    ffmpeg:0:: non-existing PPS 0 referenced
    ffmpeg:0:: decode_slice_header error
    ffmpeg:0:: no frame!
    ffdec_h264: decoding error (len:-1, have_data: 0)

    I have read that this error means that the ffmpeg decoder is missing the SPS/PPS information provided by a keyframe. The axis unit has a GOV parameter which is the interval at which i-frames are sent ; it is set to 32.

    Note that I can view both units’ rtp streams in unicast with the following :

    gst-launch-0.10 rtspsrc location=rtsp://ADDRESS:PORT/... latency=100 ! rtph264depay ! ffdec_h264 ! autovideosink

    Since unicast works and the unicast and multicast pipelines are the same (except for the source), my guess is either :

    • My udpsrc caps are simply incorrect for the axis stream (and I don’t really know where/how to verify it)

    • or, the axis multicast format/encoding is different and requires a modification to the pipeline (I find this unlikely since unicast is working and I don’t understand why the encoding would change between unicast/multicast).

    Any suggestions are appreciated as I am limited by my knowledge of gstreamer and media formats in terms of what to try next.