Recherche avancée

Médias (17)

Mot : - Tags -/wired

Autres articles (65)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

Sur d’autres sites (13269)

  • FFmpeg does not decode h264 stream

    5 juillet 2012, par HAPPY_TIGER

    I am trying to decode h264 stream from rtsp server and render it on iPhone.

    I found some libraries and read some articles about it.

    Libraries are from dropCam for iPhone called RTSPClient and DecoderWrapper.

    But I can not decode frame data with DecodeWrapper that using on ffmpeg.

    Here are my code.

    VideoViewer.m

    - (void)didReceiveFrame:(NSData*)frameData presentationTime:(NSDate*)presentationTime
    {
       [VideoDecoder staticInitialize];
       mConverter = [[VideoDecoder alloc] initWithCodec:kVCT_H264 colorSpace:kVCS_RGBA32 width:320 height:240 privateData:nil];


       [mConverter decodeFrame:frameData];

       if ([mConverter isFrameReady]) {
           UIImage *imageData =[mConverter getDecodedFrame];
           if (imageData) {
               [mVideoView setImage:imageData];
               NSLog(@"decoded!");
           }
       }
    }

    ---VideoDecoder.m---
    - (id)initWithCodec:(enum VideoCodecType)codecType
            colorSpace:(enum VideoColorSpace)colorSpace
                 width:(int)width
                height:(int)height
           privateData:(NSData*)privateData {
       if(self = [super init]) {

           codec = avcodec_find_decoder(CODEC_ID_H264);
           codecCtx = avcodec_alloc_context();

           // Note: for H.264 RTSP streams, the width and height are usually not specified (width and height are 0).  
           // These fields will become filled in once the first frame is decoded and the SPS is processed.
           codecCtx->width = width;
           codecCtx->height = height;

           codecCtx->extradata = av_malloc([privateData length]);
           codecCtx->extradata_size = [privateData length];
           [privateData getBytes:codecCtx->extradata length:codecCtx->extradata_size];
           codecCtx->pix_fmt = PIX_FMT_RGBA;
    #ifdef SHOW_DEBUG_MV
           codecCtx->debug_mv = 0xFF;
    #endif

           srcFrame = avcodec_alloc_frame();
           dstFrame = avcodec_alloc_frame();

           int res = avcodec_open(codecCtx, codec);
           if (res < 0)
           {
               NSLog(@"Failed to initialize decoder");
           }

       }

       return self;    
    }

    - (void)decodeFrame:(NSData*)frameData {


       AVPacket packet = {0};
       packet.data = (uint8_t*)[frameData bytes];
       packet.size = [frameData length];

       int frameFinished=0;
       NSLog(@"Packet size===>%d",packet.size);
       // Is this a packet from the video stream?
       if(packet.stream_index==0)
       {
           int res = avcodec_decode_video2(codecCtx, srcFrame, &frameFinished, &packet);
           NSLog(@"Res value===>%d",res);
           NSLog(@"frame data===>%d",(int)srcFrame->data);
           if (res < 0)
           {
               NSLog(@"Failed to decode frame");
           }
       }
       else
       {
           NSLog(@"No video stream found");
       }


       // Need to delay initializing the output buffers because we don't know the dimensions until we decode the first frame.
       if (!outputInit) {
           if (codecCtx->width > 0 && codecCtx->height > 0) {
    #ifdef _DEBUG
               NSLog(@"Initializing decoder with frame size of: %dx%d", codecCtx->width, codecCtx->height);
    #endif

               outputBufLen = avpicture_get_size(PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
               outputBuf = av_malloc(outputBufLen);

               avpicture_fill((AVPicture*)dstFrame, outputBuf, PIX_FMT_RGBA, codecCtx->width, codecCtx->height);

               convertCtx = sws_getContext(codecCtx->width, codecCtx->height, codecCtx->pix_fmt,  codecCtx->width,
                                           codecCtx->height, PIX_FMT_RGBA, SWS_FAST_BILINEAR, NULL, NULL, NULL);

               outputInit = YES;
               frameFinished=1;
           }
           else {
               NSLog(@"Could not get video output dimensions");
           }
       }

       if (frameFinished)
           frameReady = YES;

    }

    The console shows me as follows.

    2011-05-16 20:16:04.223 RTSPTest1[41226:207] Packet size===>359
    [h264 @ 0x5815c00] no frame!
    2011-05-16 20:16:04.223 RTSPTest1[41226:207] Res value===>-1
    2011-05-16 20:16:04.224 RTSPTest1[41226:207] frame data===>101791200
    2011-05-16 20:16:04.224 RTSPTest1[41226:207] Failed to decode frame
    2011-05-16 20:16:04.225 RTSPTest1[41226:207] decoded!
    2011-05-16 20:16:04.226 RTSPTest1[41226:207] Packet size===>424
    [h264 @ 0x5017c00] no frame!
    2011-05-16 20:16:04.226 RTSPTest1[41226:207] Res value===>-1
    2011-05-16 20:16:04.227 RTSPTest1[41226:207] frame data===>81002704
    2011-05-16 20:16:04.227 RTSPTest1[41226:207] Failed to decode frame
    2011-05-16 20:16:04.228 RTSPTest1[41226:207] decoded!
    2011-05-16 20:16:04.229 RTSPTest1[41226:207] Packet size===>424
    [h264 @ 0x581d000] no frame!
    2011-05-16 20:16:04.229 RTSPTest1[41226:207] Res value===>-1
    2011-05-16 20:16:04.230 RTSPTest1[41226:207] frame data===>101791616
    2011-05-16 20:16:04.230 RTSPTest1[41226:207] Failed to decode frame
    2011-05-16 20:16:04.231 RTSPTest1[41226:207] decoded!
    . . . .  .

    But the simulator shows nothing.

    What's wrong with my code.

    Help me solve this problem.

    Thanks for your answers.

  • Piwik Mobile est maintenant disponible !

    18 août 2010, par SteveG

    Après quelques mois de développement, l’équipe Piwik est fière de vous présenter le client mobile. Piwik Mobile est déjàdisponible pour les markets des différents téléphones avec IOS (comme iPhone, iPod et iPad) ou Android (1.6 ou supérieure). Piwik Mobile dans les markets : Android : http://www.androidpit.com/en/android/market/apps/app/org.piwik.mobile/Piwik-Mobile-Beta iOS : http://itunes.apple.com/us/app/piwikmobile/id385536442?mt=8 Piwik Mobile a été développé en utilisant [...]

    ]]>

  • accelerate x264 encoding

    7 août 2012, par Saraswati

    i am making use of x264 to encode raw data captured from the iphone camera .. but the encoding is very slow .Can anyone help me accelerate the encoding speed.

    I have used following settings to build x264 lib :

    //for armv6

    CC=/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc ./configure —host=arm-apple-darwin —sysroot=/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.3.sdk —prefix='dist' —extra-cflags='-arch armv6' —extra-ldflags=-L/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.3.sdk/usr/lib/system/ —extra-ldflags='-arch armv6' —enable-pic —disable-asm

    //for armv7

    CC=/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc ./configure —host=arm-apple-darwin —sysroot=/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.3.sdk —prefix='dist' —extra-cflags='-arch armv7' —extra-ldflags=-L/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.3.sdk/usr/lib/system/ —extra-ldflags='-arch armv7' —enable-pic

    I am using default preset like this :

    x264_param_default_preset(param, "slow", "zerolatency");

    and setting few perameters :

    param->i_bframe = 0;
    param->analyse.i_me_method = X264_ME_HEX;
    param->analyse.i_subpel_refine = 2;
    param->i_frame_reference = 1;
    param->analyse.b_mixed_references = 0;
    param->analyse.i_trellis = 0;
    param->rc.b_mb_tree = 0;
    param->analyse.i_weighted_pred = X264_WEIGHTP_NONE;


    param->rc.i_bitrate = 180;
    param->rc.i_qp_min = 20;
    param->rc.i_qp_max = 26;

    param->i_keyint_max = 15;
    param->i_keyint_min = 15;


    param->i_width = w;
    param->i_height = h;

    x264_param_apply_profile(param, "baseline");
    x264_picture_alloc( &(enc->pic), X264_CSP_I420, param->i_width, param->i_height );