Recherche avancée

Médias (91)

Autres articles (38)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (4884)

  • Video created using H263 codec and ffmpeg does not play on android device [closed]

    21 mars 2013, par susheel tickoo

    I have created a video using FFmpeg and H263 codec. But when I play the video on an Android device the player is unable to play it. I have used both the extensions .mp4 and .3gp.

     void  generate(JNIEnv *pEnv, jobject pObj,jobjectArray stringArray,int famerate,int width,int height,jstring videoFilename)
       {
           AVCodec *codec;
           AVCodecContext *c= NULL;
           //int framesnum=5;
           int i,looper, out_size, size, x, y,encodecbuffsize,j;
           __android_log_write(ANDROID_LOG_INFO, "record","************into generate************");
           int imagecount= (*pEnv)->GetArrayLength(pEnv, stringArray);
           __android_log_write(ANDROID_LOG_INFO, "record","************got magecount************");
           int retval=-10;

           FILE *f;
           AVFrame *picture,*encoded_avframe;
           uint8_t  *encodedbuffer;
           jbyte *raw_record;
           char logdatadata[100];




           int returnvalue = -1,numBytes =-1;
           const char *gVideoFileName = (char *)(*pEnv)->GetStringUTFChars(pEnv, videoFilename, NULL);
           __android_log_write(ANDROID_LOG_INFO, "record","************got video file name************");

           /* find the mpeg1 video encoder */
           codec = avcodec_find_encoder(CODEC_ID_H264);
           if (!codec) {
               __android_log_write(ANDROID_LOG_INFO, "record","codec not found");
               exit(1);
           }
           c= avcodec_alloc_context();
           /*c->bit_rate = 400000;

           c->width = width;
           c->height = height;

           c->time_base= (AVRational){1,famerate};
           c->gop_size = 12; // emit one intra frame every ten frames
           c->max_b_frames=0;
           c->pix_fmt = PIX_FMT_YUV420P;
           c->codec_type = AVMEDIA_TYPE_VIDEO;
           c->codec_id = CODEC_ID_H263;*/

            c->bit_rate = 400000;
               // resolution must be a multiple of two
               c->width = 176;
               c->height = 144;
                   c->pix_fmt = PIX_FMT_YUV420P;


               c->qcompress = 0.0;
               c->qblur = 0.0;
               c->gop_size = 20;  //or 1
               c->sub_id = 1;
               c->workaround_bugs = FF_BUG_AUTODETECT;

               //pFFmpeg->c->time_base = (AVRational){1,25};
               c->time_base.num = 1;
               c->time_base.den = famerate;
               c->max_b_frames = 0; //pas de B frame en H263

              // c->opaque = opaque;
               c->dct_algo = FF_DCT_AUTO;
               c->idct_algo = FF_IDCT_AUTO;
               //lc->rtp_mode = 0;
               c->rtp_payload_size = 1000;
               c->rtp_callback = 0; // ffmpeg_rtp_callback;


               c->flags |= CODEC_FLAG_QSCALE;
               c->mb_decision = FF_MB_DECISION_RD;
               c->thread_count = 1;
           #define DEFAULT_RATE    (16 * 8 * 1024)
               c->rc_min_rate = DEFAULT_RATE;
               c->rc_max_rate = DEFAULT_RATE;
               c->rc_buffer_size = DEFAULT_RATE * 64;
               c->bit_rate = DEFAULT_RATE;                    


           sprintf(logdatadata, "------width from c ---- = %d",width);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
           sprintf(logdatadata, "------height from c ---- = %d",height);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

           __android_log_write(ANDROID_LOG_INFO, "record","************Found codec and now opening it************");
           /* open it */
           retval = avcodec_open(c, codec);
           if ( retval < 0)
           {
               sprintf(logdatadata, "------avcodec_open ---- retval = %d",retval);
               __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
               __android_log_write(ANDROID_LOG_INFO, "record","could not open codec");
               exit(1);
           }
           __android_log_write(ANDROID_LOG_INFO, "record","statement 5");
           f = fopen(gVideoFileName, "ab");

           if (!f) {
               __android_log_write(ANDROID_LOG_INFO, "record","could not open video file");
               exit(1);
           }

           __android_log_write(ANDROID_LOG_INFO, "record", "***************Allocating encodedbuffer*********\n");
           encodecbuffsize = avpicture_get_size(PIX_FMT_RGB24, c->width, c->height);

           sprintf(logdatadata, "encodecbuffsize = %d",encodecbuffsize);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
           encodedbuffer = malloc(encodecbuffsize);

           jclass cls = (*pEnv)->FindClass(pEnv, "com/canvasm/mediclinic/VideoGenerator");
           jmethodID mid = (*pEnv)->GetMethodID(pEnv, cls, "videoProgress", "(Ljava/lang/String;)Ljava/lang/String;");
           jmethodID mid_delete = (*pEnv)->GetMethodID(pEnv, cls, "deleteTempFile", "(Ljava/lang/String;)Ljava/lang/String;");

           if (mid == 0)
               return;

           __android_log_write(ANDROID_LOG_INFO, "native","got method id");


           for(i=0;i<=imagecount;i++) {

               jboolean isCp;
               int progress = 0;
               float temp;
               jstring string;
               if(i==imagecount)
                   string = (jstring) (*pEnv)->GetObjectArrayElement(pEnv, stringArray, imagecount-1);
               else
                   string = (jstring) (*pEnv)->GetObjectArrayElement(pEnv, stringArray, i);

               const char *rawString = (*pEnv)->GetStringUTFChars(pEnv, string, &isCp);

               __android_log_write(ANDROID_LOG_INFO, "record",rawString);
               picture = OpenImage(rawString,width,height);
               //WriteJPEG(c,picture,i);
               //   encode video
               memset(encodedbuffer,0,encodecbuffsize);

               //do{

               for(looper=0;looper<5;looper++)
               {
                   memset(encodedbuffer,0,encodecbuffsize);
                   out_size = avcodec_encode_video(c, encodedbuffer, encodecbuffsize, picture);
                   sprintf(logdatadata, "avcodec_encode_video ----- out_size = %d \n",out_size );
                   __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
                   if(out_size>0)
                       break;
               }
               __android_log_write(ANDROID_LOG_INFO, "record","*************Start looping for same image*******");
               returnvalue = fwrite(encodedbuffer, 1, out_size, f);
               sprintf(logdatadata, "fwrite ----- returnvalue = %d \n",returnvalue );
               __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

               __android_log_write(ANDROID_LOG_INFO, "record","*************End looping for same image*******");

               // publishing progress
               progress = ((i*100)/(imagecount+1))+15;//+1 is for last frame duplicated entry
               if(progress<20 )
                   progress =20;
               if(progress>=95 )
                   progress =95;

               sprintf(logdatadata, "%d",progress );
               jstring jstrBuf = (*pEnv)->NewStringUTF(pEnv, logdatadata);
               (*pEnv)->CallObjectMethod(pEnv, pObj, mid,jstrBuf);

               if(i>0)
                   (*pEnv)->CallObjectMethod(pEnv, pObj, mid_delete,string);

           }


           /* get the delayed frames */
           for(; out_size; i++) {
               fflush(stdout);
               out_size = avcodec_encode_video(c, encodedbuffer, encodecbuffsize, NULL);
               fwrite(encodedbuffer, 20, out_size, f);
           }

           /* add sequence end code to have a real mpeg file */
           encodedbuffer[0] = 0x00;
           encodedbuffer[1] = 0x00;
           encodedbuffer[2] = 0x01;
           encodedbuffer[3] = 0xb7;
           fwrite(encodedbuffer, 1, 4, f);
           fclose(f);
           free(encodedbuffer);
           avcodec_close(c);
           av_free(c);
           __android_log_write(ANDROID_LOG_INFO, "record","Video created ");

           // last updation of 100%
           sprintf(logdatadata, "%d",100 );
           jstring jstrBuf = (*pEnv)->NewStringUTF(pEnv, logdatadata);
           (*pEnv)->CallObjectMethod(pEnv, pObj, mid,jstrBuf);
       }



       AVFrame* OpenImage(const char* imageFileName,int w,int h)
       {
           AVFrame *pFrame;
           AVCodec *pCodec ;
           AVFormatContext *pFormatCtx;
           AVCodecContext *pCodecCtx;
           uint8_t *buffer;
           int frameFinished,framesNumber = 0,retval = -1,numBytes=0;
           AVPacket packet;
           char logdatadata[100];
           //__android_log_write(ANDROID_LOG_INFO, "OpenImage",imageFileName);
           if(av_open_input_file(&pFormatCtx, imageFileName, NULL, 0, NULL)!=0)
           //if(avformat_open_input(&pFormatCtx,imageFileName,NULL,NULL)!=0)
           {
               __android_log_write(ANDROID_LOG_INFO, "record",
                       "Can't open image file ");
               return NULL;
           }

           pCodecCtx = pFormatCtx->streams[0]->codec;
           pCodecCtx->width = w;
           pCodecCtx->height = h;
           pCodecCtx->pix_fmt = PIX_FMT_YUV420P;

           // Find the decoder for the video stream
           pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
           if (!pCodec)
           {
               __android_log_write(ANDROID_LOG_INFO, "record",
                       "Can't open image file ");
               return NULL;
           }

           pFrame = avcodec_alloc_frame();

           numBytes = avpicture_get_size(PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
           buffer = (uint8_t *) av_malloc(numBytes * sizeof(uint8_t));
           sprintf(logdatadata, "numBytes  = %d",numBytes);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

           retval = avpicture_fill((AVPicture *) pFrame, buffer, PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);

           // Open codec
           if(avcodec_open(pCodecCtx, pCodec)<0)
           {
               __android_log_write(ANDROID_LOG_INFO, "record","Could not open codec");
               return NULL;
           }

           if (!pFrame)
           {
               __android_log_write(ANDROID_LOG_INFO, "record","Can't allocate memory for AVFrame\n");
               return NULL;
           }
           int readval = -5;
           while (readval = av_read_frame(pFormatCtx, &packet) >= 0)
           {
               if(packet.stream_index != 0)
                   continue;

               int ret = avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
               sprintf(logdatadata, "avcodec_decode_video2 ret = %d",ret);
               __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

               if (ret > 0)
               {
                   __android_log_write(ANDROID_LOG_INFO, "record","Frame is decoded\n");
                   pFrame->quality = 4;
                   av_free_packet(&packet);
                   av_close_input_file(pFormatCtx);
                   return pFrame;
               }
               else
               {
                   __android_log_write(ANDROID_LOG_INFO, "record","error while decoding frame \n");
               }
           }
           sprintf(logdatadata, "readval = %d",readval);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
       }

    The generate method takes a list of strings (path to images) and converts them to video and the OpenImage method is responsible for convertign a single image to AVFrame.

  • Revision 7d8fc26c9c : vpxdec : support scaling output Adds an option, —scale, that will rescale any f

    26 février 2013, par John Koleszar

    Changed Paths : Modify /examples.mk Modify /vpxdec.c vpxdec : support scaling output Adds an option, —scale, that will rescale any frames produced by the decoder that don't match the resolution of the first frame to that resolution. This is useful for playback of files that use spatial (...)

  • Evolution #2919 : Boucles récursives et critère {si ...} des itérateurs

    2 mars 2013, par marcimat -

    La résolution d’un des 2 points cités est assez délicate. 1) On ne peut pas mettre de critères sur une boucle récursive : effectivement puisque cette boucle appelle simplement la fonction de la boucle parente, ni plus ni moins, qui elle possède les critères. Cependant, les critères (qui sont mis (...)