Recherche avancée

Médias (91)

Autres articles (42)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

Sur d’autres sites (9916)

  • Video created using H263 codec and ffmpeg does not play on android device [closed]

    21 mars 2013, par susheel tickoo

    I have created a video using FFmpeg and H263 codec. But when I play the video on an Android device the player is unable to play it. I have used both the extensions .mp4 and .3gp.

     void  generate(JNIEnv *pEnv, jobject pObj,jobjectArray stringArray,int famerate,int width,int height,jstring videoFilename)
       {
           AVCodec *codec;
           AVCodecContext *c= NULL;
           //int framesnum=5;
           int i,looper, out_size, size, x, y,encodecbuffsize,j;
           __android_log_write(ANDROID_LOG_INFO, "record","************into generate************");
           int imagecount= (*pEnv)->GetArrayLength(pEnv, stringArray);
           __android_log_write(ANDROID_LOG_INFO, "record","************got magecount************");
           int retval=-10;

           FILE *f;
           AVFrame *picture,*encoded_avframe;
           uint8_t  *encodedbuffer;
           jbyte *raw_record;
           char logdatadata[100];




           int returnvalue = -1,numBytes =-1;
           const char *gVideoFileName = (char *)(*pEnv)->GetStringUTFChars(pEnv, videoFilename, NULL);
           __android_log_write(ANDROID_LOG_INFO, "record","************got video file name************");

           /* find the mpeg1 video encoder */
           codec = avcodec_find_encoder(CODEC_ID_H264);
           if (!codec) {
               __android_log_write(ANDROID_LOG_INFO, "record","codec not found");
               exit(1);
           }
           c= avcodec_alloc_context();
           /*c->bit_rate = 400000;

           c->width = width;
           c->height = height;

           c->time_base= (AVRational){1,famerate};
           c->gop_size = 12; // emit one intra frame every ten frames
           c->max_b_frames=0;
           c->pix_fmt = PIX_FMT_YUV420P;
           c->codec_type = AVMEDIA_TYPE_VIDEO;
           c->codec_id = CODEC_ID_H263;*/

            c->bit_rate = 400000;
               // resolution must be a multiple of two
               c->width = 176;
               c->height = 144;
                   c->pix_fmt = PIX_FMT_YUV420P;


               c->qcompress = 0.0;
               c->qblur = 0.0;
               c->gop_size = 20;  //or 1
               c->sub_id = 1;
               c->workaround_bugs = FF_BUG_AUTODETECT;

               //pFFmpeg->c->time_base = (AVRational){1,25};
               c->time_base.num = 1;
               c->time_base.den = famerate;
               c->max_b_frames = 0; //pas de B frame en H263

              // c->opaque = opaque;
               c->dct_algo = FF_DCT_AUTO;
               c->idct_algo = FF_IDCT_AUTO;
               //lc->rtp_mode = 0;
               c->rtp_payload_size = 1000;
               c->rtp_callback = 0; // ffmpeg_rtp_callback;


               c->flags |= CODEC_FLAG_QSCALE;
               c->mb_decision = FF_MB_DECISION_RD;
               c->thread_count = 1;
           #define DEFAULT_RATE    (16 * 8 * 1024)
               c->rc_min_rate = DEFAULT_RATE;
               c->rc_max_rate = DEFAULT_RATE;
               c->rc_buffer_size = DEFAULT_RATE * 64;
               c->bit_rate = DEFAULT_RATE;                    


           sprintf(logdatadata, "------width from c ---- = %d",width);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
           sprintf(logdatadata, "------height from c ---- = %d",height);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

           __android_log_write(ANDROID_LOG_INFO, "record","************Found codec and now opening it************");
           /* open it */
           retval = avcodec_open(c, codec);
           if ( retval < 0)
           {
               sprintf(logdatadata, "------avcodec_open ---- retval = %d",retval);
               __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
               __android_log_write(ANDROID_LOG_INFO, "record","could not open codec");
               exit(1);
           }
           __android_log_write(ANDROID_LOG_INFO, "record","statement 5");
           f = fopen(gVideoFileName, "ab");

           if (!f) {
               __android_log_write(ANDROID_LOG_INFO, "record","could not open video file");
               exit(1);
           }

           __android_log_write(ANDROID_LOG_INFO, "record", "***************Allocating encodedbuffer*********\n");
           encodecbuffsize = avpicture_get_size(PIX_FMT_RGB24, c->width, c->height);

           sprintf(logdatadata, "encodecbuffsize = %d",encodecbuffsize);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
           encodedbuffer = malloc(encodecbuffsize);

           jclass cls = (*pEnv)->FindClass(pEnv, "com/canvasm/mediclinic/VideoGenerator");
           jmethodID mid = (*pEnv)->GetMethodID(pEnv, cls, "videoProgress", "(Ljava/lang/String;)Ljava/lang/String;");
           jmethodID mid_delete = (*pEnv)->GetMethodID(pEnv, cls, "deleteTempFile", "(Ljava/lang/String;)Ljava/lang/String;");

           if (mid == 0)
               return;

           __android_log_write(ANDROID_LOG_INFO, "native","got method id");


           for(i=0;i<=imagecount;i++) {

               jboolean isCp;
               int progress = 0;
               float temp;
               jstring string;
               if(i==imagecount)
                   string = (jstring) (*pEnv)->GetObjectArrayElement(pEnv, stringArray, imagecount-1);
               else
                   string = (jstring) (*pEnv)->GetObjectArrayElement(pEnv, stringArray, i);

               const char *rawString = (*pEnv)->GetStringUTFChars(pEnv, string, &isCp);

               __android_log_write(ANDROID_LOG_INFO, "record",rawString);
               picture = OpenImage(rawString,width,height);
               //WriteJPEG(c,picture,i);
               //   encode video
               memset(encodedbuffer,0,encodecbuffsize);

               //do{

               for(looper=0;looper<5;looper++)
               {
                   memset(encodedbuffer,0,encodecbuffsize);
                   out_size = avcodec_encode_video(c, encodedbuffer, encodecbuffsize, picture);
                   sprintf(logdatadata, "avcodec_encode_video ----- out_size = %d \n",out_size );
                   __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
                   if(out_size>0)
                       break;
               }
               __android_log_write(ANDROID_LOG_INFO, "record","*************Start looping for same image*******");
               returnvalue = fwrite(encodedbuffer, 1, out_size, f);
               sprintf(logdatadata, "fwrite ----- returnvalue = %d \n",returnvalue );
               __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

               __android_log_write(ANDROID_LOG_INFO, "record","*************End looping for same image*******");

               // publishing progress
               progress = ((i*100)/(imagecount+1))+15;//+1 is for last frame duplicated entry
               if(progress<20 )
                   progress =20;
               if(progress>=95 )
                   progress =95;

               sprintf(logdatadata, "%d",progress );
               jstring jstrBuf = (*pEnv)->NewStringUTF(pEnv, logdatadata);
               (*pEnv)->CallObjectMethod(pEnv, pObj, mid,jstrBuf);

               if(i>0)
                   (*pEnv)->CallObjectMethod(pEnv, pObj, mid_delete,string);

           }


           /* get the delayed frames */
           for(; out_size; i++) {
               fflush(stdout);
               out_size = avcodec_encode_video(c, encodedbuffer, encodecbuffsize, NULL);
               fwrite(encodedbuffer, 20, out_size, f);
           }

           /* add sequence end code to have a real mpeg file */
           encodedbuffer[0] = 0x00;
           encodedbuffer[1] = 0x00;
           encodedbuffer[2] = 0x01;
           encodedbuffer[3] = 0xb7;
           fwrite(encodedbuffer, 1, 4, f);
           fclose(f);
           free(encodedbuffer);
           avcodec_close(c);
           av_free(c);
           __android_log_write(ANDROID_LOG_INFO, "record","Video created ");

           // last updation of 100%
           sprintf(logdatadata, "%d",100 );
           jstring jstrBuf = (*pEnv)->NewStringUTF(pEnv, logdatadata);
           (*pEnv)->CallObjectMethod(pEnv, pObj, mid,jstrBuf);
       }



       AVFrame* OpenImage(const char* imageFileName,int w,int h)
       {
           AVFrame *pFrame;
           AVCodec *pCodec ;
           AVFormatContext *pFormatCtx;
           AVCodecContext *pCodecCtx;
           uint8_t *buffer;
           int frameFinished,framesNumber = 0,retval = -1,numBytes=0;
           AVPacket packet;
           char logdatadata[100];
           //__android_log_write(ANDROID_LOG_INFO, "OpenImage",imageFileName);
           if(av_open_input_file(&pFormatCtx, imageFileName, NULL, 0, NULL)!=0)
           //if(avformat_open_input(&pFormatCtx,imageFileName,NULL,NULL)!=0)
           {
               __android_log_write(ANDROID_LOG_INFO, "record",
                       "Can't open image file ");
               return NULL;
           }

           pCodecCtx = pFormatCtx->streams[0]->codec;
           pCodecCtx->width = w;
           pCodecCtx->height = h;
           pCodecCtx->pix_fmt = PIX_FMT_YUV420P;

           // Find the decoder for the video stream
           pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
           if (!pCodec)
           {
               __android_log_write(ANDROID_LOG_INFO, "record",
                       "Can't open image file ");
               return NULL;
           }

           pFrame = avcodec_alloc_frame();

           numBytes = avpicture_get_size(PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
           buffer = (uint8_t *) av_malloc(numBytes * sizeof(uint8_t));
           sprintf(logdatadata, "numBytes  = %d",numBytes);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

           retval = avpicture_fill((AVPicture *) pFrame, buffer, PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);

           // Open codec
           if(avcodec_open(pCodecCtx, pCodec)<0)
           {
               __android_log_write(ANDROID_LOG_INFO, "record","Could not open codec");
               return NULL;
           }

           if (!pFrame)
           {
               __android_log_write(ANDROID_LOG_INFO, "record","Can't allocate memory for AVFrame\n");
               return NULL;
           }
           int readval = -5;
           while (readval = av_read_frame(pFormatCtx, &packet) >= 0)
           {
               if(packet.stream_index != 0)
                   continue;

               int ret = avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
               sprintf(logdatadata, "avcodec_decode_video2 ret = %d",ret);
               __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);

               if (ret > 0)
               {
                   __android_log_write(ANDROID_LOG_INFO, "record","Frame is decoded\n");
                   pFrame->quality = 4;
                   av_free_packet(&packet);
                   av_close_input_file(pFormatCtx);
                   return pFrame;
               }
               else
               {
                   __android_log_write(ANDROID_LOG_INFO, "record","error while decoding frame \n");
               }
           }
           sprintf(logdatadata, "readval = %d",readval);
           __android_log_write(ANDROID_LOG_INFO, "record",logdatadata);
       }

    The generate method takes a list of strings (path to images) and converts them to video and the OpenImage method is responsible for convertign a single image to AVFrame.

  • Using FFMPEG via command line in android

    30 décembre 2012, par user1662334

    I want to use FFMPEG via COMMAND LINE in my android application.For this purpose :

    1. I have cross-compiled the ffmpeg lib and got the libffmpeg.so
    2. I have stored libffmpeg.so and the ffmpeg exectable in files directory of the my project.

    This is the code i am using :

    public class FFMPEGActivity extends Activity

    Process p;

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);

           String[] cmd =new String[4];
       cmd[0]="/data/data/com.example.ffmpegnew/files/libffmpeg";
       cmd[1]="-i";
       cmd[2]="mnt/sdcard/music/baba.mp4";
       cmd[3]="mnt/sdcard/music/outfile.mp4";

       p = Runtime.getRuntime().exec(cmd,null, new File("/data/data/com.example.ffmpegnew/files"));

       }
       catch(Exception e)
       {
           System.out.println("exception"+e);
       }

    }

    This is the exception i am getting :

    09-17 13:47:01.679: I/System.out(3752): exceptionjava.io.IOException: Error running exec(). Command: [/data/data/com.example.ffmpegnew/files/libffmpeg.so, -i, mnt/sdcard/music/baba.mp4, mnt/sdcard/music/outfile.mp4] Working Directory: /data/data/com.example.ffmpegnew/files Environment: null

    Please tell me how to solve this problem.Thanks in advance.

  • Getting exception while trying to run ffmpeg via command line in android

    17 septembre 2012, par user1662334

    I want to use ffmpeg via command line arguments in android application.For this purpose :

    1. I have cross-compiled the ffmpeg lib and got the libffmpeg.so
    2. I have stored libffmpeg.so in files directory of the app.

    This is the code i am using :

    public class MainActivity extends Activity {

       Process p;

       @Override
       public void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.activity_main);

               String[] cmd =new String[4];
           cmd[0]="/data/data/com.example.ffmpegnew/files/libffmpeg";
           cmd[1]="-i";
           cmd[2]="mnt/sdcard/music/baba.mp4";
           cmd[3]="mnt/sdcard/music/outfile.mp4";

           p = Runtime.getRuntime().exec(cmd,null, new File("/data/data/com.example.ffmpegnew/files"));

           }
           catch(Exception e)
           {
               System.out.println("exception"+e);
           }

       }

    }

    This is the exception i am getting :

    09-17 13:47:01.679: I/System.out(3752): exceptionjava.io.IOException: Error running exec(). Command: [/data/data/com.example.ffmpegnew/files/libffmpeg.so, -i, mnt/sdcard/music/baba.mp4, mnt/sdcard/music/outfile.mp4] Working Directory: /data/data/com.example.ffmpegnew/files Environment: null

    Please tell me how to solve this problem.Thanks in advance.