Recherche avancée

Médias (91)

Autres articles (70)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

  • Participer à sa documentation

    10 avril 2011

    La documentation est un des travaux les plus importants et les plus contraignants lors de la réalisation d’un outil technique.
    Tout apport extérieur à ce sujet est primordial : la critique de l’existant ; la participation à la rédaction d’articles orientés : utilisateur (administrateur de MediaSPIP ou simplement producteur de contenu) ; développeur ; la création de screencasts d’explication ; la traduction de la documentation dans une nouvelle langue ;
    Pour ce faire, vous pouvez vous inscrire sur (...)

Sur d’autres sites (8179)

  • Anomalie #3223 : Liste des mot-clés liés à un objet : boutons d’actions affichés quelles que soien...

    18 juillet 2014, par marcimat ☺☮☯♫

    … commentaire non destiné à ce ticket ! …

  • How to play raw h264 produced by MediaCodec encoder ?

    1er novembre 2014, par jackos2500

    I’m a bit new when it comes to MediaCodec (and video encoding/decoding in general), so correct me if anything I say here is wrong.

    I want to play the raw h264 output of MediaCodec with VLC/ffplay. I need this to play becuase my end goal is to stream some live video to a computer, and MediaMuxer only produces a file on disk rather than something I can stream with (very) low latency to a desktop. (I’m open to other solutions, but I have not found anything else that fits the latency requirement)

    Here is the code I’m using encode the video and write it to a file : (it’s based off the MediaCodec example found here, only with the MediaMuxer part removed)

    package com.jackos2500.droidtop;

    import android.media.MediaCodec;
    import android.media.MediaCodecInfo;
    import android.media.MediaFormat;
    import android.opengl.EGL14;
    import android.opengl.EGLConfig;
    import android.opengl.EGLContext;
    import android.opengl.EGLDisplay;
    import android.opengl.EGLExt;
    import android.opengl.EGLSurface;
    import android.opengl.GLES20;
    import android.os.Environment;
    import android.util.Log;
    import android.view.Surface;

    import java.io.BufferedOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.nio.ByteBuffer;

    public class StreamH264 {
       private static final String TAG = "StreamH264";
       private static final boolean VERBOSE = true;           // lots of logging

       // where to put the output file (note: /sdcard requires WRITE_EXTERNAL_STORAGE permission)
       private static final File OUTPUT_DIR = Environment.getExternalStorageDirectory();

       public static int MEGABIT = 1000 * 1000;
       private static final int IFRAME_INTERVAL = 10;

       private static final int TEST_R0 = 0;
       private static final int TEST_G0 = 136;
       private static final int TEST_B0 = 0;
       private static final int TEST_R1 = 236;
       private static final int TEST_G1 = 50;
       private static final int TEST_B1 = 186;

       private MediaCodec codec;
       private CodecInputSurface inputSurface;
       private BufferedOutputStream out;

       private MediaCodec.BufferInfo bufferInfo;
       public StreamH264() {

       }

       private void prepareEncoder() throws IOException {
           bufferInfo = new MediaCodec.BufferInfo();

           MediaFormat format = MediaFormat.createVideoFormat("video/avc", 1280, 720);
           format.setInteger(MediaFormat.KEY_BIT_RATE, 2 * MEGABIT);
           format.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
           format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
           format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);

           codec = MediaCodec.createEncoderByType("video/avc");
           codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
           inputSurface = new CodecInputSurface(codec.createInputSurface());
           codec.start();

           File dst = new File(OUTPUT_DIR, "test.264");
           out = new BufferedOutputStream(new FileOutputStream(dst));
       }
       private void releaseEncoder() throws IOException {
           if (VERBOSE) Log.d(TAG, "releasing encoder objects");
           if (codec != null) {
               codec.stop();
               codec.release();
               codec = null;
           }
           if (inputSurface != null) {
               inputSurface.release();
               inputSurface = null;
           }
           if (out != null) {
               out.flush();
               out.close();
               out = null;
           }
       }
       public void stream() throws IOException {
           try {
               prepareEncoder();
               inputSurface.makeCurrent();
               for (int i = 0; i < (30 * 5); i++) {
                   // Feed any pending encoder output into the file.
                   drainEncoder(false);

                   // Generate a new frame of input.
                   generateSurfaceFrame(i);
                   inputSurface.setPresentationTime(computePresentationTimeNsec(i, 30));

                   // Submit it to the encoder.  The eglSwapBuffers call will block if the input
                   // is full, which would be bad if it stayed full until we dequeued an output
                   // buffer (which we can't do, since we're stuck here).  So long as we fully drain
                   // the encoder before supplying additional input, the system guarantees that we
                   // can supply another frame without blocking.
                   if (VERBOSE) Log.d(TAG, "sending frame " + i + " to encoder");
                   inputSurface.swapBuffers();
               }
               // send end-of-stream to encoder, and drain remaining output
               drainEncoder(true);
           } finally {
               // release encoder, muxer, and input Surface
               releaseEncoder();
           }
       }

       private void drainEncoder(boolean endOfStream) throws IOException {
           final int TIMEOUT_USEC = 10000;
           if (VERBOSE) Log.d(TAG, "drainEncoder(" + endOfStream + ")");

           if (endOfStream) {
               if (VERBOSE) Log.d(TAG, "sending EOS to encoder");
               codec.signalEndOfInputStream();
           }
           ByteBuffer[] outputBuffers = codec.getOutputBuffers();
           while (true) {
               int encoderStatus = codec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
               if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                   // no output available yet
                   if (!endOfStream) {
                       break;      // out of while
                   } else {
                       if (VERBOSE) Log.d(TAG, "no output available, spinning to await EOS");
                   }
               } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                   // not expected for an encoder
                   outputBuffers = codec.getOutputBuffers();
               } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                   // should happen before receiving buffers, and should only happen once
                   MediaFormat newFormat = codec.getOutputFormat();
                   Log.d(TAG, "encoder output format changed: " + newFormat);
               } else if (encoderStatus < 0) {
                   Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus);
                   // let's ignore it
               } else {
                   ByteBuffer encodedData = outputBuffers[encoderStatus];
                   if (encodedData == null) {
                       throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null");
                   }

                   if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                       // The codec config data was pulled out and fed to the muxer when we got
                       // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                       if (VERBOSE) Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                       bufferInfo.size = 0;
                   }

                   if (bufferInfo.size != 0) {
                       // adjust the ByteBuffer values to match BufferInfo (not needed?)
                       encodedData.position(bufferInfo.offset);
                       encodedData.limit(bufferInfo.offset + bufferInfo.size);

                       byte[] data = new byte[bufferInfo.size];
                       encodedData.get(data);
                       out.write(data);
                       if (VERBOSE) Log.d(TAG, "sent " + bufferInfo.size + " bytes to file");
                   }

                   codec.releaseOutputBuffer(encoderStatus, false);

                   if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                       if (!endOfStream) {
                           Log.w(TAG, "reached end of stream unexpectedly");
                       } else {
                           if (VERBOSE) Log.d(TAG, "end of stream reached");
                       }
                       break;      // out of while
                   }
               }
           }
       }
       private void generateSurfaceFrame(int frameIndex) {
           frameIndex %= 8;

           int startX, startY;
           if (frameIndex < 4) {
               // (0,0) is bottom-left in GL
               startX = frameIndex * (1280 / 4);
               startY = 720 / 2;
           } else {
               startX = (7 - frameIndex) * (1280 / 4);
               startY = 0;
           }

           GLES20.glClearColor(TEST_R0 / 255.0f, TEST_G0 / 255.0f, TEST_B0 / 255.0f, 1.0f);
           GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

           GLES20.glEnable(GLES20.GL_SCISSOR_TEST);
           GLES20.glScissor(startX, startY, 1280 / 4, 720 / 2);
           GLES20.glClearColor(TEST_R1 / 255.0f, TEST_G1 / 255.0f, TEST_B1 / 255.0f, 1.0f);
           GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
           GLES20.glDisable(GLES20.GL_SCISSOR_TEST);
       }
       private static long computePresentationTimeNsec(int frameIndex, int frameRate) {
           final long ONE_BILLION = 1000000000;
           return frameIndex * ONE_BILLION / frameRate;
       }

       /**
        * Holds state associated with a Surface used for MediaCodec encoder input.
        * <p>
        * The constructor takes a Surface obtained from MediaCodec.createInputSurface(), and uses that
        * to create an EGL window surface.  Calls to eglSwapBuffers() cause a frame of data to be sent
        * to the video encoder.
        * </p><p>
        * This object owns the Surface -- releasing this will release the Surface too.
        */
       private static class CodecInputSurface {
           private static final int EGL_RECORDABLE_ANDROID = 0x3142;

           private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;
           private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;
           private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;

           private Surface mSurface;

           /**
            * Creates a CodecInputSurface from a Surface.
            */
           public CodecInputSurface(Surface surface) {
               if (surface == null) {
                   throw new NullPointerException();
               }
               mSurface = surface;

               eglSetup();
           }

           /**
            * Prepares EGL.  We want a GLES 2.0 context and a surface that supports recording.
            */
           private void eglSetup() {
               mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
               if (mEGLDisplay == EGL14.EGL_NO_DISPLAY) {
                   throw new RuntimeException("unable to get EGL14 display");
               }
               int[] version = new int[2];
               if (!EGL14.eglInitialize(mEGLDisplay, version, 0, version, 1)) {
                   throw new RuntimeException("unable to initialize EGL14");
               }

               // Configure EGL for recording and OpenGL ES 2.0.
               int[] attribList = {
                       EGL14.EGL_RED_SIZE, 8,
                       EGL14.EGL_GREEN_SIZE, 8,
                       EGL14.EGL_BLUE_SIZE, 8,
                       EGL14.EGL_ALPHA_SIZE, 8,
                       EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
                       EGL_RECORDABLE_ANDROID, 1,
                       EGL14.EGL_NONE
               };
               EGLConfig[] configs = new EGLConfig[1];
               int[] numConfigs = new int[1];
               EGL14.eglChooseConfig(mEGLDisplay, attribList, 0, configs, 0, configs.length,
                       numConfigs, 0);
               checkEglError("eglCreateContext RGB888+recordable ES2");

               // Configure context for OpenGL ES 2.0.
               int[] attrib_list = {
                       EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
                       EGL14.EGL_NONE
               };
               mEGLContext = EGL14.eglCreateContext(mEGLDisplay, configs[0], EGL14.EGL_NO_CONTEXT,
                       attrib_list, 0);
               checkEglError("eglCreateContext");

               // Create a window surface, and attach it to the Surface we received.
               int[] surfaceAttribs = {
                       EGL14.EGL_NONE
               };
               mEGLSurface = EGL14.eglCreateWindowSurface(mEGLDisplay, configs[0], mSurface,
                       surfaceAttribs, 0);
               checkEglError("eglCreateWindowSurface");
           }

           /**
            * Discards all resources held by this class, notably the EGL context.  Also releases the
            * Surface that was passed to our constructor.
            */
           public void release() {
               if (mEGLDisplay != EGL14.EGL_NO_DISPLAY) {
                   EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE,
                           EGL14.EGL_NO_CONTEXT);
                   EGL14.eglDestroySurface(mEGLDisplay, mEGLSurface);
                   EGL14.eglDestroyContext(mEGLDisplay, mEGLContext);
                   EGL14.eglReleaseThread();
                   EGL14.eglTerminate(mEGLDisplay);
               }

               mSurface.release();

               mEGLDisplay = EGL14.EGL_NO_DISPLAY;
               mEGLContext = EGL14.EGL_NO_CONTEXT;
               mEGLSurface = EGL14.EGL_NO_SURFACE;

               mSurface = null;
           }

           /**
            * Makes our EGL context and surface current.
            */
           public void makeCurrent() {
               EGL14.eglMakeCurrent(mEGLDisplay, mEGLSurface, mEGLSurface, mEGLContext);
               checkEglError("eglMakeCurrent");
           }

           /**
            * Calls eglSwapBuffers.  Use this to "publish" the current frame.
            */
           public boolean swapBuffers() {
               boolean result = EGL14.eglSwapBuffers(mEGLDisplay, mEGLSurface);
               checkEglError("eglSwapBuffers");
               return result;
           }

           /**
            * Sends the presentation time stamp to EGL.  Time is expressed in nanoseconds.
            */
           public void setPresentationTime(long nsecs) {
               EGLExt.eglPresentationTimeANDROID(mEGLDisplay, mEGLSurface, nsecs);
               checkEglError("eglPresentationTimeANDROID");
           }

           /**
            * Checks for EGL errors.  Throws an exception if one is found.
            */
           private void checkEglError(String msg) {
               int error;
               if ((error = EGL14.eglGetError()) != EGL14.EGL_SUCCESS) {
                   throw new RuntimeException(msg + ": EGL error: 0x" + Integer.toHexString(error));
               }
           }
       }
    }
    </p>

    However, the file produced from this code does not play with VLC or ffplay. Can anyone tell me what I’m doing wrong ? I believe it is due to an incorrect format (or total lack) of headers required for the playing of raw h264, as I have had success playing .264 files downloaded from the internet with ffplay. Also, I’m not sure exactly how I’m going to stream this video to a computer, so if somebody could give me some suggestions as to how I might do that, I would be very grateful ! Thanks !

  • How to write NALs produced by x264_encoder_encode() using ffmpeg av_interleaved_write_frame()

    21 janvier 2013, par Haleeq Usman

    I have been trying to produce a "flv" video file in the following sequence :

    av_register_all();

    // Open video file
    if (avformat_open_input(&amp;pFormatCtx, "6.mp4", NULL, NULL) != 0)
       return -1; // Couldn&#39;t open file

    // Retrieve stream information
    if (avformat_find_stream_info(pFormatCtx, NULL) &lt; 0)
       return -1; // Couldn&#39;t find stream information

    // Dump information about file onto standard error
    av_dump_format(pFormatCtx, 0, "input_file.mp4", 0);

    // Find the first video stream
    videoStream = -1;
    for (i = 0; i &lt; pFormatCtx->nb_streams; i++)
       if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO) {
           videoStream = i;
           break;
       }
    if (videoStream == -1)
       return -1; // Didn&#39;t find a video stream

    // Get a pointer to the codec context for the video stream
    pCodecCtx = pFormatCtx->streams[videoStream]->codec;

    // Find the decoder for the video stream
    pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
    if (pCodec == NULL) {
       fprintf(stderr, "Unsupported codec!\n");
       return -1; // Codec not found
    }
    // Open codec
    if (avcodec_open2(pCodecCtx, pCodec, NULL) &lt; 0)
       return -1; // Could not open codec

    // Allocate video frame
    pFrame = avcodec_alloc_frame();

    // Allocate video frame
    pFrame = avcodec_alloc_frame();

    // Allocate an AVFrame structure
    pFrameYUV420 = avcodec_alloc_frame();
    if (pFrameYUV420 == NULL)
       return -1;

    // Determine required buffer size and allocate buffer
    numBytes = avpicture_get_size(pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height);
    buffer = (uint8_t *) av_malloc(numBytes * sizeof(uint8_t));

    // Assign appropriate parts of buffer to image planes in pFrameYUV420
    // Note that pFrameYUV420 is an AVFrame, but AVFrame is a superset of AVPicture
    avpicture_fill((AVPicture *) pFrameRGB, buffer, pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height);

    // Setup scaler
    img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt, SWS_BILINEAR, 0, 0, 0);
    if (img_convert_ctx == NULL) {
       fprintf(stderr, "Cannot initialize the conversion context!\n");
       exit(1);
    }

    // Setup encoder/muxing now
    filename = "output_file.flv";
    fmt = av_guess_format("flv", filename, NULL);
    if (fmt == NULL) {
       printf("Could not guess format.\n");
       return -1;
    }
    /* allocate the output media context */
    oc = avformat_alloc_context();
    if (oc == NULL) {
       printf("could not allocate context.\n");
       return -1;
    }
    oc->oformat = fmt;
    snprintf(oc->filename, sizeof(oc->filename), "%s", filename);

    video_st = NULL;
    if (fmt->video_codec != AV_CODEC_ID_NONE) {
       video_st = add_stream(oc, &amp;video_codec, fmt->video_codec);
    }

    // Let&#39;s see some information about our format
    av_dump_format(oc, 0, filename, 1);

    /* open the output file, if needed */
    if (!(fmt->flags &amp; AVFMT_NOFILE)) {
       ret = avio_open(&amp;oc->pb, filename, AVIO_FLAG_WRITE);
       if (ret &lt; 0) {
           fprintf(stderr, "Could not open &#39;%s&#39;: %s\n", filename, av_err2str(ret));
           return 1;
       }
       }
    /* Write the stream header, if any. */
    ret = avformat_write_header(oc, NULL);
    if (ret &lt; 0) {
       fprintf(stderr, "Error occurred when opening output file: %s\n", av_err2str(ret));
       return 1;
    }

    // Setup x264 params
    x264_param_t param;
    x264_param_default_preset(&amp;param, "veryfast", "zerolatency");
    param.i_threads = 1;
    param.i_width = video_st->codec->width;
    param.i_height = video_st->codec->height;
    param.i_fps_num = STREAM_FRAME_RATE; // 30 fps, same as video
    param.i_fps_den = 1;
    // Intra refres:
    param.i_keyint_max = STREAM_FRAME_RATE;
    param.b_intra_refresh = 1;
    // Rate control:
    param.rc.i_rc_method = X264_RC_CRF;
    param.rc.f_rf_constant = 25;
    param.rc.f_rf_constant_max = 35;
    // For streaming:
    param.b_repeat_headers = 1;
    param.b_annexb = 1;
    x264_param_apply_profile(&amp;param, "baseline");

    x264_t* encoder = x264_encoder_open(&amp;param);
    x264_picture_t pic_in, pic_out;
    x264_picture_alloc(&amp;pic_in, X264_CSP_I420, video_st->codec->width, video_st->codec->height);

    x264_nal_t* nals;
    int i_nals;

    // The loop:
    // 1. Read frames
    // 2. Decode the frame
    // 3. Attempt to re-encode using x264
    // 4. Write the x264 encoded frame using av_interleaved_write_frame
    while (av_read_frame(pFormatCtx, &amp;packet) >= 0) {
       // Is this a packet from the video stream?
       if (packet.stream_index == videoStream) {
           // Decode video frame
           avcodec_decode_video2(pCodecCtx, pFrame, &amp;frameFinished, &amp;packet);

           // Did we get a video frame?
           if (frameFinished) {
               sws_scale(img_convert_ctx, pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pic_in.img.plane, pic_in.img.i_stride);
               int frame_size = x264_encoder_encode(encoder, &amp;nals, &amp;i_nals, &amp;pic_in, &amp;pic_out);

               if (frame_size >= 0) {
                   if (i_nals &lt; 0)
                       printf("invalid frame size: %d\n", i_nals);
                   // write out NALs
                   for (i = 0; i &lt; i_nals; i++) {
                       // initalize a packet
                       AVPacket p;
                       av_init_packet(&amp;p);
                       p.data = nals[i].p_payload;
                       p.size = nals[i].i_payload;
                       p.stream_index = video_st->index;
                       p.flags = AV_PKT_FLAG_KEY;
                       p.pts = AV_NOPTS_VALUE;
                       p.dts = AV_NOPTS_VALUE;
                       ret = av_interleaved_write_frame(oc, &amp;p);
                   }
               }
               printf("encoded frame #%d\n", frame_count);
               frame_count++;
           }
       }

       // Free the packet that was allocated by av_read_frame
       av_free_packet(&amp;packet);
    }

    // Now we free up resources used/close codecs, and finally close our program.

    Here is the implementation for the add_stream() function :

    /* Add an output stream. */
    static AVStream *add_stream(AVFormatContext *oc, AVCodec **codec, enum AVCodecID codec_id) {
       AVCodecContext *c;
       AVStream *st;
       int r;
       /* find the encoder */
       *codec = avcodec_find_encoder(codec_id);
       if (!(*codec)) {
           fprintf(stderr, "Could not find encoder for &#39;%s&#39;\n",
                   avcodec_get_name(codec_id));
           exit(1);
       }
       st = avformat_new_stream(oc, *codec);
       if (!st) {
           fprintf(stderr, "Could not allocate stream\n");
           exit(1);
       }
       st->id = oc->nb_streams - 1;
       c = st->codec;
       switch ((*codec)->type) {
       case AVMEDIA_TYPE_AUDIO:
           st->id = 1;
           c->sample_fmt = AV_SAMPLE_FMT_FLTP;
           c->bit_rate = 64000;
           c->sample_rate = 44100;
           c->channels = 2;
           break;
       case AVMEDIA_TYPE_VIDEO:
           avcodec_get_context_defaults3(c, *codec);
           c->codec_id = codec_id;
           c->bit_rate = 500*1000;
           //c->rc_min_rate = 500*1000;
           //c->rc_max_rate = 500*1000;
           //c->rc_buffer_size = 500*1000;
           /* Resolution must be a multiple of two. */
           c->width = 1280;
           c->height = 720;
           /* timebase: This is the fundamental unit of time (in seconds) in terms
            * of which frame timestamps are represented. For fixed-fps content,
            * timebase should be 1/framerate and timestamp increments should be
            * identical to 1. */
           c->time_base.den = STREAM_FRAME_RATE;
           c->time_base.num = 1;
           c->gop_size = 12; /* emit one intra frame every twelve frames at most */
           c->pix_fmt = STREAM_PIX_FMT;
           if (c->codec_id == AV_CODEC_ID_MPEG2VIDEO) {
               /* just for testing, we also add B frames */
               c->max_b_frames = 2;
           }
           if (c->codec_id == AV_CODEC_ID_MPEG1VIDEO) {
               /* Needed to avoid using macroblocks in which some coeffs overflow.
                * This does not happen with normal video, it just happens here as
                * the motion of the chroma plane does not match the luma plane. */
               c->mb_decision = 2;
           }
           break;
       default:
           break;
       }
       /* Some formats want stream headers to be separate. */
       if (oc->oformat->flags &amp; AVFMT_GLOBALHEADER)
           c->flags |= CODEC_FLAG_GLOBAL_HEADER;
       return st;
    }

    After the encoding is complete, I check the output file output_file.flv. I notice it's size is very large : 101MB and it does not play. If I use ffmpeg to decode/encode the input file, then I get an output file about 83MB in size (which is about the same size as the original .mp4 file used as input). Also, the 83MB output from just using ffmpeg C api, as opposed to using x264 for the encoding step, plays just fine. Does anyone know where I am going wrong ? I have tried researching this for a few days now but with no luck :(. I feel that I am close to making it work, however, I just cannot figure out what I am doing wrong. Thank you !