Recherche avancée

Médias (3)

Mot : - Tags -/plugin

Autres articles (104)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (10602)

  • How to play raw h264 produced by MediaCodec encoder ?

    1er novembre 2014, par jackos2500

    I’m a bit new when it comes to MediaCodec (and video encoding/decoding in general), so correct me if anything I say here is wrong.

    I want to play the raw h264 output of MediaCodec with VLC/ffplay. I need this to play becuase my end goal is to stream some live video to a computer, and MediaMuxer only produces a file on disk rather than something I can stream with (very) low latency to a desktop. (I’m open to other solutions, but I have not found anything else that fits the latency requirement)

    Here is the code I’m using encode the video and write it to a file : (it’s based off the MediaCodec example found here, only with the MediaMuxer part removed)

    package com.jackos2500.droidtop;

    import android.media.MediaCodec;
    import android.media.MediaCodecInfo;
    import android.media.MediaFormat;
    import android.opengl.EGL14;
    import android.opengl.EGLConfig;
    import android.opengl.EGLContext;
    import android.opengl.EGLDisplay;
    import android.opengl.EGLExt;
    import android.opengl.EGLSurface;
    import android.opengl.GLES20;
    import android.os.Environment;
    import android.util.Log;
    import android.view.Surface;

    import java.io.BufferedOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.nio.ByteBuffer;

    public class StreamH264 {
       private static final String TAG = "StreamH264";
       private static final boolean VERBOSE = true;           // lots of logging

       // where to put the output file (note: /sdcard requires WRITE_EXTERNAL_STORAGE permission)
       private static final File OUTPUT_DIR = Environment.getExternalStorageDirectory();

       public static int MEGABIT = 1000 * 1000;
       private static final int IFRAME_INTERVAL = 10;

       private static final int TEST_R0 = 0;
       private static final int TEST_G0 = 136;
       private static final int TEST_B0 = 0;
       private static final int TEST_R1 = 236;
       private static final int TEST_G1 = 50;
       private static final int TEST_B1 = 186;

       private MediaCodec codec;
       private CodecInputSurface inputSurface;
       private BufferedOutputStream out;

       private MediaCodec.BufferInfo bufferInfo;
       public StreamH264() {

       }

       private void prepareEncoder() throws IOException {
           bufferInfo = new MediaCodec.BufferInfo();

           MediaFormat format = MediaFormat.createVideoFormat("video/avc", 1280, 720);
           format.setInteger(MediaFormat.KEY_BIT_RATE, 2 * MEGABIT);
           format.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
           format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
           format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);

           codec = MediaCodec.createEncoderByType("video/avc");
           codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
           inputSurface = new CodecInputSurface(codec.createInputSurface());
           codec.start();

           File dst = new File(OUTPUT_DIR, "test.264");
           out = new BufferedOutputStream(new FileOutputStream(dst));
       }
       private void releaseEncoder() throws IOException {
           if (VERBOSE) Log.d(TAG, "releasing encoder objects");
           if (codec != null) {
               codec.stop();
               codec.release();
               codec = null;
           }
           if (inputSurface != null) {
               inputSurface.release();
               inputSurface = null;
           }
           if (out != null) {
               out.flush();
               out.close();
               out = null;
           }
       }
       public void stream() throws IOException {
           try {
               prepareEncoder();
               inputSurface.makeCurrent();
               for (int i = 0; i < (30 * 5); i++) {
                   // Feed any pending encoder output into the file.
                   drainEncoder(false);

                   // Generate a new frame of input.
                   generateSurfaceFrame(i);
                   inputSurface.setPresentationTime(computePresentationTimeNsec(i, 30));

                   // Submit it to the encoder.  The eglSwapBuffers call will block if the input
                   // is full, which would be bad if it stayed full until we dequeued an output
                   // buffer (which we can't do, since we're stuck here).  So long as we fully drain
                   // the encoder before supplying additional input, the system guarantees that we
                   // can supply another frame without blocking.
                   if (VERBOSE) Log.d(TAG, "sending frame " + i + " to encoder");
                   inputSurface.swapBuffers();
               }
               // send end-of-stream to encoder, and drain remaining output
               drainEncoder(true);
           } finally {
               // release encoder, muxer, and input Surface
               releaseEncoder();
           }
       }

       private void drainEncoder(boolean endOfStream) throws IOException {
           final int TIMEOUT_USEC = 10000;
           if (VERBOSE) Log.d(TAG, "drainEncoder(" + endOfStream + ")");

           if (endOfStream) {
               if (VERBOSE) Log.d(TAG, "sending EOS to encoder");
               codec.signalEndOfInputStream();
           }
           ByteBuffer[] outputBuffers = codec.getOutputBuffers();
           while (true) {
               int encoderStatus = codec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
               if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                   // no output available yet
                   if (!endOfStream) {
                       break;      // out of while
                   } else {
                       if (VERBOSE) Log.d(TAG, "no output available, spinning to await EOS");
                   }
               } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                   // not expected for an encoder
                   outputBuffers = codec.getOutputBuffers();
               } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                   // should happen before receiving buffers, and should only happen once
                   MediaFormat newFormat = codec.getOutputFormat();
                   Log.d(TAG, "encoder output format changed: " + newFormat);
               } else if (encoderStatus < 0) {
                   Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus);
                   // let's ignore it
               } else {
                   ByteBuffer encodedData = outputBuffers[encoderStatus];
                   if (encodedData == null) {
                       throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null");
                   }

                   if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                       // The codec config data was pulled out and fed to the muxer when we got
                       // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                       if (VERBOSE) Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                       bufferInfo.size = 0;
                   }

                   if (bufferInfo.size != 0) {
                       // adjust the ByteBuffer values to match BufferInfo (not needed?)
                       encodedData.position(bufferInfo.offset);
                       encodedData.limit(bufferInfo.offset + bufferInfo.size);

                       byte[] data = new byte[bufferInfo.size];
                       encodedData.get(data);
                       out.write(data);
                       if (VERBOSE) Log.d(TAG, "sent " + bufferInfo.size + " bytes to file");
                   }

                   codec.releaseOutputBuffer(encoderStatus, false);

                   if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                       if (!endOfStream) {
                           Log.w(TAG, "reached end of stream unexpectedly");
                       } else {
                           if (VERBOSE) Log.d(TAG, "end of stream reached");
                       }
                       break;      // out of while
                   }
               }
           }
       }
       private void generateSurfaceFrame(int frameIndex) {
           frameIndex %= 8;

           int startX, startY;
           if (frameIndex < 4) {
               // (0,0) is bottom-left in GL
               startX = frameIndex * (1280 / 4);
               startY = 720 / 2;
           } else {
               startX = (7 - frameIndex) * (1280 / 4);
               startY = 0;
           }

           GLES20.glClearColor(TEST_R0 / 255.0f, TEST_G0 / 255.0f, TEST_B0 / 255.0f, 1.0f);
           GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

           GLES20.glEnable(GLES20.GL_SCISSOR_TEST);
           GLES20.glScissor(startX, startY, 1280 / 4, 720 / 2);
           GLES20.glClearColor(TEST_R1 / 255.0f, TEST_G1 / 255.0f, TEST_B1 / 255.0f, 1.0f);
           GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
           GLES20.glDisable(GLES20.GL_SCISSOR_TEST);
       }
       private static long computePresentationTimeNsec(int frameIndex, int frameRate) {
           final long ONE_BILLION = 1000000000;
           return frameIndex * ONE_BILLION / frameRate;
       }

       /**
        * Holds state associated with a Surface used for MediaCodec encoder input.
        * <p>
        * The constructor takes a Surface obtained from MediaCodec.createInputSurface(), and uses that
        * to create an EGL window surface.  Calls to eglSwapBuffers() cause a frame of data to be sent
        * to the video encoder.
        * </p><p>
        * This object owns the Surface -- releasing this will release the Surface too.
        */
       private static class CodecInputSurface {
           private static final int EGL_RECORDABLE_ANDROID = 0x3142;

           private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;
           private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;
           private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;

           private Surface mSurface;

           /**
            * Creates a CodecInputSurface from a Surface.
            */
           public CodecInputSurface(Surface surface) {
               if (surface == null) {
                   throw new NullPointerException();
               }
               mSurface = surface;

               eglSetup();
           }

           /**
            * Prepares EGL.  We want a GLES 2.0 context and a surface that supports recording.
            */
           private void eglSetup() {
               mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
               if (mEGLDisplay == EGL14.EGL_NO_DISPLAY) {
                   throw new RuntimeException("unable to get EGL14 display");
               }
               int[] version = new int[2];
               if (!EGL14.eglInitialize(mEGLDisplay, version, 0, version, 1)) {
                   throw new RuntimeException("unable to initialize EGL14");
               }

               // Configure EGL for recording and OpenGL ES 2.0.
               int[] attribList = {
                       EGL14.EGL_RED_SIZE, 8,
                       EGL14.EGL_GREEN_SIZE, 8,
                       EGL14.EGL_BLUE_SIZE, 8,
                       EGL14.EGL_ALPHA_SIZE, 8,
                       EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
                       EGL_RECORDABLE_ANDROID, 1,
                       EGL14.EGL_NONE
               };
               EGLConfig[] configs = new EGLConfig[1];
               int[] numConfigs = new int[1];
               EGL14.eglChooseConfig(mEGLDisplay, attribList, 0, configs, 0, configs.length,
                       numConfigs, 0);
               checkEglError("eglCreateContext RGB888+recordable ES2");

               // Configure context for OpenGL ES 2.0.
               int[] attrib_list = {
                       EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
                       EGL14.EGL_NONE
               };
               mEGLContext = EGL14.eglCreateContext(mEGLDisplay, configs[0], EGL14.EGL_NO_CONTEXT,
                       attrib_list, 0);
               checkEglError("eglCreateContext");

               // Create a window surface, and attach it to the Surface we received.
               int[] surfaceAttribs = {
                       EGL14.EGL_NONE
               };
               mEGLSurface = EGL14.eglCreateWindowSurface(mEGLDisplay, configs[0], mSurface,
                       surfaceAttribs, 0);
               checkEglError("eglCreateWindowSurface");
           }

           /**
            * Discards all resources held by this class, notably the EGL context.  Also releases the
            * Surface that was passed to our constructor.
            */
           public void release() {
               if (mEGLDisplay != EGL14.EGL_NO_DISPLAY) {
                   EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE,
                           EGL14.EGL_NO_CONTEXT);
                   EGL14.eglDestroySurface(mEGLDisplay, mEGLSurface);
                   EGL14.eglDestroyContext(mEGLDisplay, mEGLContext);
                   EGL14.eglReleaseThread();
                   EGL14.eglTerminate(mEGLDisplay);
               }

               mSurface.release();

               mEGLDisplay = EGL14.EGL_NO_DISPLAY;
               mEGLContext = EGL14.EGL_NO_CONTEXT;
               mEGLSurface = EGL14.EGL_NO_SURFACE;

               mSurface = null;
           }

           /**
            * Makes our EGL context and surface current.
            */
           public void makeCurrent() {
               EGL14.eglMakeCurrent(mEGLDisplay, mEGLSurface, mEGLSurface, mEGLContext);
               checkEglError("eglMakeCurrent");
           }

           /**
            * Calls eglSwapBuffers.  Use this to "publish" the current frame.
            */
           public boolean swapBuffers() {
               boolean result = EGL14.eglSwapBuffers(mEGLDisplay, mEGLSurface);
               checkEglError("eglSwapBuffers");
               return result;
           }

           /**
            * Sends the presentation time stamp to EGL.  Time is expressed in nanoseconds.
            */
           public void setPresentationTime(long nsecs) {
               EGLExt.eglPresentationTimeANDROID(mEGLDisplay, mEGLSurface, nsecs);
               checkEglError("eglPresentationTimeANDROID");
           }

           /**
            * Checks for EGL errors.  Throws an exception if one is found.
            */
           private void checkEglError(String msg) {
               int error;
               if ((error = EGL14.eglGetError()) != EGL14.EGL_SUCCESS) {
                   throw new RuntimeException(msg + ": EGL error: 0x" + Integer.toHexString(error));
               }
           }
       }
    }
    </p>

    However, the file produced from this code does not play with VLC or ffplay. Can anyone tell me what I’m doing wrong ? I believe it is due to an incorrect format (or total lack) of headers required for the playing of raw h264, as I have had success playing .264 files downloaded from the internet with ffplay. Also, I’m not sure exactly how I’m going to stream this video to a computer, so if somebody could give me some suggestions as to how I might do that, I would be very grateful ! Thanks !

  • ffmpeg seeking to I-frame

    7 octobre 2014, par user3398748

    Is it possible to seek to a I-Frame using the av_seek_frame() function.
    The problem I am facing is that if I seek in a AVC file I get a lot of noise if I dont flush the buffer. And if I flush the buffer the decoder dose not return a frame until it comes across a I-Frame which causes problem in the calculation of total frames at the end of file if I am seeking.

    Thank you

  • ffmpeg-php to create thumbnail of video

    8 novembre 2014, par Wazzzy

    I am trying to use this script to create thumbnail of a video using ffmpeg. At first I used phpinfo(); and I found ffmpeg is installed on my server.

    Then I copied ffmpeg-php onto my server and run the test file

    I got to know that many function are working on my server

    I got output like this

    Output :-

    Functions available in /usr/lib/php/extensions/no-debug-non-zts-20060613/ffmpeg.so extension:

    Methods available in class ffmpeg_movie:
    __construct
    getduration
    getframecount
    getframerate
    getfilename
    getcomment
    gettitle
    getauthor
    getartist
    getcopyright
    getalbum
    getgenre
    getyear
    gettracknumber
    getframewidth
    getframeheight
    getframenumber
    getpixelformat
    getbitrate
    hasaudio
    hasvideo
    getnextkeyframe
    getframe
    getvideocodec
    getaudiocodec
    getvideostreamid
    getaudiostreamid
    getaudiochannels
    getaudiosamplerate
    getaudiobitrate
    getvideobitrate
    getpixelaspectratio
    getpixelaspectratio
    getvideobitrate
    getaudiobitrate
    getaudiosamplerate
    getaudiochannels
    getaudiostreamid
    getvideostreamid
    getaudiocodec
    getvideocodec
    getframe
    getnextkeyframe
    hasvideo
    hasaudio
    getbitrate
    getpixelformat
    getframenumber
    getframeheight
    getframewidth
    gettracknumber
    getyear
    getgenre
    getalbum
    getcopyright
    getartist
    getauthor
    gettitle
    getcomment
    getfilename
    getframerate
    getframecount
    getduration
    __construct

    I wrote this code and tried any possible path to assign $ffmpeg;

    &lt;?php

       $thumb_stdout;
       $retval=0;
       $ffmpeg = '/home/lib/ffmpeg';

    // change "demo.mpg" to your mpg file name!
    $video  = dirname(__FILE__) . 'demo.mpg';

    // change "demo.jpg" to whichever name you like or don't
    // for this example, the name of the output jpg file does not matter
    $image  = dirname(__FILE__) . 'demo.jpg';

    $second = 1;

    $cmd = "$ffmpeg -i $video 2>&amp;1";
    if (preg_match('/Duration: ((\d+):(\d+):(\d+))/s', '$cmd', $time)) {
       $total = ($time[2] * 3600) + ($time[3] * 60) + $time[4];
       $second = rand(1, ($total - 1));
    }

    //$cmd = "$ffmpeg -i $video -deinterlace -an -ss $second -t 00:00:01 -r 1 -y -vcodec mjpeg -f mjpeg $image 2>&amp;1";
    $cmd = "$ffmpeg -i $video -r 1 -ss 00:00:05 -t 00:00:01 -s 250x250 -f image2 $image";
    echo $cmd;
    exec($cmd);
    //$return = '$cmd';

    echo '<br />done!';
    ?>