Recherche avancée

Médias (0)

Mot : - Tags -/navigation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (102)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Emballe Médias : Mettre en ligne simplement des documents

    29 octobre 2010, par

    Le plugin emballe médias a été développé principalement pour la distribution mediaSPIP mais est également utilisé dans d’autres projets proches comme géodiversité par exemple. Plugins nécessaires et compatibles
    Pour fonctionner ce plugin nécessite que d’autres plugins soient installés : CFG Saisies SPIP Bonux Diogène swfupload jqueryui
    D’autres plugins peuvent être utilisés en complément afin d’améliorer ses capacités : Ancres douces Légendes photo_infos spipmotion (...)

Sur d’autres sites (13774)

  • FFmpeg muxing theora/vorbis unable to flush ?

    11 novembre 2013, par user2979732

    I'm pretty new to ffmpeg and it's confusing. I'm working on a basic muxer and have been spending over a week on this - I don't normally post as I solve 98% of my issues with google, but unable to get this one so far.

    The basis of my source is FFmpeg's own muxing.c example. When I try to force it using libvorbis for audio, and create "test.ogg" it demonstrates the same issues I'm having in my own derivation of muxing.c. The problem is with ogg/theora/vorbis. I'm forcing the use of audio codec like this :

    audio_st = add_stream(oc, &audio_codec, avcodec_find_encoder_by_name("libvorbis")->id);

    It seems the problem is in not setting audio pts in the muxing.c sample. There is a confusion in general about this, nobody apart from this guy didn't address what I am looking for http://webcache.googleusercontent.com/search?q=cache:6ml82RMN3YYJ:ffmpeg.org/pipermail/libav-user/2013-April/004304.html+&cd=4&hl=en&ct=clnk&gl=cz

    I couldn't find any answers to that naturally - like why don't they set the audio pts ? Laziness ? Not needed ? Do they believe all encoders will produce the pts for them(not true as seen below) ?

    Anyway, when you try muxing.c with mp4/libx264/forced libmp3lame all is fine, but the encoder says that "encoder did not produce valid pts, making some up.". However, it's silent with ogg/theora/vorbis, as if there were valid pts(?) but the result is no audio packets present in the stream(!), at least from what I saw using ffprobe. Which results in the video not being able to replay even, until you take out the empty audio stream. Then it plays the video, which shows that stream is fine.

    Coming to my original issue. I tried setting the pts on the audio frame you're sending to the encoder to fix that problem(this already sucks). I was unable to find a definite answer how to properly set the pts - that's the other big issue as I'm trying stuff which I'm not sure works. Anyway, in the end when setting "some" pts, this results in ogg with sound.

    if (frame->pts == AV_NOPTS_VALUE) frame->pts = audio_sync_opts;
    audio_sync_opts = frame->pts + frame->nb_samples;

    I'm aware I should probably use rescaling to adjust for the container time bases etc..if this was present/explained in ffmpeg's own sample I wouldn't have to guess now (as I'm stil not 100% sure about time base relationship between container and codec, I think container time base takes somehow over the codec one).

    My other problem is flushing - but that might have something to do with the screwed up pts. So I won't rather get into that in detail - the basic problem is, when I send finite number of audio frames, like 20, I get 2 packets only for example. From my understanding, I need to flush the rest of audio after all the encoding/muxing is done, which I managed to do with mp4/libx264/libmp3lame, but with ogg/theora/vorbis it doesn't flush. Why not, I have no idea.

    If someone could rework muxing.c into sending it finite number of audio / video frames - ie . not until duration > X, but until it sent 20 video & 100 audio frames(just an example). So that number of frames I have is important, not the video time I end up with. Then encode / mux all the frames - with proper video/audio pts, working with theora/ogg and flushing if needed, that would probably solve all of my issues. I'm sure for an expert ffmpeg'er modifying muxing.c addressing all those things would be a pretty quick exercise and could help more than 1 confused person.

    Thanks !

  • Encoding images into a movie file

    5 avril 2014, par RuAware

    I am trying to save jpgs into a movie, I have tried jcodec and alothough my s3 plays it fine other devices do not. including vlc and windows media

    I have just spent most of the day playing with MediaCodec, although the SDK is so high, it will help people with jelly bean and above. But I can not work out how to get the Files to the encoder and then write the file.

    Ideally I wont to support down to SDK 9/8

    Has anyone got any code they can share, either to get MediaCodec to work or another option. If you say ffmpeg, I'd love to but my jin knowledge is non existent and I will need a very good guide.

    Code for MediaCodec so far

    public class EncodeAndMux extends AsyncTask {
       private static int bitRate = 2000000;
       private static int MAX_INPUT = 100000;
       private static String mimeType = "video/avc";

       private int frameRate = 15;    
       private int colorFormat;
       private int stride = 1;
       private int sliceHeight = 2;        

       private MediaCodec encoder = null;
       private MediaFormat inputFormat;
       private MediaCodecInfo codecInfo = null;
       private MediaMuxer muxer;
       private boolean mMuxerStarted = false;
       private int mTrackIndex = 0;  
       private long presentationTime = 0;
       private Paint bmpPaint;

       private static int WAITTIME = 10000;
       private static String TAG = "ENCODE";

       private ArrayList<string> mFilePaths;
       private String mPath;

       private EncodeListener mListener;
       private int width = 320;
       private int height = 240;
       private double mSpeed = 1;

       public EncodeAndMux(ArrayList<string> filePaths, String savePath) {
           mFilePaths = filePaths;
           mPath = savePath;  

           // Create paint to draw BMP
           bmpPaint = new Paint();
           bmpPaint.setAntiAlias(true);
           bmpPaint.setFilterBitmap(true);
           bmpPaint.setDither(true);
       }

       public void setListner(EncodeListener listener) {
           mListener = listener;
       }

       // set the speed, how many frames a second
       public void setSpead(int speed) {
           mSpeed = speed;
       }

       public double getSpeed() {
           return mSpeed;
       }

       private long computePresentationTime(int frameIndex) {
           final long ONE_SECOND = 1000000;
           return (long) (frameIndex * (ONE_SECOND / mSpeed));
       }

       public interface EncodeListener {
           public void finished();
           public void errored();
       }

       @TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR2)
       @Override
       protected Boolean doInBackground(Integer... params) {

           try {
               muxer = new MediaMuxer(mPath, OutputFormat.MUXER_OUTPUT_MPEG_4);
           } catch (Exception e){
               e.printStackTrace();
           }

           // Find a code that supports the mime type
           int numCodecs = MediaCodecList.getCodecCount();
           for (int i = 0; i &lt; numCodecs &amp;&amp; codecInfo == null; i++) {
               MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
               if (!info.isEncoder()) {
                   continue;
               }
               String[] types = info.getSupportedTypes();
               boolean found = false;

               for (int j = 0; j &lt; types.length &amp;&amp; !found; j++) {
                   if (types[j].equals(mimeType))
                       found = true;
               }

               if (!found)
                   continue;
               codecInfo = info;
           }


            for (int i = 0; i &lt; MediaCodecList.getCodecCount(); i++) {
                    MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
                    if (!info.isEncoder()) {
                        continue;
                    }

                    String[] types = info.getSupportedTypes();
                    for (int j = 0; j &lt; types.length; ++j) {
                        if (types[j] != mimeType)
                            continue;
                        MediaCodecInfo.CodecCapabilities caps = info.getCapabilitiesForType(types[j]);
                        for (int k = 0; k &lt; caps.profileLevels.length; k++) {
                            if (caps.profileLevels[k].profile == MediaCodecInfo.CodecProfileLevel.AVCProfileHigh &amp;&amp; caps.profileLevels[k].level == MediaCodecInfo.CodecProfileLevel.AVCLevel4) {
                                codecInfo = info;
                            }
                        }
                    }
            }

           Log.d(TAG, "Found " + codecInfo.getName() + " supporting " + mimeType);

           MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
           for (int i = 0; i &lt; capabilities.colorFormats.length &amp;&amp; colorFormat == 0; i++) {
               int format = capabilities.colorFormats[i];
               switch (format) {
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
                   colorFormat = format;
                   break;
               }
           }
           Log.d(TAG, "Using color format " + colorFormat);

           // Determine width, height and slice sizes
           if (codecInfo.getName().equals("OMX.TI.DUCATI1.VIDEO.H264E")) {
               // This codec doesn&#39;t support a width not a multiple of 16,
               // so round down.
               width &amp;= ~15;
           }

           stride = width;
           sliceHeight = height;

           if (codecInfo.getName().startsWith("OMX.Nvidia.")) {
               stride = (stride + 15) / 16 * 16;
               sliceHeight = (sliceHeight + 15) / 16 * 16;
           }

           inputFormat = MediaFormat.createVideoFormat(mimeType, width, height);
           inputFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
           inputFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
           inputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
           inputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
    //          inputFormat.setInteger("stride", stride);
    //          inputFormat.setInteger("slice-height", sliceHeight);
           inputFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, MAX_INPUT);

           encoder = MediaCodec.createByCodecName(codecInfo.getName());
           encoder.configure(inputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
           encoder.start();

           ByteBuffer[] inputBuffers = encoder.getInputBuffers();
           ByteBuffer[] outputBuffers = encoder.getOutputBuffers();

           int inputBufferIndex= -1, outputBufferIndex= -1;
           BufferInfo info = new BufferInfo();
           for (int i = 0; i &lt; mFilePaths.size(); i++) {

               // use decode sample to calculate inSample size and then resize
               Bitmap bitmapIn = Images.decodeSampledBitmapFromPath(mFilePaths.get(i), width, height);  

               // Create blank bitmap
               Bitmap bitmap = Bitmap.createBitmap(width, height, Config.ARGB_8888);                  

               // Center scaled image
               Canvas canvas = new Canvas(bitmap);                
               canvas.drawBitmap(bitmapIn,(bitmap.getWidth()/2)-(bitmapIn.getWidth()/2),(bitmap.getHeight()/2)-(bitmapIn.getHeight()/2), bmpPaint);

               Log.d(TAG, "Bitmap width: " + bitmapIn.getWidth() + " height: " + bitmapIn.getHeight() + " WIDTH: " + width + " HEIGHT: " + height);
               byte[] dat = getNV12(width, height, bitmap);
               bitmap.recycle();

               // Exception occurred on this below line in Emulator, LINE No. 182//**
               inputBufferIndex = encoder.dequeueInputBuffer(WAITTIME);
               Log.i("DAT", "Size= "+dat.length);

               if(inputBufferIndex >= 0){
                   int samplesiz= dat.length;
                   inputBuffers[inputBufferIndex].put(dat);
                   presentationTime = computePresentationTime(i);
                   if (i == mFilePaths.size()) {
                       encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                       Log.i(TAG, "Last Frame");
                   } else {
                       encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
                   }

                   while(true) {
                      outputBufferIndex = encoder.dequeueOutputBuffer(info, WAITTIME);
                      Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
                      if (outputBufferIndex >= 0) {
                          ByteBuffer encodedData = outputBuffers[outputBufferIndex];
                          if (encodedData == null) {
                              throw new RuntimeException("encoderOutputBuffer " + outputBufferIndex +
                                      " was null");
                          }

                          if ((info.flags &amp; MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                              // The codec config data was pulled out and fed to the muxer when we got
                              // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                              Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                              info.size = 0;
                          }

                          if (info.size != 0) {
                              if (!mMuxerStarted) {
                                  throw new RuntimeException("muxer hasn&#39;t started");
                              }

                              // adjust the ByteBuffer values to match BufferInfo (not needed?)
                              encodedData.position(info.offset);
                              encodedData.limit(info.offset + info.size);

                              muxer.writeSampleData(mTrackIndex, encodedData, info);
                              Log.d(TAG, "sent " + info.size + " bytes to muxer");
                          }

                          encoder.releaseOutputBuffer(outputBufferIndex, false);

                          inputBuffers[inputBufferIndex].clear();
                          outputBuffers[outputBufferIndex].clear();

                          if ((info.flags &amp; MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                              break;      // out of while
                          }

                      } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                          // Subsequent data will conform to new format.
                          MediaFormat opmediaformat = encoder.getOutputFormat();
                          if (!mMuxerStarted) {
                              mTrackIndex = muxer.addTrack(opmediaformat);
                              muxer.start();
                              mMuxerStarted = true;
                          }
                          Log.i(TAG, "op_buf_format_changed: " + opmediaformat);
                      } else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                          outputBuffers = encoder.getOutputBuffers();
                          Log.d(TAG, "Output Buffer changed " + outputBuffers);
                      } else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
                          // No Data, break out
                          break;
                      } else {
                          // Unexpected State, ignore it
                          Log.d(TAG, "Unexpected State " + outputBufferIndex);
                      }
                   }

               }    
           }

           if (encoder != null) {
               encoder.flush();
               encoder.stop();
               encoder.release();
               encoder = null;
           }

           if (muxer != null) {
               muxer.stop();
               muxer.release();
               muxer = null;
           }

           return true;

       };

       @Override
       protected void onPostExecute(Boolean result) {
           if (result) {
               if (mListener != null)
                   mListener.finished();
           } else {
               if (mListener != null)
                   mListener.errored();
           }
           super.onPostExecute(result);
       }



       byte [] getNV12(int inputWidth, int inputHeight, Bitmap scaled) {
           int [] argb = new int[inputWidth * inputHeight];
           scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
           byte [] yuv = new byte[inputWidth*inputHeight*3/2];
           encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
           scaled.recycle();
           return yuv;
       }


       void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
           final int frameSize = width * height;
           int yIndex = 0;
           int uvIndex = frameSize;
           int a, R, G, B, Y, U, V;
           int index = 0;
           for (int j = 0; j &lt; height; j++) {
               for (int i = 0; i &lt; width; i++) {

                   a = (argb[index] &amp; 0xff000000) >> 24; // a is not used obviously
                   R = (argb[index] &amp; 0xff0000) >> 16;
                   G = (argb[index] &amp; 0xff00) >> 8;
                   B = (argb[index] &amp; 0xff) >> 0;

                   // well known RGB to YVU algorithm
                   Y = ( (  66 * R + 129 * G +  25 * B + 128) >> 8) +  16;
                   V = ( ( -38 * R -  74 * G + 112 * B + 128) >> 8) + 128;
                   U = ( ( 112 * R -  94 * G -  18 * B + 128) >> 8) + 128;

                   yuv420sp[yIndex++] = (byte) ((Y &lt; 0) ? 0 : ((Y > 255) ? 255 : Y));
                   if (j % 2 == 0 &amp;&amp; index % 2 == 0) {
                       yuv420sp[uvIndex++] = (byte)((V&lt;0) ? 0 : ((V > 255) ? 255 : V));
                       yuv420sp[uvIndex++] = (byte)((U&lt;0) ? 0 : ((U > 255) ? 255 : U));
                   }

                   index ++;
               }
           }
       }
    }
    </string></string>

    This has now been tested on 4 of my devices and works fine, is there are way to

    1/ Calculate the MAX_INPUT (to high and on the N7 II it crashes, I Don't want that happening once released)
    2/ Offer an api 16 solution ?
    3/ Do I need stride and stride height ?

    Thanks

  • How is video decoding corruption debugged ?

    20 novembre 2013, par TopGunCoder

    I just started working for a new company and my new role demands that I help debug the video corruption that they are receiving through decoding frames. As much as I intend on digging down deep into the code and looking into the specifics of my problem, it made me think about video debugging in general.

    Since handling videos is very new to me, the whole process seems pretty complex and it seems there are a lot of places for corruption to present itself. The way I see it there is at least three places where corruption could pop up (barring memory corruption from the machine) :

    • Transporting the data before it is decoded
    • decoding implementation that perpetuates corruption once it is encountered, or is all together incorrect (Which seems to be my problem)
    • Transportation to the monitor(which seems unlikely but possible)

    So what i'm really curious about is if/how people debug their video streams to determine the location of any potential corruption they are encountering. I'm sure there is no sure fire method but I am curious to see what problems are even possible and how they can be identified and triaged.

    P.S. - I'm not sure of the differences between different decoding methods but, if this question seems too vague maybe it helps to mention I am using ffmpeg and avcodec_decode_video2 for the decoding.