Recherche avancée

Médias (91)

Autres articles (104)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Gestion de la ferme

    2 mars 2010, par

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

Sur d’autres sites (10916)

  • System.AccessViolationException on avcodec_open2 using FFMpeg.AutoGen

    5 décembre 2024, par forlayo

    I am using FFMpeg.AutoGen (v6.0.0.2) with hwencode support for Nvidia cards, and in few devices (for example on NVIDIA GeForce MX350) when I am doing fffmpeg.avcodec_open2 I got the following ffmpeg message in logs :

    


    [h264_nvenc @ 000001f473fcc480] OpenEncodeSessionEx failed: unsupported device (2): (no details)



    


    And then it crashes with the following crash :

    


    Fatal error. System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.&#xA;   at FFmpeg.AutoGen.DynamicallyLoadedBindings&#x2B;&lt;>c.<initialize>b__2_494(FFmpeg.AutoGen.AVCodecContext*, FFmpeg.AutoGen.AVCodec*, FFmpeg.AutoGen.AVDictionary**)&#xA;   at FFmpeg.AutoGen.ffmpeg.avcodec_open2(FFmpeg.AutoGen.AVCodecContext*, FFmpeg.AutoGen.AVCodec*, FFmpeg.AutoGen.AVDictionary**)&#xA;   at GraphicsCheck.FfmpegChecker.StartEncoder(Omni.Platforms.Windows.Services.AV.CodecInfo, Int32, Int32, Int32)&#xA;   at GraphicsCheck.FfmpegChecker.CheckConfig(CodecConfig)&#xA;   at Program.<main>$(System.String[])&#xA;</main></initialize>

    &#xA;

    I created a sample app to demo this issue. And following the path of the calls..

    &#xA;

      &#xA;
    1. That happens at nvenc_open_session and is called here
    2. &#xA;

    3. That would cause going to fail2 routine, which is this one :
    4. &#xA;

    &#xA;

    fail2:&#xA;    CHECK_CU(dl_fn->cuda_dl->cuCtxDestroy(ctx->cu_context_internal));&#xA;    ctx->cu_context_internal = NULL;&#xA;

    &#xA;

      &#xA;
    1. ctx->cu_context_internal comes from (in same method) then I suppose it was created properly (as otherwise wouldn't continue).
    2. &#xA;

    &#xA;

        ret = CHECK_CU(dl_fn->cuda_dl->cuCtxCreate(&amp;ctx->cu_context_internal, 0, cu_device));&#xA;    if (ret &lt; 0)&#xA;        goto fail;&#xA;

    &#xA;

      &#xA;
    1. ctx in that method is the priv_data of AVCodecContext as we can see here :
    2. &#xA;

    &#xA;

    static av_cold int nvenc_check_device(AVCodecContext *avctx, int idx)&#xA;{&#xA;    NvencContext *ctx = avctx->priv_data;&#xA;

    &#xA;

      &#xA;
    1. Also the act of printing the error message, on nvenc_print_error, is interacting with context.
    2. &#xA;

    &#xA;

    static int nvenc_print_error(AVCodecContext *avctx, NVENCSTATUS err,&#xA;                             const char *error_string)&#xA;{&#xA;    const char *desc;&#xA;    const char *details = "(no details)";&#xA;    int ret = nvenc_map_error(err, &amp;desc);&#xA;&#xA;#ifdef NVENC_HAVE_GETLASTERRORSTRING&#xA;    NvencContext *ctx = avctx->priv_data;&#xA;    NV_ENCODE_API_FUNCTION_LIST *p_nvenc = &amp;ctx->nvenc_dload_funcs.nvenc_funcs;&#xA;&#xA;    if (p_nvenc &amp;&amp; ctx->nvencoder)&#xA;        details = p_nvenc->nvEncGetLastErrorString(ctx->nvencoder);&#xA;#endif&#xA;&#xA;    av_log(avctx, AV_LOG_ERROR, "%s: %s (%d): %s\n", error_string, desc, err, details);&#xA;&#xA;    return ret;&#xA;}&#xA;

    &#xA;

      &#xA;
    1. av_log is executed as we can see the error message on logs then the other lines shouldn't be a problem.

      &#xA;

    2. &#xA;

    3. Seeing the other calls, when nvenc_check_device fails the error is going up to nvenc_check_device -> nvenc_setup_device but we aren't seeing more logs then the issue may be on nvenc_check_device on fail2 routine or on nvenc_print_error

      &#xA;

    4. &#xA;

    &#xA;

    Any thoughts ?

    &#xA;

  • Pipe video frames from ffmpeg to numpy array without loading whole movie into memory

    2 mai 2021, par marcman

    I'm not sure whether what I'm asking is feasible or functional, but I'm experimenting with trying to load frames from a video in an ordered, but "on-demand," fashion.

    &#xA;

    Basically what I have now is to read the entire uncompressed video into a buffer by piping through stdout, e.g. :

    &#xA;

    H, W = 1080, 1920 # video dimensions&#xA;video = &#x27;/path/to/video.mp4&#x27; # path to video&#xA;&#xA;# ffmpeg command&#xA;command = [ "ffmpeg",&#xA;            &#x27;-i&#x27;, video,&#xA;            &#x27;-pix_fmt&#x27;, &#x27;rgb24&#x27;,&#xA;            &#x27;-f&#x27;, &#x27;rawvideo&#x27;,&#xA;            &#x27;pipe:1&#x27; ]&#xA;&#xA;# run ffmpeg and load all frames into numpy array (num_frames, H, W, 3)&#xA;pipe = subprocess.run(command, stdout=subprocess.PIPE, bufsize=10**8)&#xA;video = np.frombuffer(pipe.stdout, dtype=np.uint8).reshape(-1, H, W, 3)&#xA;&#xA;# or alternatively load individual frames in a loop&#xA;nb_img = H*W*3 # H * W * 3 channels * 1-byte/channel&#xA;for i in range(0, len(pipe.stdout), nb_img):&#xA;    img = np.frombuffer(pipe.stdout, dtype=np.uint8, count=nb_img, offset=i).reshape(H, W, 3)&#xA;

    &#xA;

    I'm wondering if it's possible to do this same process, in Python, but without first loading the entire video into memory. In my mind, I'm picturing something like :

    &#xA;

      &#xA;
    1. open the a buffer
    2. &#xA;

    3. seeking to memory locations on demand
    4. &#xA;

    5. loading frames to numpy arrays
    6. &#xA;

    &#xA;

    I know there are other libraries, like OpenCV for example, that enable this same sort of behavior, but I'm wondering :

    &#xA;

      &#xA;
    • Is it possible to do this operation efficiently using this sort of ffmpeg-pipe-to-numpy-array operation ?
    • &#xA;

    • Does this defeat the speed-up benefit of ffmpeg directly rather than seeking/loading through OpenCV or first extracting frames and then loading individual files ?
    • &#xA;

    &#xA;

  • Encoding images into a movie file

    5 avril 2014, par RuAware

    I am trying to save jpgs into a movie, I have tried jcodec and alothough my s3 plays it fine other devices do not. including vlc and windows media

    I have just spent most of the day playing with MediaCodec, although the SDK is so high, it will help people with jelly bean and above. But I can not work out how to get the Files to the encoder and then write the file.

    Ideally I wont to support down to SDK 9/8

    Has anyone got any code they can share, either to get MediaCodec to work or another option. If you say ffmpeg, I'd love to but my jin knowledge is non existent and I will need a very good guide.

    Code for MediaCodec so far

    public class EncodeAndMux extends AsyncTask {
       private static int bitRate = 2000000;
       private static int MAX_INPUT = 100000;
       private static String mimeType = "video/avc";

       private int frameRate = 15;    
       private int colorFormat;
       private int stride = 1;
       private int sliceHeight = 2;        

       private MediaCodec encoder = null;
       private MediaFormat inputFormat;
       private MediaCodecInfo codecInfo = null;
       private MediaMuxer muxer;
       private boolean mMuxerStarted = false;
       private int mTrackIndex = 0;  
       private long presentationTime = 0;
       private Paint bmpPaint;

       private static int WAITTIME = 10000;
       private static String TAG = "ENCODE";

       private ArrayList<string> mFilePaths;
       private String mPath;

       private EncodeListener mListener;
       private int width = 320;
       private int height = 240;
       private double mSpeed = 1;

       public EncodeAndMux(ArrayList<string> filePaths, String savePath) {
           mFilePaths = filePaths;
           mPath = savePath;  

           // Create paint to draw BMP
           bmpPaint = new Paint();
           bmpPaint.setAntiAlias(true);
           bmpPaint.setFilterBitmap(true);
           bmpPaint.setDither(true);
       }

       public void setListner(EncodeListener listener) {
           mListener = listener;
       }

       // set the speed, how many frames a second
       public void setSpead(int speed) {
           mSpeed = speed;
       }

       public double getSpeed() {
           return mSpeed;
       }

       private long computePresentationTime(int frameIndex) {
           final long ONE_SECOND = 1000000;
           return (long) (frameIndex * (ONE_SECOND / mSpeed));
       }

       public interface EncodeListener {
           public void finished();
           public void errored();
       }

       @TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR2)
       @Override
       protected Boolean doInBackground(Integer... params) {

           try {
               muxer = new MediaMuxer(mPath, OutputFormat.MUXER_OUTPUT_MPEG_4);
           } catch (Exception e){
               e.printStackTrace();
           }

           // Find a code that supports the mime type
           int numCodecs = MediaCodecList.getCodecCount();
           for (int i = 0; i &lt; numCodecs &amp;&amp; codecInfo == null; i++) {
               MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
               if (!info.isEncoder()) {
                   continue;
               }
               String[] types = info.getSupportedTypes();
               boolean found = false;

               for (int j = 0; j &lt; types.length &amp;&amp; !found; j++) {
                   if (types[j].equals(mimeType))
                       found = true;
               }

               if (!found)
                   continue;
               codecInfo = info;
           }


            for (int i = 0; i &lt; MediaCodecList.getCodecCount(); i++) {
                    MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
                    if (!info.isEncoder()) {
                        continue;
                    }

                    String[] types = info.getSupportedTypes();
                    for (int j = 0; j &lt; types.length; ++j) {
                        if (types[j] != mimeType)
                            continue;
                        MediaCodecInfo.CodecCapabilities caps = info.getCapabilitiesForType(types[j]);
                        for (int k = 0; k &lt; caps.profileLevels.length; k++) {
                            if (caps.profileLevels[k].profile == MediaCodecInfo.CodecProfileLevel.AVCProfileHigh &amp;&amp; caps.profileLevels[k].level == MediaCodecInfo.CodecProfileLevel.AVCLevel4) {
                                codecInfo = info;
                            }
                        }
                    }
            }

           Log.d(TAG, "Found " + codecInfo.getName() + " supporting " + mimeType);

           MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
           for (int i = 0; i &lt; capabilities.colorFormats.length &amp;&amp; colorFormat == 0; i++) {
               int format = capabilities.colorFormats[i];
               switch (format) {
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
                   case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
                   colorFormat = format;
                   break;
               }
           }
           Log.d(TAG, "Using color format " + colorFormat);

           // Determine width, height and slice sizes
           if (codecInfo.getName().equals("OMX.TI.DUCATI1.VIDEO.H264E")) {
               // This codec doesn&#39;t support a width not a multiple of 16,
               // so round down.
               width &amp;= ~15;
           }

           stride = width;
           sliceHeight = height;

           if (codecInfo.getName().startsWith("OMX.Nvidia.")) {
               stride = (stride + 15) / 16 * 16;
               sliceHeight = (sliceHeight + 15) / 16 * 16;
           }

           inputFormat = MediaFormat.createVideoFormat(mimeType, width, height);
           inputFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
           inputFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
           inputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
           inputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
    //          inputFormat.setInteger("stride", stride);
    //          inputFormat.setInteger("slice-height", sliceHeight);
           inputFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, MAX_INPUT);

           encoder = MediaCodec.createByCodecName(codecInfo.getName());
           encoder.configure(inputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
           encoder.start();

           ByteBuffer[] inputBuffers = encoder.getInputBuffers();
           ByteBuffer[] outputBuffers = encoder.getOutputBuffers();

           int inputBufferIndex= -1, outputBufferIndex= -1;
           BufferInfo info = new BufferInfo();
           for (int i = 0; i &lt; mFilePaths.size(); i++) {

               // use decode sample to calculate inSample size and then resize
               Bitmap bitmapIn = Images.decodeSampledBitmapFromPath(mFilePaths.get(i), width, height);  

               // Create blank bitmap
               Bitmap bitmap = Bitmap.createBitmap(width, height, Config.ARGB_8888);                  

               // Center scaled image
               Canvas canvas = new Canvas(bitmap);                
               canvas.drawBitmap(bitmapIn,(bitmap.getWidth()/2)-(bitmapIn.getWidth()/2),(bitmap.getHeight()/2)-(bitmapIn.getHeight()/2), bmpPaint);

               Log.d(TAG, "Bitmap width: " + bitmapIn.getWidth() + " height: " + bitmapIn.getHeight() + " WIDTH: " + width + " HEIGHT: " + height);
               byte[] dat = getNV12(width, height, bitmap);
               bitmap.recycle();

               // Exception occurred on this below line in Emulator, LINE No. 182//**
               inputBufferIndex = encoder.dequeueInputBuffer(WAITTIME);
               Log.i("DAT", "Size= "+dat.length);

               if(inputBufferIndex >= 0){
                   int samplesiz= dat.length;
                   inputBuffers[inputBufferIndex].put(dat);
                   presentationTime = computePresentationTime(i);
                   if (i == mFilePaths.size()) {
                       encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                       Log.i(TAG, "Last Frame");
                   } else {
                       encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
                   }

                   while(true) {
                      outputBufferIndex = encoder.dequeueOutputBuffer(info, WAITTIME);
                      Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
                      if (outputBufferIndex >= 0) {
                          ByteBuffer encodedData = outputBuffers[outputBufferIndex];
                          if (encodedData == null) {
                              throw new RuntimeException("encoderOutputBuffer " + outputBufferIndex +
                                      " was null");
                          }

                          if ((info.flags &amp; MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                              // The codec config data was pulled out and fed to the muxer when we got
                              // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                              Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                              info.size = 0;
                          }

                          if (info.size != 0) {
                              if (!mMuxerStarted) {
                                  throw new RuntimeException("muxer hasn&#39;t started");
                              }

                              // adjust the ByteBuffer values to match BufferInfo (not needed?)
                              encodedData.position(info.offset);
                              encodedData.limit(info.offset + info.size);

                              muxer.writeSampleData(mTrackIndex, encodedData, info);
                              Log.d(TAG, "sent " + info.size + " bytes to muxer");
                          }

                          encoder.releaseOutputBuffer(outputBufferIndex, false);

                          inputBuffers[inputBufferIndex].clear();
                          outputBuffers[outputBufferIndex].clear();

                          if ((info.flags &amp; MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                              break;      // out of while
                          }

                      } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                          // Subsequent data will conform to new format.
                          MediaFormat opmediaformat = encoder.getOutputFormat();
                          if (!mMuxerStarted) {
                              mTrackIndex = muxer.addTrack(opmediaformat);
                              muxer.start();
                              mMuxerStarted = true;
                          }
                          Log.i(TAG, "op_buf_format_changed: " + opmediaformat);
                      } else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                          outputBuffers = encoder.getOutputBuffers();
                          Log.d(TAG, "Output Buffer changed " + outputBuffers);
                      } else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
                          // No Data, break out
                          break;
                      } else {
                          // Unexpected State, ignore it
                          Log.d(TAG, "Unexpected State " + outputBufferIndex);
                      }
                   }

               }    
           }

           if (encoder != null) {
               encoder.flush();
               encoder.stop();
               encoder.release();
               encoder = null;
           }

           if (muxer != null) {
               muxer.stop();
               muxer.release();
               muxer = null;
           }

           return true;

       };

       @Override
       protected void onPostExecute(Boolean result) {
           if (result) {
               if (mListener != null)
                   mListener.finished();
           } else {
               if (mListener != null)
                   mListener.errored();
           }
           super.onPostExecute(result);
       }



       byte [] getNV12(int inputWidth, int inputHeight, Bitmap scaled) {
           int [] argb = new int[inputWidth * inputHeight];
           scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
           byte [] yuv = new byte[inputWidth*inputHeight*3/2];
           encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
           scaled.recycle();
           return yuv;
       }


       void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
           final int frameSize = width * height;
           int yIndex = 0;
           int uvIndex = frameSize;
           int a, R, G, B, Y, U, V;
           int index = 0;
           for (int j = 0; j &lt; height; j++) {
               for (int i = 0; i &lt; width; i++) {

                   a = (argb[index] &amp; 0xff000000) >> 24; // a is not used obviously
                   R = (argb[index] &amp; 0xff0000) >> 16;
                   G = (argb[index] &amp; 0xff00) >> 8;
                   B = (argb[index] &amp; 0xff) >> 0;

                   // well known RGB to YVU algorithm
                   Y = ( (  66 * R + 129 * G +  25 * B + 128) >> 8) +  16;
                   V = ( ( -38 * R -  74 * G + 112 * B + 128) >> 8) + 128;
                   U = ( ( 112 * R -  94 * G -  18 * B + 128) >> 8) + 128;

                   yuv420sp[yIndex++] = (byte) ((Y &lt; 0) ? 0 : ((Y > 255) ? 255 : Y));
                   if (j % 2 == 0 &amp;&amp; index % 2 == 0) {
                       yuv420sp[uvIndex++] = (byte)((V&lt;0) ? 0 : ((V > 255) ? 255 : V));
                       yuv420sp[uvIndex++] = (byte)((U&lt;0) ? 0 : ((U > 255) ? 255 : U));
                   }

                   index ++;
               }
           }
       }
    }
    </string></string>

    This has now been tested on 4 of my devices and works fine, is there are way to

    1/ Calculate the MAX_INPUT (to high and on the N7 II it crashes, I Don't want that happening once released)
    2/ Offer an api 16 solution ?
    3/ Do I need stride and stride height ?

    Thanks