Recherche avancée

Médias (3)

Mot : - Tags -/spip

Autres articles (71)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • MediaSPIP Player : les contrôles

    26 mai 2010, par

    Les contrôles à la souris du lecteur
    En plus des actions au click sur les boutons visibles de l’interface du lecteur, il est également possible d’effectuer d’autres actions grâce à la souris : Click : en cliquant sur la vidéo ou sur le logo du son, celui ci se mettra en lecture ou en pause en fonction de son état actuel ; Molette (roulement) : en plaçant la souris sur l’espace utilisé par le média (hover), la molette de la souris n’exerce plus l’effet habituel de scroll de la page, mais diminue ou (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (10054)

  • How to extract the first frame from a video ?

    10 août 2024, par Andrew

    I am trying to extract the first frame from a video to save it as a preview.

    


    Here's an almost working solution using ffmpeg :

    


    func extractFirstFrame(videoBytes []byte) ([]byte, error) {
    input := bytes.NewReader(videoBytes)

    var output bytes.Buffer

    err := ffmpeg.Input("pipe:0").
        Filter("select", ffmpeg.Args{"gte(n,1)"}).
        Output(
            "pipe:1",
            ffmpeg.KwArgs{
                "vframes": 1,
                "format":  "image2",
                "vcodec":  "mjpeg",
            }).
        WithInput(input).
        WithOutput(&output).
        Run()

    if err != nil {
        return nil, fmt.Errorf("error extracting frame: %v", err)
    }

    return output.Bytes(), nil
}


    


    The problem with this is that it can only process horizontal videos. For some reason it will throw a 0xbebbb1b7 error for a vertical videos. I don't understand why is that, probably because this is my very first time with ffmpeg.

    


    Also, I am concerned if this solution is optimal. My assumption is that the whole video will be loaded into the memory first, which is kinda bad and I would like to avoid that

    


    I use https://github.com/u2takey/ffmpeg-go, but errors are the same even if I run it using exec :

    


        cmd := exec.Command(
        "ffmpeg",
        "-i", "pipe:0",
        "-vf", "select=eq(n\\,0)",
        "-q:v", "3",
        "-f", "image2",
        "pipe:1",
    )


    


    Interestingly enough, I noticed that running command in the terminal will work as expected with any type of video ffmpeg -i .\video_2024-08-10_00-03-00.mp4 -vf "select=eq(n\,0)"  -q:v 3 output_image.jpg, so the problem may be on how I send those videos to my server. I useFiber and send videos as multipart/form-data and then read it like this :

    


        form_data, err := c.FormFile("image")
    file, err := form_data.Open()
    bytes, err := io.ReadAll(file)
    ....
    preview, err := extractFirstFrame(bytes)



    


    I also managed to find logs for the failed attempt :

    


    
ffmpeg version 7.0.2-full_build-www.gyan.dev Copyright (c) 2000-2024 the FFmpeg developers
  built with gcc 13.2.0 (Rev5, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libxevd --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxeve --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint
  libavutil      59.  8.100 / 59.  8.100
  libavcodec     61.  3.100 / 61.  3.100
  libavformat    61.  1.100 / 61.  1.100
  libavdevice    61.  1.100 / 61.  1.100
  libavfilter    10.  1.100 / 10.  1.100
  libswscale      8.  1.100 /  8.  1.100
  libswresample   5.  1.100 /  5.  1.100
  libpostproc    58.  1.100 / 58.  1.100
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002a562ff0e80] stream 0, offset 0x30: partial file
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002a562ff0e80] Could not find codec parameters for stream 1 (Video: h264 (avc1 / 0x31637661), none, 720x1280, 3093 kb/s): unspecified pixel format
Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'pipe:0':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    creation_time   : 2024-08-09T20:00:29.000000Z
  Duration: 00:00:06.80, start: 0.000000, bitrate: N/A
  Stream #0:0[0x1](eng): Audio: aac (mp4a / 0x6134706D), 22050 Hz, stereo, fltp, 74 kb/s (default)
      Metadata:
        creation_time   : 2024-08-09T20:00:23.000000Z
        handler_name    : SoundHandle
        vendor_id       : [0][0][0][0]
  Stream #0:1[0x2](eng): Video: h264 (avc1 / 0x31637661), none, 720x1280, 3093 kb/s, 30 fps, 30 tbr, 90k tbn (default)
      Metadata:
        creation_time   : 2024-08-09T20:00:23.000000Z
        handler_name    : VideoHandle
        vendor_id       : [0][0][0][0]
Stream mapping:
  Stream #0:1 -> #0:0 (h264 (native) -> mjpeg (native))
[mov,mp4,m4a,3gp,3g2,mj2 @ 000002a562ff0e80] stream 1, offset 0x5d: partial file
[in#0/mov,mp4,m4a,3gp,3g2,mj2 @ 000002a562fda2c0] Error during demuxing: Invalid data found when processing input
Cannot determine format of input 0:1 after EOF
[vf#0:0 @ 000002a5636c0ec0] Task finished with error code: -1094995529 (Invalid data found when processing input)
[vf#0:0 @ 000002a5636c0ec0] Terminating thread with return code -1094995529 (Invalid data found when processing input)
[vost#0:0/mjpeg @ 000002a5636c0400] Could not open encoder before EOF
[vost#0:0/mjpeg @ 000002a5636c0400] Task finished with error code: -22 (Invalid argument)
[vost#0:0/mjpeg @ 000002a5636c0400] Terminating thread with return code -22 (Invalid argument)
[out#0/image2 @ 000002a562ff5640] Nothing was written into output file, because at least one of its streams received no packets.
frame=    0 fps=0.0 q=0.0 Lsize=       0KiB time=N/A bitrate=N/A speed=N/A
Conversion failed!


    


    Logs for the same video, but ran via terminal :

    


    ffmpeg version 7.0.2-full_build-www.gyan.dev Copyright (c) 2000-2024 the FFmpeg developers
  built with gcc 13.2.0 (Rev5, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libxevd --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxeve --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint
  libavutil      59.  8.100 / 59.  8.100
  libavcodec     61.  3.100 / 61.  3.100
  libavformat    61.  1.100 / 61.  1.100
  libavdevice    61.  1.100 / 61.  1.100
  libavfilter    10.  1.100 / 10.  1.100
  libswscale      8.  1.100 /  8.  1.100
  libswresample   5.  1.100 /  5.  1.100
  libpostproc    58.  1.100 / 58.  1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '.\video_2024-08-10_00-03-00.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    creation_time   : 2024-08-09T20:00:29.000000Z
  Duration: 00:00:06.80, start: 0.000000, bitrate: 3175 kb/s
  Stream #0:0[0x1](eng): Audio: aac (HE-AAC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 74 kb/s (default)
      Metadata:
        creation_time   : 2024-08-09T20:00:23.000000Z
        handler_name    : SoundHandle
        vendor_id       : [0][0][0][0]
  Stream #0:1[0x2](eng): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, bt709, progressive), 720x1280, 3093 kb/s, 30 fps, 30 tbr, 90k tbn (default)
      Metadata:
        creation_time   : 2024-08-09T20:00:23.000000Z
        handler_name    : VideoHandle
        vendor_id       : [0][0][0][0]
Stream mapping:
  Stream #0:1 -> #0:0 (h264 (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 00000219eb4d0c00] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to 'output_image.jpg':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    encoder         : Lavf61.1.100
  Stream #0:0(eng): Video: mjpeg, yuvj420p(pc, unknown/bt709/bt709, progressive), 720x1280, q=2-31, 200 kb/s, 30 fps, 30 tbn (default)
      Metadata:
        creation_time   : 2024-08-09T20:00:23.000000Z
        handler_name    : VideoHandle
        vendor_id       : [0][0][0][0]
        encoder         : Lavc61.3.100 mjpeg
      Side data:
        cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A
[image2 @ 00000219e7aa51c0] The specified filename 'output_image.jpg' does not contain an image sequence pattern or a pattern is invalid.
[image2 @ 00000219e7aa51c0] Use a pattern such as %03d for an image sequence or use the -update option (with -frames:v 1 if needed) to write a single image.
[out#0/image2 @ 00000219e7b5e400] video:49KiB audio:0KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: unknown
frame=    1 fps=0.0 q=3.0 Lsize=N/A time=00:00:00.03 bitrate=N/A speed=0.322x


    


    I also realized that the problem was not related to the video orientation, but to to the video from a specific source uploaded specifically on my server 🤡 Such a weird coincidence that I decided to test my video upload with those files, but anyway. And those are.... Downloaded instagram reels lol. But again, ffmpeg works just fine when ran in terminal

    


  • FFmpeg.Autogen : Issue with Zero-Sized Atom Boxes in MP4 Output

    16 juin 2024, par Alexander Jansson

    I just started learning ffmpeg using ffmpeg.autogen wrapper version 5.1 in c#, and ffmpeg shared libs version 5.1. Im trying to facilitate a class which screenrecords using gdigrab and produces streamable mp4 to a/an buffer/event. Everything seems to work as suposed to with no error except that the outputstream produces atom boxes with 0 in size thus small file size aswell, no data seems to be produced in the boxes, the "debug test mp4 file" is analyzed with MP4Box and the box info is provided in the thread.

    


    To be more specific why does this code produce empty atomboxes, is someone able to make the data produced actually contain any frame data from the gdigrab editing my code ?

    


    `code :

    


     public unsafe class ScreenStreamer : IDisposable
 {
     private readonly AVCodec* productionCodec;
     private readonly AVCodec* screenCaptureAVCodec;
     private readonly AVCodecContext* productionAVCodecContext;
     private readonly AVFormatContext* productionFormatContext;
     private readonly AVCodecContext* screenCaptureAVCodecContext;
     private readonly AVDictionary* productionAVCodecOptions;
     private readonly AVInputFormat* screenCaptureInputFormat;
     private readonly AVFormatContext* screenCaptureInputFormatContext;
     private readonly int gDIGrabVideoStreamIndex;
     private readonly System.Drawing.Size screenBounds;
     private readonly int _produceAtleastAmount;
     public EventHandler OnNewVideoDataProduced;
     private MemoryStream unsafeToManagedBridgeBuffer;
     private CancellationTokenSource cancellationTokenSource;
     private Task recorderTask;

     public ScreenStreamer(int fps, int bitrate, int screenIndex, int produceAtleastAmount = 1000)
     {
         ffmpeg.avdevice_register_all();
         ffmpeg.avformat_network_init();
         recorderTask = Task.CompletedTask;
         cancellationTokenSource = new CancellationTokenSource();
         unsafeToManagedBridgeBuffer = new MemoryStream();
         _produceAtleastAmount = produceAtleastAmount;

         // Allocate and initialize production codec and context
         productionCodec = ffmpeg.avcodec_find_encoder(AVCodecID.AV_CODEC_ID_H264);
         if (productionCodec == null) throw new ApplicationException("Could not find encoder for codec ID H264.");

         productionAVCodecContext = ffmpeg.avcodec_alloc_context3(productionCodec);
         if (productionAVCodecContext == null) throw new ApplicationException("Could not allocate video codec context.");

         // Set codec parameters
         screenBounds = RetrieveScreenBounds(screenIndex);
         productionAVCodecContext->width = screenBounds.Width;
         productionAVCodecContext->height = screenBounds.Height;
         productionAVCodecContext->time_base = new AVRational() { den = fps, num = 1 };
         productionAVCodecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;
         productionAVCodecContext->bit_rate = bitrate;

         int result = ffmpeg.av_opt_set(productionAVCodecContext->priv_data, "preset", "veryfast", 0);
         if (result != 0)
         {
             throw new ApplicationException($"Failed to set options with error code {result}.");
         }

         // Open codec
         fixed (AVDictionary** pm = &productionAVCodecOptions)
         {
             result = ffmpeg.av_dict_set(pm, "movflags", "frag_keyframe+empty_moov+default_base_moof", 0);
             if (result < 0)
             {
                 throw new ApplicationException($"Failed to set dictionary with error code {result}.");
             }

             result = ffmpeg.avcodec_open2(productionAVCodecContext, productionCodec, pm);
             if (result < 0)
             {
                 throw new ApplicationException($"Failed to open codec with error code {result}.");
             }
         }

         // Allocate and initialize screen capture codec and context
         screenCaptureInputFormat = ffmpeg.av_find_input_format("gdigrab");
         if (screenCaptureInputFormat == null) throw new ApplicationException("Could not find input format gdigrab.");

         fixed (AVFormatContext** ps = &screenCaptureInputFormatContext)
         {
             result = ffmpeg.avformat_open_input(ps, "desktop", screenCaptureInputFormat, null);
             if (result < 0)
             {
                 throw new ApplicationException($"Failed to open input with error code {result}.");
             }

             result = ffmpeg.avformat_find_stream_info(screenCaptureInputFormatContext, null);
             if (result < 0)
             {
                 throw new ApplicationException($"Failed to find stream info with error code {result}.");
             }
         }

         gDIGrabVideoStreamIndex = -1;
         for (int i = 0; i < screenCaptureInputFormatContext->nb_streams; i++)
         {
             if (screenCaptureInputFormatContext->streams[i]->codecpar->codec_type == AVMediaType.AVMEDIA_TYPE_VIDEO)
             {
                 gDIGrabVideoStreamIndex = i;
                 break;
             }
         }

         if (gDIGrabVideoStreamIndex < 0)
         {
             throw new ApplicationException("Failed to find video stream in input.");
         }

         AVCodecParameters* codecParameters = screenCaptureInputFormatContext->streams[gDIGrabVideoStreamIndex]->codecpar;
         screenCaptureAVCodec = ffmpeg.avcodec_find_decoder(codecParameters->codec_id);
         if (screenCaptureAVCodec == null)
         {
             throw new ApplicationException("Could not find decoder for screen capture.");
         }

         screenCaptureAVCodecContext = ffmpeg.avcodec_alloc_context3(screenCaptureAVCodec);
         if (screenCaptureAVCodecContext == null)
         {
             throw new ApplicationException("Could not allocate screen capture codec context.");
         }

         result = ffmpeg.avcodec_parameters_to_context(screenCaptureAVCodecContext, codecParameters);
         if (result < 0)
         {
             throw new ApplicationException($"Failed to copy codec parameters to context with error code {result}.");
         }

         result = ffmpeg.avcodec_open2(screenCaptureAVCodecContext, screenCaptureAVCodec, null);
         if (result < 0)
         {
             throw new ApplicationException($"Failed to open screen capture codec with error code {result}.");
         }
     }

     public void Start()
     {
         recorderTask = Task.Run(() =>
         {
             AVPacket* packet = ffmpeg.av_packet_alloc();
             AVFrame* rawFrame = ffmpeg.av_frame_alloc();
             AVFrame* compatibleFrame = null;
             byte* dstBuffer = null;

             try
             {
                 while (!cancellationTokenSource.Token.IsCancellationRequested)
                 {
                     if (ffmpeg.av_read_frame(screenCaptureInputFormatContext, packet) >= 0)
                     {
                         if (packet->stream_index == gDIGrabVideoStreamIndex)
                         {
                             int response = ffmpeg.avcodec_send_packet(screenCaptureAVCodecContext, packet);
                             if (response < 0)
                             {
                                 throw new ApplicationException($"Error while sending a packet to the decoder: {response}");
                             }

                             response = ffmpeg.avcodec_receive_frame(screenCaptureAVCodecContext, rawFrame);
                             if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)
                             {
                                 continue;
                             }
                             else if (response < 0)
                             {
                                 throw new ApplicationException($"Error while receiving a frame from the decoder: {response}");
                             }

                             compatibleFrame = ConvertToCompatiblePixelFormat(rawFrame, out dstBuffer);

                             response = ffmpeg.avcodec_send_frame(productionAVCodecContext, compatibleFrame);
                             if (response < 0)
                             {
                                 throw new ApplicationException($"Error while sending a frame to the encoder: {response}");
                             }

                             while (response >= 0)
                             {
                                 response = ffmpeg.avcodec_receive_packet(productionAVCodecContext, packet);
                                 if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)
                                 {
                                     break;
                                 }
                                 else if (response < 0)
                                 {
                                     throw new ApplicationException($"Error while receiving a packet from the encoder: {response}");
                                 }

                                 using var packetStream = new UnmanagedMemoryStream(packet->data, packet->size);
                                 packetStream.CopyTo(unsafeToManagedBridgeBuffer);
                                 byte[] managedBytes = unsafeToManagedBridgeBuffer.ToArray();
                                 OnNewVideoDataProduced?.Invoke(this, managedBytes);
                                 unsafeToManagedBridgeBuffer.SetLength(0);
                             }
                         }
                     }
                     ffmpeg.av_packet_unref(packet);
                     ffmpeg.av_frame_unref(rawFrame);
                     if (compatibleFrame != null)
                     {
                         ffmpeg.av_frame_unref(compatibleFrame);
                         ffmpeg.av_free(dstBuffer);
                     }
                 }
             }
             finally
             {
                 ffmpeg.av_packet_free(&packet);
                 ffmpeg.av_frame_free(&rawFrame);
                 if (compatibleFrame != null)
                 {
                     ffmpeg.av_frame_free(&compatibleFrame);
                 }
             }
         });
     }

     public AVFrame* ConvertToCompatiblePixelFormat(AVFrame* srcFrame, out byte* dstBuffer)
     {
         AVFrame* dstFrame = ffmpeg.av_frame_alloc();
         int buffer_size = ffmpeg.av_image_get_buffer_size(productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);
         byte_ptrArray4 dstData = new byte_ptrArray4();
         int_array4 dstLinesize = new int_array4();
         dstBuffer = (byte*)ffmpeg.av_malloc((ulong)buffer_size);
         ffmpeg.av_image_fill_arrays(ref dstData, ref dstLinesize, dstBuffer, productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);

         dstFrame->format = (int)productionAVCodecContext->pix_fmt;
         dstFrame->width = productionAVCodecContext->width;
         dstFrame->height = productionAVCodecContext->height;
         dstFrame->data.UpdateFrom(dstData);
         dstFrame->linesize.UpdateFrom(dstLinesize);

         SwsContext* swsCtx = ffmpeg.sws_getContext(
             srcFrame->width, srcFrame->height, (AVPixelFormat)srcFrame->format,
             productionAVCodecContext->width, productionAVCodecContext->height, productionAVCodecContext->pix_fmt,
             ffmpeg.SWS_BILINEAR, null, null, null);

         if (swsCtx == null)
         {
             throw new ApplicationException("Could not initialize the conversion context.");
         }

         ffmpeg.sws_scale(swsCtx, srcFrame->data, srcFrame->linesize, 0, srcFrame->height, dstFrame->data, dstFrame->linesize);
         ffmpeg.sws_freeContext(swsCtx);
         return dstFrame;
     }

     private System.Drawing.Size RetrieveScreenBounds(int screenIndex)
     {
         return new System.Drawing.Size(1920, 1080);
     }

     public void Dispose()
     {
         cancellationTokenSource?.Cancel();
         recorderTask?.Wait();
         cancellationTokenSource?.Dispose();
         recorderTask?.Dispose();
         unsafeToManagedBridgeBuffer?.Dispose();

         fixed (AVCodecContext** p = &productionAVCodecContext)
         {
             if (*p != null)
             {
                 ffmpeg.avcodec_free_context(p);
             }
         }
         fixed (AVCodecContext** p = &screenCaptureAVCodecContext)
         {
             if (*p != null)
             {
                 ffmpeg.avcodec_free_context(p);
             }
         }

         if (productionFormatContext != null)
         {
             ffmpeg.avformat_free_context(productionFormatContext);
         }

         if (screenCaptureInputFormatContext != null)
         {
             ffmpeg.avformat_free_context(screenCaptureInputFormatContext);
         }

         if (productionAVCodecOptions != null)
         {
             fixed (AVDictionary** p = &productionAVCodecOptions)
             {
                 ffmpeg.av_dict_free(p);
             }
         }
     }
 }


    


    I call Start method and wait 8 econds, out of scope I write the bytes to an mp4 file without using the write trailer just to debug the atomboxes. and the mp4 debugging box output I got :

    


    (Full OUTPUT)
https://pastebin.com/xkM4MfG7

    



    


    (Not full)

    


    &#xA;&#xA;"&#xA;<boxes>&#xA;<uuidbox size="0" type="uuid" uuid="{00000000-00000000-00000000-00000000}" specification="unknown" container="unknown">&#xA;</uuidbox>&#xA;<trackreferencetypebox size="0" type="cdsc" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="hint" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="font" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="hind" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="vdep" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="vplx" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="subt" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="thmb" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="mpod" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="dpnd" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="sync" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="ipir" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="sbas" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="scal" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="tbas" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="sabt" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="oref" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="adda" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="adrc" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="iloc" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="avcp" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="swto" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="swfr" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="chap" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="tmcd" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="cdep" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="scpt" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="ssrc" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="lyra" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<itemreferencebox size="0" type="tbas" specification="p12" container="iref">&#xA;<itemreferenceboxentry itemid=""></itemreferenceboxentry>&#xA;</itemreferencebox>&#xA;<itemreferencebox size="0" type="iloc" specification="p12" container="iref">&#xA;<itemreferenceboxentry itemid=""></itemreferenceboxentry>&#xA;</itemreferencebox>&#xA;<itemreferencebox size="0" type="fdel" specification="p12" container="iref">&#xA;<itemreferenceboxentry itemid=""></itemreferenceboxentry>&#xA;</itemreferencebox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<rollrecoveryentry></rollrecoveryentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<audioprerollentry></audioprerollentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<visualrandomaccessentry></visualrandomaccessentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<cencsampleencryptiongroupentry isencrypted="" kid=""></cencsampleencryptiongroupentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<operatingpointsinformation>&#xA; <profiletierlevel></profiletierlevel>&#xA;<operatingpoint minpicwidth="" minpicheight="" maxpicwidth="" maxpicheight="" maxchromaformat="" maxbitdepth="" avgframerate="" constantframerate="" maxbitrate="" avgbitrate=""></operatingpoint>&#xA;&#xA;</operatingpointsinformation>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<layerinformation>&#xA;<layerinfoitem></layerinfoitem>&#xA;</layerinformation>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<tileregiongroupentry tilegroup="" independent="" x="" y="" w="" h="">&#xA;<tileregiondependency tileid=""></tileregiondependency>&#xA;</tileregiongroupentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<nalumap rle="">&#xA;<nalumapentry groupid=""></nalumapentry>&#xA;</nalumap>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<temporallevelentry></temporallevelentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<sapentry></sapentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<syncsamplegroupentry></syncsamplegroupentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<subpictureorderentry refs=""></subpictureorderentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<sampledescriptionentrybox size="0" type="GNRM" specification="unknown" container="stsd" extensiondatasize="0">&#xA;</sampledescriptionentrybox>&#xA;<visualsampledescriptionbox size="0" type="GNRV" specification="unknown" container="stsd" version="0" revision="0" vendor="0" temporalquality="0" spacialquality="0" width="0" height="0" horizontalresolution="4718592" verticalresolution="4718592" compressorname="" bitdepth="24">&#xA;</visualsampledescriptionbox>&#xA;<audiosampledescriptionbox size="0" type="GNRA" specification="unknown" container="stsd" version="0" revision="0" vendor="0" channelcount="2" bitspersample="16" samplerate="0">&#xA;</audiosampledescriptionbox>&#xA;<trackgrouptypebox size="0" type="msrc" version="0" flags="0" specification="p12" container="trgr">&#xA;</trackgrouptypebox>&#xA;<trackgrouptypebox size="0" type="ster" version="0" flags="0" specification="p12" container="trgr">&#xA;</trackgrouptypebox>&#xA;<trackgrouptypebox size="0" type="cstg" version="0" flags="0" specification="p15" container="trgr">&#xA;</trackgrouptypebox>&#xA;<freespacebox size="0" type="free" specification="p12" container="*">&#xA;</freespacebox>&#xA;<freespacebox size="0" type="free" specification="p12" container="*">&#xA;</freespacebox>&#xA;<mediadatabox size="0" type="mdat" specification="p12" container="file">&#xA;</mediadatabox>&#xA;<mediadatabox size="0" type="mdat" specification="p12" container="meta">&#xA;"&#xA;</mediadatabox></boxes>

    &#xA;

  • FFmpeg.Autogen Wrapper : Issue with Zero-Sized Atom Boxes in MP4 Output

    11 juin 2024, par Alexander Jansson

    I just started learning ffmpeg using ffmpeg.autogen wrapper version 5.1 in c#, and ffmpeg shared libs version 5.1. Im trying to facilitate a class which screenrecords using gdigrab and produces streamable mp4 to a/an buffer/event. Everything seems to work as suposed to with no error except that the outputstream produces atom boxes with 0 in size thus small file size aswell, no data seems to be produced in the boxes, the "debug test mp4 file" is analyzed with MP4Box and the box info is provided in the thread.

    &#xA;

    To be more specific why does this code produce empty atomboxes, is someone able to make the data produced actually contain any frame data from the gdigrab editing my code ?

    &#xA;

    `code :

    &#xA;

     public unsafe class ScreenStreamer : IDisposable&#xA; {&#xA;     private readonly AVCodec* productionCodec;&#xA;     private readonly AVCodec* screenCaptureAVCodec;&#xA;     private readonly AVCodecContext* productionAVCodecContext;&#xA;     private readonly AVFormatContext* productionFormatContext;&#xA;     private readonly AVCodecContext* screenCaptureAVCodecContext;&#xA;     private readonly AVDictionary* productionAVCodecOptions;&#xA;     private readonly AVInputFormat* screenCaptureInputFormat;&#xA;     private readonly AVFormatContext* screenCaptureInputFormatContext;&#xA;     private readonly int gDIGrabVideoStreamIndex;&#xA;     private readonly System.Drawing.Size screenBounds;&#xA;     private readonly int _produceAtleastAmount;&#xA;     public EventHandler OnNewVideoDataProduced;&#xA;     private MemoryStream unsafeToManagedBridgeBuffer;&#xA;     private CancellationTokenSource cancellationTokenSource;&#xA;     private Task recorderTask;&#xA;&#xA;     public ScreenStreamer(int fps, int bitrate, int screenIndex, int produceAtleastAmount = 1000)&#xA;     {&#xA;         ffmpeg.avdevice_register_all();&#xA;         ffmpeg.avformat_network_init();&#xA;         recorderTask = Task.CompletedTask;&#xA;         cancellationTokenSource = new CancellationTokenSource();&#xA;         unsafeToManagedBridgeBuffer = new MemoryStream();&#xA;         _produceAtleastAmount = produceAtleastAmount;&#xA;&#xA;         // Allocate and initialize production codec and context&#xA;         productionCodec = ffmpeg.avcodec_find_encoder(AVCodecID.AV_CODEC_ID_H264);&#xA;         if (productionCodec == null) throw new ApplicationException("Could not find encoder for codec ID H264.");&#xA;&#xA;         productionAVCodecContext = ffmpeg.avcodec_alloc_context3(productionCodec);&#xA;         if (productionAVCodecContext == null) throw new ApplicationException("Could not allocate video codec context.");&#xA;&#xA;         // Set codec parameters&#xA;         screenBounds = RetrieveScreenBounds(screenIndex);&#xA;         productionAVCodecContext->width = screenBounds.Width;&#xA;         productionAVCodecContext->height = screenBounds.Height;&#xA;         productionAVCodecContext->time_base = new AVRational() { den = fps, num = 1 };&#xA;         productionAVCodecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;&#xA;         productionAVCodecContext->bit_rate = bitrate;&#xA;&#xA;         int result = ffmpeg.av_opt_set(productionAVCodecContext->priv_data, "preset", "veryfast", 0);&#xA;         if (result != 0)&#xA;         {&#xA;             throw new ApplicationException($"Failed to set options with error code {result}.");&#xA;         }&#xA;&#xA;         // Open codec&#xA;         fixed (AVDictionary** pm = &amp;productionAVCodecOptions)&#xA;         {&#xA;             result = ffmpeg.av_dict_set(pm, "movflags", "frag_keyframe&#x2B;empty_moov&#x2B;default_base_moof", 0);&#xA;             if (result &lt; 0)&#xA;             {&#xA;                 throw new ApplicationException($"Failed to set dictionary with error code {result}.");&#xA;             }&#xA;&#xA;             result = ffmpeg.avcodec_open2(productionAVCodecContext, productionCodec, pm);&#xA;             if (result &lt; 0)&#xA;             {&#xA;                 throw new ApplicationException($"Failed to open codec with error code {result}.");&#xA;             }&#xA;         }&#xA;&#xA;         // Allocate and initialize screen capture codec and context&#xA;         screenCaptureInputFormat = ffmpeg.av_find_input_format("gdigrab");&#xA;         if (screenCaptureInputFormat == null) throw new ApplicationException("Could not find input format gdigrab.");&#xA;&#xA;         fixed (AVFormatContext** ps = &amp;screenCaptureInputFormatContext)&#xA;         {&#xA;             result = ffmpeg.avformat_open_input(ps, "desktop", screenCaptureInputFormat, null);&#xA;             if (result &lt; 0)&#xA;             {&#xA;                 throw new ApplicationException($"Failed to open input with error code {result}.");&#xA;             }&#xA;&#xA;             result = ffmpeg.avformat_find_stream_info(screenCaptureInputFormatContext, null);&#xA;             if (result &lt; 0)&#xA;             {&#xA;                 throw new ApplicationException($"Failed to find stream info with error code {result}.");&#xA;             }&#xA;         }&#xA;&#xA;         gDIGrabVideoStreamIndex = -1;&#xA;         for (int i = 0; i &lt; screenCaptureInputFormatContext->nb_streams; i&#x2B;&#x2B;)&#xA;         {&#xA;             if (screenCaptureInputFormatContext->streams[i]->codecpar->codec_type == AVMediaType.AVMEDIA_TYPE_VIDEO)&#xA;             {&#xA;                 gDIGrabVideoStreamIndex = i;&#xA;                 break;&#xA;             }&#xA;         }&#xA;&#xA;         if (gDIGrabVideoStreamIndex &lt; 0)&#xA;         {&#xA;             throw new ApplicationException("Failed to find video stream in input.");&#xA;         }&#xA;&#xA;         AVCodecParameters* codecParameters = screenCaptureInputFormatContext->streams[gDIGrabVideoStreamIndex]->codecpar;&#xA;         screenCaptureAVCodec = ffmpeg.avcodec_find_decoder(codecParameters->codec_id);&#xA;         if (screenCaptureAVCodec == null)&#xA;         {&#xA;             throw new ApplicationException("Could not find decoder for screen capture.");&#xA;         }&#xA;&#xA;         screenCaptureAVCodecContext = ffmpeg.avcodec_alloc_context3(screenCaptureAVCodec);&#xA;         if (screenCaptureAVCodecContext == null)&#xA;         {&#xA;             throw new ApplicationException("Could not allocate screen capture codec context.");&#xA;         }&#xA;&#xA;         result = ffmpeg.avcodec_parameters_to_context(screenCaptureAVCodecContext, codecParameters);&#xA;         if (result &lt; 0)&#xA;         {&#xA;             throw new ApplicationException($"Failed to copy codec parameters to context with error code {result}.");&#xA;         }&#xA;&#xA;         result = ffmpeg.avcodec_open2(screenCaptureAVCodecContext, screenCaptureAVCodec, null);&#xA;         if (result &lt; 0)&#xA;         {&#xA;             throw new ApplicationException($"Failed to open screen capture codec with error code {result}.");&#xA;         }&#xA;     }&#xA;&#xA;     public void Start()&#xA;     {&#xA;         recorderTask = Task.Run(() =>&#xA;         {&#xA;             AVPacket* packet = ffmpeg.av_packet_alloc();&#xA;             AVFrame* rawFrame = ffmpeg.av_frame_alloc();&#xA;             AVFrame* compatibleFrame = null;&#xA;             byte* dstBuffer = null;&#xA;&#xA;             try&#xA;             {&#xA;                 while (!cancellationTokenSource.Token.IsCancellationRequested)&#xA;                 {&#xA;                     if (ffmpeg.av_read_frame(screenCaptureInputFormatContext, packet) >= 0)&#xA;                     {&#xA;                         if (packet->stream_index == gDIGrabVideoStreamIndex)&#xA;                         {&#xA;                             int response = ffmpeg.avcodec_send_packet(screenCaptureAVCodecContext, packet);&#xA;                             if (response &lt; 0)&#xA;                             {&#xA;                                 throw new ApplicationException($"Error while sending a packet to the decoder: {response}");&#xA;                             }&#xA;&#xA;                             response = ffmpeg.avcodec_receive_frame(screenCaptureAVCodecContext, rawFrame);&#xA;                             if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)&#xA;                             {&#xA;                                 continue;&#xA;                             }&#xA;                             else if (response &lt; 0)&#xA;                             {&#xA;                                 throw new ApplicationException($"Error while receiving a frame from the decoder: {response}");&#xA;                             }&#xA;&#xA;                             compatibleFrame = ConvertToCompatiblePixelFormat(rawFrame, out dstBuffer);&#xA;&#xA;                             response = ffmpeg.avcodec_send_frame(productionAVCodecContext, compatibleFrame);&#xA;                             if (response &lt; 0)&#xA;                             {&#xA;                                 throw new ApplicationException($"Error while sending a frame to the encoder: {response}");&#xA;                             }&#xA;&#xA;                             while (response >= 0)&#xA;                             {&#xA;                                 response = ffmpeg.avcodec_receive_packet(productionAVCodecContext, packet);&#xA;                                 if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)&#xA;                                 {&#xA;                                     break;&#xA;                                 }&#xA;                                 else if (response &lt; 0)&#xA;                                 {&#xA;                                     throw new ApplicationException($"Error while receiving a packet from the encoder: {response}");&#xA;                                 }&#xA;&#xA;                                 using var packetStream = new UnmanagedMemoryStream(packet->data, packet->size);&#xA;                                 packetStream.CopyTo(unsafeToManagedBridgeBuffer);&#xA;                                 byte[] managedBytes = unsafeToManagedBridgeBuffer.ToArray();&#xA;                                 OnNewVideoDataProduced?.Invoke(this, managedBytes);&#xA;                                 unsafeToManagedBridgeBuffer.SetLength(0);&#xA;                             }&#xA;                         }&#xA;                     }&#xA;                     ffmpeg.av_packet_unref(packet);&#xA;                     ffmpeg.av_frame_unref(rawFrame);&#xA;                     if (compatibleFrame != null)&#xA;                     {&#xA;                         ffmpeg.av_frame_unref(compatibleFrame);&#xA;                         ffmpeg.av_free(dstBuffer);&#xA;                     }&#xA;                 }&#xA;             }&#xA;             finally&#xA;             {&#xA;                 ffmpeg.av_packet_free(&amp;packet);&#xA;                 ffmpeg.av_frame_free(&amp;rawFrame);&#xA;                 if (compatibleFrame != null)&#xA;                 {&#xA;                     ffmpeg.av_frame_free(&amp;compatibleFrame);&#xA;                 }&#xA;             }&#xA;         });&#xA;     }&#xA;&#xA;     public AVFrame* ConvertToCompatiblePixelFormat(AVFrame* srcFrame, out byte* dstBuffer)&#xA;     {&#xA;         AVFrame* dstFrame = ffmpeg.av_frame_alloc();&#xA;         int buffer_size = ffmpeg.av_image_get_buffer_size(productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);&#xA;         byte_ptrArray4 dstData = new byte_ptrArray4();&#xA;         int_array4 dstLinesize = new int_array4();&#xA;         dstBuffer = (byte*)ffmpeg.av_malloc((ulong)buffer_size);&#xA;         ffmpeg.av_image_fill_arrays(ref dstData, ref dstLinesize, dstBuffer, productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);&#xA;&#xA;         dstFrame->format = (int)productionAVCodecContext->pix_fmt;&#xA;         dstFrame->width = productionAVCodecContext->width;&#xA;         dstFrame->height = productionAVCodecContext->height;&#xA;         dstFrame->data.UpdateFrom(dstData);&#xA;         dstFrame->linesize.UpdateFrom(dstLinesize);&#xA;&#xA;         SwsContext* swsCtx = ffmpeg.sws_getContext(&#xA;             srcFrame->width, srcFrame->height, (AVPixelFormat)srcFrame->format,&#xA;             productionAVCodecContext->width, productionAVCodecContext->height, productionAVCodecContext->pix_fmt,&#xA;             ffmpeg.SWS_BILINEAR, null, null, null);&#xA;&#xA;         if (swsCtx == null)&#xA;         {&#xA;             throw new ApplicationException("Could not initialize the conversion context.");&#xA;         }&#xA;&#xA;         ffmpeg.sws_scale(swsCtx, srcFrame->data, srcFrame->linesize, 0, srcFrame->height, dstFrame->data, dstFrame->linesize);&#xA;         ffmpeg.sws_freeContext(swsCtx);&#xA;         return dstFrame;&#xA;     }&#xA;&#xA;     private System.Drawing.Size RetrieveScreenBounds(int screenIndex)&#xA;     {&#xA;         return new System.Drawing.Size(1920, 1080);&#xA;     }&#xA;&#xA;     public void Dispose()&#xA;     {&#xA;         cancellationTokenSource?.Cancel();&#xA;         recorderTask?.Wait();&#xA;         cancellationTokenSource?.Dispose();&#xA;         recorderTask?.Dispose();&#xA;         unsafeToManagedBridgeBuffer?.Dispose();&#xA;&#xA;         fixed (AVCodecContext** p = &amp;productionAVCodecContext)&#xA;         {&#xA;             if (*p != null)&#xA;             {&#xA;                 ffmpeg.avcodec_free_context(p);&#xA;             }&#xA;         }&#xA;         fixed (AVCodecContext** p = &amp;screenCaptureAVCodecContext)&#xA;         {&#xA;             if (*p != null)&#xA;             {&#xA;                 ffmpeg.avcodec_free_context(p);&#xA;             }&#xA;         }&#xA;&#xA;         if (productionFormatContext != null)&#xA;         {&#xA;             ffmpeg.avformat_free_context(productionFormatContext);&#xA;         }&#xA;&#xA;         if (screenCaptureInputFormatContext != null)&#xA;         {&#xA;             ffmpeg.avformat_free_context(screenCaptureInputFormatContext);&#xA;         }&#xA;&#xA;         if (productionAVCodecOptions != null)&#xA;         {&#xA;             fixed (AVDictionary** p = &amp;productionAVCodecOptions)&#xA;             {&#xA;                 ffmpeg.av_dict_free(p);&#xA;             }&#xA;         }&#xA;     }&#xA; }&#xA;

    &#xA;

    I call Start method and wait 8 econds, out of scope I write the bytes to an mp4 file without using the write trailer just to debug the atomboxes. and the mp4 debugging box output I got :

    &#xA;

    (Full OUTPUT)&#xA;https://pastebin.com/xkM4MfG7

    &#xA;


    &#xA;

    (Not full)

    &#xA;

    &#xA;&#xA;"&#xA;<boxes>&#xA;<uuidbox size="0" type="uuid" uuid="{00000000-00000000-00000000-00000000}" specification="unknown" container="unknown">&#xA;</uuidbox>&#xA;<trackreferencetypebox size="0" type="cdsc" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="hint" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="font" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="hind" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="vdep" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="vplx" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="subt" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="thmb" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="mpod" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="dpnd" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="sync" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="ipir" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="sbas" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="scal" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="tbas" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="sabt" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="oref" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="adda" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="adrc" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="iloc" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="avcp" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="swto" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="swfr" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="chap" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="tmcd" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="cdep" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="scpt" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="ssrc" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="lyra" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<itemreferencebox size="0" type="tbas" specification="p12" container="iref">&#xA;<itemreferenceboxentry itemid=""></itemreferenceboxentry>&#xA;</itemreferencebox>&#xA;<itemreferencebox size="0" type="iloc" specification="p12" container="iref">&#xA;<itemreferenceboxentry itemid=""></itemreferenceboxentry>&#xA;</itemreferencebox>&#xA;<itemreferencebox size="0" type="fdel" specification="p12" container="iref">&#xA;<itemreferenceboxentry itemid=""></itemreferenceboxentry>&#xA;</itemreferencebox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<rollrecoveryentry></rollrecoveryentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<audioprerollentry></audioprerollentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<visualrandomaccessentry></visualrandomaccessentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<cencsampleencryptiongroupentry isencrypted="" kid=""></cencsampleencryptiongroupentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<operatingpointsinformation>&#xA; <profiletierlevel></profiletierlevel>&#xA;<operatingpoint minpicwidth="" minpicheight="" maxpicwidth="" maxpicheight="" maxchromaformat="" maxbitdepth="" avgframerate="" constantframerate="" maxbitrate="" avgbitrate=""></operatingpoint>&#xA;&#xA;</operatingpointsinformation>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<layerinformation>&#xA;<layerinfoitem></layerinfoitem>&#xA;</layerinformation>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<tileregiongroupentry tilegroup="" independent="" x="" y="" w="" h="">&#xA;<tileregiondependency tileid=""></tileregiondependency>&#xA;</tileregiongroupentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<nalumap rle="">&#xA;<nalumapentry groupid=""></nalumapentry>&#xA;</nalumap>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<temporallevelentry></temporallevelentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<sapentry></sapentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<syncsamplegroupentry></syncsamplegroupentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<subpictureorderentry refs=""></subpictureorderentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<sampledescriptionentrybox size="0" type="GNRM" specification="unknown" container="stsd" extensiondatasize="0">&#xA;</sampledescriptionentrybox>&#xA;<visualsampledescriptionbox size="0" type="GNRV" specification="unknown" container="stsd" version="0" revision="0" vendor="0" temporalquality="0" spacialquality="0" width="0" height="0" horizontalresolution="4718592" verticalresolution="4718592" compressorname="" bitdepth="24">&#xA;</visualsampledescriptionbox>&#xA;<audiosampledescriptionbox size="0" type="GNRA" specification="unknown" container="stsd" version="0" revision="0" vendor="0" channelcount="2" bitspersample="16" samplerate="0">&#xA;</audiosampledescriptionbox>&#xA;<trackgrouptypebox size="0" type="msrc" version="0" flags="0" specification="p12" container="trgr">&#xA;</trackgrouptypebox>&#xA;<trackgrouptypebox size="0" type="ster" version="0" flags="0" specification="p12" container="trgr">&#xA;</trackgrouptypebox>&#xA;<trackgrouptypebox size="0" type="cstg" version="0" flags="0" specification="p15" container="trgr">&#xA;</trackgrouptypebox>&#xA;<freespacebox size="0" type="free" specification="p12" container="*">&#xA;</freespacebox>&#xA;<freespacebox size="0" type="free" specification="p12" container="*">&#xA;</freespacebox>&#xA;<mediadatabox size="0" type="mdat" specification="p12" container="file">&#xA;</mediadatabox>&#xA;<mediadatabox size="0" type="mdat" specification="p12" container="meta">&#xA;"&#xA;</mediadatabox></boxes>

    &#xA;