
Recherche avancée
Médias (3)
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (88)
-
Soumettre améliorations et plugins supplémentaires
10 avril 2011Si vous avez développé une nouvelle extension permettant d’ajouter une ou plusieurs fonctionnalités utiles à MediaSPIP, faites le nous savoir et son intégration dans la distribution officielle sera envisagée.
Vous pouvez utiliser la liste de discussion de développement afin de le faire savoir ou demander de l’aide quant à la réalisation de ce plugin. MediaSPIP étant basé sur SPIP, il est également possible d’utiliser le liste de discussion SPIP-zone de SPIP pour (...) -
Organiser par catégorie
17 mai 2013, parDans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...) -
Récupération d’informations sur le site maître à l’installation d’une instance
26 novembre 2010, parUtilité
Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...)
Sur d’autres sites (10752)
-
How can I convert a video raw frame to image using ffmpeg [closed]
8 mai 2024, par SeanI have a video stream (from a DJI Drone) coming to a websocket server.The websocket server saves the data using the technique described in this answer.


Technical Details



The frame is :


- 

- From a H265 video generated by the drone
- has size : height 1080 , width 1440
- FPS : 30
- Size : 45235










This is the raw video data as we receive from the getData() function of the video frame returned by the addStreamDataListener method


Goal



I want that the saved frame (I have noted the peculiar size) to be converted to ffpmeg.


Attempt(s) to solve



I have tried :


ffmpeg -f rawvideo -s 720x480 -i images/fileName1715180324575.dat output.jpg



as well as


ffmpeg -f rawvideo -pix_fmt rgb24 -s 1080x1440 -i images/fileName1715180324575.dat output.jpg



and various other combinations


Error



I get :


ffmpeg version n6.1.1 Copyright (c) 2000-2023 the FFmpeg developers


built with gcc 13.2.1 (GCC) 20230801
 configuration: --prefix=/usr --disable-debug --disable-static --disable-stripping --enable-amf --enable-avisynth --enable-cuda-llvm --enable-lto --enable-fontconfig --enable-frei0r --enable-gmp --enable-gnutls --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libdav1d --enable-libdrm --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libharfbuzz --enable-libiec61883 --enable-libjack --enable-libjxl --enable-libmodplug --enable-libmp3lame --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libplacebo --enable-libpulse --enable-librav1e --enable-librsvg --enable-librubberband --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpl --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-libzimg --enable-nvdec --enable-nvenc --enable-opencl --enable-opengl --enable-shared --enable-vapoursynth --enable-version3 --enable-vulkan
 libavutil 58. 29.100 / 58. 29.100
 libavcodec 60. 31.102 / 60. 31.102
 libavformat 60. 16.100 / 60. 16.100
 libavdevice 60. 3.100 / 60. 3.100
 libavfilter 9. 12.100 / 9. 12.100
 libswscale 7. 5.100 / 7. 5.100
 libswresample 4. 12.100 / 4. 12.100
 libpostproc 57. 3.100 / 57. 3.100
[rawvideo @ 0x5d0fc4e5f600] Packet corrupt (stream = 0, dts = 0).
[rawvideo @ 0x5d0fc4e5f600] Estimating duration from bitrate, this may be inaccurate
Input #0, rawvideo, from 'images/fileName1715180324575.dat':
 Duration: N/A, start: 0.000000, bitrate: 933120 kb/s
 Stream #0:0: Video: rawvideo (RGB[24] / 0x18424752), rgb24, 1080x1440, 933120 kb/s, 25 tbr, 25 tbn
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[in#0/rawvideo @ 0x5d0fc4e5f500] corrupt input packet in stream 0
[rawvideo @ 0x5d0fc4e697c0] Invalid buffer size, packet size 45235 < expected frame_size 4665600
[vist#0:0/rawvideo @ 0x5d0fc4e69640] Error submitting packet to decoder: Invalid argument
[swscaler @ 0x5d0fc4e86580] deprecated pixel format used, make sure you did set range correctly
[vost#0:0/mjpeg @ 0x5d0fc4e6c6c0] No filtered frames for output stream, trying to initialize anyway.
Output #0, image2, to 'output.jpg':
 Metadata:
 encoder : Lavf60.16.100
 Stream #0:0: Video: mjpeg, yuvj444p(pc, progressive), 1080x1440, q=2-31, 200 kb/s, 25 fps, 25 tbn
 Metadata:
 encoder : Lavc60.31.102 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A
[vist#0:0/rawvideo @ 0x5d0fc4e69640] Decode error rate 1 exceeds maximum 0.666667
[out#0/image2 @ 0x5d0fc4e69380] video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
[out#0/image2 @ 0x5d0fc4e69380] Output file is empty, nothing was encoded(check -ss / -t / -frames parameters if used)
frame= 0 fps=0.0 q=0.0 Lsize=N/A time=N/A bitrate=N/A speed=N/A 
Conversion failed!



or


ffmpeg version n6.1.1 Copyright (c) 2000-2023 the FFmpeg developers
 built with gcc 13.2.1 (GCC) 20230801
 configuration: --prefix=/usr --disable-debug --disable-static --disable-stripping --enable-amf --enable-avisynth --enable-cuda-llvm --enable-lto --enable-fontconfig --enable-frei0r --enable-gmp --enable-gnutls --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libdav1d --enable-libdrm --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libharfbuzz --enable-libiec61883 --enable-libjack --enable-libjxl --enable-libmodplug --enable-libmp3lame --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libplacebo --enable-libpulse --enable-librav1e --enable-librsvg --enable-librubberband --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpl --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-libzimg --enable-nvdec --enable-nvenc --enable-opencl --enable-opengl --enable-shared --enable-vapoursynth --enable-version3 --enable-vulkan
 libavutil 58. 29.100 / 58. 29.100
 libavcodec 60. 31.102 / 60. 31.102
 libavformat 60. 16.100 / 60. 16.100
 libavdevice 60. 3.100 / 60. 3.100
 libavfilter 9. 12.100 / 9. 12.100
 libswscale 7. 5.100 / 7. 5.100
 libswresample 4. 12.100 / 4. 12.100
 libpostproc 57. 3.100 / 57. 3.100
[rawvideo @ 0x5a4c0b8685c0] Packet corrupt (stream = 0, dts = 0).
[rawvideo @ 0x5a4c0b8685c0] Estimating duration from bitrate, this may be inaccurate
Input #0, rawvideo, from 'images/fileName1715180324575.dat':
 Duration: N/A, start: 0.000000, bitrate: 103680 kb/s
 Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 720x480, 103680 kb/s, 25 tbr, 25 tbn
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[in#0/rawvideo @ 0x5a4c0b8684c0] corrupt input packet in stream 0
[rawvideo @ 0x5a4c0b872700] Invalid buffer size, packet size 45235 < expected frame_size 518400
[vist#0:0/rawvideo @ 0x5a4c0b872580] Error submitting packet to decoder: Invalid argument
[swscaler @ 0x5a4c0b88f480] deprecated pixel format used, make sure you did set range correctly
[vost#0:0/mjpeg @ 0x5a4c0b875600] No filtered frames for output stream, trying to initialize anyway.
Output #0, image2, to 'output.jpg':
 Metadata:
 encoder : Lavf60.16.100
 Stream #0:0: Video: mjpeg, yuvj420p(pc, progressive), 720x480, q=2-31, 200 kb/s, 25 fps, 25 tbn
 Metadata:
 encoder : Lavc60.31.102 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A
[vist#0:0/rawvideo @ 0x5a4c0b872580] Decode error rate 1 exceeds maximum 0.666667
[out#0/image2 @ 0x5a4c0b8722c0] video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
[out#0/image2 @ 0x5a4c0b8722c0] Output file is empty, nothing was encoded(check -ss / -t / -frames parameters if used)
frame= 0 fps=0.0 q=0.0 Lsize=N/A time=N/A bitrate=N/A speed=N/A 
Conversion failed!



or similar errors.


Question



How can I extract the image from a video frame using ffmpeg ? Thank you


A sample of the image file may be found here : github


-
FFmpeg.Autogen Wrapper : Issue with Zero-Sized Atom Boxes in MP4 Output
11 juin 2024, par Alexander JanssonI just started learning ffmpeg using ffmpeg.autogen wrapper version 5.1 in c#, and ffmpeg shared libs version 5.1. Im trying to facilitate a class which screenrecords using gdigrab and produces streamable mp4 to a/an buffer/event. Everything seems to work as suposed to with no error except that the outputstream produces atom boxes with 0 in size thus small file size aswell, no data seems to be produced in the boxes, the "debug test mp4 file" is analyzed with MP4Box and the box info is provided in the thread.


To be more specific why does this code produce empty atomboxes, is someone able to make the data produced actually contain any frame data from the gdigrab editing my code ?


`code :


public unsafe class ScreenStreamer : IDisposable
 {
 private readonly AVCodec* productionCodec;
 private readonly AVCodec* screenCaptureAVCodec;
 private readonly AVCodecContext* productionAVCodecContext;
 private readonly AVFormatContext* productionFormatContext;
 private readonly AVCodecContext* screenCaptureAVCodecContext;
 private readonly AVDictionary* productionAVCodecOptions;
 private readonly AVInputFormat* screenCaptureInputFormat;
 private readonly AVFormatContext* screenCaptureInputFormatContext;
 private readonly int gDIGrabVideoStreamIndex;
 private readonly System.Drawing.Size screenBounds;
 private readonly int _produceAtleastAmount;
 public EventHandler OnNewVideoDataProduced;
 private MemoryStream unsafeToManagedBridgeBuffer;
 private CancellationTokenSource cancellationTokenSource;
 private Task recorderTask;

 public ScreenStreamer(int fps, int bitrate, int screenIndex, int produceAtleastAmount = 1000)
 {
 ffmpeg.avdevice_register_all();
 ffmpeg.avformat_network_init();
 recorderTask = Task.CompletedTask;
 cancellationTokenSource = new CancellationTokenSource();
 unsafeToManagedBridgeBuffer = new MemoryStream();
 _produceAtleastAmount = produceAtleastAmount;

 // Allocate and initialize production codec and context
 productionCodec = ffmpeg.avcodec_find_encoder(AVCodecID.AV_CODEC_ID_H264);
 if (productionCodec == null) throw new ApplicationException("Could not find encoder for codec ID H264.");

 productionAVCodecContext = ffmpeg.avcodec_alloc_context3(productionCodec);
 if (productionAVCodecContext == null) throw new ApplicationException("Could not allocate video codec context.");

 // Set codec parameters
 screenBounds = RetrieveScreenBounds(screenIndex);
 productionAVCodecContext->width = screenBounds.Width;
 productionAVCodecContext->height = screenBounds.Height;
 productionAVCodecContext->time_base = new AVRational() { den = fps, num = 1 };
 productionAVCodecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;
 productionAVCodecContext->bit_rate = bitrate;

 int result = ffmpeg.av_opt_set(productionAVCodecContext->priv_data, "preset", "veryfast", 0);
 if (result != 0)
 {
 throw new ApplicationException($"Failed to set options with error code {result}.");
 }

 // Open codec
 fixed (AVDictionary** pm = &productionAVCodecOptions)
 {
 result = ffmpeg.av_dict_set(pm, "movflags", "frag_keyframe+empty_moov+default_base_moof", 0);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to set dictionary with error code {result}.");
 }

 result = ffmpeg.avcodec_open2(productionAVCodecContext, productionCodec, pm);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to open codec with error code {result}.");
 }
 }

 // Allocate and initialize screen capture codec and context
 screenCaptureInputFormat = ffmpeg.av_find_input_format("gdigrab");
 if (screenCaptureInputFormat == null) throw new ApplicationException("Could not find input format gdigrab.");

 fixed (AVFormatContext** ps = &screenCaptureInputFormatContext)
 {
 result = ffmpeg.avformat_open_input(ps, "desktop", screenCaptureInputFormat, null);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to open input with error code {result}.");
 }

 result = ffmpeg.avformat_find_stream_info(screenCaptureInputFormatContext, null);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to find stream info with error code {result}.");
 }
 }

 gDIGrabVideoStreamIndex = -1;
 for (int i = 0; i < screenCaptureInputFormatContext->nb_streams; i++)
 {
 if (screenCaptureInputFormatContext->streams[i]->codecpar->codec_type == AVMediaType.AVMEDIA_TYPE_VIDEO)
 {
 gDIGrabVideoStreamIndex = i;
 break;
 }
 }

 if (gDIGrabVideoStreamIndex < 0)
 {
 throw new ApplicationException("Failed to find video stream in input.");
 }

 AVCodecParameters* codecParameters = screenCaptureInputFormatContext->streams[gDIGrabVideoStreamIndex]->codecpar;
 screenCaptureAVCodec = ffmpeg.avcodec_find_decoder(codecParameters->codec_id);
 if (screenCaptureAVCodec == null)
 {
 throw new ApplicationException("Could not find decoder for screen capture.");
 }

 screenCaptureAVCodecContext = ffmpeg.avcodec_alloc_context3(screenCaptureAVCodec);
 if (screenCaptureAVCodecContext == null)
 {
 throw new ApplicationException("Could not allocate screen capture codec context.");
 }

 result = ffmpeg.avcodec_parameters_to_context(screenCaptureAVCodecContext, codecParameters);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to copy codec parameters to context with error code {result}.");
 }

 result = ffmpeg.avcodec_open2(screenCaptureAVCodecContext, screenCaptureAVCodec, null);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to open screen capture codec with error code {result}.");
 }
 }

 public void Start()
 {
 recorderTask = Task.Run(() =>
 {
 AVPacket* packet = ffmpeg.av_packet_alloc();
 AVFrame* rawFrame = ffmpeg.av_frame_alloc();
 AVFrame* compatibleFrame = null;
 byte* dstBuffer = null;

 try
 {
 while (!cancellationTokenSource.Token.IsCancellationRequested)
 {
 if (ffmpeg.av_read_frame(screenCaptureInputFormatContext, packet) >= 0)
 {
 if (packet->stream_index == gDIGrabVideoStreamIndex)
 {
 int response = ffmpeg.avcodec_send_packet(screenCaptureAVCodecContext, packet);
 if (response < 0)
 {
 throw new ApplicationException($"Error while sending a packet to the decoder: {response}");
 }

 response = ffmpeg.avcodec_receive_frame(screenCaptureAVCodecContext, rawFrame);
 if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)
 {
 continue;
 }
 else if (response < 0)
 {
 throw new ApplicationException($"Error while receiving a frame from the decoder: {response}");
 }

 compatibleFrame = ConvertToCompatiblePixelFormat(rawFrame, out dstBuffer);

 response = ffmpeg.avcodec_send_frame(productionAVCodecContext, compatibleFrame);
 if (response < 0)
 {
 throw new ApplicationException($"Error while sending a frame to the encoder: {response}");
 }

 while (response >= 0)
 {
 response = ffmpeg.avcodec_receive_packet(productionAVCodecContext, packet);
 if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)
 {
 break;
 }
 else if (response < 0)
 {
 throw new ApplicationException($"Error while receiving a packet from the encoder: {response}");
 }

 using var packetStream = new UnmanagedMemoryStream(packet->data, packet->size);
 packetStream.CopyTo(unsafeToManagedBridgeBuffer);
 byte[] managedBytes = unsafeToManagedBridgeBuffer.ToArray();
 OnNewVideoDataProduced?.Invoke(this, managedBytes);
 unsafeToManagedBridgeBuffer.SetLength(0);
 }
 }
 }
 ffmpeg.av_packet_unref(packet);
 ffmpeg.av_frame_unref(rawFrame);
 if (compatibleFrame != null)
 {
 ffmpeg.av_frame_unref(compatibleFrame);
 ffmpeg.av_free(dstBuffer);
 }
 }
 }
 finally
 {
 ffmpeg.av_packet_free(&packet);
 ffmpeg.av_frame_free(&rawFrame);
 if (compatibleFrame != null)
 {
 ffmpeg.av_frame_free(&compatibleFrame);
 }
 }
 });
 }

 public AVFrame* ConvertToCompatiblePixelFormat(AVFrame* srcFrame, out byte* dstBuffer)
 {
 AVFrame* dstFrame = ffmpeg.av_frame_alloc();
 int buffer_size = ffmpeg.av_image_get_buffer_size(productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);
 byte_ptrArray4 dstData = new byte_ptrArray4();
 int_array4 dstLinesize = new int_array4();
 dstBuffer = (byte*)ffmpeg.av_malloc((ulong)buffer_size);
 ffmpeg.av_image_fill_arrays(ref dstData, ref dstLinesize, dstBuffer, productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);

 dstFrame->format = (int)productionAVCodecContext->pix_fmt;
 dstFrame->width = productionAVCodecContext->width;
 dstFrame->height = productionAVCodecContext->height;
 dstFrame->data.UpdateFrom(dstData);
 dstFrame->linesize.UpdateFrom(dstLinesize);

 SwsContext* swsCtx = ffmpeg.sws_getContext(
 srcFrame->width, srcFrame->height, (AVPixelFormat)srcFrame->format,
 productionAVCodecContext->width, productionAVCodecContext->height, productionAVCodecContext->pix_fmt,
 ffmpeg.SWS_BILINEAR, null, null, null);

 if (swsCtx == null)
 {
 throw new ApplicationException("Could not initialize the conversion context.");
 }

 ffmpeg.sws_scale(swsCtx, srcFrame->data, srcFrame->linesize, 0, srcFrame->height, dstFrame->data, dstFrame->linesize);
 ffmpeg.sws_freeContext(swsCtx);
 return dstFrame;
 }

 private System.Drawing.Size RetrieveScreenBounds(int screenIndex)
 {
 return new System.Drawing.Size(1920, 1080);
 }

 public void Dispose()
 {
 cancellationTokenSource?.Cancel();
 recorderTask?.Wait();
 cancellationTokenSource?.Dispose();
 recorderTask?.Dispose();
 unsafeToManagedBridgeBuffer?.Dispose();

 fixed (AVCodecContext** p = &productionAVCodecContext)
 {
 if (*p != null)
 {
 ffmpeg.avcodec_free_context(p);
 }
 }
 fixed (AVCodecContext** p = &screenCaptureAVCodecContext)
 {
 if (*p != null)
 {
 ffmpeg.avcodec_free_context(p);
 }
 }

 if (productionFormatContext != null)
 {
 ffmpeg.avformat_free_context(productionFormatContext);
 }

 if (screenCaptureInputFormatContext != null)
 {
 ffmpeg.avformat_free_context(screenCaptureInputFormatContext);
 }

 if (productionAVCodecOptions != null)
 {
 fixed (AVDictionary** p = &productionAVCodecOptions)
 {
 ffmpeg.av_dict_free(p);
 }
 }
 }
 }



I call Start method and wait 8 econds, out of scope I write the bytes to an mp4 file without using the write trailer just to debug the atomboxes. and the mp4 debugging box output I got :


(Full OUTPUT)
https://pastebin.com/xkM4MfG7



(Not full)




"
<boxes>
<uuidbox size="0" type="uuid" uuid="{00000000-00000000-00000000-00000000}" specification="unknown" container="unknown">
</uuidbox>
<trackreferencetypebox size="0" type="cdsc" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="hint" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="font" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="hind" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="vdep" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="vplx" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="subt" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="thmb" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="mpod" specification="p14" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="dpnd" specification="p14" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="sync" specification="p14" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="ipir" specification="p14" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="sbas" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="scal" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="tbas" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="sabt" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="oref" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="adda" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="adrc" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="iloc" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="avcp" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="swto" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="swfr" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="chap" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="tmcd" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="cdep" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="scpt" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="ssrc" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="lyra" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<itemreferencebox size="0" type="tbas" specification="p12" container="iref">
<itemreferenceboxentry itemid=""></itemreferenceboxentry>
</itemreferencebox>
<itemreferencebox size="0" type="iloc" specification="p12" container="iref">
<itemreferenceboxentry itemid=""></itemreferenceboxentry>
</itemreferencebox>
<itemreferencebox size="0" type="fdel" specification="p12" container="iref">
<itemreferenceboxentry itemid=""></itemreferenceboxentry>
</itemreferencebox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<rollrecoveryentry></rollrecoveryentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<audioprerollentry></audioprerollentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<visualrandomaccessentry></visualrandomaccessentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<cencsampleencryptiongroupentry isencrypted="" kid=""></cencsampleencryptiongroupentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<operatingpointsinformation>
 <profiletierlevel></profiletierlevel>
<operatingpoint minpicwidth="" minpicheight="" maxpicwidth="" maxpicheight="" maxchromaformat="" maxbitdepth="" avgframerate="" constantframerate="" maxbitrate="" avgbitrate=""></operatingpoint>

</operatingpointsinformation>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<layerinformation>
<layerinfoitem></layerinfoitem>
</layerinformation>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<tileregiongroupentry tilegroup="" independent="" x="" y="" w="" h="">
<tileregiondependency tileid=""></tileregiondependency>
</tileregiongroupentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<nalumap rle="">
<nalumapentry groupid=""></nalumapentry>
</nalumap>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<temporallevelentry></temporallevelentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<sapentry></sapentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<syncsamplegroupentry></syncsamplegroupentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<subpictureorderentry refs=""></subpictureorderentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<sampledescriptionentrybox size="0" type="GNRM" specification="unknown" container="stsd" extensiondatasize="0">
</sampledescriptionentrybox>
<visualsampledescriptionbox size="0" type="GNRV" specification="unknown" container="stsd" version="0" revision="0" vendor="0" temporalquality="0" spacialquality="0" width="0" height="0" horizontalresolution="4718592" verticalresolution="4718592" compressorname="" bitdepth="24">
</visualsampledescriptionbox>
<audiosampledescriptionbox size="0" type="GNRA" specification="unknown" container="stsd" version="0" revision="0" vendor="0" channelcount="2" bitspersample="16" samplerate="0">
</audiosampledescriptionbox>
<trackgrouptypebox size="0" type="msrc" version="0" flags="0" specification="p12" container="trgr">
</trackgrouptypebox>
<trackgrouptypebox size="0" type="ster" version="0" flags="0" specification="p12" container="trgr">
</trackgrouptypebox>
<trackgrouptypebox size="0" type="cstg" version="0" flags="0" specification="p15" container="trgr">
</trackgrouptypebox>
<freespacebox size="0" type="free" specification="p12" container="*">
</freespacebox>
<freespacebox size="0" type="free" specification="p12" container="*">
</freespacebox>
<mediadatabox size="0" type="mdat" specification="p12" container="file">
</mediadatabox>
<mediadatabox size="0" type="mdat" specification="p12" container="meta">
"
</mediadatabox></boxes>


-
FFmpeg.Autogen : Issue with Zero-Sized Atom Boxes in MP4 Output
16 juin 2024, par Alexander JanssonI just started learning ffmpeg using ffmpeg.autogen wrapper version 5.1 in c#, and ffmpeg shared libs version 5.1. Im trying to facilitate a class which screenrecords using gdigrab and produces streamable mp4 to a/an buffer/event. Everything seems to work as suposed to with no error except that the outputstream produces atom boxes with 0 in size thus small file size aswell, no data seems to be produced in the boxes, the "debug test mp4 file" is analyzed with MP4Box and the box info is provided in the thread.


To be more specific why does this code produce empty atomboxes, is someone able to make the data produced actually contain any frame data from the gdigrab editing my code ?


`code :


public unsafe class ScreenStreamer : IDisposable
 {
 private readonly AVCodec* productionCodec;
 private readonly AVCodec* screenCaptureAVCodec;
 private readonly AVCodecContext* productionAVCodecContext;
 private readonly AVFormatContext* productionFormatContext;
 private readonly AVCodecContext* screenCaptureAVCodecContext;
 private readonly AVDictionary* productionAVCodecOptions;
 private readonly AVInputFormat* screenCaptureInputFormat;
 private readonly AVFormatContext* screenCaptureInputFormatContext;
 private readonly int gDIGrabVideoStreamIndex;
 private readonly System.Drawing.Size screenBounds;
 private readonly int _produceAtleastAmount;
 public EventHandler OnNewVideoDataProduced;
 private MemoryStream unsafeToManagedBridgeBuffer;
 private CancellationTokenSource cancellationTokenSource;
 private Task recorderTask;

 public ScreenStreamer(int fps, int bitrate, int screenIndex, int produceAtleastAmount = 1000)
 {
 ffmpeg.avdevice_register_all();
 ffmpeg.avformat_network_init();
 recorderTask = Task.CompletedTask;
 cancellationTokenSource = new CancellationTokenSource();
 unsafeToManagedBridgeBuffer = new MemoryStream();
 _produceAtleastAmount = produceAtleastAmount;

 // Allocate and initialize production codec and context
 productionCodec = ffmpeg.avcodec_find_encoder(AVCodecID.AV_CODEC_ID_H264);
 if (productionCodec == null) throw new ApplicationException("Could not find encoder for codec ID H264.");

 productionAVCodecContext = ffmpeg.avcodec_alloc_context3(productionCodec);
 if (productionAVCodecContext == null) throw new ApplicationException("Could not allocate video codec context.");

 // Set codec parameters
 screenBounds = RetrieveScreenBounds(screenIndex);
 productionAVCodecContext->width = screenBounds.Width;
 productionAVCodecContext->height = screenBounds.Height;
 productionAVCodecContext->time_base = new AVRational() { den = fps, num = 1 };
 productionAVCodecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;
 productionAVCodecContext->bit_rate = bitrate;

 int result = ffmpeg.av_opt_set(productionAVCodecContext->priv_data, "preset", "veryfast", 0);
 if (result != 0)
 {
 throw new ApplicationException($"Failed to set options with error code {result}.");
 }

 // Open codec
 fixed (AVDictionary** pm = &productionAVCodecOptions)
 {
 result = ffmpeg.av_dict_set(pm, "movflags", "frag_keyframe+empty_moov+default_base_moof", 0);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to set dictionary with error code {result}.");
 }

 result = ffmpeg.avcodec_open2(productionAVCodecContext, productionCodec, pm);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to open codec with error code {result}.");
 }
 }

 // Allocate and initialize screen capture codec and context
 screenCaptureInputFormat = ffmpeg.av_find_input_format("gdigrab");
 if (screenCaptureInputFormat == null) throw new ApplicationException("Could not find input format gdigrab.");

 fixed (AVFormatContext** ps = &screenCaptureInputFormatContext)
 {
 result = ffmpeg.avformat_open_input(ps, "desktop", screenCaptureInputFormat, null);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to open input with error code {result}.");
 }

 result = ffmpeg.avformat_find_stream_info(screenCaptureInputFormatContext, null);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to find stream info with error code {result}.");
 }
 }

 gDIGrabVideoStreamIndex = -1;
 for (int i = 0; i < screenCaptureInputFormatContext->nb_streams; i++)
 {
 if (screenCaptureInputFormatContext->streams[i]->codecpar->codec_type == AVMediaType.AVMEDIA_TYPE_VIDEO)
 {
 gDIGrabVideoStreamIndex = i;
 break;
 }
 }

 if (gDIGrabVideoStreamIndex < 0)
 {
 throw new ApplicationException("Failed to find video stream in input.");
 }

 AVCodecParameters* codecParameters = screenCaptureInputFormatContext->streams[gDIGrabVideoStreamIndex]->codecpar;
 screenCaptureAVCodec = ffmpeg.avcodec_find_decoder(codecParameters->codec_id);
 if (screenCaptureAVCodec == null)
 {
 throw new ApplicationException("Could not find decoder for screen capture.");
 }

 screenCaptureAVCodecContext = ffmpeg.avcodec_alloc_context3(screenCaptureAVCodec);
 if (screenCaptureAVCodecContext == null)
 {
 throw new ApplicationException("Could not allocate screen capture codec context.");
 }

 result = ffmpeg.avcodec_parameters_to_context(screenCaptureAVCodecContext, codecParameters);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to copy codec parameters to context with error code {result}.");
 }

 result = ffmpeg.avcodec_open2(screenCaptureAVCodecContext, screenCaptureAVCodec, null);
 if (result < 0)
 {
 throw new ApplicationException($"Failed to open screen capture codec with error code {result}.");
 }
 }

 public void Start()
 {
 recorderTask = Task.Run(() =>
 {
 AVPacket* packet = ffmpeg.av_packet_alloc();
 AVFrame* rawFrame = ffmpeg.av_frame_alloc();
 AVFrame* compatibleFrame = null;
 byte* dstBuffer = null;

 try
 {
 while (!cancellationTokenSource.Token.IsCancellationRequested)
 {
 if (ffmpeg.av_read_frame(screenCaptureInputFormatContext, packet) >= 0)
 {
 if (packet->stream_index == gDIGrabVideoStreamIndex)
 {
 int response = ffmpeg.avcodec_send_packet(screenCaptureAVCodecContext, packet);
 if (response < 0)
 {
 throw new ApplicationException($"Error while sending a packet to the decoder: {response}");
 }

 response = ffmpeg.avcodec_receive_frame(screenCaptureAVCodecContext, rawFrame);
 if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)
 {
 continue;
 }
 else if (response < 0)
 {
 throw new ApplicationException($"Error while receiving a frame from the decoder: {response}");
 }

 compatibleFrame = ConvertToCompatiblePixelFormat(rawFrame, out dstBuffer);

 response = ffmpeg.avcodec_send_frame(productionAVCodecContext, compatibleFrame);
 if (response < 0)
 {
 throw new ApplicationException($"Error while sending a frame to the encoder: {response}");
 }

 while (response >= 0)
 {
 response = ffmpeg.avcodec_receive_packet(productionAVCodecContext, packet);
 if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)
 {
 break;
 }
 else if (response < 0)
 {
 throw new ApplicationException($"Error while receiving a packet from the encoder: {response}");
 }

 using var packetStream = new UnmanagedMemoryStream(packet->data, packet->size);
 packetStream.CopyTo(unsafeToManagedBridgeBuffer);
 byte[] managedBytes = unsafeToManagedBridgeBuffer.ToArray();
 OnNewVideoDataProduced?.Invoke(this, managedBytes);
 unsafeToManagedBridgeBuffer.SetLength(0);
 }
 }
 }
 ffmpeg.av_packet_unref(packet);
 ffmpeg.av_frame_unref(rawFrame);
 if (compatibleFrame != null)
 {
 ffmpeg.av_frame_unref(compatibleFrame);
 ffmpeg.av_free(dstBuffer);
 }
 }
 }
 finally
 {
 ffmpeg.av_packet_free(&packet);
 ffmpeg.av_frame_free(&rawFrame);
 if (compatibleFrame != null)
 {
 ffmpeg.av_frame_free(&compatibleFrame);
 }
 }
 });
 }

 public AVFrame* ConvertToCompatiblePixelFormat(AVFrame* srcFrame, out byte* dstBuffer)
 {
 AVFrame* dstFrame = ffmpeg.av_frame_alloc();
 int buffer_size = ffmpeg.av_image_get_buffer_size(productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);
 byte_ptrArray4 dstData = new byte_ptrArray4();
 int_array4 dstLinesize = new int_array4();
 dstBuffer = (byte*)ffmpeg.av_malloc((ulong)buffer_size);
 ffmpeg.av_image_fill_arrays(ref dstData, ref dstLinesize, dstBuffer, productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);

 dstFrame->format = (int)productionAVCodecContext->pix_fmt;
 dstFrame->width = productionAVCodecContext->width;
 dstFrame->height = productionAVCodecContext->height;
 dstFrame->data.UpdateFrom(dstData);
 dstFrame->linesize.UpdateFrom(dstLinesize);

 SwsContext* swsCtx = ffmpeg.sws_getContext(
 srcFrame->width, srcFrame->height, (AVPixelFormat)srcFrame->format,
 productionAVCodecContext->width, productionAVCodecContext->height, productionAVCodecContext->pix_fmt,
 ffmpeg.SWS_BILINEAR, null, null, null);

 if (swsCtx == null)
 {
 throw new ApplicationException("Could not initialize the conversion context.");
 }

 ffmpeg.sws_scale(swsCtx, srcFrame->data, srcFrame->linesize, 0, srcFrame->height, dstFrame->data, dstFrame->linesize);
 ffmpeg.sws_freeContext(swsCtx);
 return dstFrame;
 }

 private System.Drawing.Size RetrieveScreenBounds(int screenIndex)
 {
 return new System.Drawing.Size(1920, 1080);
 }

 public void Dispose()
 {
 cancellationTokenSource?.Cancel();
 recorderTask?.Wait();
 cancellationTokenSource?.Dispose();
 recorderTask?.Dispose();
 unsafeToManagedBridgeBuffer?.Dispose();

 fixed (AVCodecContext** p = &productionAVCodecContext)
 {
 if (*p != null)
 {
 ffmpeg.avcodec_free_context(p);
 }
 }
 fixed (AVCodecContext** p = &screenCaptureAVCodecContext)
 {
 if (*p != null)
 {
 ffmpeg.avcodec_free_context(p);
 }
 }

 if (productionFormatContext != null)
 {
 ffmpeg.avformat_free_context(productionFormatContext);
 }

 if (screenCaptureInputFormatContext != null)
 {
 ffmpeg.avformat_free_context(screenCaptureInputFormatContext);
 }

 if (productionAVCodecOptions != null)
 {
 fixed (AVDictionary** p = &productionAVCodecOptions)
 {
 ffmpeg.av_dict_free(p);
 }
 }
 }
 }



I call Start method and wait 8 econds, out of scope I write the bytes to an mp4 file without using the write trailer just to debug the atomboxes. and the mp4 debugging box output I got :


(Full OUTPUT)
https://pastebin.com/xkM4MfG7



(Not full)




"
<boxes>
<uuidbox size="0" type="uuid" uuid="{00000000-00000000-00000000-00000000}" specification="unknown" container="unknown">
</uuidbox>
<trackreferencetypebox size="0" type="cdsc" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="hint" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="font" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="hind" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="vdep" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="vplx" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="subt" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="thmb" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="mpod" specification="p14" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="dpnd" specification="p14" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="sync" specification="p14" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="ipir" specification="p14" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="sbas" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="scal" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="tbas" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="sabt" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="oref" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="adda" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="adrc" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="iloc" specification="p12" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="avcp" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="swto" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="swfr" specification="p15" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="chap" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="tmcd" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="cdep" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="scpt" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="ssrc" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<trackreferencetypebox size="0" type="lyra" specification="apple" container="tref">
<trackreferenceentry trackid=""></trackreferenceentry>
</trackreferencetypebox>
<itemreferencebox size="0" type="tbas" specification="p12" container="iref">
<itemreferenceboxentry itemid=""></itemreferenceboxentry>
</itemreferencebox>
<itemreferencebox size="0" type="iloc" specification="p12" container="iref">
<itemreferenceboxentry itemid=""></itemreferenceboxentry>
</itemreferencebox>
<itemreferencebox size="0" type="fdel" specification="p12" container="iref">
<itemreferenceboxentry itemid=""></itemreferenceboxentry>
</itemreferencebox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<rollrecoveryentry></rollrecoveryentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<audioprerollentry></audioprerollentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<visualrandomaccessentry></visualrandomaccessentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<cencsampleencryptiongroupentry isencrypted="" kid=""></cencsampleencryptiongroupentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<operatingpointsinformation>
 <profiletierlevel></profiletierlevel>
<operatingpoint minpicwidth="" minpicheight="" maxpicwidth="" maxpicheight="" maxchromaformat="" maxbitdepth="" avgframerate="" constantframerate="" maxbitrate="" avgbitrate=""></operatingpoint>

</operatingpointsinformation>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<layerinformation>
<layerinfoitem></layerinfoitem>
</layerinformation>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<tileregiongroupentry tilegroup="" independent="" x="" y="" w="" h="">
<tileregiondependency tileid=""></tileregiondependency>
</tileregiongroupentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<nalumap rle="">
<nalumapentry groupid=""></nalumapentry>
</nalumap>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<temporallevelentry></temporallevelentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">
<sapentry></sapentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<syncsamplegroupentry></syncsamplegroupentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">
<subpictureorderentry refs=""></subpictureorderentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">
<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>
</samplegroupdescriptionbox>
<sampledescriptionentrybox size="0" type="GNRM" specification="unknown" container="stsd" extensiondatasize="0">
</sampledescriptionentrybox>
<visualsampledescriptionbox size="0" type="GNRV" specification="unknown" container="stsd" version="0" revision="0" vendor="0" temporalquality="0" spacialquality="0" width="0" height="0" horizontalresolution="4718592" verticalresolution="4718592" compressorname="" bitdepth="24">
</visualsampledescriptionbox>
<audiosampledescriptionbox size="0" type="GNRA" specification="unknown" container="stsd" version="0" revision="0" vendor="0" channelcount="2" bitspersample="16" samplerate="0">
</audiosampledescriptionbox>
<trackgrouptypebox size="0" type="msrc" version="0" flags="0" specification="p12" container="trgr">
</trackgrouptypebox>
<trackgrouptypebox size="0" type="ster" version="0" flags="0" specification="p12" container="trgr">
</trackgrouptypebox>
<trackgrouptypebox size="0" type="cstg" version="0" flags="0" specification="p15" container="trgr">
</trackgrouptypebox>
<freespacebox size="0" type="free" specification="p12" container="*">
</freespacebox>
<freespacebox size="0" type="free" specification="p12" container="*">
</freespacebox>
<mediadatabox size="0" type="mdat" specification="p12" container="file">
</mediadatabox>
<mediadatabox size="0" type="mdat" specification="p12" container="meta">
"
</mediadatabox></boxes>