Recherche avancée

Médias (91)

Autres articles (67)

  • Organiser par catégorie

    17 mai 2013, par

    Dans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
    Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
    Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...)

  • Création définitive du canal

    12 mars 2010, par

    Lorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
    A la validation, vous recevez un email vous invitant donc à créer votre canal.
    Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
    A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

Sur d’autres sites (5418)

  • Bash : bash script to download trimmed mp3 from youtube url

    25 août 2017, par Bhishan Poudel

    I would like to download the initially x seconds trimmed mp3 from a video url of youtube.
    I found that youtube-dl can download the video from youtube to local machine. But, when I looked at the man pages of youtube-dl, I could not find any trim options.

    So I tried to use the ffmpeg to trim downloaded mp3 file.
    Instead of doing this is two steps, I like to write one bash script which does the same thing.
    My attempt is given below.

    However, I was stuck at one place :
    "HOW TO GET THE VARIABLE NAME OF OUTPUT MP3 FILE FROM YOUTUBE-DL ?"
    The script is given below :

    # trim initial x seconds of mp3 file
    # e.g. mytrim https://www.youtube.com/watch?v=dD5RgCf1hrI 30
    function mytrim() {
       youtube-dl --extract-audio --embed-thumbnail --audio-format mp3 -o "%(title)s.%(ext)s" $1
       ffmpeg -ss $2 -i $OUTPUT_MP3 -acodec copy -y temp.mp3
       mv temp.mp3 $OUTPUT_MP3
       }

    How to get the variable value $OUTPUT_MP3 ?
    echo "%(title)s.%(ext)s" gives the verbatim output, does not give the output filename.

    How could we make the script work ?

    The help will be appreciated.

  • Python script and equivalent command do not run the same

    19 août 2021, par user32882

    I would like to use youtubedl to download the audio from a YouTube video into an mp3 file. I came up with the following command to do so :

    


    youtube-dl -x --audio-format mp3 https://www.youtube.com/watch?v=SF8DGbfOFig&ab_channel=derang

    


    When I run the above command through my command line, it seems that I successfully manage to download the file in mp3 format :

    


    [youtube] SF8DGbfOFig: Downloading webpage
[download] Destination: Total Science & S.P.Y - Piano Funk (Ft. Riya & DāM FunK) [320k]-SF8DGbfOFig.webm
[download] 100% of 5.57MiB in 00:03
[ffmpeg] Destination: Total Science & S.P.Y - Piano Funk (Ft. Riya & DāM FunK) [320k]-SF8DGbfOFig.mp3
Deleting original file Total Science & S.P.Y - Piano Funk (Ft. Riya & DāM FunK) [320k]-SF8DGbfOFig.webm (pass -k to keep)


    


    I then tried to convert the above command to an equivalent python script as follows :

    


    import youtube_dl
links = ["https://www.youtube.com/watch?v=SF8DGbfOFig&ab_channel=derang"]
ydl_args = {
        'audioformat': 'mp3',
        'outtmpl': '%(title)s.%(ext)s',
        'extractaudio': True
    }
with youtube_dl.YoutubeDL(ydl_args) as ydl:
    results = ydl.download(links)


    


    However, this does not succeed in generating an mp3 file of the audio. These are the logs I am getting :

    


    WARNING: Requested formats are incompatible for merge and will be merged into mkv.
[download] Destination: Total Science & S.P.Y - Piano Funk (Ft. Riya & DāM FunK) [320k].f135.mp4
[download] 100% of 4.42MiB in 00:02
[download] Destination: Total Science & S.P.Y - Piano Funk (Ft. Riya & DāM FunK) [320k].f251.webm
[download] 100% of 5.57MiB in 00:03
[ffmpeg] Merging formats into "Total Science & S.P.Y - Piano Funk (Ft. Riya & DāM FunK) [320k].mkv"
Deleting original file Total Science & S.P.Y - Piano Funk (Ft. Riya & DāM FunK) [320k].f135.mp4 (pass -k to keep)
Deleting original file Total Science & S.P.Y - Piano Funk (Ft. Riya & DāM FunK) [320k].f251.webm (pass -k to keep)


    


    What am I doing wrong here ? Aren't my command and python script equivalent ?

    


  • FFmpeg.Autogen Wrapper : Issue with Zero-Sized Atom Boxes in MP4 Output

    11 juin 2024, par Alexander Jansson

    I just started learning ffmpeg using ffmpeg.autogen wrapper version 5.1 in c#, and ffmpeg shared libs version 5.1. Im trying to facilitate a class which screenrecords using gdigrab and produces streamable mp4 to a/an buffer/event. Everything seems to work as suposed to with no error except that the outputstream produces atom boxes with 0 in size thus small file size aswell, no data seems to be produced in the boxes, the "debug test mp4 file" is analyzed with MP4Box and the box info is provided in the thread.

    


    To be more specific why does this code produce empty atomboxes, is someone able to make the data produced actually contain any frame data from the gdigrab editing my code ?

    


    `code :

    


     public unsafe class ScreenStreamer : IDisposable
 {
     private readonly AVCodec* productionCodec;
     private readonly AVCodec* screenCaptureAVCodec;
     private readonly AVCodecContext* productionAVCodecContext;
     private readonly AVFormatContext* productionFormatContext;
     private readonly AVCodecContext* screenCaptureAVCodecContext;
     private readonly AVDictionary* productionAVCodecOptions;
     private readonly AVInputFormat* screenCaptureInputFormat;
     private readonly AVFormatContext* screenCaptureInputFormatContext;
     private readonly int gDIGrabVideoStreamIndex;
     private readonly System.Drawing.Size screenBounds;
     private readonly int _produceAtleastAmount;
     public EventHandler OnNewVideoDataProduced;
     private MemoryStream unsafeToManagedBridgeBuffer;
     private CancellationTokenSource cancellationTokenSource;
     private Task recorderTask;

     public ScreenStreamer(int fps, int bitrate, int screenIndex, int produceAtleastAmount = 1000)
     {
         ffmpeg.avdevice_register_all();
         ffmpeg.avformat_network_init();
         recorderTask = Task.CompletedTask;
         cancellationTokenSource = new CancellationTokenSource();
         unsafeToManagedBridgeBuffer = new MemoryStream();
         _produceAtleastAmount = produceAtleastAmount;

         // Allocate and initialize production codec and context
         productionCodec = ffmpeg.avcodec_find_encoder(AVCodecID.AV_CODEC_ID_H264);
         if (productionCodec == null) throw new ApplicationException("Could not find encoder for codec ID H264.");

         productionAVCodecContext = ffmpeg.avcodec_alloc_context3(productionCodec);
         if (productionAVCodecContext == null) throw new ApplicationException("Could not allocate video codec context.");

         // Set codec parameters
         screenBounds = RetrieveScreenBounds(screenIndex);
         productionAVCodecContext->width = screenBounds.Width;
         productionAVCodecContext->height = screenBounds.Height;
         productionAVCodecContext->time_base = new AVRational() { den = fps, num = 1 };
         productionAVCodecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;
         productionAVCodecContext->bit_rate = bitrate;

         int result = ffmpeg.av_opt_set(productionAVCodecContext->priv_data, "preset", "veryfast", 0);
         if (result != 0)
         {
             throw new ApplicationException($"Failed to set options with error code {result}.");
         }

         // Open codec
         fixed (AVDictionary** pm = &productionAVCodecOptions)
         {
             result = ffmpeg.av_dict_set(pm, "movflags", "frag_keyframe+empty_moov+default_base_moof", 0);
             if (result < 0)
             {
                 throw new ApplicationException($"Failed to set dictionary with error code {result}.");
             }

             result = ffmpeg.avcodec_open2(productionAVCodecContext, productionCodec, pm);
             if (result < 0)
             {
                 throw new ApplicationException($"Failed to open codec with error code {result}.");
             }
         }

         // Allocate and initialize screen capture codec and context
         screenCaptureInputFormat = ffmpeg.av_find_input_format("gdigrab");
         if (screenCaptureInputFormat == null) throw new ApplicationException("Could not find input format gdigrab.");

         fixed (AVFormatContext** ps = &screenCaptureInputFormatContext)
         {
             result = ffmpeg.avformat_open_input(ps, "desktop", screenCaptureInputFormat, null);
             if (result < 0)
             {
                 throw new ApplicationException($"Failed to open input with error code {result}.");
             }

             result = ffmpeg.avformat_find_stream_info(screenCaptureInputFormatContext, null);
             if (result < 0)
             {
                 throw new ApplicationException($"Failed to find stream info with error code {result}.");
             }
         }

         gDIGrabVideoStreamIndex = -1;
         for (int i = 0; i < screenCaptureInputFormatContext->nb_streams; i++)
         {
             if (screenCaptureInputFormatContext->streams[i]->codecpar->codec_type == AVMediaType.AVMEDIA_TYPE_VIDEO)
             {
                 gDIGrabVideoStreamIndex = i;
                 break;
             }
         }

         if (gDIGrabVideoStreamIndex < 0)
         {
             throw new ApplicationException("Failed to find video stream in input.");
         }

         AVCodecParameters* codecParameters = screenCaptureInputFormatContext->streams[gDIGrabVideoStreamIndex]->codecpar;
         screenCaptureAVCodec = ffmpeg.avcodec_find_decoder(codecParameters->codec_id);
         if (screenCaptureAVCodec == null)
         {
             throw new ApplicationException("Could not find decoder for screen capture.");
         }

         screenCaptureAVCodecContext = ffmpeg.avcodec_alloc_context3(screenCaptureAVCodec);
         if (screenCaptureAVCodecContext == null)
         {
             throw new ApplicationException("Could not allocate screen capture codec context.");
         }

         result = ffmpeg.avcodec_parameters_to_context(screenCaptureAVCodecContext, codecParameters);
         if (result < 0)
         {
             throw new ApplicationException($"Failed to copy codec parameters to context with error code {result}.");
         }

         result = ffmpeg.avcodec_open2(screenCaptureAVCodecContext, screenCaptureAVCodec, null);
         if (result < 0)
         {
             throw new ApplicationException($"Failed to open screen capture codec with error code {result}.");
         }
     }

     public void Start()
     {
         recorderTask = Task.Run(() =>
         {
             AVPacket* packet = ffmpeg.av_packet_alloc();
             AVFrame* rawFrame = ffmpeg.av_frame_alloc();
             AVFrame* compatibleFrame = null;
             byte* dstBuffer = null;

             try
             {
                 while (!cancellationTokenSource.Token.IsCancellationRequested)
                 {
                     if (ffmpeg.av_read_frame(screenCaptureInputFormatContext, packet) >= 0)
                     {
                         if (packet->stream_index == gDIGrabVideoStreamIndex)
                         {
                             int response = ffmpeg.avcodec_send_packet(screenCaptureAVCodecContext, packet);
                             if (response < 0)
                             {
                                 throw new ApplicationException($"Error while sending a packet to the decoder: {response}");
                             }

                             response = ffmpeg.avcodec_receive_frame(screenCaptureAVCodecContext, rawFrame);
                             if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)
                             {
                                 continue;
                             }
                             else if (response < 0)
                             {
                                 throw new ApplicationException($"Error while receiving a frame from the decoder: {response}");
                             }

                             compatibleFrame = ConvertToCompatiblePixelFormat(rawFrame, out dstBuffer);

                             response = ffmpeg.avcodec_send_frame(productionAVCodecContext, compatibleFrame);
                             if (response < 0)
                             {
                                 throw new ApplicationException($"Error while sending a frame to the encoder: {response}");
                             }

                             while (response >= 0)
                             {
                                 response = ffmpeg.avcodec_receive_packet(productionAVCodecContext, packet);
                                 if (response == ffmpeg.AVERROR(ffmpeg.EAGAIN) || response == ffmpeg.AVERROR_EOF)
                                 {
                                     break;
                                 }
                                 else if (response < 0)
                                 {
                                     throw new ApplicationException($"Error while receiving a packet from the encoder: {response}");
                                 }

                                 using var packetStream = new UnmanagedMemoryStream(packet->data, packet->size);
                                 packetStream.CopyTo(unsafeToManagedBridgeBuffer);
                                 byte[] managedBytes = unsafeToManagedBridgeBuffer.ToArray();
                                 OnNewVideoDataProduced?.Invoke(this, managedBytes);
                                 unsafeToManagedBridgeBuffer.SetLength(0);
                             }
                         }
                     }
                     ffmpeg.av_packet_unref(packet);
                     ffmpeg.av_frame_unref(rawFrame);
                     if (compatibleFrame != null)
                     {
                         ffmpeg.av_frame_unref(compatibleFrame);
                         ffmpeg.av_free(dstBuffer);
                     }
                 }
             }
             finally
             {
                 ffmpeg.av_packet_free(&packet);
                 ffmpeg.av_frame_free(&rawFrame);
                 if (compatibleFrame != null)
                 {
                     ffmpeg.av_frame_free(&compatibleFrame);
                 }
             }
         });
     }

     public AVFrame* ConvertToCompatiblePixelFormat(AVFrame* srcFrame, out byte* dstBuffer)
     {
         AVFrame* dstFrame = ffmpeg.av_frame_alloc();
         int buffer_size = ffmpeg.av_image_get_buffer_size(productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);
         byte_ptrArray4 dstData = new byte_ptrArray4();
         int_array4 dstLinesize = new int_array4();
         dstBuffer = (byte*)ffmpeg.av_malloc((ulong)buffer_size);
         ffmpeg.av_image_fill_arrays(ref dstData, ref dstLinesize, dstBuffer, productionAVCodecContext->pix_fmt, productionAVCodecContext->width, productionAVCodecContext->height, 1);

         dstFrame->format = (int)productionAVCodecContext->pix_fmt;
         dstFrame->width = productionAVCodecContext->width;
         dstFrame->height = productionAVCodecContext->height;
         dstFrame->data.UpdateFrom(dstData);
         dstFrame->linesize.UpdateFrom(dstLinesize);

         SwsContext* swsCtx = ffmpeg.sws_getContext(
             srcFrame->width, srcFrame->height, (AVPixelFormat)srcFrame->format,
             productionAVCodecContext->width, productionAVCodecContext->height, productionAVCodecContext->pix_fmt,
             ffmpeg.SWS_BILINEAR, null, null, null);

         if (swsCtx == null)
         {
             throw new ApplicationException("Could not initialize the conversion context.");
         }

         ffmpeg.sws_scale(swsCtx, srcFrame->data, srcFrame->linesize, 0, srcFrame->height, dstFrame->data, dstFrame->linesize);
         ffmpeg.sws_freeContext(swsCtx);
         return dstFrame;
     }

     private System.Drawing.Size RetrieveScreenBounds(int screenIndex)
     {
         return new System.Drawing.Size(1920, 1080);
     }

     public void Dispose()
     {
         cancellationTokenSource?.Cancel();
         recorderTask?.Wait();
         cancellationTokenSource?.Dispose();
         recorderTask?.Dispose();
         unsafeToManagedBridgeBuffer?.Dispose();

         fixed (AVCodecContext** p = &productionAVCodecContext)
         {
             if (*p != null)
             {
                 ffmpeg.avcodec_free_context(p);
             }
         }
         fixed (AVCodecContext** p = &screenCaptureAVCodecContext)
         {
             if (*p != null)
             {
                 ffmpeg.avcodec_free_context(p);
             }
         }

         if (productionFormatContext != null)
         {
             ffmpeg.avformat_free_context(productionFormatContext);
         }

         if (screenCaptureInputFormatContext != null)
         {
             ffmpeg.avformat_free_context(screenCaptureInputFormatContext);
         }

         if (productionAVCodecOptions != null)
         {
             fixed (AVDictionary** p = &productionAVCodecOptions)
             {
                 ffmpeg.av_dict_free(p);
             }
         }
     }
 }


    


    I call Start method and wait 8 econds, out of scope I write the bytes to an mp4 file without using the write trailer just to debug the atomboxes. and the mp4 debugging box output I got :

    


    (Full OUTPUT)
https://pastebin.com/xkM4MfG7

    



    


    (Not full)

    


    &#xA;&#xA;"&#xA;<boxes>&#xA;<uuidbox size="0" type="uuid" uuid="{00000000-00000000-00000000-00000000}" specification="unknown" container="unknown">&#xA;</uuidbox>&#xA;<trackreferencetypebox size="0" type="cdsc" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="hint" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="font" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="hind" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="vdep" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="vplx" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="subt" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="thmb" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="mpod" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="dpnd" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="sync" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="ipir" specification="p14" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="sbas" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="scal" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="tbas" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="sabt" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="oref" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="adda" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="adrc" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="iloc" specification="p12" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="avcp" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="swto" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="swfr" specification="p15" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="chap" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="tmcd" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="cdep" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="scpt" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="ssrc" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<trackreferencetypebox size="0" type="lyra" specification="apple" container="tref">&#xA;<trackreferenceentry trackid=""></trackreferenceentry>&#xA;</trackreferencetypebox>&#xA;<itemreferencebox size="0" type="tbas" specification="p12" container="iref">&#xA;<itemreferenceboxentry itemid=""></itemreferenceboxentry>&#xA;</itemreferencebox>&#xA;<itemreferencebox size="0" type="iloc" specification="p12" container="iref">&#xA;<itemreferenceboxentry itemid=""></itemreferenceboxentry>&#xA;</itemreferencebox>&#xA;<itemreferencebox size="0" type="fdel" specification="p12" container="iref">&#xA;<itemreferenceboxentry itemid=""></itemreferenceboxentry>&#xA;</itemreferencebox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<rollrecoveryentry></rollrecoveryentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<audioprerollentry></audioprerollentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<visualrandomaccessentry></visualrandomaccessentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<cencsampleencryptiongroupentry isencrypted="" kid=""></cencsampleencryptiongroupentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<operatingpointsinformation>&#xA; <profiletierlevel></profiletierlevel>&#xA;<operatingpoint minpicwidth="" minpicheight="" maxpicwidth="" maxpicheight="" maxchromaformat="" maxbitdepth="" avgframerate="" constantframerate="" maxbitrate="" avgbitrate=""></operatingpoint>&#xA;&#xA;</operatingpointsinformation>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<layerinformation>&#xA;<layerinfoitem></layerinfoitem>&#xA;</layerinformation>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<tileregiongroupentry tilegroup="" independent="" x="" y="" w="" h="">&#xA;<tileregiondependency tileid=""></tileregiondependency>&#xA;</tileregiongroupentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<nalumap rle="">&#xA;<nalumapentry groupid=""></nalumapentry>&#xA;</nalumap>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<temporallevelentry></temporallevelentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p12" container="stbl traf">&#xA;<sapentry></sapentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<syncsamplegroupentry></syncsamplegroupentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="p15" container="stbl traf">&#xA;<subpictureorderentry refs=""></subpictureorderentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<samplegroupdescriptionbox size="0" type="sgpd" version="0" flags="0" specification="3gpp" container="stbl traf">&#xA;<defaultsamplegroupdescriptionentry size=""></defaultsamplegroupdescriptionentry>&#xA;</samplegroupdescriptionbox>&#xA;<sampledescriptionentrybox size="0" type="GNRM" specification="unknown" container="stsd" extensiondatasize="0">&#xA;</sampledescriptionentrybox>&#xA;<visualsampledescriptionbox size="0" type="GNRV" specification="unknown" container="stsd" version="0" revision="0" vendor="0" temporalquality="0" spacialquality="0" width="0" height="0" horizontalresolution="4718592" verticalresolution="4718592" compressorname="" bitdepth="24">&#xA;</visualsampledescriptionbox>&#xA;<audiosampledescriptionbox size="0" type="GNRA" specification="unknown" container="stsd" version="0" revision="0" vendor="0" channelcount="2" bitspersample="16" samplerate="0">&#xA;</audiosampledescriptionbox>&#xA;<trackgrouptypebox size="0" type="msrc" version="0" flags="0" specification="p12" container="trgr">&#xA;</trackgrouptypebox>&#xA;<trackgrouptypebox size="0" type="ster" version="0" flags="0" specification="p12" container="trgr">&#xA;</trackgrouptypebox>&#xA;<trackgrouptypebox size="0" type="cstg" version="0" flags="0" specification="p15" container="trgr">&#xA;</trackgrouptypebox>&#xA;<freespacebox size="0" type="free" specification="p12" container="*">&#xA;</freespacebox>&#xA;<freespacebox size="0" type="free" specification="p12" container="*">&#xA;</freespacebox>&#xA;<mediadatabox size="0" type="mdat" specification="p12" container="file">&#xA;</mediadatabox>&#xA;<mediadatabox size="0" type="mdat" specification="p12" container="meta">&#xA;"&#xA;</mediadatabox></boxes>

    &#xA;