Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (74)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

Sur d’autres sites (3354)

  • Handling correctly the ffmpeg & ffprobe with php

    29 septembre 2014, par cocco

    Handling correctly the ffmpeg & ffprobe with php

    maybe not relevant final goals :

    1. upload clip with ajax
    2. get ajax info from ffprobe using php as json executing ffprobe once only (no ffmpeg)
    3. handle all calculations with javascript
    4. maybe an extra php script tool that can create gifs, extract frames(thumbs), or a video grid preview
    5. when rdy ajax the conversion info to the final php conversion script executing ffmpeg once only (just the final ffmpeg string.).

    I’m trying to write my own ffmpeg local web video editor that converts all formats to mp4 automatically. As mp4 is the most compatible container now and the h264+aac/+ac3 is also one of the best compressions. I also want to be able to cut, crop, resize, remove streams, add streams and more. I’m stuck on some simple problems :

    1. HOW TO GET THE INFO ?

    I’m using ffprobe to get the file information as json with the following command :

    ffprobe -v quiet -print_format json -show_format -show_streams -show_packets '.$video

    this gives you a lot of information, but some relevant stuff is not always present. I need the duration (in milliseconds),the fps (as a float) and the total frames (as an integer).

    i know that these values can sometimes be found inside this array :

    format.duration //Total duration
    streams[0].duration //Video duration
    streams[1].duration //Audio duration

    streams[0].avg_frame_rate //Average framerate
    streams[0].r_frame_rate //Video framerate

    streams[0].nb_frames //Total frames

    but most of the time nb_frames is missing, also avg_frame_rate differs from r_frame_rate, which is also not always available.

    I know that i could use multiple commands to increase the chance to get the correct values.. but srsly ???

    //fps
    ffmpeg -i INPUT 2>&1 | sed -n "s/.*, \(.*\) fp.*/\1/p"
    //duration
    ffmpeg -i INPUT 2>&1 | awk '/Duration/ {split($2,a,":");print a[1]*3600+a[2]*60+a[3]}'
    //frames
    ffmpeg -i INPUT -vcodec copy -f rawvideo -y /dev/null 2>&1 | tr ^M '\n' | awk '/^frame=/ {print $2}'|tail -n 1

    I don’t want to execute ffmpeg 3 times to get this information ; I’d prefer to just use ffprobe.

    So... is there an elegant way to get the extra info that is not always present inside the ffprobe output (fps, frames, duration) ???

    In the preview i want to be able to jump correctly to a specific frame (NOT TIME). if the above parameters are aviable i can do that using this command.

    ffmpeg -i INPUT -vf 'select=gte(n\,FRAMENUMBER)' -vframes 1 -f image2 OUTPUT

    using the above command by setting the framenumber to the last frame always returns a black frame.
    if there are 50 frames (for example) the range is 1-50 — correct ? Frame 50 is black, frame 1 is ok, frame 0 returns an error...


    2. WHILE READING THE LOG HOW TO SKIP ERRORS AND DETERMINE IF THE CONVERSION IS FINISHED ?

    I’m able to upload one single video per time (per page) and i can read the current progress from the ffmpeg generated output log until i don’t close the page. more control/multiple conversions would be nice.

    i’m reading the last line of the log with a custom tail function but as this is a log that also includes errors i don’t always get a nice line containing the desidered values. btw to check if the progress is complete i check if the last line CONTAINS the WORD frame ....

    How can i find out when the conversion progress is finished ?

    maybe a way to delete the log with ffmpeg command ??And skip/log the errors ??

    i’m using server sent events to read the log...
    here is the php code

    <?php
    setlocale(LC_CTYPE, "en_US.UTF-8");
    function tailCustom($filepath,$lines=1,$adaptive=true){
    // a custom function to get the last line of a textfile.
    }
    function send($data){
    echo "id: ".time().PHP_EOL;
    echo "data: ".$data.PHP_EOL;
    echo PHP_EOL;
    ob_flush();
    flush();
    }
    header('Content-Type: text/event-stream');
    header('Cache-Control: no-cache');
    while(true){
    send(tailCustom($_GET['log'].".log"));
    sleep(1);
    }
    ?>

    And here the SSE js

    function startSSE(fn){
    sse=new EventSource("ffmpegProgress.php?log="+encodeURIComponent(fn));
    sse.addEventListener('message',conversionProgress,false);
    }
    function conversionProgress(e){
    if(e.data.substr(0,6)=='frame='){
     inProgress=true;
     var x=e.data.match(/frame=\s*(.*?)\s*fps=\s*(.*?)\s*q=\s*(.*?)\s*size=\s*(.*?)\s*time=\s*(.*?)\s*bitrate=\s*(.*?)\s*$/);
     x.shift();x={frame:x[0]*1,fps:x[1]*1,q:x[2],size:x[3],time:x[4],bitrate:x[5]};
    var elapsedTime = ((new Date().getTime()) - startTime);
    var chunksPerTime = timeString2ms(x.time) / elapsedTime;
    var estimatedTotalTime = duration / chunksPerTime;
    var timeLeftInSeconds = Math.abs(elapsedTime-(estimatedTotalTime*1000));
    var withOneDecimalPlace = Math.round(timeLeftInSeconds * 10) / 10;
     conversion.innerHTML='Time Left: '+ms2TimeString(timeLeftInSeconds).split('.')[0]+'<br />'+
     'Time Left2: '+(ms2TimeString(((frames-x.frame)/x.fps)*1000)+(timeString2ms(x.time)/(duration*1000)*100|0)).split('.')[0]+'<br />'+
     'Estimated Total: '+ms2TimeString(estimatedTotalTime*1000).split('.')[0]+'<br />'+
     'Elapsed Time: '+ms2TimeString(elapsedTime).split('.')[0];
    }else{
     if(inProgress){
      sse.removeEventListener('message',conversionProgress,false);
      sse.close();
      sse=null;
      conversion.textContent='Finished in '+ms2TimeString((new Date().getTime()) - startTime).split('.')[0];
      //delete log/old file??
      inProgress=false;
     }
    }
    }

    EDIT

    HERE IS A SAMPLE OUTPUT after detecting h264 codec in a m2ts with ac3 audio

    As most devices can already read h264 i just need to convert the audio in aac and copy the same audio AC3 as second track. and put everything inside a mp4 container. So that i have a Android/chrome/ios & more browsers compatible file.

    $opt="-map 0:0 -map 0:1 -map 0:1 -c:v copy -c:a:0 libfdk_aac -metadata:s:a:0 language=ita -b:a 128k -ar 48000 -ac 2 -c:a:1 copy -metadata:s:a:1 language=ita -movflags +faststart";

    $i="in.m2ts";
    $o="out.mp4";
    $t="title";
    $y="2014";
    $progress="nameoftheLOG.log";

    $cmd="ffmpeg -y -i ".escapeshellarg($i)." -metadata title=".$t." -metadata date=".$y." ".$opt." ".$o." null >/dev/null 2>".$progress." &amp;";

    if you have any questions about the code or want to see more code just ask...

  • Save a stream of arrays to video using FFMPEG

    13 décembre 2022, par Gianluca Iacchini

    I made a simple fluid simulation using CUDA, and I'm trying to save it to a video using FFMPEG, however I get the Finishing stream 0:0 without any data written to it warning.

    &#xA;

    This is how i send the data

    &#xA;

    unsigned char* data = new unsigned char[SCR_WIDTH * SCR_HEIGHT * 4];&#xA;uchar4* pColors = new uchar4[SCR_WIDTH * SCR_HEIGHT];&#xA;&#xA;for (int i = 0; i &lt; N_FRAMES; i &#x2B;&#x2B;)&#xA;{&#xA;    // Computes a simulation step and sets pColors with the correct values.&#xA;    on_frame(pColors, timeStepSize);&#xA;    for (int j = 0; j &lt; SCR_WIDTH * SCR_HEIGHT * 4; j&#x2B;=4)&#xA;    {&#xA;        data[j] = pColors[j].x;&#xA;        data[j&#x2B;1] = pColors[j].y;&#xA;        data[j&#x2B;2] = pColors[j].z;&#xA;        data[j&#x2B;3] = pColors[j].w;&#xA;    }&#xA;    std::cout.write(reinterpret_cast(data), SCR_WIDTH * SCR_HEIGHT * 4);&#xA;}&#xA;

    &#xA;

    And then I pass it to FFMPEG using the following command :

    &#xA;

    ./simulation.o | ffmpeg -y -f rawvideo -pixel_format rgba -video_size 1024x1024 -i - -c:v libx264 -pix_fmt yuv444p -crf 0 video.mp4&#xA;`

    &#xA;

    This works fine if I hard code the values (es. if I set data[j] = 255 I get a red screen as expected) but when I use the pColors variable I get the following message from FFMPEG

    &#xA;

    Finishing stream 0:0 without any data written to it.

    &#xA;

    Even though both pColors and data hold the correct values.

    &#xA;

    Here is the full report from FFMPEG

    &#xA;

    ffmpeg started on 2022-12-13 at 14:28:34&#xA;Report written to "ffmpeg-20221213-142834.log"&#xA;Command line:&#xA;ffmpeg -y -f rawvideo -report -pixel_format rgba -video_size 128x128 -i - -c:v libx264 -pix_fmt yuv444p -crf 0 video9.mp4&#xA;ffmpeg version 3.4.11-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers&#xA;  built with gcc 7 (Ubuntu 7.5.0-3ubuntu1~18.04)&#xA;  configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chrom  libavutil      55. 78.100 / 55. 78.100&#xA;  libavcodec     57.107.100 / 57.107.100&#xA;  libavformat    57. 83.100 / 57. 83.100&#xA;  libavdevice    57. 10.100 / 57. 10.100&#xA;  libavfilter     6.107.100 /  6.107.100&#xA;  libavresample   3.  7.  0 /  3.  7.  0&#xA;  libswscale      4.  8.100 /  4.  8.100&#xA;  libswresample   2.  9.100 /  2.  9.100&#xA;  libpostproc    54.  7.100 / 54.  7.100&#xA;Splitting the commandline.&#xA;Reading option &#x27;-y&#x27; ... matched as option &#x27;y&#x27; (overwrite output files) with argument &#x27;1&#x27;.&#xA;Reading option &#x27;-f&#x27; ... matched as option &#x27;f&#x27; (force format) with argument &#x27;rawvideo&#x27;.&#xA;Reading option &#x27;-report&#x27; ... matched as option &#x27;report&#x27; (generate a report) with argument &#x27;1&#x27;.&#xA;Reading option &#x27;-pixel_format&#x27; ... matched as AVOption &#x27;pixel_format&#x27; with argument &#x27;rgba&#x27;.&#xA;Reading option &#x27;-video_size&#x27; ... matched as AVOption &#x27;video_size&#x27; with argument &#x27;128x128&#x27;.&#xA;Reading option &#x27;-i&#x27; ... matched as input url with argument &#x27;-&#x27;.&#xA;Reading option &#x27;-c:v&#x27; ... matched as option &#x27;c&#x27; (codec name) with argument &#x27;libx264&#x27;.&#xA;Reading option &#x27;-pix_fmt&#x27; ... matched as option &#x27;pix_fmt&#x27; (set pixel format) with argument &#x27;yuv444p&#x27;.&#xA;Reading option &#x27;-crf&#x27; ... matched as AVOption &#x27;crf&#x27; with argument &#x27;0&#x27;.&#xA;Reading option &#x27;video9.mp4&#x27; ... matched as output url.&#xA;Finished splitting the commandline.&#xA;Parsing a group of options: global .&#xA;Applying option y (overwrite output files) with argument 1.&#xA;Applying option report (generate a report) with argument 1.&#xA;Successfully parsed a group of options.&#xA;Parsing a group of options: input url -.&#xA;Applying option f (force format) with argument rawvideo.&#xA;Successfully parsed a group of options.&#xA;Opening an input file: -.&#xA;[rawvideo @ 0x558eba7b0000] Opening &#x27;pipe:&#x27; for reading&#xA;[pipe @ 0x558eba78a080] Setting default whitelist &#x27;crypto&#x27;&#xA;[rawvideo @ 0x558eba7b0000] Before avformat_find_stream_info() pos: 0 bytes read:0 seeks:0 nb_streams:1&#xA;[rawvideo @ 0x558eba7b0000] After avformat_find_stream_info() pos: 0 bytes read:0 seeks:0 frames:0&#xA;Input #0, rawvideo, from &#x27;pipe:&#x27;:&#xA;  Duration: N/A, bitrate: 13107 kb/s&#xA;    Stream #0:0, 0, 1/25: Video: rawvideo (RGBA / 0x41424752), rgba, 128x128, 13107 kb/s, 25 tbr, 25 tbn, 25 tbc&#xA;Successfully opened the file.&#xA;Parsing a group of options: output url video9.mp4.&#xA;Applying option c:v (codec name) with argument libx264.&#xA;Applying option pix_fmt (set pixel format) with argument yuv444p.&#xA;Successfully parsed a group of options.&#xA;Opening an output file: video9.mp4.&#xA;[file @ 0x558eba78a200] Setting default whitelist &#x27;file,crypto&#x27;&#xA;Successfully opened the file.&#xA;Stream mapping:&#xA;  Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))&#xA;cur_dts is invalid (this is harmless if it occurs once at the start per stream)&#xA;No more output streams to write to, finishing.&#xA;Finishing stream 0:0 without any data written to it.&#xA;detected 2 logical cores&#xA;[graph 0 input from stream 0:0 @ 0x558eba7a4a00] Setting &#x27;video_size&#x27; to value &#x27;128x128&#x27;&#xA;[graph 0 input from stream 0:0 @ 0x558eba7a4a00] Setting &#x27;pix_fmt&#x27; to value &#x27;28&#x27;&#xA;[graph 0 input from stream 0:0 @ 0x558eba7a4a00] Setting &#x27;time_base&#x27; to value &#x27;1/25&#x27;&#xA;[graph 0 input from stream 0:0 @ 0x558eba7a4a00] Setting &#x27;pixel_aspect&#x27; to value &#x27;0/1&#x27;&#xA;[graph 0 input from stream 0:0 @ 0x558eba7a4a00] Setting &#x27;sws_param&#x27; to value &#x27;flags=2&#x27;&#xA;[graph 0 input from stream 0:0 @ 0x558eba7a4a00] Setting &#x27;frame_rate&#x27; to value &#x27;25/1&#x27;&#xA;[graph 0 input from stream 0:0 @ 0x558eba7a4a00] w:128 h:128 pixfmt:rgba tb:1/25 fr:25/1 sar:0/1 sws_param:flags=2&#xA;[format @ 0x558eba7a4b40] compat: called with args=[yuv444p]&#xA;[format @ 0x558eba7a4b40] Setting &#x27;pix_fmts&#x27; to value &#x27;yuv444p&#x27;&#xA;[auto_scaler_0 @ 0x558eba7a4be0] Setting &#x27;flags&#x27; to value &#x27;bicubic&#x27;&#xA;[auto_scaler_0 @ 0x558eba7a4be0] w:iw h:ih flags:&#x27;bicubic&#x27; interl:0&#xA;[format @ 0x558eba7a4b40] auto-inserting filter &#x27;auto_scaler_0&#x27; between the filter &#x27;Parsed_null_0&#x27; and the filter &#x27;format&#x27;&#xA;[AVFilterGraph @ 0x558eba76d500] query_formats: 4 queried, 2 merged, 1 already done, 0 delayed&#xA;[auto_scaler_0 @ 0x558eba7a4be0] w:128 h:128 fmt:rgba sar:0/1 -> w:128 h:128 fmt:yuv444p sar:0/1 flags:0x4&#xA;[libx264 @ 0x558eba7cf900] using mv_range_thread = 24&#xA;[libx264 @ 0x558eba7cf900] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2 AVX512&#xA;[libx264 @ 0x558eba7cf900] profile High 4:4:4 Predictive, level 1.1, 4:4:4 8-bit&#xA;[libx264 @ 0x558eba7cf900] 264 - core 152 r2854 e9a5903 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x1:0x111 me=hex subme=7 psy=0 mixed_ref=1 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=0 chroma_qp_offset=0 threads=3 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc=cqp mbtree=0 qp=0&#xA;Output #0, mp4, to &#x27;video9.mp4&#x27;:&#xA;  Metadata:&#xA;    encoder         : Lavf57.83.100&#xA;    Stream #0:0, 0, 1/12800: Video: h264 (libx264) (avc1 / 0x31637661), yuv444p, 128x128, q=-1--1, 25 fps, 12800 tbn, 25 tbc&#xA;    Metadata:&#xA;      encoder         : Lavc57.107.100 libx264&#xA;    Side data:&#xA;      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1&#xA;frame=    0 fps=0.0 q=0.0 Lsize=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    &#xA;video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown&#xA;Input file #0 (pipe:):&#xA;  Input stream #0:0 (video): 0 packets read (0 bytes); 0 frames decoded; &#xA;  Total: 0 packets (0 bytes) demuxed&#xA;Output file #0 (video9.mp4):&#xA;  Output stream #0:0 (video): 0 frames encoded; 0 packets muxed (0 bytes); &#xA;  Total: 0 packets (0 bytes) muxed&#xA;0 frames successfully decoded, 0 decoding errors&#xA;[AVIOContext @ 0x558eba7b4120] Statistics: 2 seeks, 3 writeouts&#xA;[AVIOContext @ 0x558eba7b4000] Statistics: 0 bytes read, 0 seeks&#xA;&#xA;

    &#xA;

    I've never used FFMPEG before so I'm having a hard time finding my mistake.

    &#xA;

  • How to decode mp3 to raw sample data for FFMpeg using FFMediaToolkit

    28 décembre 2022, par Lee

    My objective is to create a video slideshow with audio using a database as the source. The final implementation video and audio inputs need to be memory streams or byte arrays, not a file system path. The sample code is file based for portability. It's just trying to read a file based mp3 then write it to the output.

    &#xA;

    I've tried a few FFMpeg wrappers and I'm open to alternatives. This code is using FFMediaToolkit. The video portion of the code works. It's the audio that I can't get to work.

    &#xA;

    The input is described as "A 2D jagged array of multi-channel sample data with NumChannels rows and NumSamples columns." The datatype is float[][].

    &#xA;

    My mp3 source is mono. I'm using NAudio.Wave to decode the mp3. It is then split into chunks equal to the frame size for the sample rate. It is then converted into the jagged float with the data on channel 0.

    &#xA;

    The FFMpeg decoder displays a long list of "buffer underflow" and "packet too large, ignoring buffer limits to mux it". C# returns "Specified argument was out of the range of valid values." The offending line of code being "file.Audio.AddFrame(frameAudio)".

    &#xA;

    The source is 16 bit samples. The PCM_S16BE codec is the only one that I could get to accept 16 bit sample format. I could only get the MP3 encoder to work with "Signed 32-bit integer (planar)" as the sample format. I'm not certain if the source data needs to be converted from 16 to 32 bit to use the codec.

    &#xA;

    `

    &#xA;

    using FFMediaToolkit;&#xA;using FFMediaToolkit.Decoding;&#xA;using FFMediaToolkit.Encoding;&#xA;using FFMediaToolkit.Graphics;&#xA;using System;&#xA;using System.Collections.Generic;&#xA;using System.Drawing.Imaging;&#xA;using System.Drawing;&#xA;using System.IO;&#xA;using System.Linq;&#xA;using System.Text;&#xA;using System.Threading.Tasks;&#xA;using FFMediaToolkit.Audio;&#xA;using NAudio.Wave;&#xA;using FFmpeg.AutoGen;&#xA;&#xA;    internal class FFMediaToolkitTest&#xA;    {&#xA;        const int frameRate = 30;&#xA;        const int vWidth = 1920;&#xA;        const int vHeight = 1080;&#xA;        const int aSampleRate = 24_000; // source sample rate&#xA;        //const int aSampleRate = 44_100;&#xA;        const int aSamplesPerFrame = aSampleRate / frameRate;&#xA;        const int aBitRate = 32_000;&#xA;        const string dirInput = @"D:\Websites\Vocabulary\Videos\source\";&#xA;        const string pathOutput = @"D:\Websites\Vocabulary\Videos\example.mpg";&#xA;&#xA;        public FFMediaToolkitTest()&#xA;        {&#xA;            try&#xA;            {&#xA;                FFmpegLoader.FFmpegPath = ".";  //  FFMpeg  DLLs in root project directory&#xA;                var settings = new VideoEncoderSettings(width: vWidth, height: vHeight, framerate: frameRate, codec: VideoCodec.H264);&#xA;                settings.EncoderPreset = EncoderPreset.Fast;&#xA;                settings.CRF = 17;&#xA;&#xA;                //var settingsAudio = new AudioEncoderSettings(aSampleRate, 1, (AudioCodec)AVCodecID.AV_CODEC_ID_PCM_S16BE);  // Won&#x27;t run with low bitrate.&#xA;                var settingsAudio = new AudioEncoderSettings(aSampleRate, 1, AudioCodec.MP3); // mpg runs with SampleFormat.SignedDWordP&#xA;                settingsAudio.Bitrate = aBitRate;&#xA;                //settingsAudio.SamplesPerFrame = aSamplesPerFrame;&#xA;                settingsAudio.SampleFormat = SampleFormat.SignedDWordP;&#xA;&#xA;                using (var file = MediaBuilder.CreateContainer(pathOutput).WithVideo(settings).WithAudio(settingsAudio).Create())&#xA;                {&#xA;                    var files = Directory.GetFiles(dirInput, "*.jpg");&#xA;                    foreach (var inputFile in files)&#xA;                    {&#xA;                        Console.WriteLine(inputFile);&#xA;                        var binInputFile = File.ReadAllBytes(inputFile);&#xA;                        var memInput = new MemoryStream(binInputFile);&#xA;                        var bitmap = Bitmap.FromStream(memInput) as Bitmap;&#xA;                        var rect = new System.Drawing.Rectangle(System.Drawing.Point.Empty, bitmap.Size);&#xA;                        var bitLock = bitmap.LockBits(rect, ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb);&#xA;                        var bitmapData = ImageData.FromPointer(bitLock.Scan0, ImagePixelFormat.Bgr24, bitmap.Size);&#xA;&#xA;                        for (int i = 0; i &lt; 60; i&#x2B;&#x2B;)&#xA;                            file.Video.AddFrame(bitmapData); &#xA;                        bitmap.UnlockBits(bitLock);&#xA;                    }&#xA;&#xA;                    var mp3files = Directory.GetFiles(dirInput, "*.mp3");&#xA;                    foreach (var inputFile in mp3files)&#xA;                    {&#xA;                        Console.WriteLine(inputFile);&#xA;                        var binInputFile = File.ReadAllBytes(inputFile);&#xA;                        var memInput = new MemoryStream(binInputFile);&#xA;&#xA;                        foreach (float[][] frameAudio in GetFrames(memInput))&#xA;                        {&#xA;                            file.Audio.AddFrame(frameAudio); // encode the frame&#xA;                        }&#xA;                    }&#xA;                    //Console.WriteLine(file.Audio.CurrentDuration);&#xA;                    Console.WriteLine(file.Video.CurrentDuration);&#xA;                    Console.WriteLine(file.Video.Configuration);&#xA;                }&#xA;            }&#xA;            catch (Exception e)&#xA;            {&#xA;                Vocab.LogError("FFMediaToolkitTest", e.StackTrace &#x2B; " " &#x2B; e.Message);&#xA;                Console.WriteLine(e.StackTrace &#x2B; " " &#x2B; e.Message);&#xA;            }&#xA;&#xA;            Console.WriteLine();&#xA;            Console.WriteLine("Done");&#xA;            Console.ReadLine();&#xA;        }&#xA;&#xA;&#xA;        public static List GetFrames(MemoryStream mp3stream)&#xA;        {&#xA;            List output = new List();&#xA;            &#xA;            int frameCount = 0;&#xA;&#xA;            NAudio.Wave.StreamMediaFoundationReader smfReader = new StreamMediaFoundationReader(mp3stream);&#xA;            Console.WriteLine(smfReader.WaveFormat);&#xA;            Console.WriteLine(smfReader.WaveFormat.AverageBytesPerSecond); //48000&#xA;            Console.WriteLine(smfReader.WaveFormat.BitsPerSample);  // 16&#xA;            Console.WriteLine(smfReader.WaveFormat.Channels);  // 1 &#xA;            Console.WriteLine(smfReader.WaveFormat.SampleRate);     //24000&#xA;&#xA;            Console.WriteLine("PCM bytes: " &#x2B; smfReader.Length);&#xA;            Console.WriteLine("Total Time: " &#x2B; smfReader.TotalTime);&#xA;&#xA;            int samplesPerFrame = smfReader.WaveFormat.SampleRate / frameRate;&#xA;            int bytesPerFrame = samplesPerFrame * smfReader.WaveFormat.BitsPerSample / 8;&#xA;            byte[] byteBuffer = new byte[bytesPerFrame];&#xA;&#xA;            while (smfReader.Read(byteBuffer, 0, bytesPerFrame) != 0)&#xA;            {&#xA;                float[][] buffer = Convert16BitToFloat(byteBuffer);&#xA;                output.Add(buffer);&#xA;                frameCount&#x2B;&#x2B;;&#xA;            }&#xA;            return output;&#xA;        }&#xA;&#xA;        public static float[][] Convert16BitToFloat(byte[] input)&#xA;        {&#xA;            // Only works with single channel data&#xA;            int inputSamples = input.Length / 2;&#xA;            float[][] output = new float[1][]; &#xA;            output[0] = new float[inputSamples];&#xA;            int outputIndex = 0;&#xA;            for (int n = 0; n &lt; inputSamples; n&#x2B;&#x2B;)&#xA;            {&#xA;                short sample = BitConverter.ToInt16(input, n * 2);&#xA;                output[0][outputIndex&#x2B;&#x2B;] = sample / 32768f;&#xA;            }&#xA;            return output;&#xA;        }&#xA;&#xA;    }&#xA;&#xA;&#xA;

    &#xA;

    `

    &#xA;

    I've tried multiple codecs with various settings. I couldn't get any of the codecs to accept a mp4 output file extension. FFMpeg will run but error out with mpg as the output file.

    &#xA;