Recherche avancée

Médias (91)

Autres articles (23)

  • Création définitive du canal

    12 mars 2010, par

    Lorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
    A la validation, vous recevez un email vous invitant donc à créer votre canal.
    Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
    A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Taille des images et des logos définissables

    9 février 2011, par

    Dans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
    Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...)

Sur d’autres sites (3732)

  • Stream sent via FFMPEG (NodeJS) to RTMP (YouTube) not being received

    10 décembre 2024, par Qumber

    I am writing a very basic chrome extension that captures and sends video stream to a nodeJS server, which in turns sends it to Youtube live server.

    


    Here is my implementation of the backend which receives data via WebRTC and send to YT using FFMPEG :

    


    const express = require('express');
const cors = require('cors');
const { RTCPeerConnection, RTCSessionDescription } = require('@roamhq/wrtc');
const { spawn } = require('child_process');

const app = express();
app.use(express.json());
app.use(cors());

app.post('/webrtc', async (req, res) => {
  const peerConnection = new RTCPeerConnection();

  // Start ffmpeg process for streaming
  const ffmpeg = spawn('ffmpeg', [
    '-f', 'flv',
    '-i', 'pipe:0',
    '-c:v', 'libx264',
    '-preset', 'veryfast',
    '-maxrate', '3000k',
    '-bufsize', '6000k',
    '-pix_fmt', 'yuv420p',
    '-g', '50',
    '-f', 'flv',
    'rtmp://a.rtmp.youtube.com/live2/MY_KEY'
  ]);

  ffmpeg.on('error', (err) => {
    console.error('FFmpeg error:', err);
  });

  ffmpeg.stderr.on('data', (data) => {
    console.error('FFmpeg stderr:', data.toString());
  });

  ffmpeg.stdout.on('data', (data) => {
    console.log('FFmpeg stdout:', data.toString());
  });

  // Handle incoming tracks
  peerConnection.ontrack = (event) => {
    console.log('Track received:', event.track.kind);
    const track = event.track;

    // Stream the incoming track to FFmpeg
    track.onunmute = () => {
      console.log('Track unmuted:', track.kind);
      const reader = track.createReadStream();
      reader.on('data', (chunk) => {
        console.log('Forwarding chunk to FFmpeg:', chunk.length);
        ffmpeg.stdin.write(chunk);
      });
      reader.on('end', () => {
        console.log('Stream ended');
        ffmpeg.stdin.end();
      });
    };

    track.onmute = () => {
      console.log('Track muted:', track.kind);
    };
  };

  // Set the remote description (offer) received from the client
  await peerConnection.setRemoteDescription(new RTCSessionDescription(req.body.sdp));

  // Create an answer and send it back to the client
  const answer = await peerConnection.createAnswer();
  await peerConnection.setLocalDescription(answer);

  res.json({ sdp: peerConnection.localDescription });
});

app.listen(3000, () => {
  console.log('WebRTC to RTMP server running on port 3000');
});



    


    This is the output I get, but nothing gets sent to YouTube :

    


    

    FFmpeg stderr: ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers
  built with Apple clang version 15.0.0 (clang-1500.3.9.4)

FFmpeg stderr:   configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon

FFmpeg stderr:   libavutil      59.  8.100 / 59.  8.100
  libavcodec     61.  3.100 / 61.  3.100
  libavformat    61.  1.100 / 61.  1.100
  libavdevice    61.  1.100 / 61.  1.100

FFmpeg stderr:   libavfilter    10.  1.100 / 10.  1.100
  libswscale      8.  1.100 /  8.  1.100
  libswresample   5.  1.100 /  5.  1.100
  libpostproc    58.  1.100 / 58.  1.100


    


    


    I do not understand what I am doing wrong. Any help would be appreciated.

    



    


    Optionally Here's the frontend code from the extension, which (to me) appears to be recording and sending the capture :

    


    popup.js & popup.html

    


    

    

    document.addEventListener('DOMContentLoaded', () => {
  document.getElementById('openCapturePage').addEventListener('click', () => {
    chrome.tabs.create({
      url: chrome.runtime.getURL('capture.html')
    });
  });
});

    


    &#xA;&#xA;&#xA;&#xA;  &#xA;  <code class="echappe-js">&lt;script src='http://stackoverflow.com/feeds/tag/popup.js'&gt;&lt;/script&gt;&#xA;&#xA;&#xA;&#xA;  

    StreamSavvy

    &#xA;

    &#xA;&#xA;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;

    capture.js & capture.html

    &#xA;

    &#xD;&#xA;
    &#xD;&#xA;
    let peerConnection;&#xA;&#xA;async function startStreaming() {&#xA;  try {&#xA;    const stream = await navigator.mediaDevices.getDisplayMedia({&#xA;      video: {&#xA;        cursor: "always"&#xA;      },&#xA;      audio: false&#xA;    });&#xA;&#xA;    peerConnection = new RTCPeerConnection({&#xA;      iceServers: [{&#xA;        urls: &#x27;stun:stun.l.google.com:19302&#x27;&#xA;      }]&#xA;    });&#xA;&#xA;    stream.getTracks().forEach(track => peerConnection.addTrack(track, stream));&#xA;&#xA;    const offer = await peerConnection.createOffer();&#xA;    await peerConnection.setLocalDescription(offer);&#xA;&#xA;    const response = await fetch(&#x27;http://localhost:3000/webrtc&#x27;, {&#xA;      method: &#x27;POST&#x27;,&#xA;      headers: {&#xA;        &#x27;Content-Type&#x27;: &#x27;application/json&#x27;&#xA;      },&#xA;      body: JSON.stringify({&#xA;        sdp: peerConnection.localDescription&#xA;      })&#xA;    });&#xA;&#xA;    const {&#xA;      sdp&#xA;    } = await response.json();&#xA;    await peerConnection.setRemoteDescription(new RTCSessionDescription(sdp));&#xA;&#xA;    console.log("Streaming to server via WebRTC...");&#xA;  } catch (error) {&#xA;    console.error("Error starting streaming:", error.name, error.message);&#xA;  }&#xA;}&#xA;&#xA;async function stopStreaming() {&#xA;  if (peerConnection) {&#xA;    // Stop all media tracks&#xA;    peerConnection.getSenders().forEach(sender => {&#xA;      if (sender.track) {&#xA;        sender.track.stop();&#xA;      }&#xA;    });&#xA;&#xA;    // Close the peer connection&#xA;    peerConnection.close();&#xA;    peerConnection = null;&#xA;    console.log("Streaming stopped");&#xA;  }&#xA;}&#xA;&#xA;document.addEventListener(&#x27;DOMContentLoaded&#x27;, () => {&#xA;  document.getElementById(&#x27;startCapture&#x27;).addEventListener(&#x27;click&#x27;, startStreaming);&#xA;  document.getElementById(&#x27;stopCapture&#x27;).addEventListener(&#x27;click&#x27;, stopStreaming);&#xA;});

    &#xD;&#xA;

    &#xA;&#xA;&#xA;&#xA;  &#xA;  <code class="echappe-js">&lt;script src='http://stackoverflow.com/feeds/tag/capture.js'&gt;&lt;/script&gt;&#xA;&#xA;&#xA;&#xA;  

    StreamSavvy Capture

    &#xA;

    &#xA;

    &#xA;&#xA;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;

    background.js (service worker)

    &#xA;

    &#xD;&#xA;
    &#xD;&#xA;
    chrome.runtime.onInstalled.addListener(() => {&#xA;  console.log("StreamSavvy Extension Installed");&#xA;});&#xA;&#xA;chrome.runtime.onMessage.addListener((message, sender, sendResponse) => {&#xA;  if (message.type === &#x27;startStreaming&#x27;) {&#xA;    chrome.tabs.create({&#xA;      url: chrome.runtime.getURL(&#x27;capture.html&#x27;)&#xA;    });&#xA;    sendResponse({&#xA;      status: &#x27;streaming&#x27;&#xA;    });&#xA;  } else if (message.type === &#x27;stopStreaming&#x27;) {&#xA;    chrome.tabs.query({&#xA;      url: chrome.runtime.getURL(&#x27;capture.html&#x27;)&#xA;    }, (tabs) => {&#xA;      if (tabs.length > 0) {&#xA;        chrome.tabs.sendMessage(tabs[0].id, {&#xA;          type: &#x27;stopStreaming&#x27;&#xA;        });&#xA;        sendResponse({&#xA;          status: &#x27;stopped&#x27;&#xA;        });&#xA;      }&#xA;    });&#xA;  }&#xA;  return true; // Keep the message channel open for sendResponse&#xA;});

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;

  • FFMPEG is not working in app service post deployment

    1er mars 2024, par Tushar Gupta

    I using the Xabe.FFmpeg package to generate clips from a video. The code is basically converting a video to multiple clips and it's working fine in my local but whenever I am uploading the code to app service the code is not working.

    &#xA;

    Stack : .net&#xA;App Service OS : Windows

    &#xA;

    Error :

    &#xA;

    2024-02-13 08:57:34.092 +00:00 [Error] Microsoft.AspNetCore.Server.IIS.Core.IISHttpServer : Connection ID "16573246629528734243", Request ID "40000a24-0000-e600-b63f-84710c7967bb" : An unhandled exception was thrown by the application.System.ComponentModel.Win32Exception (193) : An error occurred trying to start process 'C :\home\site\wwwroot\Controllers\ffmpeg\bin\ffmpeg.exe' with working directory 'C :\home\site\wwwroot'. The specified executable is not a valid application for this OS platform.at System.Diagnostics.Process.StartWithCreateProcess(ProcessStartInfo startInfo)at System.Diagnostics.Process.Start()at Xabe.FFmpeg.FFmpeg.RunProcess(String args, String processPath, Nullable1 priority, Boolean standardInput, Boolean standardOutput, Boolean standardError)at Xabe.FFmpeg.FFmpegWrapper.&lt;>c__DisplayClass14_0.<runprocess>b__0()at System.Threading.Tasks.Task</runprocess>1.InnerInvoke()at System.Threading.Tasks.Task.<>c.<.cctor>b__281_0(Object obj)at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)--- End of stack trace from previous location ---at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread)--- End of stack trace from previous location ---at Xabe.FFmpeg.Conversion.Start(String parameters, CancellationToken cancellationToken)at ExtractResponseAPI.Controllers.HomeController.PrepareVideoClips(CloudBlockBlob sourceVideoBlob, TimeSpan startTime, TimeSpan endTime) in C :\Users\tushar.h.gupta\source\repos\ExtractResponseAPI\Controllers\HomeController.cs:line 164at ExtractResponseAPI.Controllers.HomeController.GetIntervalsAsync(String query) in C :\Users\tushar.h.gupta\source\repos\ExtractResponseAPI\Controllers\HomeController.cs:line 60at ExtractResponseAPI.Controllers.HomeController.Get(String query) in C :\Users\tushar.h.gupta\source\repos\ExtractResponseAPI\Controllers\HomeController.cs:line 24at lambda_method4(Closure, Object)at Microsoft.AspNetCore.Mvc.Infrastructure.ActionMethodExecutor.AwaitableObjectResultExecutor.Execute(ActionContext actionContext, IActionResultTypeMapper mapper, ObjectMethodExecutor executor, Object controller, Object[] arguments)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.g__Awaited|12_0(ControllerActionInvoker invoker, ValueTask1 actionResultValueTask)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<invokenextactionfilterasync>g__Awaited|10_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Rethrow(ActionExecutedContextSealed context)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Next(State&amp; next, Scope&amp; scope, Object&amp; state, Boolean&amp; isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<invokeinnerfilterasync>g__Awaited|13_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<invokefilterpipelineasync>g__Awaited|20_0(ResourceInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<invokeasync>g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope)at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<invokeasync>g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope)at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware.Invoke(HttpContext context)at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware.Invoke(HttpContext context)at Microsoft.AspNetCore.Server.IIS.Core.IISHttpContextOfT</invokeasync></invokeasync></invokefilterpipelineasync></invokeinnerfilterasync></invokenextactionfilterasync>1.ProcessRequestAsync()

    &#xA;

    Code :

    &#xA;

    private async Task> PrepareVideoClips(CloudBlockBlob sourceVideoBlob, TimeSpan startTime, TimeSpan endTime)&#xA;{&#xA;    string tempDirectory = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString());&#xA;&#xA;    // Create a temporary directory to store clips locally&#xA;    Directory.CreateDirectory(tempDirectory);&#xA;&#xA;    // Download the source video&#xA;    string sourceVideoPath = Path.Combine(tempDirectory, "sourceVideo.mp4");&#xA;    await sourceVideoBlob.DownloadToFileAsync(sourceVideoPath, FileMode.Create);&#xA;    FFmpeg.SetExecutablesPath("Controllers\\ffmpeg\\bin\\");&#xA;    // Use FFmpegCore to trim the video&#xA;    string outputVideoPath = Path.Combine(tempDirectory, "output.mp4");&#xA;&#xA;    await FFmpeg.Conversions.New()&#xA;        .AddParameter($"-ss {startTime.TotalSeconds}") // Start time&#xA;        .AddParameter($"-i {sourceVideoPath}")&#xA;        .AddParameter($"-to {(endTime - startTime).TotalSeconds}") // Duration&#xA;        .SetOutput(outputVideoPath)&#xA;        .Start();&#xA;&#xA;    // Return a list of file paths for the clips&#xA;    return new List<string> { outputVideoPath };&#xA;}&#xA;</string>

    &#xA;

    I tried deploying the app service using different OS but still facing the same issue, I also tried using different packages but the result is same.

    &#xA;

  • first audio lost when using ffmpeg to overlay one mp4 on top of a big mp4

    11 septembre 2024, par James Hao

    I searched a lot (including chatgtp and google), and tried a lot of methods, not work.&#xA;below is my ffmpeg command line on windows 10 :

    &#xA;

    ffmpeg -i video.mp4 -i b-.mp4 -filter_complex "[0:v]setpts=PTS-STARTPTS[b1];[1:v]scale=300:-1,setpts=PTS-STARTPTS&#x2B;0.0/TB[top];[b1][top]overlay=x=50:y=50:enable=&#x27;between(t\,0.0,5)&#x27;[outv];[1:a]adelay=0|0[a1]; [0:a][a1]amerge=inputs=2[outa]" -map "[outv]" -map "[outa]" -pix_fmt yuv420p -c:a aac -ac 2 -c:v libx264 -crf 18 final_video6.mp4&#xA;

    &#xA;

    two mp4 files, b-.mp4 should be on top of video.mp4 and play from 0th second and scale to 300 :-1, [0:a][a1]amerge is to merge two audio from the mp4 files, using "-ac 2" to replace pan statement according to enter link description here

    &#xA;

    in result mp4 file, the audio from video.mp4 is lost ; sometimes with the same instruction, but replace b-.mp4 with another mp4 file, the audio may partially lost. any help will be very appreciated.&#xA;below is the console output from ffmpeg :

    &#xA;

    ffmpeg version 7.0.1-full_build-www.gyan.dev Copyright (c) 2000-2024 the FFmpeg developers&#xA;  built with gcc 13.2.0 (Rev5, Built by MSYS2 project)&#xA;  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libxevd --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxeve --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint&#xA;  libavutil      59.  8.100 / 59.  8.100&#xA;  libavcodec     61.  3.100 / 61.  3.100&#xA;  libavformat    61.  1.100 / 61.  1.100&#xA;  libavdevice    61.  1.100 / 61.  1.100&#xA;  libavfilter    10.  1.100 / 10.  1.100&#xA;  libswscale      8.  1.100 /  8.  1.100&#xA;  libswresample   5.  1.100 /  5.  1.100&#xA;  libpostproc    58.  1.100 / 58.  1.100&#xA;Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;video.mp4&#x27;:&#xA;  Metadata:&#xA;    major_brand     : isom&#xA;    minor_version   : 512&#xA;    compatible_brands: isomiso2avc1mp41&#xA;    encoder         : Lavf61.1.100&#xA;  Duration: 00:00:09.40, start: 0.000000, bitrate: 72 kb/s&#xA;  Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 854x480, 21 kb/s, 30 fps, 30 tbr, 15360 tbn (default)&#xA;      Metadata:&#xA;        handler_name    : VideoHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;        encoder         : Lavc61.3.100 libx264&#xA;  Stream #0:1[0x2](und): Audio: mp3 (mp3float) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 47 kb/s (default)&#xA;      Metadata:&#xA;        handler_name    : SoundHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;Input #1, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;b-.mp4&#x27;:&#xA;  Metadata:&#xA;    major_brand     : isom&#xA;    minor_version   : 512&#xA;    compatible_brands: isomiso2avc1mp41&#xA;    encoder         : Lavf59.27.100&#xA;  Duration: 00:00:05.29, start: 0.030000, bitrate: 325 kb/s&#xA;  Stream #1:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 366x132, 196 kb/s, 25.15 fps, 50 tbr, 90k tbn (default)&#xA;      Metadata:&#xA;        handler_name    : VideoHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;  Stream #1:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 139 kb/s (default)&#xA;      Metadata:&#xA;        handler_name    : SoundHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;Stream mapping:&#xA;  Stream #0:0 (h264) -> setpts:default&#xA;  Stream #0:1 (mp3float) -> amerge&#xA;  Stream #1:0 (h264) -> scale:default&#xA;  Stream #1:1 (aac) -> adelay:default&#xA;  overlay:default -> Stream #0:0 (libx264)&#xA;  amerge:default -> Stream #0:1 (aac)&#xA;Press [q] to stop, [?] for help&#xA;[Parsed_amerge_5 @ 000002a6613bf100] No channel layout for input 1&#xA;[vost#0:0/libx264 @ 000002a6612936c0] No information about the input framerate is available. Falling back to a default value of 25fps. Use the -r option if you want a different framerate.&#xA;[libx264 @ 000002a6612b8f40] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2&#xA;[libx264 @ 000002a6612b8f40] profile High, level 3.0, 4:2:0, 8-bit&#xA;[libx264 @ 000002a6612b8f40] 264 - core 164 r3191 4613ac3 - H.264/MPEG-4 AVC codec - Copyleft 2003-2024 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=15 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=18.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00&#xA;Output #0, mp4, to &#x27;final_video6.mp4&#x27;:&#xA;  Metadata:&#xA;    major_brand     : isom&#xA;    minor_version   : 512&#xA;    compatible_brands: isomiso2avc1mp41&#xA;    encoder         : Lavf61.1.100&#xA;  Stream #0:0: Video: h264 (avc1 / 0x31637661), yuv420p(progressive), 854x480, q=2-31, 25 fps, 12800 tbn&#xA;      Metadata:&#xA;        encoder         : Lavc61.3.100 libx264&#xA;      Side data:&#xA;        cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A&#xA;  Stream #0:1: Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s&#xA;      Metadata:&#xA;        encoder         : Lavc61.3.100 aac&#xA;[out#0/mp4 @ 000002a66105e640] video:93KiB audio:78KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: 3.974328%&#xA;frame=  236 fps=0.0 q=-1.0 Lsize=     178KiB time=00:00:04.97 bitrate= 292.5kbits/s dup=0 drop=46 speed=16.9x&#xA;[libx264 @ 000002a6612b8f40] frame I:2     Avg QP: 4.89  size:  4803&#xA;[libx264 @ 000002a6612b8f40] frame P:63    Avg QP:14.43  size:   854&#xA;[libx264 @ 000002a6612b8f40] frame B:171   Avg QP:12.25  size:   181&#xA;[libx264 @ 000002a6612b8f40] consecutive B-frames:  2.1%  1.7%  6.4% 89.8%&#xA;[libx264 @ 000002a6612b8f40] mb I  I16..4: 86.1% 10.0%  3.8%&#xA;[libx264 @ 000002a6612b8f40] mb P  I16..4:  0.3%  0.1%  0.3%  P16..4:  1.4%  0.6%  0.1%  0.0%  0.0%    skip:97.1%&#xA;[libx264 @ 000002a6612b8f40] mb B  I16..4:  0.0%  0.0%  0.1%  B16..8:  1.4%  0.3%  0.0%  direct: 0.0%  skip:98.1%  L0:52.4% L1:39.0% BI: 8.6%&#xA;[libx264 @ 000002a6612b8f40] 8x8 transform intra:11.2% inter:13.3%&#xA;[libx264 @ 000002a6612b8f40] coded y,uvDC,uvAC intra: 11.2% 12.4% 12.0% inter: 0.2% 0.2% 0.1%&#xA;[libx264 @ 000002a6612b8f40] i16 v,h,dc,p: 90%  6%  4%  0%&#xA;[libx264 @ 000002a6612b8f40] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 28%  3% 67%  0%  0%  0%  0%  0%  1%&#xA;[libx264 @ 000002a6612b8f40] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 26% 35% 15%  3%  4%  4%  5%  3%  4%&#xA;[libx264 @ 000002a6612b8f40] i8c dc,h,v,p: 86% 10%  3%  0%&#xA;[libx264 @ 000002a6612b8f40] Weighted P-Frames: Y:1.6% UV:1.6%&#xA;[libx264 @ 000002a6612b8f40] ref P L0: 69.2%  3.3% 18.7%  8.8%&#xA;[libx264 @ 000002a6612b8f40] ref B L0: 64.6% 31.0%  4.3%&#xA;[libx264 @ 000002a6612b8f40] ref B L1: 94.4%  5.6%&#xA;[libx264 @ 000002a6612b8f40] kb/s:80.06&#xA;[aac @ 000002a6612d0fc0] Qavg: 498.856&#xA;

    &#xA;