
Recherche avancée
Médias (91)
-
Head down (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Echoplex (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Discipline (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Letting you (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
1 000 000 (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
999 999 (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
Autres articles (23)
-
Création définitive du canal
12 mars 2010, parLorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
A la validation, vous recevez un email vous invitant donc à créer votre canal.
Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Taille des images et des logos définissables
9 février 2011, parDans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...)
Sur d’autres sites (3732)
-
Stream sent via FFMPEG (NodeJS) to RTMP (YouTube) not being received
10 décembre 2024, par QumberI am writing a very basic chrome extension that captures and sends video stream to a nodeJS server, which in turns sends it to Youtube live server.


Here is my implementation of the backend which receives data via WebRTC and send to YT using FFMPEG :


const express = require('express');
const cors = require('cors');
const { RTCPeerConnection, RTCSessionDescription } = require('@roamhq/wrtc');
const { spawn } = require('child_process');

const app = express();
app.use(express.json());
app.use(cors());

app.post('/webrtc', async (req, res) => {
 const peerConnection = new RTCPeerConnection();

 // Start ffmpeg process for streaming
 const ffmpeg = spawn('ffmpeg', [
 '-f', 'flv',
 '-i', 'pipe:0',
 '-c:v', 'libx264',
 '-preset', 'veryfast',
 '-maxrate', '3000k',
 '-bufsize', '6000k',
 '-pix_fmt', 'yuv420p',
 '-g', '50',
 '-f', 'flv',
 'rtmp://a.rtmp.youtube.com/live2/MY_KEY'
 ]);

 ffmpeg.on('error', (err) => {
 console.error('FFmpeg error:', err);
 });

 ffmpeg.stderr.on('data', (data) => {
 console.error('FFmpeg stderr:', data.toString());
 });

 ffmpeg.stdout.on('data', (data) => {
 console.log('FFmpeg stdout:', data.toString());
 });

 // Handle incoming tracks
 peerConnection.ontrack = (event) => {
 console.log('Track received:', event.track.kind);
 const track = event.track;

 // Stream the incoming track to FFmpeg
 track.onunmute = () => {
 console.log('Track unmuted:', track.kind);
 const reader = track.createReadStream();
 reader.on('data', (chunk) => {
 console.log('Forwarding chunk to FFmpeg:', chunk.length);
 ffmpeg.stdin.write(chunk);
 });
 reader.on('end', () => {
 console.log('Stream ended');
 ffmpeg.stdin.end();
 });
 };

 track.onmute = () => {
 console.log('Track muted:', track.kind);
 };
 };

 // Set the remote description (offer) received from the client
 await peerConnection.setRemoteDescription(new RTCSessionDescription(req.body.sdp));

 // Create an answer and send it back to the client
 const answer = await peerConnection.createAnswer();
 await peerConnection.setLocalDescription(answer);

 res.json({ sdp: peerConnection.localDescription });
});

app.listen(3000, () => {
 console.log('WebRTC to RTMP server running on port 3000');
});




This is the output I get, but nothing gets sent to YouTube :




FFmpeg stderr: ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers
 built with Apple clang version 15.0.0 (clang-1500.3.9.4)

FFmpeg stderr: configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon

FFmpeg stderr: libavutil 59. 8.100 / 59. 8.100
 libavcodec 61. 3.100 / 61. 3.100
 libavformat 61. 1.100 / 61. 1.100
 libavdevice 61. 1.100 / 61. 1.100

FFmpeg stderr: libavfilter 10. 1.100 / 10. 1.100
 libswscale 8. 1.100 / 8. 1.100
 libswresample 5. 1.100 / 5. 1.100
 libpostproc 58. 1.100 / 58. 1.100





I do not understand what I am doing wrong. Any help would be appreciated.



Optionally Here's the frontend code from the extension, which (to me) appears to be recording and sending the capture :


popup.js & popup.html




document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('openCapturePage').addEventListener('click', () => {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 });
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/popup.js'></script>




StreamSavvy













capture.js & capture.html




let peerConnection;

async function startStreaming() {
 try {
 const stream = await navigator.mediaDevices.getDisplayMedia({
 video: {
 cursor: "always"
 },
 audio: false
 });

 peerConnection = new RTCPeerConnection({
 iceServers: [{
 urls: 'stun:stun.l.google.com:19302'
 }]
 });

 stream.getTracks().forEach(track => peerConnection.addTrack(track, stream));

 const offer = await peerConnection.createOffer();
 await peerConnection.setLocalDescription(offer);

 const response = await fetch('http://localhost:3000/webrtc', {
 method: 'POST',
 headers: {
 'Content-Type': 'application/json'
 },
 body: JSON.stringify({
 sdp: peerConnection.localDescription
 })
 });

 const {
 sdp
 } = await response.json();
 await peerConnection.setRemoteDescription(new RTCSessionDescription(sdp));

 console.log("Streaming to server via WebRTC...");
 } catch (error) {
 console.error("Error starting streaming:", error.name, error.message);
 }
}

async function stopStreaming() {
 if (peerConnection) {
 // Stop all media tracks
 peerConnection.getSenders().forEach(sender => {
 if (sender.track) {
 sender.track.stop();
 }
 });

 // Close the peer connection
 peerConnection.close();
 peerConnection = null;
 console.log("Streaming stopped");
 }
}

document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('startCapture').addEventListener('click', startStreaming);
 document.getElementById('stopCapture').addEventListener('click', stopStreaming);
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/capture.js'></script>




StreamSavvy Capture















background.js (service worker)




chrome.runtime.onInstalled.addListener(() => {
 console.log("StreamSavvy Extension Installed");
});

chrome.runtime.onMessage.addListener((message, sender, sendResponse) => {
 if (message.type === 'startStreaming') {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 sendResponse({
 status: 'streaming'
 });
 } else if (message.type === 'stopStreaming') {
 chrome.tabs.query({
 url: chrome.runtime.getURL('capture.html')
 }, (tabs) => {
 if (tabs.length > 0) {
 chrome.tabs.sendMessage(tabs[0].id, {
 type: 'stopStreaming'
 });
 sendResponse({
 status: 'stopped'
 });
 }
 });
 }
 return true; // Keep the message channel open for sendResponse
});







-
FFMPEG is not working in app service post deployment
1er mars 2024, par Tushar GuptaI using the Xabe.FFmpeg package to generate clips from a video. The code is basically converting a video to multiple clips and it's working fine in my local but whenever I am uploading the code to app service the code is not working.


Stack : .net
App Service OS : Windows


Error :


2024-02-13 08:57:34.092 +00:00 [Error] Microsoft.AspNetCore.Server.IIS.Core.IISHttpServer : Connection ID "16573246629528734243", Request ID "40000a24-0000-e600-b63f-84710c7967bb" : An unhandled exception was thrown by the application.System.ComponentModel.Win32Exception (193) : An error occurred trying to start process 'C :\home\site\wwwroot\Controllers\ffmpeg\bin\ffmpeg.exe' with working directory 'C :\home\site\wwwroot'. The specified executable is not a valid application for this OS platform.at System.Diagnostics.Process.StartWithCreateProcess(ProcessStartInfo startInfo)at System.Diagnostics.Process.Start()at Xabe.FFmpeg.FFmpeg.RunProcess(String args, String processPath, Nullable
1 priority, Boolean standardInput, Boolean standardOutput, Boolean standardError)at Xabe.FFmpeg.FFmpegWrapper.<>c__DisplayClass14_0.<runprocess>b__0()at System.Threading.Tasks.Task</runprocess>
1.InnerInvoke()at System.Threading.Tasks.Task.<>c.<.cctor>b__281_0(Object obj)at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)--- End of stack trace from previous location ---at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread)--- End of stack trace from previous location ---at Xabe.FFmpeg.Conversion.Start(String parameters, CancellationToken cancellationToken)at ExtractResponseAPI.Controllers.HomeController.PrepareVideoClips(CloudBlockBlob sourceVideoBlob, TimeSpan startTime, TimeSpan endTime) in C :\Users\tushar.h.gupta\source\repos\ExtractResponseAPI\Controllers\HomeController.cs:line 164at ExtractResponseAPI.Controllers.HomeController.GetIntervalsAsync(String query) in C :\Users\tushar.h.gupta\source\repos\ExtractResponseAPI\Controllers\HomeController.cs:line 60at ExtractResponseAPI.Controllers.HomeController.Get(String query) in C :\Users\tushar.h.gupta\source\repos\ExtractResponseAPI\Controllers\HomeController.cs:line 24at lambda_method4(Closure, Object)at Microsoft.AspNetCore.Mvc.Infrastructure.ActionMethodExecutor.AwaitableObjectResultExecutor.Execute(ActionContext actionContext, IActionResultTypeMapper mapper, ObjectMethodExecutor executor, Object controller, Object[] arguments)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.g__Awaited|12_0(ControllerActionInvoker invoker, ValueTask1 actionResultValueTask)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<invokenextactionfilterasync>g__Awaited|10_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Rethrow(ActionExecutedContextSealed context)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<invokeinnerfilterasync>g__Awaited|13_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<invokefilterpipelineasync>g__Awaited|20_0(ResourceInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<invokeasync>g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope)at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<invokeasync>g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope)at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware.Invoke(HttpContext context)at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware.Invoke(HttpContext context)at Microsoft.AspNetCore.Server.IIS.Core.IISHttpContextOfT</invokeasync></invokeasync></invokefilterpipelineasync></invokeinnerfilterasync></invokenextactionfilterasync>
1.ProcessRequestAsync()

Code :


private async Task> PrepareVideoClips(CloudBlockBlob sourceVideoBlob, TimeSpan startTime, TimeSpan endTime)
{
 string tempDirectory = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString());

 // Create a temporary directory to store clips locally
 Directory.CreateDirectory(tempDirectory);

 // Download the source video
 string sourceVideoPath = Path.Combine(tempDirectory, "sourceVideo.mp4");
 await sourceVideoBlob.DownloadToFileAsync(sourceVideoPath, FileMode.Create);
 FFmpeg.SetExecutablesPath("Controllers\\ffmpeg\\bin\\");
 // Use FFmpegCore to trim the video
 string outputVideoPath = Path.Combine(tempDirectory, "output.mp4");

 await FFmpeg.Conversions.New()
 .AddParameter($"-ss {startTime.TotalSeconds}") // Start time
 .AddParameter($"-i {sourceVideoPath}")
 .AddParameter($"-to {(endTime - startTime).TotalSeconds}") // Duration
 .SetOutput(outputVideoPath)
 .Start();

 // Return a list of file paths for the clips
 return new List<string> { outputVideoPath };
}
</string>


I tried deploying the app service using different OS but still facing the same issue, I also tried using different packages but the result is same.


-
first audio lost when using ffmpeg to overlay one mp4 on top of a big mp4
11 septembre 2024, par James HaoI searched a lot (including chatgtp and google), and tried a lot of methods, not work.
below is my ffmpeg command line on windows 10 :


ffmpeg -i video.mp4 -i b-.mp4 -filter_complex "[0:v]setpts=PTS-STARTPTS[b1];[1:v]scale=300:-1,setpts=PTS-STARTPTS+0.0/TB[top];[b1][top]overlay=x=50:y=50:enable='between(t\,0.0,5)'[outv];[1:a]adelay=0|0[a1]; [0:a][a1]amerge=inputs=2[outa]" -map "[outv]" -map "[outa]" -pix_fmt yuv420p -c:a aac -ac 2 -c:v libx264 -crf 18 final_video6.mp4



two mp4 files, b-.mp4 should be on top of video.mp4 and play from 0th second and scale to 300 :-1, [0:a][a1]amerge is to merge two audio from the mp4 files, using "-ac 2" to replace pan statement according to enter link description here


in result mp4 file, the audio from video.mp4 is lost ; sometimes with the same instruction, but replace b-.mp4 with another mp4 file, the audio may partially lost. any help will be very appreciated.
below is the console output from ffmpeg :


ffmpeg version 7.0.1-full_build-www.gyan.dev Copyright (c) 2000-2024 the FFmpeg developers
 built with gcc 13.2.0 (Rev5, Built by MSYS2 project)
 configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libxevd --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxeve --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint
 libavutil 59. 8.100 / 59. 8.100
 libavcodec 61. 3.100 / 61. 3.100
 libavformat 61. 1.100 / 61. 1.100
 libavdevice 61. 1.100 / 61. 1.100
 libavfilter 10. 1.100 / 10. 1.100
 libswscale 8. 1.100 / 8. 1.100
 libswresample 5. 1.100 / 5. 1.100
 libpostproc 58. 1.100 / 58. 1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'video.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 encoder : Lavf61.1.100
 Duration: 00:00:09.40, start: 0.000000, bitrate: 72 kb/s
 Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 854x480, 21 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
 Metadata:
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 encoder : Lavc61.3.100 libx264
 Stream #0:1[0x2](und): Audio: mp3 (mp3float) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 47 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 vendor_id : [0][0][0][0]
Input #1, mov,mp4,m4a,3gp,3g2,mj2, from 'b-.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 encoder : Lavf59.27.100
 Duration: 00:00:05.29, start: 0.030000, bitrate: 325 kb/s
 Stream #1:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 366x132, 196 kb/s, 25.15 fps, 50 tbr, 90k tbn (default)
 Metadata:
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 Stream #1:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 139 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 vendor_id : [0][0][0][0]
Stream mapping:
 Stream #0:0 (h264) -> setpts:default
 Stream #0:1 (mp3float) -> amerge
 Stream #1:0 (h264) -> scale:default
 Stream #1:1 (aac) -> adelay:default
 overlay:default -> Stream #0:0 (libx264)
 amerge:default -> Stream #0:1 (aac)
Press [q] to stop, [?] for help
[Parsed_amerge_5 @ 000002a6613bf100] No channel layout for input 1
[vost#0:0/libx264 @ 000002a6612936c0] No information about the input framerate is available. Falling back to a default value of 25fps. Use the -r option if you want a different framerate.
[libx264 @ 000002a6612b8f40] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 000002a6612b8f40] profile High, level 3.0, 4:2:0, 8-bit
[libx264 @ 000002a6612b8f40] 264 - core 164 r3191 4613ac3 - H.264/MPEG-4 AVC codec - Copyleft 2003-2024 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=15 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=18.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'final_video6.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 encoder : Lavf61.1.100
 Stream #0:0: Video: h264 (avc1 / 0x31637661), yuv420p(progressive), 854x480, q=2-31, 25 fps, 12800 tbn
 Metadata:
 encoder : Lavc61.3.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
 Stream #0:1: Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s
 Metadata:
 encoder : Lavc61.3.100 aac
[out#0/mp4 @ 000002a66105e640] video:93KiB audio:78KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: 3.974328%
frame= 236 fps=0.0 q=-1.0 Lsize= 178KiB time=00:00:04.97 bitrate= 292.5kbits/s dup=0 drop=46 speed=16.9x
[libx264 @ 000002a6612b8f40] frame I:2 Avg QP: 4.89 size: 4803
[libx264 @ 000002a6612b8f40] frame P:63 Avg QP:14.43 size: 854
[libx264 @ 000002a6612b8f40] frame B:171 Avg QP:12.25 size: 181
[libx264 @ 000002a6612b8f40] consecutive B-frames: 2.1% 1.7% 6.4% 89.8%
[libx264 @ 000002a6612b8f40] mb I I16..4: 86.1% 10.0% 3.8%
[libx264 @ 000002a6612b8f40] mb P I16..4: 0.3% 0.1% 0.3% P16..4: 1.4% 0.6% 0.1% 0.0% 0.0% skip:97.1%
[libx264 @ 000002a6612b8f40] mb B I16..4: 0.0% 0.0% 0.1% B16..8: 1.4% 0.3% 0.0% direct: 0.0% skip:98.1% L0:52.4% L1:39.0% BI: 8.6%
[libx264 @ 000002a6612b8f40] 8x8 transform intra:11.2% inter:13.3%
[libx264 @ 000002a6612b8f40] coded y,uvDC,uvAC intra: 11.2% 12.4% 12.0% inter: 0.2% 0.2% 0.1%
[libx264 @ 000002a6612b8f40] i16 v,h,dc,p: 90% 6% 4% 0%
[libx264 @ 000002a6612b8f40] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 28% 3% 67% 0% 0% 0% 0% 0% 1%
[libx264 @ 000002a6612b8f40] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 26% 35% 15% 3% 4% 4% 5% 3% 4%
[libx264 @ 000002a6612b8f40] i8c dc,h,v,p: 86% 10% 3% 0%
[libx264 @ 000002a6612b8f40] Weighted P-Frames: Y:1.6% UV:1.6%
[libx264 @ 000002a6612b8f40] ref P L0: 69.2% 3.3% 18.7% 8.8%
[libx264 @ 000002a6612b8f40] ref B L0: 64.6% 31.0% 4.3%
[libx264 @ 000002a6612b8f40] ref B L1: 94.4% 5.6%
[libx264 @ 000002a6612b8f40] kb/s:80.06
[aac @ 000002a6612d0fc0] Qavg: 498.856