
Recherche avancée
Autres articles (69)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Personnaliser les catégories
21 juin 2013, parFormulaire de création d’une catégorie
Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire.
Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)
Sur d’autres sites (15798)
-
ffmpeg version 2.6.8 : Stream specifier ':a' in filtergraph description matches no streams
16 décembre 2020, par AshitakaI am not getting why this isn't working.. I have tried to get the video streams with [0:v]/[0:1]/[0:v:0] & the audio streams with [0:a]/[0:0]/[0:0:0].
nothing worked.


Explaining the inputs :


1.1st input stream is a video that can be of varying resolution on which the filter adds a padding on to make it 600:480.


2.2nd input is an overlay png which is already at 5:4 ratio.. just making it 600:480 before it gets overlaid in the filter.


3.3rd & 4th ones are also videos which I don't care if they get stretched.. n they are getting stretched to 600:480.


4.so finally there are 3 streams 1 overlaid video 2 stretched videos which needs to be concatenated.


here's the command :


ffmpeg 
-i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612.mp4' 
-i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_overlay.png' 
-i '/home/vidinflux/public_html/assets/user/736/video/Lines1.mp4' 
-i '/home/vidinflux/public_html/assets/user/736/video/Lines11.mp4' 
-filter_complex
"[0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a]" 
-map "[v]" 
-map "[a]" 
-c:v libx264 
-shortest /home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_final.mp4



this is the complete error I am getting :


Stream specifier ':a' in filtergraph description [0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a] matches no streams.



also there are these warnings :


[Parsed_setsar_9 @ 0x219fba0] num:den syntax is deprecated, please use num/den or named options instead
[Parsed_setsar_11 @ 0x21a4840] num:den syntax is deprecated, please use num/den or named options instead



Complete log as requested :


[root@cloud ~]# ffmpeg -i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612.mp4' -i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_overlay.png' -i '/home/vidinflux/public_html/assets/user/736/video/Lines1.mp4' -i '/home/vidinflux/public_html/assets/user/736/video/Lines11.mp4' -filter_complex \ "[0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a]" -map "[v]" -map "[a]" -c:v libx264 -shortest /home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_final.mp4
ffmpeg version 2.6.8 Copyright (c) 2000-2016 the FFmpeg developers
 built with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-16)
 configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic' --enable-bzlib --disable-crystalhd --enable-gnutls --enable-ladspa --enable-libass --enable-libdc1394 --enable-libfaac --enable-nonfree --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libx264 --enable-libx265 --enable-libxvid --enable-x11grab --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-runtime-cpudetect
 libavutil 54. 20.100 / 54. 20.100
 libavcodec 56. 26.100 / 56. 26.100
 libavformat 56. 25.101 / 56. 25.101
 libavdevice 56. 4.100 / 56. 4.100
 libavfilter 5. 11.102 / 5. 11.102
 libavresample 2. 1. 0 / 2. 1. 0
 libswscale 3. 1.101 / 3. 1.101
 libswresample 1. 1.100 / 1. 1.100
 libpostproc 53. 3.100 / 53. 3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: isommp42
 creation_time : 2017-08-21 02:23:24
 Duration: 00:02:17.23, start: 0.000000, bitrate: 417 kb/s
 Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p(tv, bt709), 640x360 [SAR 1:1 DAR 16:9], 318 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
 Metadata:
 handler_name : VideoHandler
 Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 96 kb/s (default)
 Metadata:
 creation_time : 2017-08-21 02:23:24
 handler_name : IsoMedia File Produced by Google, 5-11-2011
Input #1, png_pipe, from '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_overlay.png':
 Duration: N/A, bitrate: N/A
 Stream #1:0: Video: png, rgba, 600x479, 25 tbr, 25 tbn, 25 tbc
Input #2, mov,mp4,m4a,3gp,3g2,mj2, from '/home/vidinflux/public_html/assets/user/736/video/Lines1.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 1
 compatible_brands: mp41mp42isom
 creation_time : 2018-01-31 22:40:09
 Duration: 00:00:04.90, start: 0.103811, bitrate: 846 kb/s
 Stream #2:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt709), 1920x1080, 827 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)
 Metadata:
 creation_time : 2018-01-31 22:40:10
 handler_name : Core Media Video
Input #3, mov,mp4,m4a,3gp,3g2,mj2, from '/home/vidinflux/public_html/assets/user/736/video/Lines11.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 1
 compatible_brands: mp41mp42isom
 creation_time : 2018-01-31 22:40:09
 Duration: 00:00:04.90, start: 0.103811, bitrate: 846 kb/s
 Stream #3:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt709), 1920x1080, 827 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)
 Metadata:
 creation_time : 2018-01-31 22:40:10
 handler_name : Core Media Video
[Parsed_setsar_9 @ 0x17c8ba0] num:den syntax is deprecated, please use num/den or named options instead
[Parsed_setsar_11 @ 0x17cd840] num:den syntax is deprecated, please use num/den or named options instead
Stream specifier ':a' in filtergraph description [0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a] matches no streams.



-
ffmpeg UDP stream error (subtitles)
18 février 2017, par PecaI have HTTP stream which I like to convert to UDP :
http://192.168.1.44:8001/1:0:1:1F8:1B:2C0:E080000:0:0:0:
The Video, Audio and Subtitle work perfectly If I try to open this stream in VLC n Ubuntu.
so far, so goodHere is otput of
FFPROBE
ffprobe -i http://192.168.1.44:8001/1:0:1:1F8:1B:2C0:E080000:0:0:0:
ffprobe version git-2017-01-22-f1214ad Copyright (c) 2007-2017 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libfreetype --enable-gnutls --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvidstab --enable-libwavpack --enable-nvenc
libavutil 55. 44.100 / 55. 44.100
libavcodec 57. 75.100 / 57. 75.100
libavformat 57. 63.100 / 57. 63.100
libavdevice 57. 2.100 / 57. 2.100
libavfilter 6. 69.100 / 6. 69.100
libavresample 3. 2. 0 / 3. 2. 0
libswscale 4. 3.101 / 4. 3.101
libswresample 2. 4.100 / 2. 4.100
libpostproc 54. 2.100 / 54. 2.100
[mpeg2video @ 0xa56fde0] Invalid frame dimensions 0x0.
Last message repeated 2 times
Input #0, mpegts, from 'ht tp://192.168.1.44:8001/1:0:1:1F8:1B:2C0:E080000:0:0:0:':
Duration: N/A, start: 35782.514200, bitrate: N/A
Program 501
Program 502
Program 503
Program 504
Stream #0:0[0x13b1]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv, top first), 720x576 [SAR 64:45 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
Stream #0:1[0x13b2]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 256 kb/s
Stream #0:2[0x1541](srp): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:3[0x1542](slv): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:4[0x1543](hrv): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:5[0x1544](cze): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:6[0x1545](hun): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:7[0x1546](ron): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:8[0x1547](alb): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:9[0x1548](bul): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:10[0x13b9](eng): Subtitle: dvb_teletext ([6][0][0][0] / 0x0006)
Program 505
Program 506
Program 507
Program 508
Program 509
Program 510
Program 511
Program 515
Program 516
Program 517
Program 518
Program 519
Program 520
Program 521
Unsupported codec with id 94215 for input stream 10So I decide to use FFMPEG to stream to UDP, and filter out unwanted SUB’s
Here is FFMPEG cmd
ffmpeg -reconnect 1 -reconnect_at_eof 1 -reconnect_streamed 1 -reconnect_delay_max 2048 \
-i "http://192.168.1.44:8001/1:0:1:1F8:1B:2C0:E080000:0:0:0:" \
-map 0:0 -vcodec copy \
-map 0:1 -acodec copy \
-map 0:2 -map 0:6 -scodec copy \
-f mpegts udp://239.0.10.3:40000?pkt_size=1316And the output is :
Output #0, mpegts, to 'udp://239.0.10.3:40000?pkt_size=1316':
Metadata:
encoder : Lavf57.63.100
Stream #0:0: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv, top first), 720x576 [SAR 64:45 DAR 16:9], q=2-31, 25 fps, 25 tbr, 90k tbn, 90k tbc
Stream #0:1: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 256 kb/s
Stream #0:2(srp): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream #0:3(hun): Subtitle: dvb_subtitle ([6][0][0][0] / 0x0006)
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Stream #0:1 -> #0:1 (copy)
Stream #0:2 -> #0:2 (copy)
Stream #0:6 -> #0:3 (copy)And the resulting stream is DISASTER ... audio is choppy, video fall apart etc etc
If I try same thing WITHOUT sub’s (map 2 and 6) the video and audio is crystal clear and stream work well
But I need this two sub’s.
To narrow down the problem, if I try to capture incoming stream into file :
ffmpeg -reconnect 1 -reconnect_at_eof 1 -reconnect_streamed 1 -reconnect_delay_max 2048 \
-i "http://192.168.1.44:8001/1:0:1:1F8:1B:2C0:E080000:0:0:0:" \
-map 0:0 -map 0:1 -map 0:2 -map 0:6 \
-codec copy \
-y -f mpegts /tmp/tst.tsThe resulting file is playable, sound OK, video OK, subtitles OK.
Looks like problem is with OUTPUT to UDP ???
Any solution for this ?
-
FFmpeg - feed raw frames via pipe - FFmpeg does not detect pipe closure
8 septembre 2018, par RumbleIm trying to follow these examples from C++ in Windows. Phyton Example C# Example
I have an application that produces raw frames that shall be encoded with FFmpeg.
The raw frames are transfered via IPC pipe to FFmpegs STDIN. That is working as expected, FFmpeg even displays the number of frames currently available.The problem occours when we are done sending frames. When I close the write end of the pipe I would expect FFmpeg to detect that, finish up and output the video. But that does not happen. FFmpeg stays open and seems to wait for more data.
I made a small test project in VisualStudio.
#include "stdafx.h"
//// stdafx.h
//#include "targetver.h"
//#include
//#include
//#include <iostream>
#include "Windows.h"
#include <cstdlib>
using namespace std;
bool WritePipe(void* WritePipe, const UINT8 *const Buffer, const UINT32 Length)
{
if (WritePipe == nullptr || Buffer == nullptr || Length == 0)
{
cout << __FUNCTION__ << ": Some input is useless";
return false;
}
// Write to pipe
UINT32 BytesWritten = 0;
UINT8 newline = '\n';
bool bIsWritten = WriteFile(WritePipe, Buffer, Length, (::DWORD*)&BytesWritten, nullptr);
cout << __FUNCTION__ << " Bytes written to pipe " << BytesWritten << endl;
//bIsWritten = WriteFile(WritePipe, &newline, 1, (::DWORD*)&BytesWritten, nullptr); // Do we need this? Actually this should destroy the image.
FlushFileBuffers(WritePipe); // Do we need this?
return bIsWritten;
}
#define PIXEL 80 // must be multiple of 8. Otherwise we get warning: Bytes are not aligned
int main()
{
HANDLE PipeWriteEnd = nullptr;
HANDLE PipeReadEnd = nullptr;
{
// create us a pipe for inter process communication
SECURITY_ATTRIBUTES Attr = { sizeof(SECURITY_ATTRIBUTES), NULL, true };
if (!CreatePipe(&PipeReadEnd, &PipeWriteEnd, &Attr, 0))
{
cout << "Could not create pipes" << ::GetLastError() << endl;
system("Pause");
return 0;
}
}
// Setup the variables needed for CreateProcess
// initialize process attributes
SECURITY_ATTRIBUTES Attr;
Attr.nLength = sizeof(SECURITY_ATTRIBUTES);
Attr.lpSecurityDescriptor = NULL;
Attr.bInheritHandle = true;
// initialize process creation flags
UINT32 CreateFlags = NORMAL_PRIORITY_CLASS;
CreateFlags |= CREATE_NEW_CONSOLE;
// initialize window flags
UINT32 dwFlags = 0;
UINT16 ShowWindowFlags = SW_HIDE;
if (PipeWriteEnd != nullptr || PipeReadEnd != nullptr)
{
dwFlags |= STARTF_USESTDHANDLES;
}
// initialize startup info
STARTUPINFOA StartupInfo = {
sizeof(STARTUPINFO),
NULL, NULL, NULL,
(::DWORD)CW_USEDEFAULT,
(::DWORD)CW_USEDEFAULT,
(::DWORD)CW_USEDEFAULT,
(::DWORD)CW_USEDEFAULT,
(::DWORD)0, (::DWORD)0, (::DWORD)0,
(::DWORD)dwFlags,
ShowWindowFlags,
0, NULL,
HANDLE(PipeReadEnd),
HANDLE(nullptr),
HANDLE(nullptr)
};
LPSTR ffmpegURL = "\"PATHTOFFMPEGEXE\" -y -loglevel verbose -f rawvideo -vcodec rawvideo -framerate 1 -video_size 80x80 -pixel_format rgb24 -i - -vcodec mjpeg -framerate 1/4 -an \"OUTPUTDIRECTORY\"";
// Finally create the process
PROCESS_INFORMATION ProcInfo;
if (!CreateProcessA(NULL, ffmpegURL, &Attr, &Attr, true, (::DWORD)CreateFlags, NULL, NULL, &StartupInfo, &ProcInfo))
{
cout << "CreateProcess failed " << ::GetLastError() << endl;
}
//CloseHandle(ProcInfo.hThread);
// Create images and write to pipe
#define MYARRAYSIZE (PIXEL*PIXEL*3) // each pixel has 3 bytes
UINT8* Bitmap = new UINT8[MYARRAYSIZE];
for (INT32 outerLoopIndex = 9; outerLoopIndex >= 0; --outerLoopIndex) // frame loop
{
for (INT32 innerLoopIndex = MYARRAYSIZE - 1; innerLoopIndex >= 0; --innerLoopIndex) // create the pixels for each frame
{
Bitmap[innerLoopIndex] = (UINT8)(outerLoopIndex * 20); // some gray color
}
system("pause");
if (!WritePipe(PipeWriteEnd, Bitmap, MYARRAYSIZE))
{
cout << "Failed writing to pipe" << endl;
}
}
// Done sending images. Tell the other process. IS THIS NEEDED? HOW TO TELL FFmpeg WE ARE DONE?
//UINT8 endOfFile = 0xFF; // EOF = -1 == 1111 1111 for uint8
//if (!WritePipe(PipeWriteEnd, &endOfFile, 1))
//{
// cout << "Failed writing to pipe" << endl;
//}
//FlushFileBuffers(PipeReadEnd); // Do we need this?
delete Bitmap;
system("pause");
// clean stuff up
FlushFileBuffers(PipeWriteEnd); // Do we need this?
if (PipeWriteEnd != NULL && PipeWriteEnd != INVALID_HANDLE_VALUE)
{
CloseHandle(PipeWriteEnd);
}
// We do not want to destroy the read end of the pipe? Should not as that belongs to FFmpeg
//if (PipeReadEnd != NULL && PipeReadEnd != INVALID_HANDLE_VALUE)
//{
// ::CloseHandle(PipeReadEnd);
//}
return 0;
}
</cstdlib></iostream>And here the output of FFmpeg
ffmpeg version 3.4.1 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 7.2.0 (GCC)
configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
libavutil 55. 78.100 / 55. 78.100
libavcodec 57.107.100 / 57.107.100
libavformat 57. 83.100 / 57. 83.100
libavdevice 57. 10.100 / 57. 10.100
libavfilter 6.107.100 / 6.107.100
libswscale 4. 8.100 / 4. 8.100
libswresample 2. 9.100 / 2. 9.100
libpostproc 54. 7.100 / 54. 7.100
[rawvideo @ 00000221ff992120] max_analyze_duration 5000000 reached at 5000000 microseconds st:0
Input #0, rawvideo, from 'pipe:':
Duration: N/A, start: 0.000000, bitrate: 153 kb/s
Stream #0:0: Video: rawvideo, 1 reference frame (RGB[24] / 0x18424752), rgb24, 80x80, 153 kb/s, 1 fps, 1 tbr, 1 tbn, 1 tbc
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
[graph 0 input from stream 0:0 @ 00000221ff999c20] w:80 h:80 pixfmt:rgb24 tb:1/1 fr:1/1 sar:0/1 sws_param:flags=2
[auto_scaler_0 @ 00000221ffa071a0] w:iw h:ih flags:'bicubic' interl:0
[format @ 00000221ffa04e20] auto-inserting filter 'auto_scaler_0' between the filter 'Parsed_null_0' and the filter 'format'
[swscaler @ 00000221ffa0a780] deprecated pixel format used, make sure you did set range correctly
[auto_scaler_0 @ 00000221ffa071a0] w:80 h:80 fmt:rgb24 sar:0/1 -> w:80 h:80 fmt:yuvj444p sar:0/1 flags:0x4
Output #0, mp4, to 'c:/users/vr3/Documents/Guenni/sometest.mp4':
Metadata:
encoder : Lavf57.83.100
Stream #0:0: Video: mjpeg, 1 reference frame (mp4v / 0x7634706D), yuvj444p(pc), 80x80, q=2-31, 200 kb/s, 1 fps, 16384 tbn, 1 tbc
Metadata:
encoder : Lavc57.107.100 mjpeg
Side data:
cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 10 fps=6.3 q=1.6 size= 0kB time=00:00:09.00 bitrate= 0.0kbits/s speed=5.63xAs you can see in the last line of te FFmpeg output, the images got trough. 10 frames are available. But after closing the pipe, FFmpeg does not close, still expecting input.
As the linked examples show, this should be a valid method.
Trying for a week now...