
Recherche avancée
Médias (3)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (34)
-
Demande de création d’un canal
12 mars 2010, parEn fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...) -
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras. -
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users.
Sur d’autres sites (6961)
-
FFmpeg image to video conversion error : [image2] Opening file for reading
14 février 2018, par user1690179I am running ffmpeg on an AWS Lambda instance. The Lambda function takes an input image and transcodes it into a video segment using ffmpeg :
ffmpeg -loop 1 -i /tmp/photo-SNRUR7ZS13.jpg -c:v libx264 -t 7.00 -pix_fmt yuv420p -vf scale=1280x720 /tmp/output.mp4
I am seeing inconsistent behavior where sometimes the output video is shorter than the specified duration. This happens inconsistently to random images. The same exact image sometimes renders correctly, and sometimes is cut short.
This behavior only happens on Lambda. I am not able to replicate this on my local computer, or on a dedicated EC2 instance with the same environment that runs on lambda.
I noticed that when the output video is short, the ffmpeg log is different. The main difference are repeated
[image2 @ 0x4b11140] Opening '/tmp/photo-2HD2Z3UN3W.jpg' for reading
lines. See ffmpeg logs below.Normal execution with the correct output video length :
ffmpeg -loop 1 -i /tmp/photo-SNRUR7ZS13.jpg -c:v libx264 -t 7.00 -pix_fmt yuv420p -vf scale=1280x720 /tmp/video-TMB6RNO0EE.mp4
ffmpeg version N-89773-g7fcbebbeaf-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 6.4.0 (Debian 6.4.0-11) 20171206
configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc-6 --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gray --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-librtmp --enable-libsoxr --enable-libspeex --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libzimg
libavutil 56. 7.100 / 56. 7.100
libavcodec 58. 9.100 / 58. 9.100
libavformat 58. 3.100 / 58. 3.100
libavdevice 58. 0.100 / 58. 0.100
libavfilter 7. 11.101 / 7. 11.101
libswscale 5. 0.101 / 5. 0.101
libswresample 3. 0.101 / 3. 0.101
libpostproc 55. 0.100 / 55. 0.100
Input #0, image2, from '/tmp/photo-SNRUR7ZS13.jpg':
Duration: 00:00:00.04, start: 0.000000, bitrate: 18703 kb/s
Stream #0:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 687x860 [SAR 200:200 DAR 687:860], 25 fps, 25 tbr, 25 tbn, 25 tbc
Stream mapping:
Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[swscaler @ 0x5837900] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 0x51c2340] using SAR=1477/3287
[libx264 @ 0x51c2340] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x51c2340] profile High, level 3.1
[libx264 @ 0x51c2340] 264 - core 155 r61 b00bcaf - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=3 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to '/tmp/video-TMB6RNO0EE.mp4':
Metadata:
encoder : Lavf58.3.100
Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 1280x720 [SAR 6183:13760 DAR 687:860], q=-1--1, 25 fps, 12800 tbn, 25 tbc
Metadata:
encoder : Lavc58.9.100 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
frame= 49 fps=0.0 q=28.0 size= 0kB time=-00:00:00.03 bitrate=N/A speed=N/A
frame= 69 fps= 66 q=28.0 size= 0kB time=00:00:00.76 bitrate= 0.5kbits/s speed=0.728x
frame= 89 fps= 57 q=28.0 size= 0kB time=00:00:01.56 bitrate= 0.2kbits/s speed=0.998x
frame= 109 fps= 53 q=28.0 size= 0kB time=00:00:02.36 bitrate= 0.2kbits/s speed=1.14x
frame= 129 fps= 50 q=28.0 size= 0kB time=00:00:03.16 bitrate= 0.1kbits/s speed=1.22x
frame= 148 fps= 48 q=28.0 size= 0kB time=00:00:03.92 bitrate= 0.1kbits/s speed=1.27x
frame= 168 fps= 47 q=28.0 size= 0kB time=00:00:04.72 bitrate= 0.1kbits/s speed=1.31x
No more output streams to write to, finishing.
frame= 175 fps= 39 q=-1.0 Lsize= 94kB time=00:00:06.88 bitrate= 112.2kbits/s speed=1.54x
video:91kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 3.161261%
Input file #0 (/tmp/photo-SNRUR7ZS13.jpg):
Input stream #0:0 (video): 176 packets read (16459168 bytes); 176 frames decoded;
Total: 176 packets (16459168 bytes) demuxed
Output file #0 (/tmp/video-TMB6RNO0EE.mp4):
Output stream #0:0 (video): 175 frames encoded; 175 packets muxed (93507 bytes);
Total: 175 packets (93507 bytes) muxed
[libx264 @ 0x51c2340] frame I:1 Avg QP:14.33 size: 73084
[libx264 @ 0x51c2340] frame P:44 Avg QP:14.09 size: 302
[libx264 @ 0x51c2340] frame B:130 Avg QP:23.31 size: 50
[libx264 @ 0x51c2340] consecutive B-frames: 0.6% 1.1% 0.0% 98.3%
[libx264 @ 0x51c2340] mb I I16..4: 3.3% 84.5% 12.1%
[libx264 @ 0x51c2340] mb P I16..4: 0.0% 0.0% 0.0% P16..4: 3.2% 0.1% 0.0% 0.0% 0.0% skip:96.7%
[libx264 @ 0x51c2340] mb B I16..4: 0.0% 0.0% 0.0% B16..8: 0.4% 0.0% 0.0% direct: 0.0% skip:99.6% L0:31.2% L1:68.8% BI: 0.0%
[libx264 @ 0x51c2340] 8x8 transform intra:84.5% inter:98.8%
[libx264 @ 0x51c2340] coded y,uvDC,uvAC intra: 95.1% 63.9% 51.6% inter: 0.1% 0.6% 0.0%
[libx264 @ 0x51c2340] i16 v,h,dc,p: 26% 21% 4% 49%
[libx264 @ 0x51c2340] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 20% 27% 21% 3% 5% 6% 6% 4% 9%
[libx264 @ 0x51c2340] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 23% 36% 10% 4% 7% 5% 6% 2% 6%
[libx264 @ 0x51c2340] i8c dc,h,v,p: 51% 29% 16% 4%
[libx264 @ 0x51c2340] Weighted P-Frames: Y:0.0% UV:0.0%
[libx264 @ 0x51c2340] ref P L0: 96.5% 0.0% 3.3% 0.2%
[libx264 @ 0x51c2340] ref B L0: 42.4% 57.6%
[libx264 @ 0x51c2340] ref B L1: 97.0% 3.0%
[libx264 @ 0x51c2340] kb/s:106.08Log from a short video :
ffmpeg -framerate 25 -y -loop 1 -i /tmp/photo-2HD2Z3UN3W.jpg -t 15.00 -filter_complex "[0:v]crop=h=ih:w='if(gt(a,16/9),ih*16/9,iw)':y=0:x='if(gt(a,16/9),(ow-iw)/2,0)'[tmp];[tmp]scale=-1:4000,crop=w=iw:h='min(iw*9/16,ih)':x=0:y='0.17*ih-((t/15.00)*min(0.17*ih,(ih-oh)/6))',trim=duration=15.00[tmp1];[tmp1]zoompan=z='if(lte(pzoom,1.0),1.15,max(1.0,pzoom-0.0005))':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=1,setsar=sar=1:1[animated];[animated]fade=out:st=12.00:d=3.00:c=#000000[animated]" -map "[animated]" -pix_fmt yuv420p -s 1280x720 -y /tmp/video-QB1JCDT021.mp4
ffmpeg version N-89773-g7fcbebbeaf-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 6.4.0 (Debian 6.4.0-11) 20171206
configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc-6 --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gray --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-librtmp --enable-libsoxr --enable-libspeex --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libzimg
libavutil 56. 7.100 / 56. 7.100
libavcodec 58. 9.100 / 58. 9.100
libavformat 58. 3.100 / 58. 3.100
libavdevice 58. 0.100 / 58. 0.100
libavfilter 7. 11.101 / 7. 11.101
libswscale 5. 0.101 / 5. 0.101
libswresample 3. 0.101 / 3. 0.101
libpostproc 55. 0.100 / 55. 0.100
Input #0, image2, from '/tmp/photo-2HD2Z3UN3W.jpg':
Duration: 00:00:00.04, start: 0.000000, bitrate: 373617 kb/s
Stream #0:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 1936x2592 [SAR 72:72 DAR 121:162], 25 fps, 25 tbr, 25 tbn, 25 tbc
Stream mapping:
Stream #0:0 (mjpeg) -> crop
fade -> Stream #0:0 (libx264)
Press [q] to stop, [?] for help
[swscaler @ 0x4d63b40] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x4df7340] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x50e97c0] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x50e97c0] Warning: data is not aligned! This can lead to a speed loss
[libx264 @ 0x4b17480] using SAR=1/1
[libx264 @ 0x4b17480] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x4b17480] profile High, level 3.1
[libx264 @ 0x4b17480] 264 - core 155 r61 b00bcaf - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=3 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to '/tmp/video-QB1JCDT021.mp4':
Metadata:
encoder : Lavf58.3.100
Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], q=-1--1, 25 fps, 12800 tbn, 25 tbc
Metadata:
encoder : Lavc58.9.100 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
[swscaler @ 0x5bd0380] deprecated pixel format used, make sure you did set range correctly
debug=1
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[image2 @ 0x4b11140] Opening '/tmp/photo-2HD2Z3UN3W.jpg' for reading
[AVIOContext @ 0x4b6ecc0] Statistics: 1868086 bytes read, 0 seeks
[mjpeg @ 0x4b14940] marker=d8 avail_size_in_buf=1868084
[mjpeg @ 0x4b14940] marker parser used 0 bytes (0 bits)
[mjpeg @ 0x4b14940] marker=e0 avail_size_in_buf=1868082
[mjpeg @ 0x4b14940] marker parser used 16 bytes (128 bits)
[mjpeg @ 0x4b14940] marker=db avail_size_in_buf=1868064
[mjpeg @ 0x4b14940] index=0
[mjpeg @ 0x4b14940] qscale[0]: 0
[mjpeg @ 0x4b14940] marker parser used 67 bytes (536 bits)
[mjpeg @ 0x4b14940] marker=db avail_size_in_buf=1867995
[mjpeg @ 0x4b14940] index=1
[mjpeg @ 0x4b14940] qscale[1]: 1
[mjpeg @ 0x4b14940] marker parser used 67 bytes (536 bits)
[mjpeg @ 0x4b14940] marker=c0 avail_size_in_buf=1867926
[mjpeg @ 0x4b14940] sof0: picture: 1936x2592
[mjpeg @ 0x4b14940] component 0 1:1 id: 0 quant:0
[mjpeg @ 0x4b14940] component 1 1:1 id: 1 quant:1
[mjpeg @ 0x4b14940] component 2 1:1 id: 2 quant:1
[mjpeg @ 0x4b14940] pix fmt id 11111100
[mjpeg @ 0x4b14940] marker parser used 17 bytes (136 bits)
[mjpeg @ 0x4b14940] marker=c4 avail_size_in_buf=1867907
[mjpeg @ 0x4b14940] class=0 index=0 nb_codes=11
[mjpeg @ 0x4b14940] marker parser used 30 bytes (240 bits)
[mjpeg @ 0x4b14940] marker=c4 avail_size_in_buf=1867875
[mjpeg @ 0x4b14940] class=1 index=0 nb_codes=242
[mjpeg @ 0x4b14940] marker parser used 82 bytes (656 bits)
[mjpeg @ 0x4b14940] marker=c4 avail_size_in_buf=1867791
[mjpeg @ 0x4b14940] class=0 index=1 nb_codes=8
[mjpeg @ 0x4b14940] marker parser used 27 bytes (216 bits)
[mjpeg @ 0x4b14940] marker=c4 avail_size_in_buf=1867762
[mjpeg @ 0x4b14940] class=1 index=1 nb_codes=241
[mjpeg @ 0x4b14940] marker parser used 51 bytes (408 bits)
[mjpeg @ 0x4b14940] escaping removed 7149 bytes
[mjpeg @ 0x4b14940] marker=da avail_size_in_buf=1867709
[mjpeg @ 0x4b14940] component: 0
[mjpeg @ 0x4b14940] component: 1
[mjpeg @ 0x4b14940] component: 2
[mjpeg @ 0x4b14940] marker parser used 1860559 bytes (14884468 bits)
[mjpeg @ 0x4b14940] marker=d9 avail_size_in_buf=0
[mjpeg @ 0x4b14940] decode frame unused 0 bytes
[swscaler @ 0x5bd42c0] deprecated pixel format used, make sure you did set range correctly
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[image2 @ 0x4b11140] Opening '/tmp/photo-2HD2Z3UN3W.jpg' for reading
[AVIOContext @ 0x4b6ecc0] Statistics: 1868086 bytes read, 0 seeks
[mjpeg @ 0x4b14940] marker=d8 avail_size_in_buf=1868084
[mjpeg @ 0x4b14940] marker parser used 0 bytes (0 bits)
[mjpeg @ 0x4b14940] marker=e0 avail_size_in_buf=1868082
[mjpeg @ 0x4b14940] marker parser used 16 bytes (128 bits)
[mjpeg @ 0x4b14940] marker=db avail_size_in_buf=1868064
[mjpeg @ 0x4b14940] index=0
[mjpeg @ 0x4b14940] qscale[0]: 0
[mjpeg @ 0x4b14940] marker parser used 67 bytes (536 bits)
[mjpeg @ 0x4b14940] marker=db avail_size_in_buf=1867995
[mjpeg @ 0x4b14940] index=1
[mjpeg @ 0x4b14940] qscale[1]: 1
[mjpeg @ 0x4b14940] marker parser used 67 bytes (536 bits)
[mjpeg @ 0x4b14940] marker=c0 avail_size_in_buf=1867926
[mjpeg @ 0x4b14940] sof0: picture: 1936x2592
[mjpeg @ 0x4b14940] component 0 1:1 id: 0 quant:0
[mjpeg @ 0x4b14940] component 1 1:1 id: 1 quant:1
[mjpeg @ 0x4b14940] component 2 1:1 id: 2 quant:1
[mjpeg @ 0x4b14940] pix fmt id 11111100
[mjpeg @ 0x4b14940] marker parser used 17 bytes (136 bits)
[mjpeg @ 0x4b14940] marker=c4 avail_size_in_buf=1867907
[mjpeg @ 0x4b14940] class=0 index=0 nb_codes=11
[mjpeg @ 0x4b14940] marker parser used 30 bytes (240 bits)
[mjpeg @ 0x4b14940] marker=c4 avail_size_in_buf=1867875
[mjpeg @ 0x4b14940] class=1 index=0 nb_codes=242
[mjpeg @ 0x4b14940] marker parser used 82 bytes (656 bits)
[mjpeg @ 0x4b14940] marker=c4 avail_size_in_buf=1867791
[mjpeg @ 0x4b14940] class=0 index=1 nb_codes=8
[mjpeg @ 0x4b14940] marker parser used 27 bytes (216 bits)
[mjpeg @ 0x4b14940] marker=c4 avail_size_in_buf=1867762
[mjpeg @ 0x4b14940] class=1 index=1 nb_codes=241
[mjpeg @ 0x4b14940] marker parser used 51 bytes (408 bits)
[mjpeg @ 0x4b14940] escaping removed 7149 bytes
[mjpeg @ 0x4b14940] marker=da avail_size_in_buf=1867709
[mjpeg @ 0x4b14940] component: 0
[mjpeg @ 0x4b14940] component: 1
[mjpeg @ 0x4b14940] component: 2
[mjpeg @ 0x4b14940] marker parser used 1860559 bytes (14884468 bits)
[mjpeg @ 0x4b14940] marker=d9 avail_size_in_buf=0
[mjpeg @ 0x4b14940] decode frame unused 0 bytes
[swscaler @ 0x5bd8200] deprecated pixel format used, make sure you did set range correctly
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
...
...
...As requested, here is a link to the full log. In this log - ffmpeg renders only 323 out of 375 frames.
The
Opening '/tmp/photo-2HD2Z3UN3W.jpg'
segment repeats many many times until it finally renders out a short video. Does anyone have insight into why it keeps opening the image file ? This must have something to do with the underlying issue. -
FFmpeg - feed raw frames via pipe - FFmpeg does not detect pipe closure
8 septembre 2018, par RumbleIm trying to follow these examples from C++ in Windows. Phyton Example C# Example
I have an application that produces raw frames that shall be encoded with FFmpeg.
The raw frames are transfered via IPC pipe to FFmpegs STDIN. That is working as expected, FFmpeg even displays the number of frames currently available.The problem occours when we are done sending frames. When I close the write end of the pipe I would expect FFmpeg to detect that, finish up and output the video. But that does not happen. FFmpeg stays open and seems to wait for more data.
I made a small test project in VisualStudio.
#include "stdafx.h"
//// stdafx.h
//#include "targetver.h"
//#include
//#include
//#include <iostream>
#include "Windows.h"
#include <cstdlib>
using namespace std;
bool WritePipe(void* WritePipe, const UINT8 *const Buffer, const UINT32 Length)
{
if (WritePipe == nullptr || Buffer == nullptr || Length == 0)
{
cout << __FUNCTION__ << ": Some input is useless";
return false;
}
// Write to pipe
UINT32 BytesWritten = 0;
UINT8 newline = '\n';
bool bIsWritten = WriteFile(WritePipe, Buffer, Length, (::DWORD*)&BytesWritten, nullptr);
cout << __FUNCTION__ << " Bytes written to pipe " << BytesWritten << endl;
//bIsWritten = WriteFile(WritePipe, &newline, 1, (::DWORD*)&BytesWritten, nullptr); // Do we need this? Actually this should destroy the image.
FlushFileBuffers(WritePipe); // Do we need this?
return bIsWritten;
}
#define PIXEL 80 // must be multiple of 8. Otherwise we get warning: Bytes are not aligned
int main()
{
HANDLE PipeWriteEnd = nullptr;
HANDLE PipeReadEnd = nullptr;
{
// create us a pipe for inter process communication
SECURITY_ATTRIBUTES Attr = { sizeof(SECURITY_ATTRIBUTES), NULL, true };
if (!CreatePipe(&PipeReadEnd, &PipeWriteEnd, &Attr, 0))
{
cout << "Could not create pipes" << ::GetLastError() << endl;
system("Pause");
return 0;
}
}
// Setup the variables needed for CreateProcess
// initialize process attributes
SECURITY_ATTRIBUTES Attr;
Attr.nLength = sizeof(SECURITY_ATTRIBUTES);
Attr.lpSecurityDescriptor = NULL;
Attr.bInheritHandle = true;
// initialize process creation flags
UINT32 CreateFlags = NORMAL_PRIORITY_CLASS;
CreateFlags |= CREATE_NEW_CONSOLE;
// initialize window flags
UINT32 dwFlags = 0;
UINT16 ShowWindowFlags = SW_HIDE;
if (PipeWriteEnd != nullptr || PipeReadEnd != nullptr)
{
dwFlags |= STARTF_USESTDHANDLES;
}
// initialize startup info
STARTUPINFOA StartupInfo = {
sizeof(STARTUPINFO),
NULL, NULL, NULL,
(::DWORD)CW_USEDEFAULT,
(::DWORD)CW_USEDEFAULT,
(::DWORD)CW_USEDEFAULT,
(::DWORD)CW_USEDEFAULT,
(::DWORD)0, (::DWORD)0, (::DWORD)0,
(::DWORD)dwFlags,
ShowWindowFlags,
0, NULL,
HANDLE(PipeReadEnd),
HANDLE(nullptr),
HANDLE(nullptr)
};
LPSTR ffmpegURL = "\"PATHTOFFMPEGEXE\" -y -loglevel verbose -f rawvideo -vcodec rawvideo -framerate 1 -video_size 80x80 -pixel_format rgb24 -i - -vcodec mjpeg -framerate 1/4 -an \"OUTPUTDIRECTORY\"";
// Finally create the process
PROCESS_INFORMATION ProcInfo;
if (!CreateProcessA(NULL, ffmpegURL, &Attr, &Attr, true, (::DWORD)CreateFlags, NULL, NULL, &StartupInfo, &ProcInfo))
{
cout << "CreateProcess failed " << ::GetLastError() << endl;
}
//CloseHandle(ProcInfo.hThread);
// Create images and write to pipe
#define MYARRAYSIZE (PIXEL*PIXEL*3) // each pixel has 3 bytes
UINT8* Bitmap = new UINT8[MYARRAYSIZE];
for (INT32 outerLoopIndex = 9; outerLoopIndex >= 0; --outerLoopIndex) // frame loop
{
for (INT32 innerLoopIndex = MYARRAYSIZE - 1; innerLoopIndex >= 0; --innerLoopIndex) // create the pixels for each frame
{
Bitmap[innerLoopIndex] = (UINT8)(outerLoopIndex * 20); // some gray color
}
system("pause");
if (!WritePipe(PipeWriteEnd, Bitmap, MYARRAYSIZE))
{
cout << "Failed writing to pipe" << endl;
}
}
// Done sending images. Tell the other process. IS THIS NEEDED? HOW TO TELL FFmpeg WE ARE DONE?
//UINT8 endOfFile = 0xFF; // EOF = -1 == 1111 1111 for uint8
//if (!WritePipe(PipeWriteEnd, &endOfFile, 1))
//{
// cout << "Failed writing to pipe" << endl;
//}
//FlushFileBuffers(PipeReadEnd); // Do we need this?
delete Bitmap;
system("pause");
// clean stuff up
FlushFileBuffers(PipeWriteEnd); // Do we need this?
if (PipeWriteEnd != NULL && PipeWriteEnd != INVALID_HANDLE_VALUE)
{
CloseHandle(PipeWriteEnd);
}
// We do not want to destroy the read end of the pipe? Should not as that belongs to FFmpeg
//if (PipeReadEnd != NULL && PipeReadEnd != INVALID_HANDLE_VALUE)
//{
// ::CloseHandle(PipeReadEnd);
//}
return 0;
}
</cstdlib></iostream>And here the output of FFmpeg
ffmpeg version 3.4.1 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 7.2.0 (GCC)
configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
libavutil 55. 78.100 / 55. 78.100
libavcodec 57.107.100 / 57.107.100
libavformat 57. 83.100 / 57. 83.100
libavdevice 57. 10.100 / 57. 10.100
libavfilter 6.107.100 / 6.107.100
libswscale 4. 8.100 / 4. 8.100
libswresample 2. 9.100 / 2. 9.100
libpostproc 54. 7.100 / 54. 7.100
[rawvideo @ 00000221ff992120] max_analyze_duration 5000000 reached at 5000000 microseconds st:0
Input #0, rawvideo, from 'pipe:':
Duration: N/A, start: 0.000000, bitrate: 153 kb/s
Stream #0:0: Video: rawvideo, 1 reference frame (RGB[24] / 0x18424752), rgb24, 80x80, 153 kb/s, 1 fps, 1 tbr, 1 tbn, 1 tbc
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
[graph 0 input from stream 0:0 @ 00000221ff999c20] w:80 h:80 pixfmt:rgb24 tb:1/1 fr:1/1 sar:0/1 sws_param:flags=2
[auto_scaler_0 @ 00000221ffa071a0] w:iw h:ih flags:'bicubic' interl:0
[format @ 00000221ffa04e20] auto-inserting filter 'auto_scaler_0' between the filter 'Parsed_null_0' and the filter 'format'
[swscaler @ 00000221ffa0a780] deprecated pixel format used, make sure you did set range correctly
[auto_scaler_0 @ 00000221ffa071a0] w:80 h:80 fmt:rgb24 sar:0/1 -> w:80 h:80 fmt:yuvj444p sar:0/1 flags:0x4
Output #0, mp4, to 'c:/users/vr3/Documents/Guenni/sometest.mp4':
Metadata:
encoder : Lavf57.83.100
Stream #0:0: Video: mjpeg, 1 reference frame (mp4v / 0x7634706D), yuvj444p(pc), 80x80, q=2-31, 200 kb/s, 1 fps, 16384 tbn, 1 tbc
Metadata:
encoder : Lavc57.107.100 mjpeg
Side data:
cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 10 fps=6.3 q=1.6 size= 0kB time=00:00:09.00 bitrate= 0.0kbits/s speed=5.63xAs you can see in the last line of te FFmpeg output, the images got trough. 10 frames are available. But after closing the pipe, FFmpeg does not close, still expecting input.
As the linked examples show, this should be a valid method.
Trying for a week now...
-
Create stopmotion video with ffmpeg - error while rendering
30 mai 2012, par NiklasI'm currently creating a stopmotion video with the help of ffmpeg and some scripts I've made. Although in the last clip I attempted to render I had a couple of frames which I've edited. The method I'm using has worked before so I'm certain that it has to do with GIMP changing something with the files. I work with .png-images. This is the command and the output I get :
ffmpeg -sameq -f image2 -r 7 -i "$src_dir/frame-%06d.png" -r 25 "$dest_dir/$file_name.avi"
output :
ffmpeg version 0.8.1-4:0.8.1-0ubuntu1, Copyright (c) 2000-2011 the Libav developers
built on Mar 22 2012 05:09:06 with gcc 4.6.3
This program is not developed anymore and is only provided for compatibility. Use avconv instead (see Changelog for the list of incompatible changes).
Input #0, image2, from '/mnt/storage/selected_frames/005-Middag-Animation/frame-%06d.png':
Duration: 00:00:02.85, start: 0.000000, bitrate: N/A
Stream #0.0: Video: png, bgra, 1280x720, 7 fps, 7 tbr, 7 tbn, 7 tbc
File '/mnt/storage/Videoklipp//005-Middag-Animation.avi' already exists. Overwrite ? [y/N] y
Incompatible pixel format 'bgra' for codec 'mpeg4', auto-selecting format 'yuv420p'
[buffer @ 0x1b88860] w:1280 h:720 pixfmt:bgra
[avsink @ 0x1b8a480] auto-inserting filter 'auto-inserted scaler 0' between the filter 'src' and the filter 'out'
[scale @ 0x1b8ab80] w:1280 h:720 fmt:bgra -> w:1280 h:720 fmt:yuv420p flags:0x4
Output #0, avi, to '/mnt/storage/Videoklipp//005-Middag-Animation.avi':
Metadata:
ISFT : Lavf53.21.0
Stream #0.0: Video: mpeg4, yuv420p, 1280x720, q=2-31, 2 kb/s, 25 tbn, 25 tbc
Stream mapping:
Stream #0.0 -> #0.0
Press ctrl-c to stop encoding
[buffer @ 0x1b88860] Changing frame properties on the fly is not supported.
Last message repeated 6 times
frame= 13 fps= 0 q=0.0 Lsize= 524kB time=2.20 bitrate=1951.2kbits/s
video:517kB audio:0kB global headers:0kB muxing overhead 1.323318%"[buffer @ 0x1b88860] Changing frame properties on the fly is not supported"
What can I do to fix this ? Any help is appreciated :)