
Recherche avancée
Médias (1)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
Autres articles (105)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Menus personnalisés
14 novembre 2010, parMediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
Menus créés à l’initialisation du site
Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...)
Sur d’autres sites (9826)
-
ffmpeg pipe Invalid data found when processing input
28 mars 2021, par Ankit MaheshwariHere is my configuration.





const ffmpegPath = require('@ffmpeg-installer/ffmpeg').path;
const spawn = require('child_process').spawn;

ffmpeg = spawn(ffmpegPath, [
 
// Remove this line, as well as `-shortest`, if you send audio from the browser.
 // '-f', 'lavfi', '-i', 'anullsrc',

 // FFmpeg will read input video from STDIN
 '-i', '-',

 // -re flag means to Read input at native frame rate.
 '-re', '-y',

 // thread_queue_size added to avoid err: Thread message queue blocking; consider raising the thread_queue_size option, required before each input - this is for image2
 // '-thread_queue_size', '2048',

 // REF TO OVERLAY 
 // https://stackoverflow.com/questions/10438713/overlay-animated-images-with-transparency-over-a-static-background-image-using-f?rq=1
 // -loop loops the background image input so that we don't just have one frame, crucial!

 // The image file muxer writes video frames to image files, http://underpop.online.fr/f/ffmpeg/help/image2-1.htm.gz
 '-f', 'image2',

 // The -loop option is specific to the image file demuxer and gif muxer, so it can't be used for typical video files, but it can be used to infinitely loop a series of input images.
 '-loop', '1',

 // pattern_type is used to determine the format of the images contained in the files.
 // Read images matching the "*.png" glob pattern, that is files terminating with the ".png" suffix
 '-pattern_type', 'glob',

 // '-i', `images/${streamConfigData.youtube_key}/destination/image-*.png`,

 '-i', `images/${streamConfigData.youtube_key}/overlay/abc.png`,

 // '-vf', 'scale=1920x1080:flags=lanczos',
 
 // -shortest ends encoding when the shortest input ends, which is necessary as looping the background means that that input will never end.
 // 'overlay=shortest=1',

 "-filter_complex", "[1:v]format=rgba,colorchannelmixer=aa=1[fg];[0][fg]overlay",
 
 // Because we're using a generated audio source which never ends,
 // specify that we'll stop at end of other input. Remove this line if you
 // send audio from the browser.
 // '-shortest',
 
 // If we're encoding H.264 in-browser, we can set the video codec to 'copy'
 // so that we don't waste any CPU and quality with unnecessary transcoding.
 // If the browser doesn't support H.264, set the video codec to 'libx264'
 // or similar to transcode it to H.264 here on the server.
 // '-vcodec', 'libx264',
 // it is not possible to filter and stream copy the same stream at the same time. https://stackoverflow.com/a/53526514/4057143
 '-vcodec', 'copy',
 
 // if browser not supports encoding AAC, we must transcode the audio to AAC here on the server.
 // '-acodec', 'aac',

 // Use this rate control mode if you want to keep the best quality and care less about the file size. CRF scale is 0–51, where 0 is lossless, 23 is the default, and 51 is worst quality possible. A lower value generally leads to higher quality, https://trac.ffmpeg.org/wiki/Encode/H.264
 '-crf', '23',

 // preset provide a certain encoding speed to compression ratio. A slower preset will provide better compression. medium – default preset, https://trac.ffmpeg.org/wiki/Encode/H.264
 '-preset', 'ultrafast',

 // -r set the frame rate. Generally, -r. Use the filter when you need to change framerate before applying further filters.
 // '-r', '30',
 // '-framerate', '30',

 //debug level logs
 '-loglevel', 'debug',
 '-v', 'verbose',

 // -g GOP_LEN_IN_FRAMES, -g sets the keyframe interval. https://superuser.com/a/908325
 '-g', '60',

 // video timescale, not sure what it is!
 '-video_track_timescale', '1000',

 // a live stream with more/less constant bit rate, to be able to control the bandwidth used.
 // a live stream with limited bit rate
 '-b:v', '15000k',
 // '-maxrate', '4000k',
 // '-bufsize', '8000k',

 // FLV is the container format used in conjunction with RTMP
 '-f', 'flv',
 
 // The output RTMP URL.
 // For debugging, you could set this to a filename like 'test.flv', and play
 // the resulting file with VLC.
 rtmpUrl 
 ], {
 env: {
 NODE_ENV: 'production',
 PATH: process.env.PATH
 }
 });







-
Issues with encoded video in Chrome playback
4 novembre 2022, par Michael Joseph AubryChrome is very particular about how a video is encoded. An issue I am facing with a specific type of video is
seeked
either does not get called at all or at least within a 20s period of time within certain frames.

I can consistently repeat this issue using this muxer https://github.com/redbrain/ytdl-core-muxer/blob/main/index.js


When I paint each frame in puppeteer Lambda there are certain frames where either
seeked
never gets called or the 20s timeout is called before the frame can resolve.

Here is an example of a log sequence for this particular frame that fails


When I re-encode this video, simply by running
ffmpeg -i VIDEO_URL
then try with the new one there are no issues.

In order to see the differences on a frame level I run
ffprobe -show_frames goodvideo.mp4 > good.txt
&&ffprobe -show_frames badvideo.mp4 > bad.txt
and here is what I see with the frames around the 117s mark where the first sign of corruption occurs

Good Frame @ 117s


[FRAME]
media_type=video
stream_index=0
key_frame=0
pts=7020013
pts_time=117.000217
pkt_dts=7020013
pkt_dts_time=117.000217
best_effort_timestamp=7020013
best_effort_timestamp_time=117.000217
pkt_duration=1001
pkt_duration_time=0.016683
pkt_pos=27029408
pkt_size=1741
width=1920
height=1080
pix_fmt=yuv420p
sample_aspect_ratio=1:1
pict_type=B
coded_picture_number=7014
display_picture_number=0
interlaced_frame=0
top_field_first=0
repeat_pict=0
color_range=tv
color_space=bt709
color_primaries=bt709
color_transfer=bt709
chroma_location=left
[/FRAME]



Bad Frame @ 117s


[FRAME]
media_type=video
stream_index=0
key_frame=0
pts=117000
pts_time=117.000000
pkt_dts=117000
pkt_dts_time=117.000000
best_effort_timestamp=117000
best_effort_timestamp_time=117.000000
pkt_duration=16
pkt_duration_time=0.016000
pkt_pos=20592998
pkt_size=18067
width=1920
height=1080
pix_fmt=yuv420p
sample_aspect_ratio=1:1
pict_type=P
coded_picture_number=7011
display_picture_number=0
interlaced_frame=0
top_field_first=0
repeat_pict=0
color_range=tv
color_space=bt709
color_primaries=bt709
color_transfer=bt709
chroma_location=left
[/FRAME]



Does anyone know the differences and why the bad frame is causing my rendering function to have issues seeking ?


This is how I am muxing the bad video, Im trying to avoid re-encoding, but by re-encoding it seems to fix the issue. Are there any settings I can apply to avoid re-encoding while making the video more durable on Chrome ?


// default export: the ffmpeg muxer
const ytmux = (link, options: any = {}) => {
 const result = new stream.PassThrough({
 highWaterMark: options.highWaterMark || 1024 * 512
 });

 ytdl.getInfo(link, options).then((info: videoInfo) => {
 let audioStream: Readable = ytdl.downloadFromInfo(info, {
 ...options,
 quality: "highestaudio"
 });
 let videoStream: Readable = ytdl.downloadFromInfo(info, {
 ...options,
 quality: "highestvideo"
 });
 // create the ffmpeg process for muxing
 let ffmpegProcess: any = cp.spawn(
 ffmpegPath.path,
 [
 // supress non-crucial messages
 "-loglevel",
 "8",
 "-hide_banner",
 // input audio and video by pipe
 "-i",
 "pipe:3",

 "-i",
 "pipe:4",

 // map audio and video correspondingly
 "-map",
 "0:v",
 "-map",
 "1:a",

 // no need to change the codec
 "-c",
 "copy",

 // Allow output to be seekable, which is needed for mp4 output
 "-movflags",
 "frag_keyframe+empty_moov",

 "-fflags",
 "+genpts",

 "-f",
 "matroska",

 "pipe:5"
 ],
 {
 // no popup window for Windows users
 windowsHide: true,
 stdio: [
 // silence stdin/out, forward stderr,
 "inherit",
 "inherit",
 "inherit",
 // and pipe audio, video, output
 "pipe",
 "pipe",
 "pipe"
 ]
 }
 );

 audioStream.pipe(ffmpegProcess.stdio[4]);
 videoStream.pipe(ffmpegProcess.stdio[3]);
 ffmpegProcess.stdio[5].pipe(result);
 });
 return result;
};



-
Creating GIF from QImages with ffmpeg
17 août 2016, par SierraI would like to generate GIF from QImage, using ffmpeg - all of that programmatically (C++). I’m working with Qt 5.6 and the last build of ffmpeg (build git-0a9e781 (2016-06-10).
I’m already able to convert these QImage in .mp4 and it works. I tried to use the same principle for the GIF, changing format pixel and codec. GIF is generated with two pictures (1 second each), in 15 FPS.
## INITIALIZATION
#####################################################################
// Filepath : "C:/Users/.../qt_temp.Jv7868.gif"
// Allocating an AVFormatContext for an output format...
avformat_alloc_output_context2(formatContext, NULL, NULL, filepath);
...
// Adding the video streams using the default format codecs and initializing the codecs.
stream = avformat_new_stream(formatContext, *codec);
AVCodecContext * codecContext = avcodec_alloc_context3(*codec);
context->codec_id = codecId;
context->bit_rate = 400000;
...
context->pix_fmt = AV_PIX_FMT_BGR8;
...
// Opening the codec...
avcodec_open2(codecContext, codec, NULL);
...
frame = allocPicture(codecContext->width, codecContext->height, codecContext->pix_fmt);
tmpFrame = allocPicture(codecContext->width, codecContext->height, AV_PIX_FMT_RGBA);
...
avformat_write_header(formatContext, NULL);
## ADDING A NEW FRAME
#####################################################################
// Getting in parameter the QImage: newFrame(const QImage & image)
const qint32 width = image.width();
const qint32 height = image.height();
// Converting QImage into AVFrame
for (qint32 y = 0; y < height; y++) {
const uint8_t * scanline = image.scanLine(y);
for (qint32 x = 0; x < width * 4; x++) {
tmpFrame->data[0][y * tmpFrame->linesize[0] + x] = scanline[x];
}
}
...
// Scaling...
if (codec->pix_fmt != AV_PIX_FMT_BGRA) {
if (!swsCtx) {
swsCtx = sws_getContext(codec->width, codec->height,
AV_PIX_FMT_BGRA,
codec->width, codec->height,
codec->pix_fmt,
SWS_BICUBIC, NULL, NULL, NULL);
}
sws_scale(swsCtx,
(const uint8_t * const *)tmpFrame->data,
tmpFrame->linesize,
0,
codec->height,
frame->data,
frame->linesize);
}
frame->pts = nextPts++;
...
int gotPacket = 0;
AVPacket packet = {0};
av_init_packet(&packet);
avcodec_encode_video2(codec, &packet, frame, &gotPacket);
if (gotPacket) {
av_packet_rescale_ts(paket, *codec->time_base, stream->time_base);
paket->stream_index = stream->index;
av_interleaved_write_frame(formatContext, paket);
}But when I’m trying to modify the video codec and pixel format to match with GIF specifications, I’m facing some issues.
I tried several codecs such asAV_CODEC_ID_GIF
andAV_CODEC_ID_RAWVIDEO
but none of them seem to work. During the initialization phase,avcodec_open2()
always returns such kind of errors :Specified pixel format rgb24 is invalid or not supported
Could not open video codec: gifEDIT 17/06/2016
Digging a little bit more,
avcodec_open2()
returns -22 :#define EINVAL 22 /* Invalid argument */
EDIT 22/06/2016
Here are the flags used to compile ffmpeg :
"FFmpeg/Libav configuration: --disable-static --enable-shared --enable-gpl --enable-version3 --disable-w32threads --enable-nvenc --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib"
Did I miss a crucial one for GIF ?
EDIT 27/06/2016
Thanks to Gwen, I have a first output : I setted the
context->pix_fmt
toAV_PIX_FMT_BGR8
. Btw I’m still facing some issues with the generated GIF. It’s not playing and encoding appears to fail.GIF generated in command lines with ffmpeg (left) . . . GIF generated programmatically (right)
It looks like some options are not defined... also may be a wrong conversion between QImage and AVFrame ? I updated the code above. It represents a lot of code, so I tried to stay short. Don’t hesitate to ask more details.
End of EDIT
I’m not really familiar with ffmpeg, any kind of help would be highly appreciated. Thank you.