
Recherche avancée
Autres articles (36)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Personnaliser les catégories
21 juin 2013, parFormulaire de création d’une catégorie
Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire.
Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...) -
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)
Sur d’autres sites (5959)
-
How to schedule ffmpeg audio streams of varying lengths with titles and metadata ?
10 septembre 2021, par underdootI have a Shoutcast stream that I want to archive 24/7 with ffmpeg. Shows on this stream vary in length between 1-3 hours, but have a regular weekly schedule.


How do I download these shows, and give the files appropriate titles according to the show playing (determined by day of week and hour of day) ? I want the title of files to be along the lines of "2021.09.21 - 0300-0400 — Radio Show Name" and automatically add to a set folder. I also want to run the program only once, and have it running in the background constantly.


-
FFMPEG Api conversion from YUV420P to RGB produces strange output
20 novembre 2024, par fasc8I'm using the FFMPEG Api in Rust to get RGB images from video files.


While some videos work correct and I get the frames back as expected, some work not. Or at least the result is not the way I expected it to be.


The code I use in Rust :


ffmpeg::init().unwrap();

let in_ctx = input(&Path::new(source)).unwrap();
let input = in_ctx
 .streams()
 .best(Type::Video)
 .ok_or(ffmpeg::Error::StreamNotFound)?;

let decoder = input.codec().decoder().video()?;

let scaler = Context::get(
 decoder.format(),
 decoder.width(),
 decoder.height(),
 Pixel::RGB24,
 decoder.width(),
 decoder.height(),
 Flags::FULL_CHR_H_INT | Flags::ACCURATE_RND,
)?; // <--- Is basically sws_getContext

// later to get the actual frame
let mut decoded = Video::empty();
if self.decoder.receive_frame(&mut decoded).is_ok() {
 let mut rgb_frame = Video::empty();
 self.scaler.run(&decoded, &mut rgb_frame)?; // <--- Does sws_scale
 println!("Converted Pixel Format: {}", rgb_frame.format() as i32);
 Ok(Some(rgb_frame))
}



Which should roughly translate to C like so :


// Get the context and video stream
SwsContext * ctx = sws_getContext(imgWidth, imgHeight,
 imgFormat, imgWidth, imgHeight,
 AV_PIX_FMT_RGB24, 0, 0, 0, 0);
sws_scale(ctx, decoded.data, decoded.linesize, 0, decoded.height, rgb_frame.data, rbg_frame.linesize);



And like I said earlier, sometimes it works fine and I get the expected frame back. But sometimes I get something like this :
Weird result image


I saved the images as .ppm files for quick visual comparison. I used this method, which basically writes the bytes to a file with a simple .ppm header :


fn save_file(frame: &Video, index: usize) -> std::result::Result<(), std::io::Error> 
{
 let mut file = File::create(format!("frame{}.ppm", index))?;
 file.write_all(format!("P6\n{} {}\n255\n", frame.width(), frame.height()).as_bytes())?;
 file.write_all(frame.data(0))?;
 Ok(())
}



Here you can see that on the left side there is a good image result vs. on the right side there is a bad image result.
Comparison of the .ppm files


To come to the question now :


Why is this happening. I tested everything on my side and the only thing left is ffmpeg conversion. FFMPEG seems to convert these two test files differently even though it reports YUV420P as format for both. I cannot figure out what the difference may be...


Here the info for the two video files i used :


Good video file :


General
Complete name : /mnt/smb/Snapchat-174933781.mp4
Format : MPEG-4
Format profile : Base Media / Version 2
Codec ID : mp42 (isom/mp42)
File size : 1.90 MiB
Duration : 9 s 612 ms
Overall bit rate : 1 661 kb/s
Encoded date : UTC 2021-07-28 22:09:36
Tagged date : UTC 2021-07-28 22:09:36
eng : -180.00

Video
ID : 512
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High@L3.1
Format settings : CABAC / 1 Ref Frames
Format settings, CABAC : Yes
Format settings, Reference frames : 1 frame
Format settings, GOP : M=1, N=30
Codec ID : avc1
Codec ID/Info : Advanced Video Coding
Duration : 9 s 598 ms
Bit rate : 1 597 kb/s
Width : 480 pixels
Height : 944 pixels
Display aspect ratio : 0.508
Frame rate mode : Variable
Frame rate : 29.797 FPS
Minimum frame rate : 15.000 FPS
Maximum frame rate : 30.000 FPS
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Bits/(Pixel*Frame) : 0.118
Stream size : 1.83 MiB (96%)
Title : Snap Video
Language : English
Encoded date : UTC 2021-07-28 22:09:36
Tagged date : UTC 2021-07-28 22:09:36
Color range : Full
colour_range_Original : Limited
Color primaries : BT.709
Transfer characteristics : BT.601
transfer_characteristics_Original : BT.709
Matrix coefficients : BT.709
Codec configuration box : avcC

Audio
ID : 256
Format : AAC LC
Format/Info : Advanced Audio Codec Low Complexity
Codec ID : mp4a-40-2
Duration : 9 s 612 ms
Bit rate mode : Constant
Bit rate : 62.0 kb/s
Channel(s) : 1 channel
Channel layout : C
Sampling rate : 44.1 kHz
Frame rate : 43.066 FPS (1024 SPF)
Compression mode : Lossy
Stream size : 73.3 KiB (4%)
Title : Snap Audio
Language : English
Encoded date : UTC 2021-07-28 22:09:36
Tagged date : UTC 2021-07-28 22:09:36



Bad video file :


General
Complete name : /mnt/smb/Snapchat-1989594918.mp4
Format : MPEG-4
Format profile : Base Media / Version 2
Codec ID : mp42 (isom/mp42)
File size : 2.97 MiB
Duration : 6 s 313 ms
Overall bit rate : 3 948 kb/s
Encoded date : UTC 2019-07-11 06:43:04
Tagged date : UTC 2019-07-11 06:43:04
com.android.version : 9

Video
ID : 1
Format : AVC
Format/Info : Advanced Video Codec
Format profile : Baseline@L3.1
Format settings : 1 Ref Frames
Format settings, CABAC : No
Format settings, Reference frames : 1 frame
Format settings, GOP : M=1, N=30
Codec ID : avc1
Codec ID/Info : Advanced Video Coding
Duration : 6 s 313 ms
Bit rate : 3 945 kb/s
Width : 496 pixels
Height : 960 pixels
Display aspect ratio : 0.517
Frame rate mode : Variable
Frame rate : 29.306 FPS
Minimum frame rate : 19.767 FPS
Maximum frame rate : 39.508 FPS
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Bits/(Pixel*Frame) : 0.283
Stream size : 2.97 MiB (100%)
Title : VideoHandle
Language : English
Encoded date : UTC 2019-07-11 06:43:04
Tagged date : UTC 2019-07-11 06:43:04
Color range : Limited
Color primaries : BT.709
Transfer characteristics : BT.709
Matrix coefficients : BT.709
Codec configuration box : avcC



Or as a diff image :


The problem is that I am not that familiar with ffmpeg yet I don't know all the quirks it has.


I hope someone can point me in the right direction.


-
node.js ffmpeg spawn child_process unexpected data output and not possible to kill or stop
1er septembre 2021, par PLNR
I'm rather new to backend stuff, so please excuse, if my question is trivial.

For an Intranet project, I want to present a video element in a webpage (React, HLS.js Player) with the possibility reload the stream in case it hung.

The video sources are mpeg-ts streams delivered as udp multicast.

A small node.js / express server should handle the ffmpeg commands, to transcode the multicast to hls to display them in a browser.



My Problem is, that I am unable to stop the process...

I've tried to SIGINT, SIGTERM the process with it's pid, but the encoding is just going on.

Here is the code I wrote for transcoding so far :


const express = require("express");
const { spawn, exec } = require("child_process");
const cors = require("cors");
const process = require("process")

let ls;
let pid;

const app = express();

app.use(cors());

app.get('/cam/:source', (body) => {
 const cam = body.params.source;
 console.log(cam);

 let source = "udp://239.1.1.1:4444";
 let stream = "/var/www/html/streams/tmp/cam1.m3u8"


 stream = spawn("ffmpeg", ["-re", "-i", source, "-c:v", "libx264", "-crf", "21", "-preset", "veryfast", "-c:a", "aac", "-b:a", "128k", "-ac", "2", "-f", "hls", "-hls_list_size", "5", "-hls_flags", "delete_segments", stream], {detached: true});

 pid = stream.pid;

 stream.stdout.on("data", data => {
 console.log(`stdout: ${data}`);
 });

 stream.stderr.on("data", data => {
 console.log(`stderr: ${data}`);
 console.log("pid: ", pid);
 });

 stream.on("error", error => {
 console.log(`error: ${error.message}`);
 });

 stream.on("close", code => {
 console.log(`child process exited with code ${code}`);
 });
})

app.get('/cam/quit', () => {
 process.kill(pid, 'SIGINT')
})

app.listen(5000, ()=> {
 console.log('Listening');
})



On requesting the /cam/quit i would expect the process to stop... but it doesn't.

I've tried :

app.get('/cam/quit', () => {
 process.kill(pid, 'SIGINT')
})



app.get('/cam/quit', () => {
 process.kill(stream.pid, 'SIGINT')
})



app.get('/cam/quit', () => {
 stream.kill(pid, 'SIGINT')
})



app.get('/cam/quit', () => {
 stream.kill(0, 'SIGINT')
})



But none of this worked... do I miss something ? Is there something I need to change ?


Problem 2 - Output :

The output is emitted on stderr... even the process is working as expected.

This is maybe only cosmetics, but it makes me wondering.

Here is the terminal output :

[nodemon] starting `node server.js`
Listening
camera stream reloaded
stderr: ffmpeg version 4.3.2-0+deb11u1ubuntu1 Copyright (c) 2000-2021 the FFmpeg developers
 built with gcc 10 (Ubuntu 10.2.1-20ubuntu1)
 configuration: --prefix=/usr --extra-version=0+deb11u1ubuntu1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
 libavutil 56. 51.100 / 56. 51.100
 libavcodec 58. 91.100 / 58. 91.100
 libavformat 58. 45.100 / 58. 45.100
 libavdevice 58. 10.100 / 58. 10.100
 libavfilter 7. 85.100 / 7. 85.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 7.100 / 5. 7.100
 libswresample 3. 7.100 / 3. 7.100
 libpostproc 55. 7.100 / 55. 7.100

pid: 4206
stderr: Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'udp://239.1.1.1':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 creation_time : 
pid: 4206
stderr: 1970-01-01T00:00:00.000000Z
 encoder : Lavf52.54.0
 Duration: 01:55:59.20, start: 0.000000, bitrate: 1436 kb/s
 Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 720x384 [SAR 1:1 DAR 15:8], 1272 kb/s, 25 fps, 25 tbr, 25 tbn, 50 tbc (default)
 Metadata:
 creation_time : 1970-01-01T00:00:00.000000Z
 handler_name : VideoHandler
 Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 159 kb/s (default)
 Metadata:
 creation_time : 1970-01-01T00:00:00.000000Z
 handler_name : SoundHandler

pid: 4206
stderr: Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
 Stream #0:1 -> #0:1 (aac (native) -> aac (native))
Press [q] to stop, [?] for help

pid: 4206
stderr: [libx264 @ 0x5616b38b5440] using SAR=1/1

pid: 4206
stderr: [libx264 @ 0x5616b38b5440] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2

pid: 4206
stderr: [libx264 @ 0x5616b38b5440] profile High, level 3.0, 4:2:0, 8-bit

pid: 4206
stderr: [libx264 @ 0x5616b38b5440] 264 - core 160 r3011 cde9a93 - H.264/MPEG-4 AVC codec - Copyleft 2003-2020 - http://www.videolan.org/x264.html - options: cabac=1 ref=1 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=2 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=1 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=10 rc=crf mbtree=1 crf=21.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, hls, to '/var/www/html/streams/tmp/aft.m3u8':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 encoder : Lavf58.45.100
 Stream #0:0(und): Video: h264 (libx264), yuv420p, 720x384 [SAR 1:1 DAR 15:8], q=-1--1, 25 fps, 90k tbn, 25 tbc (default)
 Metadata:
 creation_time : 1970-01-01T00:00:00.000000Z
 handler_name : VideoHandler
 encoder : Lavc58.91.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
 Stream #0:1(und): Audio: aac (LC), 48000 Hz, stereo, fltp, 128 kb/s (default)
 Metadata:
 creation_time : 1970-01-01T00:00:00.000000Z
 handler_name : SoundHandler
 encoder : Lavc58.91.100 aac

pid: 4206
stderr: frame= 8 fps=0.0 q=0.0 size=N/A time=00:00:00.46 bitrate=N/A speed=0.931x 
pid: 4206
stderr: frame= 21 fps= 21 q=26.0 size=N/A time=00:00:00.96 bitrate=N/A speed=0.95x 
pid: 4206
stderr: frame= 33 fps= 22 q=26.0 size=N/A time=00:00:01.49 bitrate=N/A speed=0.982x 
pid: 4206
stderr: frame= 46 fps= 23 q=26.0 size=N/A time=00:00:02.00 bitrate=N/A speed=0.989x 
pid: 4206
stderr: frame= 58 fps= 23 q=26.0 size=N/A time=00:00:02.49 bitrate=N/A speed=0.986x 
pid: 4206



and so on...



Any help would be highly appreciated !

Many thanks in advance