
Recherche avancée
Autres articles (40)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (5901)
-
avfilter/x86/vf_blend : use unaligned movs for output
19 janvier 2022, par Marton Balint -
node.js ffmpeg spawn child_process unexpected data output
5 septembre 2021, par PLNR
I'm rather new to backend stuff, so please excuse, if my question is trivial.

For an Intranet project, I want to present a video element in a webpage (React, HLS.js Player).

The video sources are mpeg-ts streams delivered as udp multicast.

A small node.js / express server should handle the ffmpeg commands, to transcode the multicast to hls to display them in a browser.



Problem is the Output :

The output is emitted on stderr... even the process is working as expected.

Here is the respective code I wrote for transcoding so far :

const express = require("express");
const { spawn, exec } = require("child_process");
const process = require("process")

let ls;

const app = express();

app.get('/cam/:source', (body) => {
 const cam = body.params.source;
 console.log(cam);

 let source = "udp://239.1.1.1:4444";
 let stream = "/var/www/html/streams/tmp/cam1.m3u8"


 stream = spawn("ffmpeg", ["-re", "-i", source, "-c:v", "libx264", "-crf", "21", "-preset", "veryfast", "-c:a", "aac", "-b:a", "128k", "-ac", "2", "-f", "hls", "-hls_list_size", "5", "-hls_flags", "delete_segments", stream], {detached: true});

 stream.stdout.on("data", data => {
 console.log(`stdout: ${data}`);
 });

 stream.stderr.on("data", data => {
 console.log(`stderr: ${data}`);
 });

 stream.on("error", error => {
 console.log(`error: ${error.message}`);
 });

 stream.on("close", code => {
 console.log(`child process exited with code ${code}`);
 });
})

app.listen(5000, ()=> {
 console.log('Listening');
})



This is maybe only cosmetics, but it makes me wondering.

Here is the terminal output :

[nodemon] starting `node server.js`
Listening
camera stream reloaded
stderr: ffmpeg version 4.3.2-0+deb11u1ubuntu1 Copyright (c) 2000-2021 the FFmpeg developers
 built with gcc 10 (Ubuntu 10.2.1-20ubuntu1)

 --shortend--


pid: 4206
stderr: frame= 8 fps=0.0 q=0.0 size=N/A time=00:00:00.46 bitrate=N/A speed=0.931x 
pid: 4206
stderr: frame= 21 fps= 21 q=26.0 size=N/A time=00:00:00.96 bitrate=N/A speed=0.95x 
pid: 4206
stderr: frame= 33 fps= 22 q=26.0 size=N/A time=00:00:01.49 bitrate=N/A speed=0.982x 
pid: 4206
stderr: frame= 46 fps= 23 q=26.0 size=N/A time=00:00:02.00 bitrate=N/A speed=0.989x 
pid: 4206
stderr: frame= 58 fps= 23 q=26.0 size=N/A time=00:00:02.49 bitrate=N/A speed=0.986x 
pid: 4206



and so on...



Any helpful information would be highly appreciated !

Many thanks in advance

-
How can I create a side-by-side horizontally stacked video with a delay in video and audio using ffmpeg-python ?
3 septembre 2021, par kupI am trying to create a video from two videos that will be stacked horizontally, but the resulting video is not what I expected. The same video is appearing on both sides (left and right). How can that be achieved ?


The ffmpeg command I tried, which kind of worked :


ffmpeg -i left.mp4 -i right.mp4 -filter_complex "
[0:v]setpts=PTS-STARTPTS[l];
[1:v]setpts=PTS-STARTPTS,tpad=start_duration=14:start_mode=add:color=black[r];
[l][r]hstack=inputs=2[stacked]; [0:a][1:a]amix=inputs=2[a]
" -map "[stacked]" -map "[a]" -c:a aac -preset superfast result.mp4



The ffmpeg-python script that's not working :


in0 = ffmpeg.input('0.mp4')
in1 = ffmpeg.input('right.mkv')
aout = ffmpeg.filter([in0.audio, in1.audio.filter('adelay', "5000|5000")],'amix')
vout = ffmpeg.filter([inv0.video, inv1.video.filter('tpad', start_duration=5, start_mode='add', color='black')], 'hstack')

(
 ffmpeg
 .concat(vout, aout, v=1, a=1)
 .output("out.mkv")
 .run()
)



This script is not resulting in the expected output which is two videos side-by-side horizontally stacked.