
Recherche avancée
Autres articles (78)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)
Sur d’autres sites (7646)
-
ts video via UDP from ffmpeg to gstreamer [closed]
23 janvier 2024, par aron.hHardware : Jetson AGX ORIN
Software : Jetpack 5.0.2


I have been attempting to send a video file locally via UDP using ffmpeg :


ffmpeg -stream_loop -1 -re -i test.ts -map 0 -c copy -preset ultrafast -f mpegts "udp://127.0.0.1:5000"


And receiving the same stream via UDP using gstreamer :


gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=H264 ! rtph264depay ! decodebin ! videoconvert ! aasink


But I get an error on the receiving gstreamer end :


/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0: Could not decode stream.
Additional debug info:
gstrtpbasedepayload.c(505): gst_rtp_base_depayload_handle_buffer (): 
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
Received invalid RTP payload, dropping
ERROR: from element /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
The stream is in the wrong format.
Additional debug info:
gstrtph264depay.c(1298): gst_rtp_h264_depay_process ():
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
NAL unit type 27 not supported yet



More detailed information on the video file :


Original ID: 1002
Codec: H264 - MPEG-4 AVC (part 10) (h264)
Type: Video
Video resolution: 1920x1080
Buffer dimensions: 1920x1088
Frame rate: 30
Decoded format: 
Orientation: Top left
Chroma location: left



When I listen with the command
gst-launch-1.0 -v udpsrc port=5000 ! fakesink dump=1
, it is quite apparent that the packets from FFMPEG are being received.
I am not sure why gstreamer's rtph264depay says the stream is in the wrong format.

Would I have to check some details on the FFMPEG side ?
This is what information FFMPEG shows by default while running.


Input #0, mpegts, from 'test.ts':
 Duration: 00:00:57.36, start: 20902.827056, bitrate: 2504 kb/s
 Program 1
 Stream #0:0[0x3ea]: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc
Output #0, mpegts, to 'udp://127.0.0.1:5000':
 Metadata:
 encoder : Lavf58.29.100
 Stream #0:0: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 29.97 fps, 29.97 tbr, 90k tbn, 90k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
frame= 611 fps= 30 q=-1.0 Lsize= 5847kB time=00:00:20.62 bitrate=2323.0kbits/s speed= 1x
video:5350kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 9.301388%



Any advice would be appreciated.


-
Problems decoding a gstreamer pipeline into images using node.js
18 septembre 2023, par JK2018I have this Gstreamer pipeline that launches a test video , encodes it in h264 and sends it via udp on localhost 5000.


gst-launch-1.0 -v videotestsrc ! videoconvert ! x264enc tune=zerolatency ! h264parse ! queue ! rtph264pay ! udpsink host=127.0.0.1 port=5000



I run this pipeline in my terminal.


Then I have a minimalistic node.js server that is suposed to receive the udp stream coming from the gstreamer pipeline, decode the video into images.
Finally I emit each image (not a fragment of an image) over a socket.


I have tried several approaches unsuccessfully.
My first attempt was using a node gstreamer library and use gstreamer to decode the udp and re encode as images.


My second attempt was using ffmpeg library.
Here is the code below :


const express = require("express");
const http = require("http");
const { Server } = require("socket.io");
const socketIO = require("socket.io");
const cors = require("cors");
const app = express();
const server = http.createServer(app);
const dgram = require("dgram");
const ffmpeg = require("fluent-ffmpeg");

const io = socketIO(server, {
 cors: {
 origin: "http://localhost:3001",
 methods: ["GET", "POST"],
 },
});

app.use(
 cors({
 origin: "http://localhost:3001",
 methods: ["GET", "POST"],
 })
);

const udpServer = dgram.createSocket("udp4");

io.on("connection", (socket) => {
 console.log("A client connected");

 socket.on("disconnect", () => {
 console.log("A client disconnected");
 });
});

// Function to decode an H.264 video frame
function decodeH264Video(inputData, callback) {
 const command = ffmpeg()
 .input(inputData)
 .inputFormat("h264")
 .inputOptions(["-c:v h264"])
 .toFormat("image2")
 .on("end", () => {
 console.log("Decoding complete");
 })
 .on("error", (err) => {
 console.error("Error decoding video:", err);
 })
 .pipe();

 callback(command);
}

// Function to convert a decoded video frame to an image (JPEG format)
function convertVideoFrameToImage(decodedData, callback) {
 const imageStream = decodedData.pipe();
 const imageBuffer = [];

 imageStream.on("data", (chunk) => {
 imageBuffer.push(chunk);
 });

 imageStream.on("end", () => {
 const imageData = Buffer.concat(imageBuffer);
 callback(imageData);
 });
}

udpServer.on("message", (message) => {
 // Decode the UDP packet containing H.264 encoded video
 decodeH264Video(message, (decodedVideo) => {
 // Process the decoded video frame and convert it to an image
 convertVideoFrameToImage(decodedVideo, (imageData) => {
 // Send the image data to connected clients
 io.sockets.emit("image", { imageData: imageData.toString("base64") });
 });
 });
});

udpServer.bind(5000);

server.listen(3000, () => {
 console.log("Server is running on port 3000");
});



Any help is more than welcome


-
FFmpeg zoompan animation results in zig-zag pattern [closed]
2 février 2024, par kregusplease assist, as I am pulling my hair out !


The goal is for a user to specify a zoomPoint in a video as follows :


{
 "scale":4, # Scale factor, here x4
 "offset":0, # Start zooming in after 0ms
 "duration":5000, # Stay zoomed in for 5000ms
 "marginX":80, # Zoom x to 80% of video width 
 "marginY":10 # Zoom y to 10% of video height
}



I am running the following, simplified FFmpeg command :


ffmpeg -i "/tmp/input.mp4" -c:v libx264 -r 30 -vf "scale=iw*2:ih*2,zoompan=fps=30:z='if(gte(it,0.0)*lt(it,5.0), min(pzoom+0.2, 4),if(gte(it,5.0)*lt(it,5.5), max(pzoom-0.2, 1), 1))':d=1:s=1512x982:x='(iw - iw/4)*(80/100)':y='(ih - ih/4)*(10/100)'" /tmp/output.mp4



The animation duration is
0.5s
, the video framerate is30fps
, and so there are0.5 / (1/30) = 15
animation frames.

Zoom distance is
scale - 1 = 3
in this case, which makes the zoom increment3 / 15 = 0.2
.

This results in the following example video : click here


While the zoom animation ends in the correct position, you will notice it arrives at that position in a zig-zag pattern, where it starts by zooming into the top-right corner, before changing direction towards the correct position.


I cannot seem to figure out how to get it to animate the zoom in a straight line to the specified x/y position.


Any tips are welcome, thanks !