
Recherche avancée
Médias (91)
-
Spoon - Revenge !
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
My Morning Jacket - One Big Holiday
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Zap Mama - Wadidyusay ?
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
David Byrne - My Fair Lady
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Beastie Boys - Now Get Busy
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (70)
-
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)
Sur d’autres sites (10599)
-
Problems decoding a gstreamer pipeline into images using node.js
18 septembre 2023, par JK2018I have this Gstreamer pipeline that launches a test video , encodes it in h264 and sends it via udp on localhost 5000.


gst-launch-1.0 -v videotestsrc ! videoconvert ! x264enc tune=zerolatency ! h264parse ! queue ! rtph264pay ! udpsink host=127.0.0.1 port=5000



I run this pipeline in my terminal.


Then I have a minimalistic node.js server that is suposed to receive the udp stream coming from the gstreamer pipeline, decode the video into images.
Finally I emit each image (not a fragment of an image) over a socket.


I have tried several approaches unsuccessfully.
My first attempt was using a node gstreamer library and use gstreamer to decode the udp and re encode as images.


My second attempt was using ffmpeg library.
Here is the code below :


const express = require("express");
const http = require("http");
const { Server } = require("socket.io");
const socketIO = require("socket.io");
const cors = require("cors");
const app = express();
const server = http.createServer(app);
const dgram = require("dgram");
const ffmpeg = require("fluent-ffmpeg");

const io = socketIO(server, {
 cors: {
 origin: "http://localhost:3001",
 methods: ["GET", "POST"],
 },
});

app.use(
 cors({
 origin: "http://localhost:3001",
 methods: ["GET", "POST"],
 })
);

const udpServer = dgram.createSocket("udp4");

io.on("connection", (socket) => {
 console.log("A client connected");

 socket.on("disconnect", () => {
 console.log("A client disconnected");
 });
});

// Function to decode an H.264 video frame
function decodeH264Video(inputData, callback) {
 const command = ffmpeg()
 .input(inputData)
 .inputFormat("h264")
 .inputOptions(["-c:v h264"])
 .toFormat("image2")
 .on("end", () => {
 console.log("Decoding complete");
 })
 .on("error", (err) => {
 console.error("Error decoding video:", err);
 })
 .pipe();

 callback(command);
}

// Function to convert a decoded video frame to an image (JPEG format)
function convertVideoFrameToImage(decodedData, callback) {
 const imageStream = decodedData.pipe();
 const imageBuffer = [];

 imageStream.on("data", (chunk) => {
 imageBuffer.push(chunk);
 });

 imageStream.on("end", () => {
 const imageData = Buffer.concat(imageBuffer);
 callback(imageData);
 });
}

udpServer.on("message", (message) => {
 // Decode the UDP packet containing H.264 encoded video
 decodeH264Video(message, (decodedVideo) => {
 // Process the decoded video frame and convert it to an image
 convertVideoFrameToImage(decodedVideo, (imageData) => {
 // Send the image data to connected clients
 io.sockets.emit("image", { imageData: imageData.toString("base64") });
 });
 });
});

udpServer.bind(5000);

server.listen(3000, () => {
 console.log("Server is running on port 3000");
});



Any help is more than welcome


-
FFmpeg zoompan animation results in zig-zag pattern [closed]
2 février 2024, par kregusplease assist, as I am pulling my hair out !


The goal is for a user to specify a zoomPoint in a video as follows :


{
 "scale":4, # Scale factor, here x4
 "offset":0, # Start zooming in after 0ms
 "duration":5000, # Stay zoomed in for 5000ms
 "marginX":80, # Zoom x to 80% of video width 
 "marginY":10 # Zoom y to 10% of video height
}



I am running the following, simplified FFmpeg command :


ffmpeg -i "/tmp/input.mp4" -c:v libx264 -r 30 -vf "scale=iw*2:ih*2,zoompan=fps=30:z='if(gte(it,0.0)*lt(it,5.0), min(pzoom+0.2, 4),if(gte(it,5.0)*lt(it,5.5), max(pzoom-0.2, 1), 1))':d=1:s=1512x982:x='(iw - iw/4)*(80/100)':y='(ih - ih/4)*(10/100)'" /tmp/output.mp4



The animation duration is
0.5s
, the video framerate is30fps
, and so there are0.5 / (1/30) = 15
animation frames.

Zoom distance is
scale - 1 = 3
in this case, which makes the zoom increment3 / 15 = 0.2
.

This results in the following example video : click here


While the zoom animation ends in the correct position, you will notice it arrives at that position in a zig-zag pattern, where it starts by zooming into the top-right corner, before changing direction towards the correct position.


I cannot seem to figure out how to get it to animate the zoom in a straight line to the specified x/y position.


Any tips are welcome, thanks !


-
FFmpeg Wasm, error while creating video from canvas
12 octobre 2023, par NineCattoRulesI'm using ffmpeg.wasm in my Next.JS app.


Here my specs :


"@ffmpeg/ffmpeg": "^0.12.5",
"@ffmpeg/util": "^0.12.0",
"next": "^13.0.6",
"react": "^18.2.0",



I want to simply record a 5s video from a canvas, so I tried :


'use client'

import React, { useEffect, useRef, useState } from 'react';
import { FFmpeg } from '@ffmpeg/ffmpeg';
import { fetchFile } from '@ffmpeg/util';

const CanvasVideoRecorder = () => {
 const canvasRef = useRef(null);
 const videoChunksRef = useRef([]);
 const ffmpegRef = useRef(new FFmpeg({ log: true }));
 const [loaded, setLoaded] = useState(false);
 const [videoUrl, setVideoUrl] = useState(null);

 const load = async () => {
 await ffmpegRef.current.load({
 coreURL: '/js/ffmpeg-core.js',
 wasmURL: '/js/ffmpeg-core.wasm',
 });
 setLoaded(true);
 };

 useEffect(() => {
 const ctx = canvasRef.current.getContext('2d');
 function drawFrame(timestamp) {
 ctx.fillStyle = `rgb(${(Math.sin(timestamp / 500) * 128) + 128}, 0, 0)`;
 ctx.fillRect(0, 0, canvasRef.current.width, canvasRef.current.height);
 requestAnimationFrame(drawFrame);
 }
 requestAnimationFrame(drawFrame);
 }, []);

 const startRecording = async () => {
 const videoStream = canvasRef.current.captureStream(30);
 const videoRecorder = new MediaRecorder(videoStream, { mimeType: 'video/webm' });

 videoRecorder.ondataavailable = (event) => {
 if (event.data.size > 0) {
 videoChunksRef.current.push(event.data);
 }
 };

 videoRecorder.start();
 setTimeout(() => videoRecorder.stop(), 5000);

 videoRecorder.onstop = async () => {
 try {
 await ffmpegRef.current.writeFile('recorded.webm', await fetchFile(new Blob(videoChunksRef.current, { type: 'video/webm' })));

 await ffmpegRef.current.exec('-y', '-i', 'recorded.webm', '-an', '-c:v', 'copy', 'output_copy.webm');

 const data = await ffmpegRef.current.readFile('output_copy.webm');
 const url = URL.createObjectURL(new Blob([data.buffer], { type: 'video/webm' }));

 setVideoUrl(url);
 } catch (error) {
 console.error("Error during processing:", error);
 }
 };
 };

 return (
 <div>
 <canvas ref="{canvasRef}" width="640" height="480"></canvas>

 {loaded ? (
 <>

 <button>Start Recording</button>
 {videoUrl && <video controls="controls" src="{videoUrl}"></video>}
 >
 ) : (
 <button>Load FFmpeg</button>
 )}
 </div>
 );
};

export default CanvasVideoRecorder;



I don't know why but it catch an error :


ErrnoError: FS error



This error occurs when I do this :


await ffmpegRef.current.exec('-y', '-i', 'recorded.webm', '-an', '-c:v', 'copy', 'output_copy.webm');
const data = await ffmpegRef.current.readFile('output_copy.webm');



The
recorded.webm
file is written correctly and I can read it,ffmpegRef.current
is well defined, so what's wrong with my logic, why the exec command doesn't work ?