
Recherche avancée
Médias (1)
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
Autres articles (104)
-
Formulaire personnalisable
21 juin 2013, parCette page présente les champs disponibles dans le formulaire de publication d’un média et il indique les différents champs qu’on peut ajouter. Formulaire de création d’un Media
Dans le cas d’un document de type média, les champs proposés par défaut sont : Texte Activer/Désactiver le forum ( on peut désactiver l’invite au commentaire pour chaque article ) Licence Ajout/suppression d’auteurs Tags
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire. (...) -
Qu’est ce qu’un masque de formulaire
13 juin 2013, parUn masque de formulaire consiste en la personnalisation du formulaire de mise en ligne des médias, rubriques, actualités, éditoriaux et liens vers des sites.
Chaque formulaire de publication d’objet peut donc être personnalisé.
Pour accéder à la personnalisation des champs de formulaires, il est nécessaire d’aller dans l’administration de votre MediaSPIP puis de sélectionner "Configuration des masques de formulaires".
Sélectionnez ensuite le formulaire à modifier en cliquant sur sont type d’objet. (...) -
Monitoring de fermes de MediaSPIP (et de SPIP tant qu’à faire)
31 mai 2013, parLorsque l’on gère plusieurs (voir plusieurs dizaines) de MediaSPIP sur la même installation, il peut être très pratique d’obtenir d’un coup d’oeil certaines informations.
Cet article a pour but de documenter les scripts de monitoring Munin développés avec l’aide d’Infini.
Ces scripts sont installés automatiquement par le script d’installation automatique si une installation de munin est détectée.
Description des scripts
Trois scripts Munin ont été développés :
1. mediaspip_medias
Un script de (...)
Sur d’autres sites (5212)
-
avfilter/vf_tinterlace : support full-range YUV
9 décembre 2022, par Niklas Haasavfilter/vf_tinterlace : support full-range YUV
This filter, when used in the "pad" mode, currently makes the
distinction between limited and full range solely by testing for YUVJ
pixel formats at link setup time. This is deprecated and should be
improved to perform the detection based on the per-frame metadata.In order to make this distinction based on color range metadata, which
is only known at the time of filtering frames, for simplicity, we simply
allocate two copies of the "black" frame - one for limited range and the
other for full range metadata. This could be done more dynamically (e.g.
as-needed or simply by blitting the appropriate pixel value directly),
but this change is relatively simple and preserves the structure of the
existing code.This commit actually fixes a bug in FATE - the new output is correct for
the first time. The previous md5 ref was of a frame that incorrectly
combined full-range pixel data with limited-range black fields. The
corresponding result has been updated.Signed-off-by : Niklas Haas <git@haasn.dev>
-
JavaCV Fetch RTSP Meet "avcodec_open2() error -40 : Could not open audio codec" Error
6 avril 2019, par Jeremy LuI am trying to use javacv to fetch a rtsp monitoring camera and grabber the frames, but meet "avcodec_open2() error -40 : Could not open audio codec" error when the grabber is prepare to start after setup the parameters.
Javacv version :
- org.bytedeco.javacv-platform, 1.4.1
- org.bytedeco.javacpp-presets.opencv-platform, 3.4.1-1.4.1
Here is the Jave Exception Infomation :
org.bytedeco.javacv.FrameGrabber$Exception: avcodec_open2() error -40: Could not open audio codec.
at org.bytedeco.javacv.FFmpegFrameGrabber.startUnsafe(FFmpegFrameGrabber.java:835) ~[javacv-1.4.1.jar:1.4.1]
at org.bytedeco.javacv.FFmpegFrameGrabber.start(FFmpegFrameGrabber.java:663) ~[javacv-1.4.1.jar:1.4.1]rtsp infomation :
For the rtsp source, I also tried to download the FFplay and use command.\ffplay.exe -rtsp_transport tcp rtsp://admin:DWUUUP@10.193.8.71
to show the frames from monitoring camera, and it works, and with following information
-
Input #0, rtsp, from ’rtsp ://admin:DWUUUP@10.193.8.71’ :
- Metadata :
- title : Media Presentation
- Duration : N/A, start : 0.000000, bitrate : N/A
- Stream #0:0 : Video : h264 (Main), yuvj420p(pc, bt709, progressive), 1920x1080 [SAR 1:1 DAR 16:9], 15 fps, 24.17 tbr, 90k tbn, 30 tbc
- Stream #0:1 : Audio : aac, 16000 Hz, 1 channels, fltp
- Metadata :
-
[aac @ 000001fc123be880] Audio object type 0 is not implemented. Update your FFmpeg version to the newest one from Git. If the problem still occurs, it means that your file has a feature which has not been implemented.
And here is the source code :
FFmpegFrameGrabber grabber = new FFmpegFrameGrabber ("rtsp://admin:DWUUUP@10.193.8.71");
grabber.setImageHeight(640);
grabber.setImageWidth(360);
grabber.setOption("rtsp_transport", "tcp");
grabber.start(); // Where meet the problem
Java2DFrameConverter converter = new Java2DFrameConverter();
BufferedImage bufferedImage = converter.convert( grabber.grab() );
ByteArrayOutputStream bateArrayOutputStream = new ByteArrayOutputStream();
ImageIO.write(bufferedImage, "jpeg", bateArrayOutputStream);
byte[] data = bateArrayOutputStream.toByteArray();
bateArrayOutputStream.close();I have tested to change the rtsp to a mp4 file, the code can work.
How can I fix this problem ? Thanks very much !
-
How to fix here "EPIPE" in Node js Socket.io
23 avril, par Mehdi008Receiving this error :


Error: write EPIPE
 at afterWriteDispatched (node:internal/stream_base_commons:161:15)
 at writeGeneric (node:internal/stream_base_commons:152:3)
 at Socket._writeGeneric (node:net:958:11)
 at Socket._write (node:net:970:8)
 at doWrite (node:internal/streams/writable:598:12)
 at clearBuffer (node:internal/streams/writable:783:7)
 at onwrite (node:internal/streams/writable:653:7)
 at WriteWrap.onWriteComplete [as oncomplete] (node:internal/stream_base_commons:107:10)
 Emitted 'error' event on Socket instance at:
 at emitErrorNT (node:internal/streams/destroy:169:8)
 at emitErrorCloseNT (node:internal/streams/destroy:128:3)
 at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
 errno: -4047,
 code: 'EPIPE',
 syscall: 'write'
 }



// for this code: const { spawn } = require("child_process");

module.exports = (socket, pool) => {
 
 let stream_req_token = "";
 let rtmps_array = [];

 socket.on("stream_request_token",async(token)=>{

 const stream_reqs = await pool`SELECT * FROM stream_request WHERE token=${token}`;


 if(stream_reqs.length === 0){
 
 socket.emit("token_validation_response",false);
 return;
 }

 const stream_req = stream_reqs[0];

 if(!stream_req.is_valid){

 socket.emit("token_validation_response",false);
 return;
 }

 socket.emit("token_validation_response",true);

 stream_req_token = token;


 })

 socket.on("rtmps_array",async(array)=>{

 try{

 const rtmps = JSON.parse(array);

 const streams_requests = await pool`SELECT id FROM stream_request WHERE token=${stream_req_token}`;
 const stream_req_id = streams_requests[0].id;

 rtmps_array = rtmps;

 rtmps.map(async(rtmp)=>{

 await pool`INSERT INTO stream (platform,url,url_key,stream_request_id) 
 VALUES(${rtmp.platform},${rtmp.url},${rtmp.key},${stream_req_id})`;
 })

 socket.emit("rtmps_array_response",true);

 } catch(err){

 console.log(err);
 socket.emit("rtmps_array_response",false);

 }


 })

 //Start Streaming
 let ffmpegProcess = null;
 let isStreaming = false; // Flag to track streaming state

 socket.on("stream", (chunk) => {
 if (!ffmpegProcess) {
 console.log('Initializing FFmpeg process...');

 // Spawn FFmpeg process
 const resolution = "1280x720"; // Change to "1920x1080" for 1080p

 const ffmpegArgs = [
 "-i", "pipe:0", // Input from stdin
 "-c:v", "libx264", // Video codec
 "-preset", "veryfast", // Low latency encoding
 "-b:v", "4500k", // Target average bitrate (4.5 Mbps)
 "-minrate", "2500k", // Minimum bitrate
 "-maxrate", "6000k", // Maximum bitrate
 "-bufsize", "16000k", // Buffer size (twice the max bitrate)
 "-r", "30", // **FORCE 30 FPS**
 "-g", "60", // Keyframe interval (every 2 seconds)
 "-tune", "zerolatency", // Low latency tuning
 "-sc_threshold", "0", // Constant bitrate enforcement
 "-flags", "+global_header",
 
 // **Resolution Fix**
 "-s", resolution, // **Set resolution to 720p or 1080p**
 "-aspect", "16:9", // **Maintain aspect ratio**
 
 // **Frame Rate Fix**
 "-vsync", "cfr", // **Forces Constant Frame Rate (CFR)**
 "-fps_mode", "cfr", // **Prevents FFmpeg from auto-adjusting FPS**
 
 // Audio settings
 "-c:a", "aac",
 "-b:a", "128k", // Audio bitrate
 "-ar", "44100", // Audio sample rate
 "-ac", "2", // Stereo audio
 
 "-f", "flv" // Output format
 ];
 
 // Map the streams to multiple RTMP destinations
 rtmps_array.forEach((rtmp) => {
 ffmpegArgs.push("-map", "0:v:0", "-map", "0:a:0", "-f", "flv", `${rtmp.url}/${rtmp.key}`);
 });
 
 // Spawn FFmpeg process
 ffmpegProcess = spawn("ffmpeg", ffmpegArgs);

 ffmpegProcess.stderr.on('data', (data) => {
 console.log(`FFmpeg STDERR: ${data}`);
 });

 ffmpegProcess.on('close', (code) => {
 console.log(`FFmpeg process closed with code ${code}`);
 ffmpegProcess = null; // Reset process
 isStreaming = false; // Reset streaming state
 });

 ffmpegProcess.on('error', (err) => {
 console.error(`FFmpeg process error: ${err.message}`);
 ffmpegProcess = null; // Reset process
 isStreaming = false; // Reset streaming state
 });

 console.log('FFmpeg process started.');
 isStreaming = true; // Set streaming state to true
 }

 // Write chunk to FFmpeg process
 if (isStreaming && ffmpegProcess && ffmpegProcess.stdin && !ffmpegProcess.stdin.destroyed) {
 try {
 ffmpegProcess.stdin.write(chunk); // Write chunk to stdin
 console.log('Chunk written to FFmpeg.');
 } catch (err) {
 console.error('Error writing chunk to FFmpeg stdin:', err.message);
 }
 } else {
 console.error('FFmpeg process or stdin is not ready.');
 }
});

socket.on("stop-stream", async() => {
 console.log('Stream Stopped.');

 if(stream_req_token.length !== 0){

 await pool`UPDATE stream_request 
 SET is_valid=false
 WHERE token=${stream_req_token}`

 await pool`DELETE FROM current_streams WHERE id=${stream_req_token}`; 
 }

 if (ffmpegProcess) {
 isStreaming = false; // Set streaming state to false

 try {
 // Check if stdin is open before closing
 if (ffmpegProcess.stdin) {
 ffmpegProcess.stdin.end(); // End stdin safely
 }

 // Wait for FFmpeg to close before setting to null
 ffmpegProcess.on("close", () => {
 console.log("FFmpeg process closed.");
 ffmpegProcess = null;
 });

 // Kill FFmpeg process
 ffmpegProcess.kill("SIGTERM"); // Use SIGTERM for graceful exit

 } catch (err) {
 console.error("Error while stopping FFmpeg:", err.message);
 }
 } else {
 console.log("No active FFmpeg process.");
 }
});


socket.on('error', (err) => {
 console.error('Socket error:', err);
});

socket.on("disconnect", async() => {
 console.log('Client Disconnected.');

 if(stream_req_token.length !== 0){

 await pool`UPDATE stream_request 
 SET is_valid=false
 WHERE token=${stream_req_token}`;

 await pool`DELETE FROM current_streams WHERE id=${stream_req_token}`; 

 }
 
 if (ffmpegProcess) {
 isStreaming = false; // Set streaming state to false

 try {
 // Check if stdin is open before closing
 if (ffmpegProcess.stdin) {
 ffmpegProcess.stdin.end(); // End stdin safely
 }

 // Wait for FFmpeg to close before setting to null
 ffmpegProcess.on("close", () => {
 console.log("FFmpeg process closed.");
 ffmpegProcess = null;
 });

 // Kill FFmpeg process
 ffmpegProcess.kill("SIGTERM"); // Use SIGTERM for graceful exit


 } catch (err) {
 console.error("Error while stopping FFmpeg:", err.message);
 }
 } else {
 console.log("No active FFmpeg process.");
 }
});

};