
Recherche avancée
Autres articles (85)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...) -
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.
Sur d’autres sites (8389)
-
exit when done ffmpeg [closed]
28 décembre 2012, par Giacomo King PatermoHow to close Command Prompt when FFMPEG finish to convert ? I try this code :
ffmpeg.exe -loglevel quiet -i "input.avi" -qscale 0 "output.avi"
but does not exit when ffmpeg finish to convert, hide only the output.
Thanks.
-
Run docker container with FFMPEG rstp stream on websockets
18 février, par bmvrI have create a node js application that uses ffmpeg with spawn node library.


Here's the backend sample :


const startStreamWs = (cameraId, rtsp_url) => {
 console.log(`Starting stream for camera: ${cameraId}`);

 const ffmpeg = spawn("ffmpeg", [
 "-rtsp_transport", "tcp", // Use TCP for reliable streaming
 "-i", rtsp_url,
 "-analyzeduration", "5000000", // Increase analyzeduration
 "-probesize", "5000000", // Increase probesize
 "-fflags", "nobuffer", // Reduce buffering
 "-flags", "low_delay", // Low latency
 "-strict", "experimental",
 "-max_delay", "200000", // Reduce max delay for faster response
 "-bufsize", "2M", // Buffer size for smoother streaming
 "-f", "mpegts", // MPEG-TS container for streaming
 "-codec:v", "mpeg1video", // MPEG-1 video codec
 "-s", "1280x720", // Video resolution
 "-r", "25", // Frame rate (25 fps)
 "-b:v", "1500k", // Bitrate for video
 "-maxrate", "2000k", // Maximum bitrate
 "-bufsize", "2M", // Buffer size (needed with maxrate)
 "-bf", "0", // Disable B-frames for lower latency
 "-an", // Disable audio
 "-"
]);


 ffmpeg.stdout.on("data", (data) => {
 if (cameraStreams[cameraId]) {
 console.log(`Data sent for camera ${cameraId}`);
 // Broadcast stream data to all connected clients
 for (const client of cameraStreams[cameraId].clients) {
 if (client.readyState === ws.OPEN) {
 client.send(data);
 }
 }
 }
 });

 ffmpeg.stderr.on("data", (data) => {
 console.error(`FFmpeg stderr (Camera ${cameraId}): ${data.toString()}`);
 logErrorToFile(data);
 });

 ffmpeg.on("close", (code) => {
 console.log(`FFmpeg process exited for Camera ${cameraId} with code ${code}`);
 if (cameraStreams[cameraId]) {
 // Close all remaining clients
 for (const client of cameraStreams[cameraId].clients) {
 client.close();
 }
 delete cameraStreams[cameraId];
 }
 });

 return ffmpeg;
};



Front End Sample my angular component


import { Component, OnDestroy, OnInit } from '@angular/core';
import { FormsModule } from '@angular/forms';
import { ActivatedRoute } from '@angular/router';

declare var JSMpeg: any; // Declare JSMpeg from the global script

@Component({
 selector: 'app-video-player',
 templateUrl: './video-player.component.html',
 styleUrls: ['./video-player.component.css'],
 standalone: false
})
export class VideoPlayerComponent implements OnInit, OnDestroy {
 stepSize: number = 0.1;
 private player: any;
 cameraId: string | null = null;
 ws: WebSocket | null = null; 
 wsUrl: string | null = null;

 constructor(private route: ActivatedRoute) {
 this.cameraId = this.route.snapshot.paramMap.get('id');
 this.wsUrl = `ws://localhost:8085?cameraId=${this.cameraId}`;
 this.ws = new WebSocket(this.wsUrl);
 }

 async ngOnInit(): Promise<void> {
 const canvas = document.getElementById('videoCanvas') as HTMLCanvasElement;
 this.player = new JSMpeg.Player(this.wsUrl, { canvas: canvas });
 }

 async ngOnDestroy(): Promise<void> {
 this.ws?.close(1000, "Exiting");
 }

 getStepSize(): number {
 return this.stepSize;
 }
}


</void></void>


On localhost is fine, once i containerize, it's not. I can serve the website but not the stream.
I have the same version FFMPEG 7.1 and the codec is available.
Although I run localhost macosx and unbutu on docker.


-
how to read/write streams of ProcessBuilder and finally create a output file(output.mp4) ffmpeg
18 janvier 2015, par user2303069I am trying to transcode a video using ffmpeg with java.ProcessBuilder. I read the other threads regarding this but it doesn’t really solve my problem. I think everything goes well up till Process p=pb.start(). I know I have to read some stream of ProcessBuilder and convert it to a byte[] and write that to a file. I thought ffmpeg would do all that for me since I’m passing the "inputFile" and OutputFile paths. Anyway I post what I have till here. Can you provide me with the right code so that I can see a fully transcoded video file after as result.
Currently all code execute with no error and the output file is created but its size is 0KB. I’m running the code from windows server in a java (.jar) program.This is a snippet of the program
try {
File encodingFile = new File("c:\\ffmpeg\\bin\\output.mp4");
File errorFile = new File("c:\\ffmpeg\\bin\\error.txt");
errorFile.createNewFile();
encodingFile.createNewFile();
ProcessBuilder pb = new ProcessBuilder("c:\\ffmpeg\\bin\\ffmpeg.exe", "-i", "c:\\ffmpeg\\bin\\input.mp4", "-y", "-s", "360" + "480" + "-1", "-vcodec", "libx264", "c:\\ffmpeg\\bin\\output.mp4"); //or other command....
pb.redirectErrorStream(true);
pb.redirectError(errorFile);
//pb.redirectOutput(encodingFile);
Process p=pb.start();
byte[] videoIn;
File file = new File("c:\\ffmpeg\\bin\\input.mp4");
videoIn = new byte[(int)file.length()];
FileInputStream fileInputStream = new FileInputStream(file);
fileInputStream.read(videoIn);
*/
gui.textArea1.append("\n+++++++Finished while converting!++++++++\n");
}catch(IOException ex){
gui.textArea1.append("\nIOException converting: "+ex.getMessage());
}finally{
try{
if(is!=null){
is.close();
}
if(fos!=null){
fos.close();
}
}catch(IOException ioe){
gui.textArea1.append("\nClosing streams while converting:"+ioe.getMessage());
}
}
gui.textArea1.append("converting finished!");