Recherche avancée

Médias (0)

Mot : - Tags -/alertes

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (85)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

Sur d’autres sites (8389)

  • exit when done ffmpeg [closed]

    28 décembre 2012, par Giacomo King Patermo

    How to close Command Prompt when FFMPEG finish to convert ? I try this code :

    ffmpeg.exe -loglevel quiet -i "input.avi" -qscale 0 "output.avi"

    but does not exit when ffmpeg finish to convert, hide only the output.

    Thanks.

  • Run docker container with FFMPEG rstp stream on websockets

    18 février, par bmvr

    I have create a node js application that uses ffmpeg with spawn node library.

    


    Here's the backend sample :

    


    const startStreamWs = (cameraId, rtsp_url) => {
  console.log(`Starting stream for camera: ${cameraId}`);

  const ffmpeg = spawn("ffmpeg", [
    "-rtsp_transport", "tcp", // Use TCP for reliable streaming
    "-i", rtsp_url,
    "-analyzeduration", "5000000",  // Increase analyzeduration
    "-probesize", "5000000",       // Increase probesize
    "-fflags", "nobuffer",         // Reduce buffering
    "-flags", "low_delay",         // Low latency
    "-strict", "experimental",
    "-max_delay", "200000",        // Reduce max delay for faster response
    "-bufsize", "2M",              // Buffer size for smoother streaming
    "-f", "mpegts",                // MPEG-TS container for streaming
    "-codec:v", "mpeg1video",         // MPEG-1 video codec
    "-s", "1280x720",              // Video resolution
    "-r", "25",                    // Frame rate (25 fps)
    "-b:v", "1500k",               // Bitrate for video
    "-maxrate", "2000k",           // Maximum bitrate
    "-bufsize", "2M",              // Buffer size (needed with maxrate)
    "-bf", "0",                    // Disable B-frames for lower latency
    "-an",                         // Disable audio
    "-"
]);


  ffmpeg.stdout.on("data", (data) => {
    if (cameraStreams[cameraId]) {
      console.log(`Data sent for camera ${cameraId}`);
      // Broadcast stream data to all connected clients
      for (const client of cameraStreams[cameraId].clients) {
        if (client.readyState === ws.OPEN) {
          client.send(data);
        }
      }
    }
  });

  ffmpeg.stderr.on("data", (data) => {
    console.error(`FFmpeg stderr (Camera ${cameraId}): ${data.toString()}`);
    logErrorToFile(data);
  });

  ffmpeg.on("close", (code) => {
    console.log(`FFmpeg process exited for Camera ${cameraId} with code ${code}`);
    if (cameraStreams[cameraId]) {
      // Close all remaining clients
      for (const client of cameraStreams[cameraId].clients) {
        client.close();
      }
      delete cameraStreams[cameraId];
    }
  });

  return ffmpeg;
};


    


    Front End Sample my angular component

    


    import { Component, OnDestroy, OnInit } from &#x27;@angular/core&#x27;;&#xA;import { FormsModule } from &#x27;@angular/forms&#x27;;&#xA;import { ActivatedRoute } from &#x27;@angular/router&#x27;;&#xA;&#xA;declare var JSMpeg: any; // Declare JSMpeg from the global script&#xA;&#xA;@Component({&#xA;  selector: &#x27;app-video-player&#x27;,&#xA;  templateUrl: &#x27;./video-player.component.html&#x27;,&#xA;  styleUrls: [&#x27;./video-player.component.css&#x27;],&#xA;  standalone: false&#xA;})&#xA;export class VideoPlayerComponent implements OnInit, OnDestroy {&#xA;  stepSize: number = 0.1;&#xA;  private player: any;&#xA;  cameraId: string | null = null;&#xA;  ws: WebSocket | null = null; &#xA;  wsUrl: string | null = null;&#xA;&#xA;  constructor(private route: ActivatedRoute) {&#xA;    this.cameraId = this.route.snapshot.paramMap.get(&#x27;id&#x27;);&#xA;    this.wsUrl = `ws://localhost:8085?cameraId=${this.cameraId}`;&#xA;    this.ws = new WebSocket(this.wsUrl);&#xA;  }&#xA;&#xA;  async ngOnInit(): Promise<void> {&#xA;    const canvas = document.getElementById(&#x27;videoCanvas&#x27;) as HTMLCanvasElement;&#xA;    this.player = new JSMpeg.Player(this.wsUrl, { canvas: canvas });&#xA;  }&#xA;&#xA;  async ngOnDestroy(): Promise<void> {&#xA;    this.ws?.close(1000, "Exiting");&#xA;  }&#xA;&#xA;  getStepSize(): number {&#xA;    return this.stepSize;&#xA;  }&#xA;}&#xA;&#xA;&#xA;</void></void>

    &#xA;

    On localhost is fine, once i containerize, it's not. I can serve the website but not the stream.&#xA;I have the same version FFMPEG 7.1 and the codec is available.&#xA;Although I run localhost macosx and unbutu on docker.

    &#xA;

  • how to read/write streams of ProcessBuilder and finally create a output file(output.mp4) ffmpeg

    18 janvier 2015, par user2303069

    I am trying to transcode a video using ffmpeg with java.ProcessBuilder. I read the other threads regarding this but it doesn’t really solve my problem. I think everything goes well up till Process p=pb.start(). I know I have to read some stream of ProcessBuilder and convert it to a byte[] and write that to a file. I thought ffmpeg would do all that for me since I’m passing the "inputFile" and OutputFile paths. Anyway I post what I have till here. Can you provide me with the right code so that I can see a fully transcoded video file after as result.
    Currently all code execute with no error and the output file is created but its size is 0KB. I’m running the code from windows server in a java (.jar) program.

    This is a snippet of the program

    try {

       File encodingFile = new File("c:\\ffmpeg\\bin\\output.mp4");
       File errorFile = new File("c:\\ffmpeg\\bin\\error.txt");
       errorFile.createNewFile();
       encodingFile.createNewFile();

       ProcessBuilder pb = new ProcessBuilder("c:\\ffmpeg\\bin\\ffmpeg.exe", "-i", "c:\\ffmpeg\\bin\\input.mp4", "-y", "-s", "360" + "480" + "-1", "-vcodec", "libx264", "c:\\ffmpeg\\bin\\output.mp4"); //or other command....
       pb.redirectErrorStream(true);
       pb.redirectError(errorFile);
       //pb.redirectOutput(encodingFile);

       Process p=pb.start();
       byte[] videoIn;
       File file = new File("c:\\ffmpeg\\bin\\input.mp4");
       videoIn = new byte[(int)file.length()];

       FileInputStream fileInputStream = new FileInputStream(file);
       fileInputStream.read(videoIn);
              */
       gui.textArea1.append("\n+++++++Finished while converting!++++++++\n");

    }catch(IOException ex){
       gui.textArea1.append("\nIOException converting: "+ex.getMessage());
    }finally{
       try{
           if(is!=null){
               is.close();
           }
           if(fos!=null){
               fos.close();
           }
       }catch(IOException ioe){
           gui.textArea1.append("\nClosing streams while converting:"+ioe.getMessage());
       }
    }
    gui.textArea1.append("converting finished!");