Recherche avancée

Médias (10)

Mot : - Tags -/wav

Autres articles (13)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (3992)

  • Fluent-FFMPEG redirects to home page when recording is finished

    25 mai 2022, par Myles Jefferson

    I am using fluent-FFmpeg with my node.js and express server to record videos from an RTSP stream. The issue I am encountering is that once the command to record the video is finished, my React client-side always redirects to the home page of my application, even though this is not the behavior I want. I want the user to remain on the page with the RTSP stream and just receive a toast notification indicating that the recording is finished. With this issue, the page redirects before the notification has a chance to display. Is this an issue with my server-side or client-side code ?

    


    Node.js

    


    export const startRecording = async (req, res) => {
  const camera = req.params;
  if (camera.id in runningCommands) { return res.json({ "status": "failure", "error": "Recording already in progress" }) }
  const { recordTime, uid } = req.body;
  let conn = createConnection(config);
  conn.connect();
  let query = 'SELECT * FROM cameras WHERE id = ?';
  conn.query(query, [camera.id], (error, rows) => {
    if (error) { return res.json({ "status": "failure", "error": error }) }
    const camera = rows[0];
    const { ip, fname } = camera;
    const currentDate = new Date().toLocaleString().replace(/[:/\s]/g, '-').replace(',', '');
    const filename = `${fname}-${currentDate}`;

    try {
      // FFmpeg command to start recording
      const command = ffmpeg(`rtsp://${ip}/axis-media/media.amp`)
        .videoCodec('libx264')
        .size('1280x720')
        .duration(recordTime)
        .on('start', commandLine => {
          runningCommands[camera.id] = command
          console.log(`Spawned Ffmpeg with command: ${commandLine}`)
        })
        .on('error', err => console.error(err))
        .on('end', () => {
          delete runningCommands[camera.id]
          console.log('Recording Complete')
          takeScreenshot(filename, `./public/recordings/mp4/${filename}.mp4`)
          conn.query('INSERT INTO recordings (uid, cid, filename) VALUES (?, ?, ?)', [uid, camera.id, filename], () => conn.end())
          res.json({ "status": "success", "message": "Recording ended" })
        })
        .save(`./public/recordings/mp4/${filename}.mp4`);
    } catch (error) { console.error(error)}
  })
}


    


    React :

    


    const handleRecording = async () => {
    try {
      setIsRecording(true)
      const futureTime = new Date().getTime() + recordTime * 1000
      const finishedTime = new Date(futureTime).toLocaleTimeString()
      setTimeRemaining(finishedTime)
      const { data } = await publicRequest.post(`record/startRecording/${id}`, { recordTime, uid: user.id })
      window.location.reload(false)
      if (data.status === 'success') {
        setIsRecording(false)
        toast('Recording finished!', { type: 'success' })
      } else {
        setIsRecording(true)
        toast('Recording already in progress!', { type: 'error' })
      }
    } catch (error) {
      console.error(error)
    }
  }


    


  • Deploying on google App Engine : An error occurred : ffmpeg was killed with signal SIGABRT Error : ffmpeg was killed with signal SIGABRT

    27 août 2020, par Jérémy Gachon

    I wrote a node-js api, with node-js and fluent-ffmpeg :

    


    'use strict';
require('babel-register');
const path = require('path');    
const ffmpeg = require('fluent-ffmpeg');


    


    [...]

    


    var infs = new ffmpeg

infs.addInput(doc.data().url).outputOptions([
            '-preset slow', '-g 48', '-sc_threshold 0',
            '-map 0:0', '-map 0:1', '-map 0:0', '-map 0:1',
            '-s:v:0 1280x720', '-c:v:0 libx264', '-b:v:0 2000k',
            // "-var_stream_map", "'v:0,a:0 v:1,a:1'",
            '-master_pl_name ./' + req.params.id + '/master' + req.params.id + '.m3u8',
            '-f hls', '-hls_time 6', '-hls_list_size 0',
            '-hls_segment_filename ./' + req.params.id + '/fileSequence|' + req.params.id + '|%d|v%v.ts',
            '-max_muxing_queue_size 1024',
        ]).output('./' + req.params.id + '/video' + req.params.id + '.m3u8')
            .on('start', function (commandLine) {
                console.log('Spawned Ffmpeg with command: ' + commandLine);
            })
            .on('error', function (err, stdout, stderr) {
                console.log('An error occurred: ' + err.message, err, stderr);
            })
            .on('progress', function (progress) {
                console.log('Processing: ' + progress.percent + '% done')
            })
            .on('end', function (err, stdout, stderr) {

                console.log('Finished processing!' /*, err, stdout, stderr*/)
            })
            .run()
        res.status(200).send('GG').end();
    } 
   });


    


    [...]

    


    That work with

    


    


    node app.js

    


    


    on my macbook pro, but when i do

    


    


    gcloud app deploy

    


    


    and I call the public url, I have this logs :

    


    Processing: undefined% done
An error occurred: ffmpeg was killed with signal SIGABRT Error: ffmpeg was killed with signal SIGABRT 


    


    Here is my app.yaml :

    


        # Copyright 2017, Google, Inc.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# [START gae_flex_quickstart_yaml]
runtime: nodejs
env: flex

# This sample incurs costs to run on the App Engine flexible environment.
# The settings below are to reduce costs during testing and are not appropriate
# for production use. For more information, see:
# https://cloud.google.com/appengine/docs/flexible/nodejs/configuring-your-app-with-app-yaml
manual_scaling:
  instances: 1
resources:
  cpu: 1
  memory_gb: 6
  disk_size_gb: 30

# [END gae_flex_quickstart_yaml]


    


    enter image description here

    


    How can I do to deploy correctly my node-js api on google app engine ?

    


    Thank's in advance.

    


    Jérémy.

    


  • How to stream to NestJS if the data is transmitted through a NATS broker ?

    13 août 2023, par О. Войтенко

    I have an rtsp to mpegts streamer (ffmpeg wrapped in Express for publishing) that streams by publish to a NATS Jetstream.

    


    const options =
     [
         "-rtsp_transport", "tcp",
         "-i", streamUrl,
         '-f', 'mpegts',
         '-pix_fmt', 'yuv420p',
         '-c:v', 'h264',
         '-b:v', '800k',
         '-r', '30',
         '-muxdelay', '0.4',
         '-movflags', 'frag_keyframe+empty_moov',
         '-'
     ];


    


    try {
         const natsConnection = await nats.connect({
             servers: 'ws://127.0.0.1:9222',
             preserveBuffers: true
         })

         conststream = spawn('ffmpeg', options);

         stream.stdout.on('data', (chunk) => {
             console log(chunk);
             natsConnection.publish('stream.1', chunk);
         });

         stream.stdout.on('error', (error) => {
             console.error('FFmpeg error:', error);
         });

         stream.stdout.on('close', (code) => {
             console.log('FFmpeg process closed with code:', code);
         });
     } catch (e) {
         console.error(e);
     }


    


    It's all about the NestJS application signing up for this when the visitor follows the route.
The controller method itself

    


    @get()
   async findOne(
     @Headers() headers,
     @Res() res,
   ) {
     const streamObservable = this.natsService.subscribeStreamEvents('1');

     // Set appropriate headers for video streaming
     res.setHeader('Content-Type', 'video/mp4');
     res.setHeader('Transfer-Encoding', 'chunked');

     res.status(200);

     // Write data chunks directly to the response
     const subscription = streamObservable.subscribe({
       next: (chunk) => {
         console.log('wwww', chunk);
         res.write(chunk);
       },
       error: (error) => {
         console.error('Error streaming video:', error);
       },
       complete: () => {
         res end();
       },
     });

     res.on('close', () => {
       subscription.unsubscribe();
     });
   }


    


    Well, and service service

    


    subscribeToCameraEvents(cameraId: string) {&#xA;     const subject = new Subject<any>();&#xA;&#xA;     this.natsConnection.subscribe(`stream.${cameraId}`, {&#xA;       callback: (err, msg: Msg) => {&#xA;         if (err) {&#xA;           subject.error(err);&#xA;           return;&#xA;         }&#xA;         // console.log(msg.data);&#xA;         subject.next(msg.data);&#xA;       },&#xA;     });&#xA;&#xA;     return subject.asObservable();&#xA;   }&#xA;</any>

    &#xA;

    The data is consoled as a Buffer.

    &#xA;

    trying to play the stream

    &#xA;

    &#xA;&#xA;&#xA;     &#xA;&#xA;&#xA;<video controls="controls" width="640" height="480" style="background: black">&#xA;     <source src="http://localhost:3301/api/streams" type="video/mp4">&#xA;     Your browser does not support the video tag.&#xA;</source></video>&#xA;&#xA;&#xA;

    &#xA;

    the route is true in the source.

    &#xA;

    The problem is that through VLC and ffplay - localhost:3301/api/streams the stream is displayed, and everything works fine, without any additional settings.

    &#xA;

    And here in html does not want.

    &#xA;

    **What am I doing wrong ? **

    &#xA;

    I will be glad to any comments and additional questions.

    &#xA;

    I have tried canvas and jsmpeg (not working too), @Res(passthrough : true) - stream does not work in VLC, ffplay and html (without passthrough stream is working only in VLS and ffplay)

    &#xA;