Recherche avancée

Médias (1)

Mot : - Tags -/Rennes

Autres articles (62)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (14189)

  • Decoding a FFMPEG Buffer video Streaming using websocket for processing with OpenCV ?

    4 octobre 2019, par Alexis Meneses

    I am having a problem trying to get a frame from a streaming that I am doing on a web socket.
    I am sending my data from a webcam with ffmpeg using this command :

    ffmpeg -s 320x240 -f video4linux2 -i /dev/video0 -f mpeg1video -b:v 800k -r 30 http://localhost:8092

    Later, I get that stream and broadcast on a node js server with the following code :

    var childProcess = require('child_process')
     , express = require('express')
     , http = require('http')
     , morgan = require('morgan')
     , ws = require('ws');

    // configuration files
    var configServer = require('./lib/config/server');

    // app parameters
    var app = express();
    app.set('port', configServer.httpPort);
    app.use(express.static(configServer.staticFolder));
    app.use(morgan('dev'));

    // serve index
    require('./lib/routes').serveIndex(app, configServer.staticFolder);

    // HTTP server
    http.createServer(app).listen(app.get('port'), function () {
     console.log('HTTP server listening on port ' + app.get('port'));
    });


    var STREAM_MAGIC_BYTES = 'jsmp'; // Must be 4 bytes
    var width = 320;
    var height = 240;

    // WebSocket server
    var wsServer = new (ws.Server)({ port: configServer.wsPort });
    console.log('WebSocket server listening on port ' + configServer.wsPort);

    wsServer.on('connection', function(socket) {

     var streamHeader = new Buffer(8);

     streamHeader.write(STREAM_MAGIC_BYTES);
     streamHeader.writeUInt16BE(width, 4);
     streamHeader.writeUInt16BE(height, 6);
     socket.send(streamHeader, { binary: true });
     console.log(streamHeader);

     console.log('New WebSocket Connection (' + wsServer.clients.length + ' total)');

     socket.on('close', function(code, message){
       console.log('Disconnected WebSocket (' + wsServer.clients.length + ' total)');
     });
    });

    wsServer.broadcast = function(data, opts) {
     for(var i in this.clients) {
       if(this.clients[i].readyState == 1) {
         this.clients[i].send(data, opts);
       }
       else {
         console.log('Error: Client (' + i + ') not connected.');
       }
     }


    };

    // HTTP server to accept incoming MPEG1 stream
    http.createServer(function (req, res) {
     console.log(
       'Stream Connected: ' + req.socket.remoteAddress +
       ':' + req.socket.remotePort + ' size: ' + width + 'x' + height
     );

     req.on('data', function (data) {
       wsServer.broadcast(data, { binary: true });
     });
    }).listen(configServer.streamPort, function () {
     console.log('Listening for video stream on port ' + configServer.streamPort);


    });

    module.exports.app = app;

    I am getting successfully the data from this.clients[i].send(data, opts) on my python program, but I dont know how to decode the information to process the image with opencv. Any idea ?

    What I want to do is :

    import asyncio
    import websockets
    import cv2

    async def hello():
       uri = "ws://192.168.1.170:8094" #URL of the websocket server
       async with websockets.connect(uri) as websocket:
               inf = await websocket.recv()
               # Process the data in order to showimg with opencv.

               print(inf)


    asyncio.get_event_loop().run_until_complete(hello())
  • How to use ffmpeg to convert video into the audio format with nodejs and angular in web app

    31 août 2021, par Amir Shahzad

    I want to convert the video into the audio format using ffmpeg in nodejs but I not know how I can implement it in the angular app with nodejs.

    


    This is my nodejs code

    


      const express = require('express');
  const ffmpeg  = require('fluent-ffmpeg');
  const fileUpload = require('express-fileupload');
  const cors   = require('cors')
  const app = express();

  app.use(fileUpload({
     useTempFiles: true,
     tempFileDir: 'temp/'
  }));
  app.use(express.json());
  app.use(express.urlencoded({ extended: true }));
  app.use(cors({ origin: 'http://localhost:4200' }));

  ffmpeg.setFfmpegPath('/usr/bin/ffmpeg');

  // Converting mp4 to audio
  app.post('/mp4tomp3', (req, res) => {
     res.contentType('video/avi');
     res.attachment('output.mp3');
     req.files.mp4.mv("temp/" + req.files.mp4.name , function(err) {
      if(err){
        res.sendStatus(500).send(err)
    }else{
        console.log("Fiel Uploaded Successfully.!");
    }
   });
     ffmpeg('temp/' + req.files.mp4.name)
       .toFormat('mp3')
       .on('end', function() {
            console.log('Done');
    })
   .on('error', function(err){
         console.log('An Error Occured' + err.message)
    })
    .pipe(res, {end: true})
  })

  app.listen(3000, () => {
    console.log('Server Start On Port 3000')
  })


    


    This code is working good when I use index.html file in the nodejs app but its give an error while I remove index.html file and use angular app for frontend then it give error in nodejs mp4 not defined and name mv is not defined Please tell me how I can implement it using angular framework

    


    This is my app.component.html file

    


       <div class="container">&#xA;     <h1>Video Proccessing App</h1>&#xA;     <form>&#xA;        <input type="file" formcontrolname="mp4" />&#xA;         <input type="submit" value="Convert" />&#xA;     </form>&#xA;  </div>&#xA;  &#xA;

    &#xA;

    This is my app.component.ts file

    &#xA;

         import { Component, OnInit } from &#x27;@angular/core&#x27;;&#xA;     import { FormBuilder, FormGroup, Validators } from &#x27;@angular/forms&#x27;;&#xA;     import { VideoConversionService } from &#x27;src/services/video-conversion.service&#x27;;&#xA;&#xA;     @Component({&#xA;        selector: &#x27;app-root&#x27;,&#xA;        templateUrl: &#x27;./app.component.html&#x27;,&#xA;        styleUrls: [&#x27;./app.component.css&#x27;]&#xA;     })&#xA;     export class AppComponent implements OnInit {&#xA;&#xA;         submitted =false;&#xA;         form! : FormGroup&#xA;         data:any&#xA;&#xA;&#xA;        constructor(private formBuilder: FormBuilder,&#xA;        private videoService: VideoConversionService){}&#xA;&#xA;&#xA;       creatForm(){&#xA;          this.form = this.formBuilder.group({&#xA;          mp4: [&#x27;&#x27;, Validators.required],&#xA;       });&#xA;      }&#xA;      ngOnInit(): void {&#xA;         this.creatForm();&#xA;&#xA;      }&#xA;&#xA;&#xA;      convertVideo(){&#xA;          this.submitted = true&#xA;          this.videoService.conversion(this.form.value).subscribe(res => {&#xA;          this.data = res;&#xA;         // console.log(this.data)&#xA;         //console.log(this.form.value)&#xA;     })&#xA;     }&#xA;&#xA;     }&#xA;

    &#xA;

    This is my service file for handling the backend api in my angular app

    &#xA;

    import { Injectable } from &#x27;@angular/core&#x27;;&#xA;import { HttpClient  } from &#x27;@angular/common/http&#x27;;&#xA;@Injectable({&#xA;  providedIn: &#x27;root&#x27;&#xA;})&#xA;export class VideoConversionService {&#xA;&#xA;constructor(private httpClient: HttpClient) { }&#xA;&#xA;conversion(data: any){&#xA;   return this.httpClient.post(&#x27;http://localhost:3000/mp4tomp3&#x27;, data)&#xA;}&#xA;}&#xA;

    &#xA;

    This is Screenshot of chrome error

    &#xA;

    while i click on convert button then chrome give that error

    &#xA;

    This is the screenshot of the nodejs app error while I click on the convert button

    &#xA;

    enter image description here

    &#xA;

  • Node.JS Live Streaming Audio with FFMPEG

    20 avril 2021, par nicnacnic

    I'm trying to create an Express server to live stream audio captured from another application (Discord in this case). I'm able to get a server up and running, but there are a couple issues that need to be solved. Here's my server code so far.

    &#xA;

    const app = express();&#xA;app.get("/", function(req, res) {&#xA;    res.sendFile(__dirname &#x2B; "/index.html");&#xA;});&#xA;app.get("/audio", function(req, res) {&#xA;    const stream = ffmpeg(audio).inputOptions(["-f", "s16le", "-ar", "48k", "-ac", "2"]).format(&#x27;wav&#x27;);&#xA;    res.writeHead(200, { "Content-Type": "audio/wav" });&#xA;    stream.pipe(res);&#xA;});&#xA;app.listen(8080)&#xA;

    &#xA;

      &#xA;
    1. Silent sections of audio need to be added. When there's no activity on the input, there's no data written to the audio variable. This causes weird behavior, for example I can speak and the audio comes through a second later. Then, if I wait 10 seconds then speak again, the audio comes through 4-5 seconds later. I believe this is a problem with the way I'm using ffmpeg to transcode, but I have no idea how to fix it.
    2. &#xA;

    3. Refreshing the client crashes the program. Every time I refresh the client I get an ffmpeg error. Error: Output stream closed. This error doesn't happen if I close it, only on reload.
    4. &#xA;

    5. The audio is not synced between clients. Every time I open a new connection, the audio starts playing from the beginning instead of being synced with each other and playing the audio live.
    6. &#xA;

    &#xA;

    This is how it's supposed to work : it captures audio from my app in PCM, converts the audio to WAV with ffmpeg, and then streams the audio live to the clients. The audio needs to be synced with all the clients as best as possible to reduce delay. And I'm using fluent-ffmpeg instead of just regular ffmpeg for the transcoding.&#xA;Thanks !

    &#xA;